WorldWideScience

Sample records for client database applications

  1. Online data mining services for dynamic spatial databases I: system architecture and client applications

    OpenAIRE

    Manuel COSTA; Sousa, Inês; Fonseca, Alexandra; Henriques, Diana; Rosa, Paulo; Franco, Ivan; Capeta, Nuno; Teixeira, Luís; Jorge C. S. Cardoso; Carvalho, Vasco

    2005-01-01

    This paper describes online data mining services for dynamic spatial databases connected to environmental monitoring networks. These services can use Artificial Neural Networks as data mining techniques to find temporal relations in monitored parameters. The execution of the data mining algorithms is performed at the server side and a distributed processing scheme is used to overcome problems of scalability. To support the discovery of temporal relations, two other families of online servi...

  2. A Multidatabase System as 4-Tiered Client-Server Distributed Heterogeneous Database System

    OpenAIRE

    Mohammad Ghulam Ali

    2009-01-01

    In this paper, we describe a multidatabase system as 4-tiered Client-Server DBMS architectures. We discuss their functional components and provide an overview of their performance characteristics. The first component of this proposed system is a web-based interface or Graphical User Interface, which resides on top of the Client Application Program, the second component of the system is a client Application program running in an application server, which resides on top of the Global Database M...

  3. Multi-tiered Client/Server Database Application Based on Web%基于Web的多层客户/服务器数据库应用程序

    Institute of Scientific and Technical Information of China (English)

    李文生; 潘世兵

    2001-01-01

    讨论基于Web的多层客户/服务器数据库应用计算模型,并提出采用Delphi建立基于Web的多层客户/服务器数据库应用程序的方法和步骤。%This Paper discusses the computing model of multie-tieredclient/server database application based on Web and proposes method and steps for constructing multie-tiered client/server database application based on Web with Delphi.

  4. CLIENT-TO-CLIENT STREAMING SCHEME FOR VOD APPLICATIONS

    OpenAIRE

    T R Gopala Krishnan Nair; Dakshayini, M

    2010-01-01

    In this paper, we propose an efficient client-to-client streaming approach to cooperatively stream the video using chaining technique with unicast communication among the clients. This approach considers two major issues of VoD 1) Prefix caching scheme to accommodate more number of videos closer to client, so that the request-service delay for the user can be minimized. 2) Cooperative proxy and client chaining scheme for streaming the videos using unicasting. This approach minimizes the clien...

  5. Client-server, distributed database strategies in a healthcare record system for a homeless population.

    Science.gov (United States)

    Chueh, H C; Barnett, G O

    1993-01-01

    A computer-based healthcare record system being developed for Boston's Healthcare for the Homeless Program (BHCHP) uses client-server and distributed database technologies to enhance the delivery of healthcare to patients of this unusual population. The needs of physicians, nurses and social workers are specifically addressed in the application interface so that an integrated approach to healthcare for this population can be facilitated. These patients and their providers have unique medical information needs that are supported by both database and applications technology. To integrate the information capabilities with the actual practice of providers of care to the homeless, this computer-based record system is designed for remote and portable use over regular phone lines. An initial standalone system is being used at one major BHCHP site of care. This project describes methods for creating a secure, accessible, and scalable computer-based medical record using client-server, distributed database design. PMID:8130445

  6. RA radiological characterization database application

    International Nuclear Information System (INIS)

    Radiological characterization of the RA research reactor is one of the main activities in the first two years of the reactor decommissioning project. The raw characterization data from direct measurements or laboratory analyses (defined within the existing sampling and measurement programme) have to be interpreted, organized and summarized in order to prepare the final characterization survey report. This report should be made so that the radiological condition of the entire site is completely and accurately shown with the radiological condition of the components clearly depicted. This paper presents an electronic database application, designed as a serviceable and efficient tool for characterization data storage, review and analysis, as well as for the reports generation. Relational database model was designed and the application is made by using Microsoft Access 2002 (SP1), a 32-bit RDBMS for the desktop and client/server database applications that run under Windows XP. (author)

  7. Cloud Storage Client Application Analysis

    Directory of Open Access Journals (Sweden)

    Rakesh Malik

    2015-06-01

    Full Text Available The research proposed in this paper focuses on gathering evidence from devices with UNIX/Linux systems (in particular on Ubuntu 14.04 and Android OS, and Windows 8.1, in order to find artifacts left by cloud storage applications that suggests their use even after the deletion of the applications. The work performed aims to expand upon the prior work done by other researches in the field of cloud forensics and to show an example of analysis. We show where and what type of data remnants can be found using our analysis and how this information can be used as evidence in a digital forensic investigation.

  8. Database and interface modifications: change management without affecting the clients

    International Nuclear Information System (INIS)

    The first Oracle-based Controls Configuration Database (CCDB) was developed in 1986, by which the controls system of CERN's Proton Synchrotron became data-driven. Since then, this mission-critical system has evolved tremendously going through several generational changes in terms of the increasing complexity of the control system, software technologies and data models. Today, the CCDB covers the whole CERN accelerator complex and satisfies a much wider range of functional requirements. Despite its online usage, everyday operations of the machines must not be disrupted. This paper describes our approach with respect to dealing with change while ensuring continuity. The successful strategy that has been put in place is based on the following guidelines: -) Involve end-users right from the start, throughout the design and development process; -) Provide four separate environments for development, unit and functional testing, integration testing (TestBed), production; -) Analyze the impact of a change and try to apply only backward compatible changes; -) Communicate timely, clearly and transparently on scheduled intervention and their impact; and -) Coordinate the upgrades with impacted clients

  9. Database And Interface Modifications: Change Management Without Affecting The Clients

    CERN Document Server

    Peryt, M; Martin Marquez, M; Zaharieva, Z

    2011-01-01

    The first Oracle®-based Controls Configuration Database (CCDB) was developed in 1986, by which the controls system of CERN’s Proton Synchrotron became data-driven. Since then, this mission-critical system has evolved tremendously going through several generational changes in terms of the increasing complexity of the control system, software technologies and data models. Today, the CCDB covers the whole CERN accelerator complex and satisfies a much wider range of functional requirements. Despite its online usage, everyday operations of the machines must not be disrupted. This paper describes our approach with respect to dealing with change while ensuring continuity. How do we manage the database schema changes? How do we take advantage of the latest web deployed application development frameworks without alienating the users? How do we minimize impact on the dependent systems connected to databases through various APIs? In this paper we will provide our answers to these questions, and to many more.

  10. A real time multi-server multi-client coherent database for a new high voltage system

    International Nuclear Information System (INIS)

    A high voltage system has been designed to allow multiple users (clients) access to the database of measured values and settings. This database is actively maintained in real time for a given mainframe containing multiple modules each having their own database. With limited CPU nd memory resources the mainframe system provides a data coherency scheme for multiple clients which (1) allows the client to determine when and what values need to be updated, (2) allows for changes from one client to be detected by another client, and (3) does not depend on the mainframe system tracking client accesses

  11. Database Application Schema Forensics

    OpenAIRE

    Hector Quintus Beyers; Olivier, Martin S; Hancke, Gerhard P.

    2014-01-01

    The application schema layer of a Database Management System (DBMS) can be modified to deliver results that may warrant a forensic investigation. Table structures can be corrupted by changing the metadata of a database or operators of the database can be altered to deliver incorrect results when used in queries. This paper will discuss categories of possibilities that exist to alter the application schema with some practical examples. Two forensic environments are introduced where a forensic ...

  12. Android client application development for existing CRM solution

    OpenAIRE

    Kusyn, Martin

    2013-01-01

    This bachelor's thesis focuses on the development of client application for Android operating system for mobile devices. The final application serves as a client for RAYNET Cloud CRM, an existing CRM solution. The first part of the thesis contains brief introduction of RAYNET s.r.o. , the creator of RAYNET Cloud CRM, introduction of the CRM system itself and brief description of its functionality and implementation. Requirements of the mobile client application are discussed and its user ...

  13. Incorporating client-server database architecture and graphical user interface into outpatient medical records.

    OpenAIRE

    Fiacco, P. A.; Rice, W. H.

    1991-01-01

    Computerized medical record systems require structured database architectures for information processing. However, the data must be able to be transferred across heterogeneous platform and software systems. Client-Server architecture allows for distributive processing of information among networked computers and provides the flexibility needed to link diverse systems together effectively. We have incorporated this client-server model with a graphical user interface into an outpatient medical ...

  14. Product Licenses Database Application

    CERN Document Server

    Tonkovikj, Petar

    2016-01-01

    The goal of this project is to organize and centralize the data about software tools available to CERN employees, as well as provide a system that would simplify the license management process by providing information about the available licenses and their expiry dates. The project development process is consisted of two steps: modeling the products (software tools), product licenses, legal agreements and other data related to these entities in a relational database and developing the front-end user interface so that the user can interact with the database. The result is an ASP.NET MVC web application with interactive views for displaying and managing the data in the underlying database.

  15. Database and Expert Systems Applications

    DEFF Research Database (Denmark)

    Viborg Andersen, Kim; Debenham, John; Wagner, Roland

    submissions. The papers are organized in topical sections on workflow automation, database queries, data classification and recommendation systems, information retrieval in multimedia databases, Web applications, implementational aspects of databases, multimedia databases, XML processing, security, XML...... schemata, query evaluation, semantic processing, information retrieval, temporal and spatial databases, querying XML, organisational aspects of databases, natural language processing, ontologies, Web data extraction, semantic Web, data stream management, data extraction, distributed database systems...

  16. Creating and optimizing client-server applications on mobile devices

    OpenAIRE

    Anacleto, Ricardo; Luz, Nuno; Almeida, Ana,; Figueiredo, Lino; Novais, Paulo

    2013-01-01

    Mobile devices are embedded systems with very limited capacities that need to be considered when developing a client-server application, mainly due to technical, ergonomic and economic implications to the mobile user. With the increasing popularity of mobile computing, many developers have faced problems due to low performance of devices. In this paper, we discuss how to optimize and create client-server applications for in wireless/mobile environments, presenting techniques...

  17. New NED XML/VOtable Services and Client Interface Applications

    Science.gov (United States)

    Pevunova, O.; Good, J.; Mazzarella, J.; Berriman, G. B.; Madore, B.

    2005-12-01

    The NASA/IPAC Extragalactic Database (NED) provides data and cross-identifications for over 7 million extragalactic objects fused from thousands of survey catalogs and journal articles. The data cover all frequencies from radio through gamma rays and include positions, redshifts, photometry and spectral energy distributions (SEDs), sizes, and images. NED services have traditionally supplied data in HTML format for connections from Web browsers, and a custom ASCII data structure for connections by remote computer programs written in the C programming language. We describe new services that provide responses from NED queries in XML documents compliant with the international virtual observatory VOtable protocol. The XML/VOtable services support cone searches, all-sky searches based on object attributes (survey names, cross-IDs, redshifts, flux densities), and requests for detailed object data. Initial services have been inserted into the NVO registry, and others will follow soon. The first client application is a Style Sheet specification for rendering NED VOtable query results in Web browsers that support XML. The second prototype application is a Java applet that allows users to compare multiple SEDs. The new XML/VOtable output mode will also simplify the integration of data from NED into visualization and analysis packages, software agents, and other virtual observatory applications. We show an example SED from NED plotted using VOPlot. The NED website is: http://nedwww.ipac.caltech.edu.

  18. Databases and their application

    NARCIS (Netherlands)

    E.C. Grimm; R.H.W Bradshaw; S. Brewer; S. Flantua; T. Giesecke; A.M. Lézine; H. Takahara; J.W.,Jr Williams

    2013-01-01

    During the past 20 years, several pollen database cooperatives have been established. These databases are now constituent databases of the Neotoma Paleoecology Database, a public domain, multiproxy, relational database designed for Quaternary-Pliocene fossil data and modern surface samples. The poll

  19. Client-Centric Adaptive Scheduling of Service-Oriented Applications

    Institute of Scientific and Technical Information of China (English)

    Jing Wang; Li-Yong Zhang; Yan-Bo Han

    2006-01-01

    The paper proposes a client-centric computing model that allows for adaptive execution of service-oriented applications. The model can flexibly dispatch application tasks to the client side and the network side, dynamically adjust an execution scheme to adapt to environmental changes, and thus is expected to achieve better scalability, higher performance and more controllable privacy. Scheduling algorithms and the rescheduling strategies are proposed for the model.Experiments show that with the model the performance of service-oriented application execution can be improved.

  20. ATLAS database application enhancements using Oracle 11g

    CERN Document Server

    Dimitrov, G; The ATLAS collaboration; Blaszczyk, M; Sorokoletov, R

    2012-01-01

    The ATLAS experiment at LHC relies on databases for detector online data-taking, storage and retrieval of configurations, calibrations and alignments, post data-taking analysis, file management over the grid, job submission and management, condition data replication to remote sites. Oracle Relational Database Management System (RDBMS) has been addressing the ATLAS database requirements to a great extent for many years. Ten database clusters are currently deployed for the needs of the different applications, divided in production, integration and standby databases. The data volume, complexity and demands from the users are increasing steadily with time. Nowadays more than 20 TB of data are stored in the ATLAS production Oracle databases at CERN (not including the index overhead), but the most impressive number is the hosted 260 database schemas (for the most common case each schema is related to a dedicated client application with its own requirements). At the beginning of 2012 all ATLAS databases at CERN have...

  1. Database characterisation of HEP applications

    Science.gov (United States)

    Piorkowski, Mariusz; Grancher, Eric; Topurov, Anton

    2012-12-01

    Oracle-based database applications underpin many key aspects of operations for both the LHC accelerator and the LHC experiments. In addition to the overall performance, the predictability of the response is a key requirement to ensure smooth operations and delivering predictability requires understanding the applications from the ground up. Fortunately, database management systems provide several tools to check, measure, analyse and gather useful information. We present our experiences characterising the performance of several typical HEP database applications performance characterisations that were used to deliver improved predictability and scalability as well as for optimising the hardware platform choice as we migrated to new hardware and Oracle 11g.

  2. Database characterisation of HEP applications

    International Nuclear Information System (INIS)

    Oracle-based database applications underpin many key aspects of operations for both the LHC accelerator and the LHC experiments. In addition to the overall performance, the predictability of the response is a key requirement to ensure smooth operations and delivering predictability requires understanding the applications from the ground up. Fortunately, database management systems provide several tools to check, measure, analyse and gather useful information. We present our experiences characterising the performance of several typical HEP database applications performance characterisations that were used to deliver improved predictability and scalability as well as for optimising the hardware platform choice as we migrated to new hardware and Oracle 11g.

  3. Design of an application for credit scoring and client suggestion

    OpenAIRE

    Silva, Fábio; Analide, César

    2010-01-01

    Risk assessment on loan application is vital for many financial institutions. Most financial institutions have already applied methods of credit scoring and risk assessment in order to evaluate their clients in terms. These systems are often based on deterministic or statistical algorithms. In this context, techniques from artificial intelligence and data mining present themselves as valid alternatives to build such classification systems. In this paper some studies are conducted to evaluate ...

  4. An Object-Oriented Framework for Client-Server Applications

    International Nuclear Information System (INIS)

    When developing high-level accelerator applications it is often necessary to perform extensive calculations to generate a data set that will be used as an input for other applications. Depending on the size and complexity of these computations, regenerating the interim data sets can introduce errors or otherwise negatively impact system perform. If these computational data sets could be generated in advance and be updated continuously from changes in the accelerator, it could substantially reduce the time and effort required in performing subsequent calculations. UNIX server applications are well suited to accommodate this need by providing a centralized repository for data or computational power. Because of the inherent difficulty in writing a robust server application, the development of the network communications software is often more burdensome than the computational engine. To simplify the task of building a client/server application, we have developed an object-oriented server shell which hides the complexity of the network software development from the programmer. This document will discuss how to implement a complete client/server application using this C++ class library with a minimal understanding of network communications mechanisms

  5. Database tomography for commercial application

    Science.gov (United States)

    Kostoff, Ronald N.; Eberhart, Henry J.

    1994-01-01

    Database tomography is a method for extracting themes and their relationships from text. The algorithms, employed begin with word frequency and word proximity analysis and build upon these results. When the word 'database' is used, think of medical or police records, patents, journals, or papers, etc. (any text information that can be computer stored). Database tomography features a full text, user interactive technique enabling the user to identify areas of interest, establish relationships, and map trends for a deeper understanding of an area of interest. Database tomography concepts and applications have been reported in journals and presented at conferences. One important feature of the database tomography algorithm is that it can be used on a database of any size, and will facilitate the users ability to understand the volume of content therein. While employing the process to identify research opportunities it became obvious that this promising technology has potential applications for business, science, engineering, law, and academe. Examples include evaluating marketing trends, strategies, relationships and associations. Also, the database tomography process would be a powerful component in the area of competitive intelligence, national security intelligence and patent analysis. User interests and involvement cannot be overemphasized.

  6. ATLAS database application enhancements using Oracle 11g

    International Nuclear Information System (INIS)

    The ATLAS experiment at LHC relies on databases for detector online data-taking, storage and retrieval of configurations, calibrations and alignments, post data-taking analysis, file management over the grid, job submission and management, condition data replication to remote sites. Oracle Relational Database Management System (RDBMS) has been addressing the ATLAS database requirements to a great extent for many years. Ten database clusters are currently deployed for the needs of the different applications, divided in production, integration and standby databases. The data volume, complexity and demands from the users are increasing steadily with time. Nowadays more than 20 TB of data are stored in the ATLAS production Oracle databases at CERN (not including the index overhead), but the most impressive number is the hosted 260 database schemes (for the most common case each schema is related to a dedicated client application with its own requirements). At the beginning of 2012 all ATLAS databases at CERN have been upgraded to the newest Oracle version at the time: Oracle 11g Release 2. Oracle 11g come with several key improvements compared to previous database engine versions. In this work we present our evaluation of the most relevant new features of Oracle 11g of interest for ATLAS applications and use cases. Notably we report on the performance and scalability enhancements obtained in production since the Oracle 11g deployment during Q1 2012 and we outline plans for future work in this area.

  7. ATLAS database application enhancements using Oracle 11g

    Science.gov (United States)

    Dimitrov, G.; Canali, L.; Blaszczyk, M.; Sorokoletov, R.

    2012-12-01

    The ATLAS experiment at LHC relies on databases for detector online data-taking, storage and retrieval of configurations, calibrations and alignments, post data-taking analysis, file management over the grid, job submission and management, condition data replication to remote sites. Oracle Relational Database Management System (RDBMS) has been addressing the ATLAS database requirements to a great extent for many years. Ten database clusters are currently deployed for the needs of the different applications, divided in production, integration and standby databases. The data volume, complexity and demands from the users are increasing steadily with time. Nowadays more than 20 TB of data are stored in the ATLAS production Oracle databases at CERN (not including the index overhead), but the most impressive number is the hosted 260 database schemes (for the most common case each schema is related to a dedicated client application with its own requirements). At the beginning of 2012 all ATLAS databases at CERN have been upgraded to the newest Oracle version at the time: Oracle 11g Release 2. Oracle 11g come with several key improvements compared to previous database engine versions. In this work we present our evaluation of the most relevant new features of Oracle 11g of interest for ATLAS applications and use cases. Notably we report on the performance and scalability enhancements obtained in production since the Oracle 11g deployment during Q1 2012 and we outline plans for future work in this area.

  8. Design and implementation of an enterprise information system utilizing a component based three-tier client/server database system

    OpenAIRE

    Akbay, Murat.; Lewis, Steven C.

    1999-01-01

    The Naval Security Group currently requires a modem architecture to merge existing command databases into a single Enterprise Information System through which each command may manipulate administrative data. There are numerous technologies available to build and implement such a system. Component- based architectures are extremely well-suited for creating scalable and flexible three-tier Client/Server systems because the data and business logic are encapsulated within objects, allowing them t...

  9. MCIP Client Application for SCADA in Iiot Environment

    Directory of Open Access Journals (Sweden)

    Nicoleta Cristina GAITAN

    2015-09-01

    Full Text Available Modern automation systems architectures which include several subsystems for which an adequate burden sharing is required. These subsystems must work together to fulfil the tasks imposed by the common function, given by the business purpose to be fulfilled. These subsystems or components, in order to perform these tasks, must communicate with each other, this being the critical function of the architecture of such a system. This article presents a MCIP (Monitoring and Control of the Industrial Processes client application which allows the monitoring and control of the industrial processes and which is object-oriented. As a novelty, the paper presents the architecture of the user object, which is actually a wrapper that allows the connection to Communication Standard Interface bus, the characteristics of the IIoT (Industrial Internet of Things object and the correspondence between a server’s address space and the address space of MCIP.

  10. Building Database-Powered Mobile Applications

    OpenAIRE

    Paul POCATILU

    2012-01-01

    Almost all mobile applications use persistency for their data. A common way for complex mobile applications is to store data in local relational databases. Almost all major mobile platforms include a relational database engine. These databases engines expose specific API (Application Programming Interface) to be used by mobile applications developers for data definition and manipulation. This paper focus on database-based application models for several mobile platforms (Android, Symbian, Wind...

  11. MstApp, a rich client control applications framework at DESY

    International Nuclear Information System (INIS)

    The control systems for PETRA 3 (a dedicated synchrotron machine) and its pre-accelerators extensively use rich clients for the control room and the servers. Most of them are written with the help of a rich client Java framework: MstApp. They totalize 106 different consoles and 158 individual server applications. MstApp takes care of many common control system application aspects beyond communication. MstApp provides a common look and feel: core menu items, a colour scheme for standard states of hardware components and predefined standardized screen sizes/locations. It interfaces our console application manager (CAM) and displays on demand our communication link diagnostics tools. MstApp supplies an accelerator context for each application; it handles printing, logging, re-sizing and unexpected application crashes. Due to our standardized deploy process MstApp applications know their individual developers and can even send them - on button press of the users - E-mails. Further a concept of different operation modes is implemented: view only, operating and expert use. Administration of the corresponding rights is done via web access of a database server. Initialization files on a web server are instantiated as JAVA objects with the help of the Java SE XML-Decoder. Data tables are read with the same mechanism. New MstApp applications can easily be created with in house wizards like the NewProjectWizard or the DeviceServerWizard. MstApp improves the operator experience, application developer productivity and delivered software quality. (authors)

  12. Application Concept : Improvement of a UX-Design Office’s Client Experience Through Gamification

    OpenAIRE

    Johansson, Juha

    2014-01-01

    The purpose of this study was to find ways to improve the client experience of Linja Design Office through application design. In a design office like Linja Design client meetings may take longer than planned and the clients of the following meetings may need to wait for their contact person. Therefore Linja Design assigned the author to study if there were ways to improve the experience of the situation. Linja as a design studio also believes strongly in gamification and therefore this study...

  13. Building Database-Powered Mobile Applications

    Directory of Open Access Journals (Sweden)

    Paul POCATILU

    2012-01-01

    Full Text Available Almost all mobile applications use persistency for their data. A common way for complex mobile applications is to store data in local relational databases. Almost all major mobile platforms include a relational database engine. These databases engines expose specific API (Application Programming Interface to be used by mobile applications developers for data definition and manipulation. This paper focus on database-based application models for several mobile platforms (Android, Symbian, Windows CE/Mobile and Windows Phone. For each selected platform the API and specific database operations are presented.

  14. Using a Framework to develop Client-Side App : A Javascript Framework for cross-platform application

    OpenAIRE

    Shakya, Udeep

    2014-01-01

    This project aims to study the comfort of using a framework to develop client side applications based on Hypertext Markup Language 5 (HTML5), Cascading Style Sheets (CSS) and JavaScript technology. The application tends to serve both as a web client application and a mobile client application for multiple platforms. A survey answering application which fetches questions (texts) from an Application Programming Interface (API) in the application server and uploads text, sound, video and picture...

  15. Exchanging the Context between OGC Geospatial Web clients and GIS applications using Atom

    Science.gov (United States)

    Maso, Joan; Díaz, Paula; Riverola, Anna; Pons, Xavier

    2013-04-01

    Currently, the discovery and sharing of geospatial information over the web still presents difficulties. News distribution through website content was simplified by the use of Really Simple Syndication (RSS) and Atom syndication formats. This communication exposes an extension of Atom to redistribute references to geospatial information in a Spatial Data Infrastructure distributed environment. A geospatial client can save the status of an application that involves several OGC services of different kind and direct data and share this status with other users that need the same information and use different client vendor products in an interoperable way. The extensibility of the Atom format was essential to define a format that could be used in RSS enabled web browser, Mass Market map viewers and emerging geospatial enable integrated clients that support Open Geospatial Consortium (OGC) services. Since OWS Context has been designed as an Atom extension, it is possible to see the document in common places where Atom documents are valid. Internet web browsers are able to present the document as a list of items with title, abstract, time, description and downloading features. OWS Context uses GeoRSS so that, the document can be to be interpreted by both Google maps and Bing Maps as items that have the extent represented in a dynamic map. Another way to explode a OWS Context is to develop an XSLT to transform the Atom feed into an HTML5 document that shows the exact status of the client view window that saved the context document. To accomplish so, we use the width and height of the client window, and the extent of the view in world (geographic) coordinates in order to calculate the scale of the map. Then, we can mix elements in world coordinates (such as CF-NetCDF files or GML) with elements in pixel coordinates (such as WMS maps, WMTS tiles and direct SVG content). A smarter map browser application called MiraMon Map Browser is able to write a context document and read

  16. International Ventilation Cooling Application Database

    DEFF Research Database (Denmark)

    Holzer, Peter; Psomas, Theofanis Ch.; OSullivan, Paul

    2016-01-01

    , systematically investigating the distribution of technologies and strategies within VC. The database is structured as both a ticking-list-like building-spreadsheet and a collection of building-datasheets. The content of both closely follows Annex 62 State-Of-The- Art-Report. The database has been filled, based...... on desktop research, by Annex 62 participants, namely by the authors. So far the VC database contains 91 buildings, located in Denmark, Ireland and Austria. Further contributions from other countries are expected. The building-datasheets offer illustrative descriptions of buildings of different...

  17. Multimedia database retrieval technology and applications

    CERN Document Server

    Muneesawang, Paisarn; Guan, Ling

    2014-01-01

    This book explores multimedia applications that emerged from computer vision and machine learning technologies. These state-of-the-art applications include MPEG-7, interactive multimedia retrieval, multimodal fusion, annotation, and database re-ranking. The application-oriented approach maximizes reader understanding of this complex field. Established researchers explain the latest developments in multimedia database technology and offer a glimpse of future technologies. The authors emphasize the crucial role of innovation, inspiring users to develop new applications in multimedia technologies

  18. Web Applications Security : A security model for client-side web applications

    OpenAIRE

    Prabhakara, Deepak

    2009-01-01

    The Web has evolved to support sophisticated web applications. These web applications are exposed to a number of attacks and vulnerabilities. The existing security model is unable to cope with these increasing attacks and there is a need for a new security model that not only provides the required security but also supports recent advances like AJAX and mashups. The attacks on client-side Web Applications can be attributed to four main reasons – 1) lack of a security context for Web Browsers...

  19. The First Android Client Application for the iLab Shared Architecture

    Directory of Open Access Journals (Sweden)

    Bogdan-Alexandru Deaky

    2012-02-01

    Full Text Available This paper presents the first Android client application developed for online laboratories based on the iLab Shared Architecture. An important challenge was to properly connect to the ISA Service Broker, because its current version was developed with browser-based client applications in mind. The application was successfully tested on a few real-world mobile devices and the experience gained represents the basis for future changes in the Service Broker and for future teleengineering applications that involve Android.

  20. Memory Storage Issues of Temporal Database Applications on Relational Database Management Systems

    OpenAIRE

    Sami M. Halawani; Nashwan A.A Romema

    2010-01-01

    Problem statement: Many existing database applications manage time-varying data. These database applications are referred to as temporal databases or time-oriented database applications that are considered as repositories of time-dependent data. Many proposals have been introduced for developing time-oriented database applications, some of which suggest building support for Temporal Database Management Systems (TDBMS) on top of existing non-temporal DBMSs, while others suggest modifying the m...

  1. Prevention Of Cross-Site Scripting Attacks XSS On Web Applications In The Client Side

    Directory of Open Access Journals (Sweden)

    S Shalini

    2011-07-01

    Full Text Available Cross Site Scripting (XSS Attacks are currently the most popular security problems in modern web applications. These Attacks make use of vulnerabilities in the code of web-applications, resulting in serious consequences, such as theft of cookies, passwords and other personal credentials. Cross-Site scripting (XSS Attacks occur when accessing information in intermediate trusted sites. Client side solution acts as a web proxy to mitigate Cross Site Scripting Attacks which manually generated rules to mitigate Cross Site Scripting attempts. Client side solution effectively protects against information leakage from the userand#039;s environment. Cross Site Scripting (XSS Attacks are easy to execute, but difficult to detect and prevent. This paper provides client-side solution to mitigate crosssite scripting Attacks. The existing client-side solutions degrade the performance of clientand#039;s system resulting in a poor web surfing experience. In this project provides a client side solution that uses a step by step approach to protect cross site scripting, without degrading much the userand#039;s web browsing experience.

  2. Research and application of ORACLE performance optimizing technologies for building airplane environment resource database

    Science.gov (United States)

    Zhang, Jianjun; Sun, Jianyong; Cheng, Conggao

    2013-03-01

    Many problems exist in processing experimental aircraft vibration (temperature, humidity) data and generating the intermediate calculations during the construction of airplane environment resource database, such as the need to deal with both structural and non-structural data, weak capacity of the client browser for data processing and massive network data transferring etc. To solve the above problems, some strategies on tuning and optimization performance of database are employed base on Oracle11g, which include data storage structure tuning, the memory configuration of the server, the disk I/O tuning and SQL statement tuning. The experimental results show that the performance of airplane environment resource database is enhanced about 80% compared with the database developed in the initial demonstration and validation phase. The application of new optimization strategies to the database construction can lay a sound foundation for finishing building airplane environment resource database.

  3. Maintaining Stored Procedures in Database Application

    Directory of Open Access Journals (Sweden)

    Santosh Kakade

    2012-06-01

    Full Text Available Stored procedure and triggers have an irreplaceable importance in any database application, as they provide a powerful way to code application logic that can be stored on the server and execute according to the need of application. Writing stored procedures for database application involves set of sql statements with an assigned name that's stored in the database in compiled form so that it can be shared by a number of programs. The use of stored procedures can be helpful in controlling access to data end-users may enter or change data but do not write procedures, preserving data integrity and improving productivity statements in a stored procedure only need to be written one time

  4. Professional iOS database application programming

    CERN Document Server

    Alessi, Patrick

    2013-01-01

    Updated and revised coverage that includes the latest versions of iOS and Xcode Whether you're a novice or experienced developer, you will want to dive into this updated resource on database application programming for the iPhone and iPad. Packed with more than 50 percent new and revised material - including completely rebuilt code, screenshots, and full coverage of new features pertaining to database programming and enterprise integration in iOS 6 - this must-have book intends to continue the precedent set by the previous edition by helping thousands of developers master database

  5. Cross-platform development of the Smart Client application with Qt framework and QtQuick

    OpenAIRE

    Krajewski, Marek

    2016-01-01

    In this thesis the Qt Framework is evaluated as the tool that can support the crossplatform development of desktop, mobile and embedded applications. Hence, a hybrid client application is developed to assess its capabilities for creating a product providing a good user experience on a wide range of the target devices. The application is required to demonstrate implementation of the Graphical User Interface, network communication with a server and access to the native development e...

  6. Framework for Deploying Client/Server Distributed Database System for effective Human Resource Information Management Systems in Imo State Civil Service of Nigeria

    OpenAIRE

    Josiah Ahaiwe; Nwaokonkwo Obi

    2012-01-01

    The information system is an integrated system that holds financial and personnel records of persons working in various branches of Imo state civil service. The purpose is to harmonize operations, reduce or if possible eliminate redundancy and control the introduction of “ghost workers” and fraud in pension management. In this research work, an attempt is made to design a frame work for deploying a client/server distributed database system for a human resource information management system wi...

  7. How to Use a Desktop Version of a DBMS for Client-Server Applications

    OpenAIRE

    Julian VASILEV

    2008-01-01

    DBMS (Data base management systems) still have a very high price for small and middle enterprises in Bulgaria. Desktop versions are free but they cannot function in multi-user environment. We will try to make an application server which will make a Desktop version of a DBMS open to many users. Thus, this approach will be appropriate for client-server applications. The author of the article gives a concise observation of the problem and a possible way of solution.

  8. Application of database systems in diabetes care.

    Science.gov (United States)

    Kopelman, P G; Sanderson, A J

    1996-01-01

    The St Vincent Declaration includes a commitment to continuous quality improvement in diabetes care. This necessitates the collection of appropriate information to ensure that diabetes services are efficient, effective and equitable. The quantity of information, and the need for rapid access, means that this must be computer-based. The choice of architecture and the design of a database for diabetes care must take into account available equipment and operational requirements. Hardware topology may be determined by the operating system and/or netware software. An effective database system will include: user-friendliness, rapid but secure access to data, a facility for multiple selections for analysis and audit, the ability to be used as part of the patient consultation process, the ability to interface or integrate with other applications, and cost efficiency. An example of a clinical information database for diabetes care, Diamond, is described. PMID:9244825

  9. LISA, the next generation: from a web-based application to a fat client.

    Science.gov (United States)

    Pierlet, Noëlla; Aerts, Werner; Vanautgaerden, Mark; Van den Bosch, Bart; De Deurwaerder, André; Schils, Erik; Noppe, Thomas

    2008-01-01

    The LISA application, developed by the University Hospitals Leuven, permits referring physicians to consult the electronic medical records of their patients over the internet in a highly secure way. We decided to completely change the way we secured the application, discard the existing web application and build a completely new application, based on the in-house developed hospital information system, used in the University Hospitals Leuven. The result is a fat Java client, running on a Windows Terminal Server, secured by a commercial SSL-VPN solution. PMID:18953122

  10. A Proposal of Client Application Architecture using Loosely Coupled Component Connection Method in Banking Branch System

    Science.gov (United States)

    Someya, Harushi; Mori, Yuichi; Abe, Masahiro; Machida, Isamu; Hasegawa, Atsushi; Yoshie, Osamu

    Due to the deregulation of financial industry, the branches in banking industry need to shift to the sales-oriented bases from the operation-oriented bases. For corresponding to this movement, new banking branch systems are being developed. It is the main characteristics of new systems that we bring the form operations that have traditionally been performed at each branch into the centralized operation center for the purpose of rationalization and efficiency of the form operations. The branches treat a wide variety of forms. The forms can be described by common items in many cases, but the items include the different business logic and each form has the different relation among the items. And there is a need to develop the client application by user oneself. Consequently the challenge is to arrange the development environment that is high reusable, easy customizable and user developable. We propose a client application architecture that has a loosely coupled component connection method, and allows developing the applications by only describing the screen configurations and their transitions in XML documents. By adopting our architecture, we developed client applications of the centralized operation center for the latest banking branch system. Our experiments demonstrate good performances.

  11. An Evaluation of the Eclipse Rich Client Platform for a telecom management application

    OpenAIRE

    Frising, Philip

    2008-01-01

    The Software Management Organizer (SMO) application is used by telecom operators for remote software and hardware handling of telecommunication equipment. The graphical user interface (GUI) provided by SMO is called SMO GUI and is costly to maintain, extend and test.The Eclipse Rich Client Platform (RCP) provides a platform for building component based GUIs with rich functionality. This thesis is to evaluate how the Eclipse RCP can be used for building a new SMO GUI. The evaluation will be pe...

  12. Exemplary applications of the OECD fire database

    International Nuclear Information System (INIS)

    In general, the data from NPP experience with fire events stored in the OECD FIRE Database can provide answers to several interesting questions and insights on phenomena, such as examples of frequent fire initiators and their root causes, of electrical equipment failure modes, of fire protection equipment malfunctions, and of fire barriers impaired. Exemplary applications of the OECD FIRE Database show that it is already possible to retrieve reasonable qualitative information and to get to some extent also quantitative estimations, which can support the interpretation of the operating experience for specific events in the member countries participating in the OECD FIRE Project. The quantitative information will, of course, increase with the increasing number of reported events and a careful description of the respective events to provide as much information as available. In the third phase of the Project starting in 2010, the OECD FIRE Database will be further analyzed with respect to applications for first probabilistic safety assessment considerations, e.g. the positive and negative role of human factor in the fire ignition on the one hand, and, on the other hand, in fire detection and extinguishing. This has to be investigated in more detail to generate Fire PSA results with a higher confidence. Positive effects of human behavior for fire extinguishing are already identified in the existing Database. One of the main questions which could be answered by the OECD FIRE Database is how fires can propagate from the initial fire compartment to other compartments, even if there are protective means available for prevention of fire spreading. For generating meaningful event and fault trees for various safety significant fire scenarios, a clear and as far as possible detailed (with respect to time dependencies and safety significance) description of the initial fire event sequence and its consequences are essential. The coding of events has to reflect as far as feasible

  13. NSLS-II High Level Application Infrastructure And Client API Design

    International Nuclear Information System (INIS)

    The beam commissioning software framework of NSLS-II project adopts a client/server based architecture to replace the more traditional monolithic high level application approach. It is an open structure platform, and we try to provide a narrow API set for client application. With this narrow API, existing applications developed in different language under different architecture could be ported to our platform with small modification. This paper describes system infrastructure design, client API and system integration, and latest progress. As a new 3rd generation synchrotron light source with ultra low emittance, there are new requirements and challenges to control and manipulate the beam. A use case study and a theoretical analysis have been performed to clarify requirements and challenges to the high level applications (HLA) software environment. To satisfy those requirements and challenges, adequate system architecture of the software framework is critical for beam commissioning, study and operation. The existing traditional approaches are self-consistent, and monolithic. Some of them have adopted a concept of middle layer to separate low level hardware processing from numerical algorithm computing, physics modelling, data manipulating, plotting, and error handling. However, none of the existing approaches can satisfy the requirement. A new design has been proposed by introducing service oriented architecture technology. The HLA is combination of tools for accelerator physicists and operators, which is same as traditional approach. In NSLS-II, they include monitoring applications and control routines. Scripting environment is very important for the later part of HLA and both parts are designed based on a common set of APIs. Physicists and operators are users of these APIs, while control system engineers and a few accelerator physicists are the developers of these APIs. With our Client/Server mode based approach, we leave how to retrieve information to the

  14. Structuring modern web applications : A study of how to structure web clients to achieve modular, maintainable and longlived applications

    OpenAIRE

    MALMSTRÖM, TIM JOHAN

    2014-01-01

    This degree project, conducted at Decerno AB, investigates what can be done to create client side web applications that are maintainable for a long time. The focus is on basing the application on an existing framework which both simplifies the development process and helps keeping the application well structured. Which framework is currently the best is evaluated using a comparison between the currently most popular frameworks. The comparison is done using a set of categories that is defined ...

  15. Application of graph database for analytical tasks

    OpenAIRE

    Günzl, Richard

    2014-01-01

    This diploma thesis is about graph databases, which belong to the category of database systems known as NoSQL databases, but graph databases are beyond NoSQL databases. Graph databases are useful in many cases thanks to native storing of interconnections between data, which brings advantageous properties in comparison with traditional relational database system, especially in querying. The main goal of the thesis is: to describe principles, properties and advantages of graph database; to desi...

  16. 78 FR 68432 - Applicability Determination Index (ADI) Database System Recent Posting: Applicability...

    Science.gov (United States)

    2013-11-14

    ... AGENCY Applicability Determination Index (ADI) Database System Recent Posting: Applicability... the Applicability Determination Index (ADI) database system is available on the Internet through the... for each document posted on the ADI database system on October 30, 2013; the applicable category;...

  17. AIDA Asia. Artificial Insemination Database Application

    International Nuclear Information System (INIS)

    The objectives of AIDA (Artificial Insemination Database Application) and its companion GAIDA (Guide to AI Data Analysis) are to address two major problems in on-farm research on livestock production. The first is the quality of the data collected and the second is the intellectual rigor of the analyses and their associated results when statistically testing causal hypotheses. The solution is to develop a data management system such as AIDA and an analysis system such as GAIDA to estimate parameters that explain biological mechanisms for on-farm application. The system uses epidemiological study designs in the uncontrolled research environment of the farm, uses a database manager (Microsoft Access) to handle data management issues encountered in preparing data for analysis, and then uses a statistical program (SYSTAT) to do preliminary analyses. These analyses enable the researcher to have better understanding of the biological mechanisms involved in the data contained within the AIDA database. Using GAIDA as a guide, this preliminary analysis helps to determine the strategy for further in-depth analyse

  18. Software Application for Supporting the Education of Database Systems

    Science.gov (United States)

    Vágner, Anikó

    2015-01-01

    The article introduces an application which supports the education of database systems, particularly the teaching of SQL and PL/SQL in Oracle Database Management System environment. The application has two parts, one is the database schema and its content, and the other is a C# application. The schema is to administrate and store the tasks and the…

  19. Software Applications to Access Earth Science Data: Building an ECHO Client

    Science.gov (United States)

    Cohen, A.; Cechini, M.; Pilone, D.

    2010-12-01

    Historically, developing an ECHO (NASA’s Earth Observing System (EOS) ClearingHOuse) client required interaction with its SOAP API. SOAP, as a framework for web service communication has numerous advantages for Enterprise applications and Java/C# type programming languages. However, as interest has grown for quick development cycles and more intriguing “mashups,” ECHO has seen the SOAP API lose its appeal. In order to address these changing needs, ECHO has introduced two new interfaces facilitating simple access to its metadata holdings. The first interface is built upon the OpenSearch format and ESIP Federated Search framework. The second interface is built upon the Representational State Transfer (REST) architecture. Using the REST and OpenSearch APIs to access ECHO makes development with modern languages much more feasible and simpler. Client developers can leverage the simple interaction with ECHO to focus more of their time on the advanced functionality they are presenting to users. To demonstrate the simplicity of developing with the REST API, participants will be led through a hands-on experience where they will develop an ECHO client that performs the following actions: + Login + Provider discovery + Provider based dataset discovery + Dataset, Temporal, and Spatial constraint based Granule discovery + Online Data Access

  20. Handbook of video databases design and applications

    CERN Document Server

    Furht, Borko

    2003-01-01

    INTRODUCTIONIntroduction to Video DatabasesOge Marques and Borko FurhtVIDEO MODELING AND REPRESENTATIONModeling Video Using Input/Output Markov Models with Application to Multi-Modal Event DetectionAshutosh Garg, Milind R. Naphade, and Thomas S. HuangStatistical Models of Video Structure and SemanticsNuno VasconcelosFlavor: A Language for Media RepresentationAlexandros Eleftheriadis and Danny HongIntegrating Domain Knowledge and Visual Evidence to Support Highlight Detection in Sports VideosJuergen Assfalg, Marco Bertini, Carlo Colombo, and Alberto Del BimboA Generic Event Model and Sports Vid

  1. Application of OCR in Building Bibliographic Databases

    Directory of Open Access Journals (Sweden)

    A.R.D. Prasad

    1997-07-01

    Full Text Available Bibliographic databases tend to be very verbose and pose a problem to libraries due to the huge amount of data entry involved. In this situation, the two technologies that offer solutions are retro conversion and optical character recognition (OCR. The application of building an intelligent system for automatic identification of bibliographic elements like title, author, publisher, etc is discussed here. This paper also discusses the heuristics in identifying the elements and resolving conflicts that arise 'in situations where more than one bibliographic element satisfy the criteria specified for identifying the various elements. This work is being carried out at the DRTC with the financial assistance of NISSAT.

  2. Framework for Deploying Client/Server Distributed Database System for effective Human Resource Information Management Systems in Imo State Civil Service of Nigeria

    Directory of Open Access Journals (Sweden)

    Josiah Ahaiwe

    2012-08-01

    Full Text Available The information system is an integrated system that holds financial and personnel records of persons working in various branches of Imo state civil service. The purpose is to harmonize operations, reduce or if possible eliminate redundancy and control the introduction of “ghost workers” and fraud in pension management. In this research work, an attempt is made to design a frame work for deploying a client/server distributed database system for a human resource information management system with a scope on Imo state civil service in Nigeria. The system consists of a relational database of personnel variables which could be shared by various levels of management in all the ministries’ and their branches located all over the state. The server is expected to be hosted in the accountant general’s office. The system is capable of handling recruitment and promotions issues, training, monthly remunerations, pension and gratuity issues, and employment history, etc.

  3. DEVELOPING MULTITHREADED DATABASE APPLICATION USING JAVA TOOLS AND ORACLE DATABASE MANAGEMENT SYSTEM IN INTRANET ENVIRONMENT

    OpenAIRE

    Raied Salman

    2015-01-01

    In many business organizations, database applications are designed and implemented using various DBMS and Programming Languages. These applications are used to maintain databases for the organizations. The organization departments can be located at different locations and can be connected by intranet environment. In such environment maintenance of database records become an assignment of complexity which needs to be resolved. In this paper an intranet application is designed an...

  4. NoSQL and SQL Databases for Mobile Applications. Case Study: MongoDB versus PostgreSQL

    Directory of Open Access Journals (Sweden)

    Marin FOTACHE

    2013-01-01

    Full Text Available Compared with "classical" web, multi-tier applications, mobile applications have common and specific requirements concerning data persistence and processing. In mobile apps, database features can be distinctly analyzed for the client (minimalistic, isolated, memory-only and the server (data rich, centralized, distributed, synchronized and disk-based layers. Currently, a few lite relational database products reign the persistence for client platforms of mobile applications. There are two main objectives of this paper. First is to investigate storage options for major mobile platforms. Second is to point out some major differences between SQL and NoSQL datastores in terms of deployment, data model, schema design, data definition and manipulation. As NoSQL movement lacks standardization, from NoSQL products family MongoDB was chosen as reference, due to its strengths and popularity among developers. PostgreSQL serves the position of SQL DBMSs representative due to its popularity and conformity with SQL standards.

  5. Database and applications security integrating information security and data management

    CERN Document Server

    Thuraisingham, Bhavani

    2005-01-01

    This is the first book to provide an in-depth coverage of all the developments, issues and challenges in secure databases and applications. It provides directions for data and application security, including securing emerging applications such as bioinformatics, stream information processing and peer-to-peer computing. Divided into eight sections, each of which focuses on a key concept of secure databases and applications, this book deals with all aspects of technology, including secure relational databases, inference problems, secure object databases, secure distributed databases and emerging

  6. Web application for detailed real-time database transaction monitoring for CMS condition data

    Science.gov (United States)

    de Gruttola, Michele; Di Guida, Salvatore; Innocente, Vincenzo; Pierro, Antonio

    2012-12-01

    In the upcoming LHC era, database have become an essential part for the experiments collecting data from LHC, in order to safely store, and consistently retrieve, a wide amount of data, which are produced by different sources. In the CMS experiment at CERN, all this information is stored in ORACLE databases, allocated in several servers, both inside and outside the CERN network. In this scenario, the task of monitoring different databases is a crucial database administration issue, since different information may be required depending on different users' tasks such as data transfer, inspection, planning and security issues. We present here a web application based on Python web framework and Python modules for data mining purposes. To customize the GUI we record traces of user interactions that are used to build use case models. In addition the application detects errors in database transactions (for example identify any mistake made by user, application failure, unexpected network shutdown or Structured Query Language (SQL) statement error) and provides warning messages from the different users' perspectives. Finally, in order to fullfill the requirements of the CMS experiment community, and to meet the new development in many Web client tools, our application was further developed, and new features were deployed.

  7. Memory Storage Issues of Temporal Database Applications on Relational Database Management Systems

    Directory of Open Access Journals (Sweden)

    Sami M. Halawani

    2010-01-01

    Full Text Available Problem statement: Many existing database applications manage time-varying data. These database applications are referred to as temporal databases or time-oriented database applications that are considered as repositories of time-dependent data. Many proposals have been introduced for developing time-oriented database applications, some of which suggest building support for Temporal Database Management Systems (TDBMS on top of existing non-temporal DBMSs, while others suggest modifying the models of exiting DBMSs or building TDBMS from scratch. Approach: This study addressed several issues concerning developing a technique that enables database designers to understand the way in which time-varying behavior can be modeled and mapped into tabular form. Results: Conventional DBMSs do not have the capability to record and process time-varying aspects of the real world. With growing sophistication of DBMS applications, the lack of temporal support in conventional DBMS raises serious problems when used to develop temporal database. The technique of understanding how to think about time and represent it in formal systems is the topic of this study. We examined how to implement time-varying application in the SQL structured query language by introducing temporal data concepts that need to be simulated in DBMSs which lack temporal supports. We proposed a temporal data model that combines the features of previous temporal models and that reduces the cost of memory storage. Conclusion: We proposed a technique for implementing temporal database on top of exiting non-temporal DBMS. This technique includes five main areas. These areas are temporal database conceptual design, temporal database logical design, integrity constraints preventions in temporal database, modifying and querying temporal database. We proposed a data model for the temporal database based on the data models which are discussed in literature.

  8. Databases and information systems: Applications in biogeography

    International Nuclear Information System (INIS)

    Some aspects of the new instrumentalization and methodological elements that make up information systems in biodiversity (ISB) are described. The use of accurate geographically referenced data allows a broad range of available sources: natural history collections and scientific literature require the use of databases and geographic information systems (GIS). The conceptualization of ISB and GIS, based in the use of extensive data bases, has implied detailed modeling and the construction of authoritative archives: exhaustive catalogues of nomenclature and synonymies, complete bibliographic lists, list of names proposed, historical-geographic gazetteers with localities and their synonyms united under a global positioning system which produces a geospheric conception of the earth and its biota. Certain difficulties in the development of the system and the construction of the biological databases are explained: quality control of data, for example. The use of such systems is basic in order to respond to many questions at the frontier of current studies of biodiversity and conservation. In particular, some applications in biogeography and their importance for modeling distributions, to identify and contrast areas of endemism and biological richness for conservation, and their use as tools in what we identify as predictive and experimental faunistics are detailed. Lastly, the process as well as its relevance is emphasized at national and regional levels

  9. Just-in-time Database-Driven Web Applications

    OpenAIRE

    Ong, Kenneth R

    2003-01-01

    "Just-in-time" database-driven Web applications are inexpensive, quickly-developed software that can be put to many uses within a health care organization. Database-driven Web applications garnered 73873 hits on our system-wide intranet in 2002. They enabled collaboration and communication via user-friendly Web browser-based interfaces for both mission-critical and patient-care-critical functions. Nineteen database-driven Web applications were developed. The application categories that compri...

  10. Modular Workflow Engine for Distributed Services using Lightweight Java Clients

    CERN Document Server

    Vetter, R -M; Peetz, J -V

    2009-01-01

    In this article we introduce the concept and the first implementation of a lightweight client-server-framework as middleware for distributed computing. On the client side an installation without administrative rights or privileged ports can turn any computer into a worker node. Only a Java runtime environment and the JAR files comprising the workflow client are needed. To connect all clients to the engine one open server port is sufficient. The engine submits data to the clients and orchestrates their work by workflow descriptions from a central database. Clients request new task descriptions periodically, thus the system is robust against network failures. In the basic set-up, data up- and downloads are handled via HTTP communication with the server. The performance of the modular system could additionally be improved using dedicated file servers or distributed network file systems. We demonstrate the design features of the proposed engine in real-world applications from mechanical engineering. We have used ...

  11. DEVELOPING MULTITHREADED DATABASE APPLICATION USING JAVA TOOLS AND ORACLE DATABASE MANAGEMENT SYSTEM IN INTRANET ENVIRONMENT

    Directory of Open Access Journals (Sweden)

    Raied Salman

    2015-11-01

    Full Text Available In many business organizations, database applications are designed and implemented using various DBMS and Programming Languages. These applications are used to maintain databases for the organizations. The organization departments can be located at different locations and can be connected by intranet environment. In such environment maintenance of database records become an assignment of complexity which needs to be resolved. In this paper an intranet application is designed and implemented using Object-Oriented Programming Language Java and Object-Relational Database Management System Oracle in multithreaded Operating System environment.

  12. Database application platform for earthquake numerical simulation

    Institute of Scientific and Technical Information of China (English)

    LUO Yan; ZHENG Yue-jun; CHEN Lian-wang; LU Yuan-zhong; HUANG Zhong-xian

    2006-01-01

    @@ Introduction In recent years, all kinds of observation networks of seismology have been established, which have been continuously producing numerous digital information. In addition, there are many study results about 3D velocity structure model and tectonic model of crust (Huang and Zhao, 2006; Huang et al, 2003; Li and Mooney, 1998),which are valuable for studying the inner structure of the earth and earthquake preparation process. It is badly needed to combine the observed data, experimental study and theoretical analyses results by the way of numerical simulation and develop a database and a corresponding application platform to be used by numerical simulation,and is also a significant way to promote earthquake prediction.

  13. Techniques for multiple database integration

    OpenAIRE

    Whitaker, Barron D

    1997-01-01

    Approved for public release; distribution is unlimited There are several graphic client/server application development tools which can be used to easily develop powerful relational database applications. However, they do not provide a direct means of performing queries which require relational joins across multiple database boundaries. This thesis studies ways to access multiple databases. Specifically, it examines how a 'cross-database join' can be performed. A case study of techniques us...

  14. Heterogeneous Database integration for Web Applications

    OpenAIRE

    V. Rajeswari; Dr. Dharmishtan K. Varughese

    2009-01-01

    In the contemporary business and industrial environment, the variety of data used by organizations are increasing rapidly. Also, there is an increasing demand for accessing this data. The size, complexity and variety of databases used for data handling cause serious problems in manipulating this distributed information. Integrating all the information from different databases into one database is a challenging problem. XML has been in use in recent times to handle data in web appliccations...

  15. Application of a Database in the Monitoring of Workstations in a Local Area Network

    Directory of Open Access Journals (Sweden)

    Eyo O. Ukem

    2009-01-01

    Full Text Available Problem statement: Computer hardware fault management and repairs can be a big challenge, especially if the number of staff available for the job is small. The task becomes more complicated if remote sites are managed and an engineer or technician has to be dispatched. Approach: Availability of relevant information when needed could ease the burden of maintenance by removing uncertainties. Such required information could be accumulated in a database and accessed as needed. Results: This study considered such a database, to assist a third party hardware maintenance firm keep track of its operations, including the machines that it services, together with their owners. A software application was developed in Java programming language, in the form of a database, using Microsoft Access as the database management system. It was designed to run on a local area network and to allow remote workstations to log on to a central computer in a client/server configuration. With this application it was possible to enter fault reports into the database residing on the central computer from any workstation on the network. Conclusion/Recommendations: The information generated from this data can be used by the third party hardware maintenance firm to speed up its service delivery, thus putting the firm in a position to render more responsive and efficient service to the customers.

  16. SECURE REMOTE CLIENT AUTHENTICATION

    Directory of Open Access Journals (Sweden)

    K.Pradeep,

    2010-10-01

    Full Text Available This paper discusses an application of Secure Remote Client Authentication. It presents a Smart Cards and Digitally certification from third party vendors, Smart cards are based on algorithm to provide secure Remote client Authentication. These schemes vary significantly.In relation to today’s security challenges, which includephishing, man-in-the-middle attacks and malicious software. Secure Remote Client authentication plays a key role.

  17. SECURE REMOTE CLIENT AUTHENTICATION

    OpenAIRE

    K.Pradeep,; R.Usha Rani; E.Ravi Kumar; K.Nikhila,; Vijay Sankar

    2010-01-01

    This paper discusses an application of Secure Remote Client Authentication. It presents a Smart Cards and Digitally certification from third party vendors, Smart cards are based on algorithm to provide secure Remote client Authentication. These schemes vary significantly.In relation to today’s security challenges, which includephishing, man-in-the-middle attacks and malicious software. Secure Remote Client authentication plays a key role.

  18. Exploring client logs towards characterizing the user behavior on web applications

    Science.gov (United States)

    Guarino de Vasconcelos, Leandro; Coelho dos Santos, Rafael D.; Baldochi, Laercio A.

    2013-05-01

    Analysis of user interaction with computer systems can be used for several purposes, the most common being analysis of the effectiveness of the interfaces used for interaction (in order to adapt or enhance its usefulness) and analysis of intention and behavior of the users when interacting with these systems. For web applications, often the analysis of user interaction is done using the web server logs collected for every document sent to the user in response to his/her request. In order to capture more detailed data on the users' interaction with sites, one could collect actions the user performs in the client side. An effective approach to this is the USABILICS system, which also allows the definition and analysis of tasks in web applications. The fine granularity of logs collected by USABILICS allows a much more detailed log of users' interaction with a web application. These logs can be converted into graphs where vertices are users' actions and edges are paths made by the user to accomplish a task. Graph analysis and visualization tools and techniques allow the analysis of actions taken in relation to an expected action path, or characterization of common (and uncommon) paths on the interaction with the application. This paper describes how to estimate users' behavior and characterize their intentions during interaction with a web application, presents the analysis and visualization tools on those graphs and shows some practical results with an educational site, commenting on the results and implications of the possibilities of using these techniques.

  19. The EREC-STRESA database. Internet application

    International Nuclear Information System (INIS)

    A considerable amount of experimental data in the field of NPPs safety and reliability was produced and gathered in the Electrogorsk Research and Engineering Centre on NPPs Safety. In order to provide properly preservation and easy accessing to the data the EREC Database was created. This paper gives a description of the EREC Database and the supporting web-based informatic platform STRESA. (author)

  20. Oracle database design for e-commerce application

    OpenAIRE

    Lihvoinen, Anna

    2009-01-01

    Published in September, 2009 at Haaga-Helia University of Applied Sciences The purpose of this thesis is to design a database for e-commerce application which will be further implemented in Oracle Application Express (Apex) by Database Software Horizons. The design document includes ER diagrams, table descriptions, table source code, and testing results. Logical and physical database designs for relational modeling methods are applied in this work. The result of the work is document...

  1. Bitcoin clients

    OpenAIRE

    Skudnov, Rostislav

    2012-01-01

    Bitcoin is a new decentralized electronic currency which gained popularity in the last two years. The usage of Bitcoin is facilitated by software commonly called Bitcoin clients. This thesis provides an overview of Bitcoin and cryptography behind it, discusses different types of Bitcoin clients and researches additional features implemented by them. It also analyzes further enhancements that can be made to clients and the Bitcoin protocol. Bitcoin clients are grouped into types and analyz...

  2. Thin-Client/Server计算模式在社区图书馆中的应用%The Application in the Community Library with Thin-Client/Server

    Institute of Scientific and Technical Information of China (English)

    赵秀丽; 杨静; 马爱华; 秦梅素

    2003-01-01

    主要阐述了Thin-Client/Server计算模式在社区图书馆建立电子阅览室中的应用,以及Thin-Client/Server计算模式的概念、工作模式、技术特点,展望Thin-Client/Server计算模式在未来社区图书馆发展中的应用前景.

  3. Application of Windows Socket Technique to Communication Process of the Train Diagram Network System Based on Client/Server Structure

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    This paper is focused on the technique for design and realization of the process communications about the computer-aided train diagram network system. The Windows Socket technique is adopted to program for the client and the server to create system applications and solve the problems of data transfer and data sharing in the system.

  4. The Application of Genetic Algorithms and Multidimensional Distinguishing Model in Forecasting and Evaluating Credits for Mobile Clients

    Institute of Scientific and Technical Information of China (English)

    Li Zhan; Xu Ji-sheng

    2003-01-01

    To solve the arrearage problem that puzzled most of the mobile corporations, we propose an approach to forecast and evaluate the credits for mobile clients, devising a method that is of the coalescence of genetic algorithm and multidimensional distinguishing model. In the end of this pa-per, a result of a testing application in Zhuhai Branch, GMCC was provided. The precision of the forecasting and evaluation of the client's credit is near 90%. This study is very signifi-cant to the mobile communication corporation at all levels.The popularization of the techniques and the result would pro-duce great benefits of both society and economy.

  5. The Application of Genetic Algorithms and Multidimensional Distinguishing Model in Forecasting and Evaluating Credits for Mobile Clients

    Institute of Scientific and Technical Information of China (English)

    LiZhan; XuJi-sheng

    2003-01-01

    To solve the arrearage problem that puzzled most of the mobile corporations, we propose an approach to forecast and evaluate the credits for mobile clients, devising a method that is of the coalescence of genetic algorithm and multidimensional distinguishing model. In the end of this paper, a result of a testing application in Zhuhai Branch, GMCC was provided. The precision of the forecasting and evaluation of the client's credit is near 90%. This study is very significant to the mobile communication corporation at all levels.The popularization o{ the techniques and the result would produce great benefits of both society and economy.

  6. Heterogeneous Database integration for Web Applications

    Directory of Open Access Journals (Sweden)

    V. Rajeswari

    2009-11-01

    Full Text Available In the contemporary business and industrial environment, the variety of data used by organizations are increasing rapidly. Also, there is an increasing demand for accessing this data. The size, complexity and variety of databases used for data handling cause serious problems in manipulating this distributed information. Integrating all the information from different databases into one database is a challenging problem. XML has been in use in recent times to handle data in web appliccations. XML (eXtensible Markup Language is a very open way of data communication. XML has become the undisputable standard both for data exchange and content management. XML is supported by the giants of the software industry like IBM, Oracle and Microsoft. The XML markup language should be the lingua franca of data interchange; but it’s rate of acceptance has been limited by a mismatch between XML and legacy databases. This in turn, has created a need for a mapping tool to integrate the XML and databases. This paper highlights the merging of heterogeneous database resource. This can be achieved by means of conversion of relational mode to XML schema and vice versa and by adding the semantic constraints to the XML Schema. The developments that the industry has seen in recent times in this field is referred to as the basis.

  7. Field installation of a distributed database application

    International Nuclear Information System (INIS)

    APC-based equipment failure reporting system was designed as a distributed database over a Wide Area Network. Speed related communication problems were encountered, necessitating a redesign of the program. In the original design, data for local failures was to reside in several local field offices. Division level reporting was to be accomplished by concatenating the field offices' databases over the network. Validation information was to reside in several small databases located on the division file server. This data was to be accessed by the field offices as a remote link. The program executables were to reside on yet another file server belonging to the service organization which maintained the program code. Both the field an division offices were to access the program code as a remote link. This paper discusses the network and communications facilities which were in place, the original design, the performance encountered, the design changes and present performance. There is also a brief discussion of possible future modifications

  8. Fuzzy Modeling of Client Preference in Data-Rich Marketing Environments

    OpenAIRE

    2000-01-01

    Advances in computational methods have led, in the world of financial services, to huge databases of client and market information. In the past decade, various computational intelligence (CI) techniques have been applied in mining this data for obtaining knowledge and in-depth information about the clients and the markets. This paper discusses the application of fuzzy clustering in target selection from large databases for direct marketing (DM) purposes. Actual data from the campaigns of a la...

  9. Fuzzy Modeling of Client Preference in Data-Rich Marketing Environments

    OpenAIRE

    Setnes, M.; Kaymak, Uzay

    2000-01-01

    textabstractAdvances in computational methods have led, in the world of financial services, to huge databases of client and market information. In the past decade, various computational intelligence (CI) techniques have been applied in mining this data for obtaining knowledge and in-depth information about the clients and the markets. This paper discusses the application of fuzzy clustering in target selection from large databases for direct marketing (DM) purposes. Actual data from the campa...

  10. Client Perspective

    International Nuclear Information System (INIS)

    Training Sections can best serve the needs of their clients by encouraging them to recognize that self-determination and overall training program ownership are the key ingredients of a successful program. In a support role, Training Sections should provide excellent lesson plans and instructors, good record keeping, and feedback vehicles. Most importantly, Training Sections should communicate closely with their clients and provide maximum flexibility to support overall client responsibilities

  11. A thermodynamic database for geophysical applications

    Science.gov (United States)

    Saxena, S. K.

    2013-12-01

    Several thermodynamic databases are available for calculation of equilibrium reference state of the model earth. Prominent among these are the data bases of (a) SLB (1), (b) HP (2) and (c) FSPW (3). The two major problems, as discussed in a meeting of the database scientists (4), lie in the formulation of solid solutions and equations of state. The models adopted in databases (1) and (2) do not account for multi-components in natural solids and the sub-lattice or compound-energy models used in (3) require lot of fictive compound and mixing energy data for which there is no present ongoing attempt. The EOS formulation in (1) is based on Mie-Gruneisen equation of state and in (2) on modification of Tait EOS with limited parameters. The database (3) adopted the Birch-Murnaghan EOS and used it for high temperature by making compressibility a function of temperature. The (2) and (3) models lead to physically unacceptable values of entropy and heat capacity at extreme conditions. The problem is as much associated with the EOS formulation as with the adoption of a heat capacity change with temperature at 1 bar as discussed by Brosh (5). None of the databases (1), (2) or (3) include the database on multicomponent fluid at extreme conditions. These problems have been addressed in the new database modified after (3). It retains the solution models for solids as in (3) and adds the Brosh Model (5) for solid solutions and the Belonoshko et al (6) model for 13-component C-H-O-S fluid. The Superfluid model builds on the combination of experimental data on pure and mixed fluids at temperatures lower than 1000 K over several kilobars and molecular dynamics generated data at extreme conditions and has been found to be consistent with all the recent experimental data. New high pressure experiments on dissociation of volatile containing solids using laser- and externally-heated DAC are being conducted to obtain new pressure-volume-temperature data on fluids to extend the current kb

  12. Thin-Client/Server架构在图书馆中的应用%Application of Thin- Client/Server in Library

    Institute of Scientific and Technical Information of China (English)

    陈春芳

    2005-01-01

    结合Thin-Client/Server架构在图书馆信息系统的实际应用情况,分析图书馆应用自动化的技术要求及Thin-Client/Server架构的优缺点.针对该架构的优缺点搞好终端服务器运行、客户端设备使用以及网络连接的管理,有利于进一步提高Thin-Client/Server架构在图书馆的功用.

  13. Application of Integrated Database to the Casting Design

    Institute of Scientific and Technical Information of China (English)

    In-Sung Cho; Seung-Mok Yoo; Chae-Ho Lim; Jeong-Kil Choi

    2008-01-01

    Construction of integrated database including casting shapes with their casting design, technical knowledge, and thermophysical properties of the casting alloys were introduced in the present study. Recognition tech- nique for casting design by industrial computer tomography was used for the construction of shape database. Technical knowledge of the casting processes such as ferrous and non-ferrous alloys and their manufacturing process of the castings were accumulated and the search engine for the knowledge was developed. Database of thermophysical properties of the casting alloys were obtained via the experimental study, and the properties were used for .the in-house computer simulation of casting process. The databases were linked with intelligent casting expert system developed in center for e-design, KITECH. It is expected that the databases can help non casting experts to devise the casting and its process. Various examples of the application by using the databases were shown in the present study.

  14. Evolution and applications of plant pathway resources and databases

    DEFF Research Database (Denmark)

    Sucaet, Yves; Deva, Taru

    2011-01-01

    Plants are important sources of food and plant products are essential for modern human life. Plants are increasingly gaining importance as drug and fuel resources, bioremediation tools and as tools for recombinant technology. Considering these applications, database infrastructure for plant model...... systems deserves much more attention. Study of plant biological pathways, the interconnection between these pathways and plant systems biology on the whole has in general lagged behind human systems biology. In this article we review plant pathway databases and the resources that are currently available....... We lay out trends and challenges in the ongoing efforts to integrate plant pathway databases and the applications of database integration. We also discuss how progress in non-plant communities can serve as an example for the improvement of the plant pathway database landscape and thereby allow...

  15. Client/server study

    Science.gov (United States)

    Dezhgosha, Kamyar; Marcus, Robert; Brewster, Stephen

    1995-01-01

    The goal of this project is to find cost-effective and efficient strategies/solutions to integrate existing databases, manage network, and improve productivity of users in a move towards client/server and Integrated Desktop Environment (IDE) at NASA LeRC. The project consisted of two tasks as follows: (1) Data collection, and (2) Database Development/Integration. Under task 1, survey questionnaires and a database were developed. Also, an investigation on commercially available tools for automated data-collection and net-management was performed. As requirements evolved, the main focus has been task 2 which involved the following subtasks: (1) Data gathering/analysis of database user requirements, (2) Database analysis and design, making recommendations for modification of existing data structures into relational database or proposing a common interface to access heterogeneous databases(INFOMAN system, CCNS equipment list, CCNS software list, USERMAN, and other databases), (3) Establishment of a client/server test bed at Central State University (CSU), (4) Investigation of multi-database integration technologies/ products for IDE at NASA LeRC, and (5) Development of prototypes using CASE tools (Object/View) for representative scenarios accessing multi-databases and tables in a client/server environment. Both CSU and NASA LeRC have benefited from this project. CSU team investigated and prototyped cost-effective/practical solutions to facilitate NASA LeRC move to a more productive environment. CSU students utilized new products and gained skills that could be a great resource for future needs of NASA.

  16. Patterns of client behavior with their most recent male escort: an application of latent class analysis.

    Science.gov (United States)

    Grov, Christian; Starks, Tyrel J; Wolff, Margaret; Smith, Michael D; Koken, Juline A; Parsons, Jeffrey T

    2015-05-01

    Research examining interactions between male escorts and clients has relied heavily on data from escorts, men working on the street, and behavioral data aggregated over time. In the current study, 495 clients of male escorts answered questions about sexual behavior with their last hire. Latent class analysis identified four client sets based on these variables. The largest (n = 200, 40.4 %, labeled Typical Escort Encounter) included men endorsing behavior prior research found typical of paid encounters (e.g., oral sex and kissing). The second largest class (n = 157, 31.7 %, Typical Escort Encounter + Erotic Touching) included men reporting similar behaviors, but with greater variety along a spectrum of touching (e.g., mutual masturbation and body worship). Those classed BD/SM and Kink (n = 76, 15.4 %) reported activity along the kink spectrum (BD/SM and role play). Finally, men classed Erotic Massage Encounters (n = 58, 11.7 %) primarily engaged in erotic touch. Clients reporting condomless anal sex were in the minority (12.2 % overall). Escorts who engage in anal sex with clients might be appropriate to train in HIV prevention and other harm reduction practices-adopting the perspective of "sex workers as sex educators." PMID:24777440

  17. Analysis of Turbulence Datasets using a Database Cluster: Requirements, Design, and Sample Applications

    Science.gov (United States)

    Meneveau, Charles

    2007-11-01

    The massive datasets now generated by Direct Numerical Simulations (DNS) of turbulent flows create serious new challenges. During a simulation, DNS provides only a few time steps at any instant, owing to storage limitations within the computational cluster. Therefore, traditional numerical experiments done during the simulation examine each time slice only a few times before discarding it. Conversely, if a few large datasets from high-resolution simulations are stored, they are practically inaccessible to most in the turbulence research community, who lack the cyber resources to handle the massive amounts of data. Even those who can compute at that scale must run simulations again forward in time in order to answer new questions about the dynamics, duplicating computational effort. The result is that most turbulence datasets are vastly underutilized and not available as they should be for creative experimentation. In this presentation, we discuss the desired features and requirements of a turbulence database that will enable its widest access to the research community. The guiding principle of large databases is ``move the program to the data'' (Szalay et al. ``Designing and mining multi-terabyte Astronomy archives: the Sloan Digital Sky Survey,'' in ACM SIGMOD, 2000). However, in the case of turbulence research, the questions and analysis techniques are highly specific to the client and vary widely from one client to another. This poses particularly hard challenges in the design of database analysis tools. We propose a minimal set of such tools that are of general utility across various applications. And, we describe a new approach based on a Web services interface that allows a client to access the data in a user-friendly fashion while allowing maximum flexibility to execute desired analysis tasks. Sample applications will be discussed. This work is performed by the interdisciplinary ITR group, consisting of the author and Yi Li(1), Eric Perlman(2), Minping Wan(1

  18. Database applications in high energy physics

    International Nuclear Information System (INIS)

    High Energy physicists were using computers to process and store their data early in the history of computing. They addressed problems of memory management, job control, job generation, data standards, file conventions, multiple simultaneous usage, tape file handling and data management earlier than, or at the same time as, the manufacturers of computing equipment. The HEP community have their own suites of programs for these functions, and are now turning their attention to the possibility of replacing some of the functional components of their 'homebrew' systems with more widely used software and/or hardware. High on the 'shopping list' for replacement is data management. ECFA Working Group 11 has been working on this problem. This paper reviews the characteristics of existing HEP systems and existing database systems and discusses the way forward. (orig.)

  19. Application of the device database in the Python programming

    International Nuclear Information System (INIS)

    The Device Database has been developed using the relational database in the KEKB accelerator control system. It contains many kinds of parameters of the devices, mainly magnets and magnet power supplies. The parameters consist of the wiring information, the address of the interfaces, the specification of the hardware, the calibration constants, the magnetic field excitation functions and the any other parameters for the device control. These parameters are necessary not only for constructing EPICS IOC database but also for providing information to the high-level application programs, most of which are written in the script languages such as SAD or Python. Particularly Python is often used to access the Device Database. For this purpose, the Python library module that is designed to handle tabular data of the relational database on memory has been developed. The overview of the library module is reported. (author)

  20. Registry of EPA Applications, Models, and Databases

    Data.gov (United States)

    U.S. Environmental Protection Agency — READ is EPA's authoritative source for information about Agency information resources, including applications/systems, datasets and models. READ is one component of...

  1. A relational database application in support of integrated neuroscience research.

    Science.gov (United States)

    Rudowsky, Ira; Kulyba, Olga; Kunin, Mikhail; Ogarodnikov, Dmitri; Raphan, Theodore

    2004-12-01

    The development of relational databases has significantly improved the performance of storage, search, and retrieval functions and has made it possible for applications that perform real-time data acquisition and analysis to interact with these types of databases. The purpose of this research was to develop a user interface for interaction between a data acquisition and analysis application and a relational database using the Oracle9i system. The overall system was designed to have an indexing capability that threads into the data acquisition and analysis programs. Tables were designed and relations within the database for indexing the files and information contained within the files were established. The system provides retrieval capabilities over a broad range of media, including analog, event, and video data types. The system's ability to interact with a data capturing program at the time of the experiment to create both multimedia files as well as the meta-data entries in the relational database avoids manual entries in the database and ensures data integrity and completeness for further interaction with the data by analysis applications. PMID:15657974

  2. Databases

    Data.gov (United States)

    National Aeronautics and Space Administration — The databases of computational and experimental data from the first Aeroelastic Prediction Workshop are located here. The databases file names tell their contents...

  3. Databases

    Digital Repository Service at National Institute of Oceanography (India)

    Kunte, P.D.

    Information on bibliographic as well as numeric/textual databases relevant to coastal geomorphology has been included in a tabular form. Databases cover a broad spectrum of related subjects like coastal environment and population aspects, coastline...

  4. Contracting preference relations for database applications

    OpenAIRE

    Mindolin, Denis; Chomicki, Jan

    2009-01-01

    The binary relation framework has been shown to be applicable to many real-life preference handling scenarios. Here we study preference contraction: the problem of discarding selected preferences. We argue that the property of minimality and the preservation of strict partial orders are crucial for contractions. Contractions can be further constrained by specifying which preferences should be protected. We consider two classes of preference relations: finite and finitely representable. We pre...

  5. Application of the Non—Stationary Oil Film Force Database

    Institute of Scientific and Technical Information of China (English)

    WANGWen; ZHANGZHi-ming; 等

    2001-01-01

    The technique of non-stationary oll film force database for hydrodynamic bearing is introduced and its potential applications in nonlinear rotor-dynamics are demonstrated.Through simulations of the locus of the shaft center aided by the database technique,nonlinear stability analysis can be performed and the natural frequency can be obtained as well.The easiness of “assembling” the individual bush forces from the database to form the bearing force.makes it very convenient to evaluate the stability of various types of journal bearings,Examples are demonstrated to show how the database technique makes it possible to get technically abundant simulation results at the expense of very short calculation time.

  6. Database Replication

    CERN Document Server

    Kemme, Bettina

    2010-01-01

    Database replication is widely used for fault-tolerance, scalability and performance. The failure of one database replica does not stop the system from working as available replicas can take over the tasks of the failed replica. Scalability can be achieved by distributing the load across all replicas, and adding new replicas should the load increase. Finally, database replication can provide fast local access, even if clients are geographically distributed clients, if data copies are located close to clients. Despite its advantages, replication is not a straightforward technique to apply, and

  7. A Green's function database platform for seismological research and education: applications and examples

    Science.gov (United States)

    Heimann, Sebastian; Kriegerowski, Marius; Dahm, Torsten; Simone, Cesca; Wang, Rongjiang

    2016-04-01

    The study of seismic sources from measured waveforms requires synthetic elementary seismograms (Green's functions, GF) calculated for specific earth models and source receiver geometries. Since the calculation of GFs is computationally expensive and requires careful parameter testing and quality control, pre-calculated GF databases, which can be re-used for different types of applications, can be of advantage. We developed a GF database web platform for the seismological community (http://kinherd.org/), where a researcher can share Green's function stores and retrieve synthetic seismograms on the fly for various point and extended earthquake source models for many different earth models at local, regional and global scale. This web service is part of a rich new toolset for the creation and handling of Green's functions and synthetic seismograms (http://emolch.github.com/pyrocko/gf). It can be used off-line or in client mode. We demonstrate core features of the GF platform with different applications on global, regional and local scales. These include the automatic inversion of kinematic source parameter from teleseismic body waves, the improved depth estimate of shallow induced earthquakes from regional seismological arrays, or the relative moment tensor inversion of local earthquakes from volcanic induced seismicity.

  8. Molten salts database for energy applications

    CERN Document Server

    Serrano-López, Roberto; Cuesta-López, Santiago

    2013-01-01

    The growing interest in energy applications of molten salts is justified by several of their properties. Their possibilities of usage as a coolant, heat transfer fluid or heat storage substrate, require thermo-hydrodynamic refined calculations. Many researchers are using simulation techniques, such as Computational Fluid Dynamics (CFD) for their projects or conceptual designs. The aim of this work is providing a review of basic properties (density, viscosity, thermal conductivity and heat capacity) of the most common and referred salt mixtures. After checking data, tabulated and graphical outputs are given in order to offer the most suitable available values to be used as input parameters for other calculations or simulations. The reviewed values show a general scattering in characterization, mainly in thermal properties. This disagreement suggests that, in several cases, new studies must be started (and even new measurement techniques should be developed) to obtain accurate values.

  9. [The application of graphic (visual) databases in neurology and neuropathology].

    Science.gov (United States)

    Lechowicz, W; Milewska, D; Swiderski, W; Dymecki, J

    1994-01-01

    The possibilities and principles of creation in Windows environment of visual databases are described which could be used for elaboration of multimedial encyclopedia of selected nosological entities. Such databases are particularly important in education making possible finding of data and their relative comparison. The method of organization of "files" using standard programmes of Windows packs and the method of application of the technique of image coding for forms of symbolic icons are presented. Examples of graphic bases of neurological and neuropathological data evolved in Windows environment using professional application programmes (framing and retouching of images in graphic editors, coding of icons of macroinstruction of reading of information introduced into the relating visual database of the analysed cases. PMID:8065541

  10. Databases

    Directory of Open Access Journals (Sweden)

    Nick Ryan

    2004-01-01

    Full Text Available Databases are deeply embedded in archaeology, underpinning and supporting many aspects of the subject. However, as well as providing a means for storing, retrieving and modifying data, databases themselves must be a result of a detailed analysis and design process. This article looks at this process, and shows how the characteristics of data models affect the process of database design and implementation. The impact of the Internet on the development of databases is examined, and the article concludes with a discussion of a range of issues associated with the recording and management of archaeological data.

  11. Development of Integrated PSA Database and Application Technology

    International Nuclear Information System (INIS)

    The high quality of PSA is essential for the risk informed regulation and applications. The main elements of PSA are the model, methodology, reliability data, and tools. The purpose of the project is to develop the reliability database for the Korean nuclear power plants and PSA analysis and management system. The reliability database system has been developed and the reliability data has been collected for 4 types of reliability data such as the reactor trip, the piping, the component and the common cause failure. The database provides the reliability data for PSAs and risk informed applications. The FTREX software is the fastest PSA quantification engine in the world. The license agreement between KAERI and EPRI is made to sell FTREX to the members of EPRI. The advanced PSA management system AIMS- PSA has been developed. The PSA model is stored in the database and solved by clicking one button. All the information necessary for the KSNP Level-1 and 2 PSA is stored in the PSA information database. It provides the PSA users a useful mean to review and analyze the PSA

  12. Mapper: A distributed object-oriented database application

    Science.gov (United States)

    Younger, Herbert; O'Reilly, John; Frogner, Bjorn

    1995-01-01

    This paper discusses the results of a Small Business Innovation Research (SBIR) project. The initial application involved decomposition of a large database across multiple processors to improve the speed of processing compound queries. The commercial outcome was a tourist information system with a point-to-point driving direction program called MAPPER. A distributed, object-oriented approach was used for the general design, while a spatial decomposition was used to divide the database into computationally manageable pieces. The resulting system is highly flexible with respect to both modifications and reuse.

  13. An integrated medical image database and retrieval system using a web application server.

    Science.gov (United States)

    Cao, Pengyu; Hashiba, Masao; Akazawa, Kouhei; Yamakawa, Tomoko; Matsuto, Takayuki

    2003-08-01

    We developed an Integrated Medical Image Database and Retrieval System (INIS) for easy access by medical staff. The INIS mainly consisted of four parts: specific servers to save medical images from multi-vendor modalities of CT, MRI, CR, ECG and endoscopy; an integrated image database (DB) server to save various kinds of images in a DICOM format; a Web application server to connect clients to the integrated image DB and the Web browser terminals connected to an HIS system. The INIS provided a common screen design to retrieve CT, MRI, CR, endoscopic and ECG images, and radiological reports, which would allow doctors to retrieve radiological images and corresponding reports, or ECG images of a patient simultaneously on a screen. Doctors working in internal medicine on average accessed information 492 times a month. Doctors working in cardiological and gastroenterological accessed information 308 times a month. Using the INIS, medical staff could browse all or parts of a patient's medical images and reports. PMID:12909158

  14. Database application for changing data models in environmental engineering

    Energy Technology Data Exchange (ETDEWEB)

    Hussels, Ulrich; Camarinopoulos, Stephanos; Luedtke, Torsten; Pampoukis, Georgios [RISA Sicherheitsanalysen GmbH, Berlin-Charlottenburg (Germany)

    2013-07-01

    Whenever a technical task is to be solved with the help of a database application and uncertainties regarding the structure, scope or level of detail of the data model exist (either currently or in the future) the use of a generic database application can reduce considerably the cost of implementation and maintenance. Simultaneously the approach described in this contribution permits the operation with different views on the data and even finding and defining new views which had not been considered before. The prerequisite for this is that the preliminary information (structure as well as data) stored into the generic application matches the intended use. In this case, parts of the generic model developed with the generic approach can be reused and according efforts for a major rebuild can be saved. This significantly reduces the development time. At the same time flexibility is achieved concerning the environmental data model, which is not given in the context of conventional developments. (orig.)

  15. Current research status, databases and application of single nucleotide polymorphism.

    Science.gov (United States)

    Javed, R; Mukesh

    2010-07-01

    Single Nucleotide Polymorphisms (SNPs) are the most frequent form of DNA variation in the genome. SNPs are genetic markers which are bi-allelic in nature and grow at a very fast rate. Current genomic databases contain information on several million SNPs. More than 6 million SNPs have been identified and the information is publicly available through the efforts of the SNP Consortium and others data bases. The NCBI plays a major role in facillating the identification and cataloging of SNPs through creation and maintenance of the public SNP database (dbSNP) by the biomedical community worldwide and stimulate many areas of biological research including the identification of the genetic components of disease. In this review article, we are compiling the existing SNP databases, research status and their application. PMID:21717869

  16. Server-side verification of client behavior in cryptographic protocols

    OpenAIRE

    Chi, Andrew; Cochran, Robert; Nesfield, Marie; Reiter, Michael K.; Sturton, Cynthia

    2016-01-01

    Numerous exploits of client-server protocols and applications involve modifying clients to behave in ways that untampered clients would not, such as crafting malicious packets. In this paper, we demonstrate practical verification of a cryptographic protocol client's messaging behavior as being consistent with the client program it is believed to be running. Moreover, we accomplish this without modifying the client in any way, and without knowing all of the client-side inputs driving its behav...

  17. The Network Configuration of an Object Relational Database Management System

    Science.gov (United States)

    Diaz, Philip; Harris, W. C.

    2000-01-01

    The networking and implementation of the Oracle Database Management System (ODBMS) requires developers to have knowledge of the UNIX operating system as well as all the features of the Oracle Server. The server is an object relational database management system (DBMS). By using distributed processing, processes are split up between the database server and client application programs. The DBMS handles all the responsibilities of the server. The workstations running the database application concentrate on the interpretation and display of data.

  18. Design and Test of Application-Specific Integrated Circuits by use of Mobile Clients

    Directory of Open Access Journals (Sweden)

    Michael Auer

    2009-02-01

    Full Text Available The aim of this work is to develop a simultaneous multi user access system – READ (Remote ASIC Design and Test that allows users to perform test and measurements remotely via clients running on mobile devices as well as on standard PCs. The system also facilitates the remote design of circuits with the PAC-Designer The system is controlled by LabVIEW and was implemented using a Data Acquisition Card from National instruments. Such systems are specially suited for manufacturing process monitoring and control. The performance of the simultaneous access was tested under load with a variable number of users. The server implements a queue that processes user’s commands upon request.

  19. 顾及语义差异的基础地理信息客户数据库更新实施模型%Fundamental Geo-information Client Database Updating Model Considering Semantic Heterogeneities

    Institute of Scientific and Technical Information of China (English)

    王育红; 牛亚辉; 林艳

    2011-01-01

    Client database updating refers to the process of utilizing the related information of the changed and updated features in the new version of fundamental geographical information database to perform the corresponding cascade updates on client database for ensuring it also has good currency. The current researches emphasize particularly on the distribution and delivery of updating information, but how to efficiently implement the updating of client database,especially in the case of semantic heterogeneity, doesn't be considered fully. Aiming to this problem, the semantic heterogeneities between two databases are firstly summarized, and then their impacts on the updating implementation process are described from some aspects such as efficiency, the completeness, the consistency and correctness of data. Finally, the updating implementation model consisted of three basic operations, semantic matching, updates extraction and updates integration, is proposed according to the theory of semantic mapping and transformation, and the execution strategies and key steps of these operations are respectively discussed. Through these discussions and analysis, the concept of fundamental geoqnformation client database updating, its implementation requirements and technical difficulties, the corresponding solutions, the key problems needed to be further researched are made clean Based on the current works and the future researches,an automated and efficient software tool for client database updating is promising to be developed and realized.%客户数据库更新就是利用新版基础地理数据库中更新变化的要素信息,对客户数据库进行相应的级联更新,以使其具有良好现势性的过程.现有研究大多侧重于基础地理数据库更新信息的分发与提供,而没有充分考虑语义差异环境下如何高效实施客户数据库更新的具体问题.针对此,该文概括了基础地理数据库与其客户数据库之间潜在的各种语义

  20. Advancements in web-database applications for rabies surveillance

    OpenAIRE

    Bélanger Denise; Coté Nathalie; Gendron Bruno; Lelièvre Frédérick; Rees Erin E

    2011-01-01

    Abstract Background Protection of public health from rabies is informed by the analysis of surveillance data from human and animal populations. In Canada, public health, agricultural and wildlife agencies at the provincial and federal level are responsible for rabies disease control, and this has led to multiple agency-specific data repositories. Aggregation of agency-specific data into one database application would enable more comprehensive data analyses and effective communication among pa...

  1. Design and implementation of a cartographic client application for mobile devices using SVG Tiny and J2ME

    Science.gov (United States)

    Hui, L.; Behr, F.-J.; Schröder, D.

    2006-10-01

    The dissemination of digital geospatial data is available now on mobile devices such as PDAs (personal digital assistants) and smart-phones etc. The mobile devices which support J2ME (Java 2 Micro Edition) offer users and developers one open interface, which they can use to develop or download the software according their own demands. Currently WMS (Web Map Service) can afford not only traditional raster image, but also the vector image. SVGT (Scalable Vector Graphics Tiny) is one subset of SVG (Scalable Vector Graphics) and because of its precise vector information, original styling and small file size, SVGT format is fitting well for the geographic mapping purpose, especially for the mobile devices which has bandwidth net connection limitation. This paper describes the development of a cartographic client for the mobile devices, using SVGT and J2ME technology. Mobile device will be simulated on the desktop computer for a series of testing with WMS, for example, send request and get the responding data from WMS and then display both vector and raster format image. Analyzing and designing of System structure such as user interface and code structure are discussed, the limitation of mobile device should be taken into consideration for this applications. The parsing of XML document which is received from WMS after the GetCapabilities request and the visual realization of SVGT and PNG (Portable Network Graphics) image are important issues in codes' writing. At last the client was tested on Nokia S40/60 mobile phone successfully.

  2. Mining the Galaxy Zoo Database: Machine Learning Applications

    Science.gov (United States)

    Borne, Kirk D.; Wallin, J.; Vedachalam, A.; Baehr, S.; Lintott, C.; Darg, D.; Smith, A.; Fortson, L.

    2010-01-01

    The new Zooniverse initiative is addressing the data flood in the sciences through a transformative partnership between professional scientists, volunteer citizen scientists, and machines. As part of this project, we are exploring the application of machine learning techniques to data mining problems associated with the large and growing database of volunteer science results gathered by the Galaxy Zoo citizen science project. We will describe the basic challenge, some machine learning approaches, and early results. One of the motivators for this study is the acquisition (through the Galaxy Zoo results database) of approximately 100 million classification labels for roughly one million galaxies, yielding a tremendously large and rich set of training examples for improving automated galaxy morphological classification algorithms. In our first case study, the goal is to learn which morphological and photometric features in the Sloan Digital Sky Survey (SDSS) database correlate most strongly with user-selected galaxy morphological class. As a corollary to this study, we are also aiming to identify which galaxy parameters in the SDSS database correspond to galaxies that have been the most difficult to classify (based upon large dispersion in their volunter-provided classifications). Our second case study will focus on similar data mining analyses and machine leaning algorithms applied to the Galaxy Zoo catalog of merging and interacting galaxies. The outcomes of this project will have applications in future large sky surveys, such as the LSST (Large Synoptic Survey Telescope) project, which will generate a catalog of 20 billion galaxies and will produce an additional astronomical alert database of approximately 100 thousand events each night for 10 years -- the capabilities and algorithms that we are exploring will assist in the rapid characterization and classification of such massive data streams. This research has been supported in part through NSF award #0941610.

  3. Advancements in web-database applications for rabies surveillance

    Directory of Open Access Journals (Sweden)

    Bélanger Denise

    2011-08-01

    Full Text Available Abstract Background Protection of public health from rabies is informed by the analysis of surveillance data from human and animal populations. In Canada, public health, agricultural and wildlife agencies at the provincial and federal level are responsible for rabies disease control, and this has led to multiple agency-specific data repositories. Aggregation of agency-specific data into one database application would enable more comprehensive data analyses and effective communication among participating agencies. In Québec, RageDB was developed to house surveillance data for the raccoon rabies variant, representing the next generation in web-based database applications that provide a key resource for the protection of public health. Results RageDB incorporates data from, and grants access to, all agencies responsible for the surveillance of raccoon rabies in Québec. Technological advancements of RageDB to rabies surveillance databases include 1 automatic integration of multi-agency data and diagnostic results on a daily basis; 2 a web-based data editing interface that enables authorized users to add, edit and extract data; and 3 an interactive dashboard to help visualize data simply and efficiently, in table, chart, and cartographic formats. Furthermore, RageDB stores data from citizens who voluntarily report sightings of rabies suspect animals. We also discuss how sightings data can indicate public perception to the risk of racoon rabies and thus aid in directing the allocation of disease control resources for protecting public health. Conclusions RageDB provides an example in the evolution of spatio-temporal database applications for the storage, analysis and communication of disease surveillance data. The database was fast and inexpensive to develop by using open-source technologies, simple and efficient design strategies, and shared web hosting. The database increases communication among agencies collaborating to protect human health from

  4. Group-oriented coordination models for distributed client-server computing

    Science.gov (United States)

    Adler, Richard M.; Hughes, Craig S.

    1994-01-01

    This paper describes group-oriented control models for distributed client-server interactions. These models transparently coordinate requests for services that involve multiple servers, such as queries across distributed databases. Specific capabilities include: decomposing and replicating client requests; dispatching request subtasks or copies to independent, networked servers; and combining server results into a single response for the client. The control models were implemented by combining request broker and process group technologies with an object-oriented communication middleware tool. The models are illustrated in the context of a distributed operations support application for space-based systems.

  5. Realization of client/server management information system of coal mine based on ODBC in geology and survey

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Q.; Mao, S.; Yang, F.; Han, Z. [Shandong University of Science and Technology (China). Geoscience Department

    2000-08-01

    The paper describes in detail the framework and the application theory of Open Database Connectivity (ODBC), the formation of a client/server system of geological and surveying management information system, and the connection of the various databases. Then systematically, the constitution and functional realization of the geological management information system are introduced. 5 refs., 5 figs.

  6. Application Program Interface for the Orion Aerodynamics Database

    Science.gov (United States)

    Robinson, Philip E.; Thompson, James

    2013-01-01

    The Application Programming Interface (API) for the Crew Exploration Vehicle (CEV) Aerodynamic Database has been developed to provide the developers of software an easily implemented, fully self-contained method of accessing the CEV Aerodynamic Database for use in their analysis and simulation tools. The API is programmed in C and provides a series of functions to interact with the database, such as initialization, selecting various options, and calculating the aerodynamic data. No special functions (file read/write, table lookup) are required on the host system other than those included with a standard ANSI C installation. It reads one or more files of aero data tables. Previous releases of aerodynamic databases for space vehicles have only included data tables and a document of the algorithm and equations to combine them for the total aerodynamic forces and moments. This process required each software tool to have a unique implementation of the database code. Errors or omissions in the documentation, or errors in the implementation, led to a lengthy and burdensome process of having to debug each instance of the code. Additionally, input file formats differ for each space vehicle simulation tool, requiring the aero database tables to be reformatted to meet the tool s input file structure requirements. Finally, the capabilities for built-in table lookup routines vary for each simulation tool. Implementation of a new database may require an update to and verification of the table lookup routines. This may be required if the number of dimensions of a data table exceeds the capability of the simulation tools built-in lookup routines. A single software solution was created to provide an aerodynamics software model that could be integrated into other simulation and analysis tools. The highly complex Orion aerodynamics model can then be quickly included in a wide variety of tools. The API code is written in ANSI C for ease of portability to a wide variety of systems. The

  7. AIDA Asia. Artificial Insemination Database Application. User manual. 1

    International Nuclear Information System (INIS)

    Artificial Insemination Database Application (AIDA-Asia) is a computer application to store and analyze information from AI Services (farms, females, inseminated, semen, estrus characteristics, inseminator and pregnancy diagnosis data). The need for such an application arose during a consultancy undertaken by the author for the International Atomic Energy Agency (IAEA, Vienna) under the framework of its Regional Co-operative Agreement for Asia and the Pacific (RCA) which is implementing a project on 'Improving Animal Productivity and Reproductive Efficiency' (RAS/5/035). The detailed specifications for the application were determined through a Task Force Meeting of National Consultants from five RCA Member States, organized by the IAEA and held in Sri Lanka in April 2001. The application has been developed in MS Access 2000 and Visual Basic for Applications (VBA) 6.0. However, it can run as a stand-alone application through its own executable files. It is based on screen forms for data entry or editing of information and command buttons. The structure of the data, the design of the application and VBA codes cannot be seen and cannot be modified by users. However, the designated administrator of AIDA-Asia in each country can customize it

  8. New Trend of Database for the Internet Era --Object database and its application

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    In the Internet era, the relational database in general usecannot be applied to some problems which should be solved. In this thesis, we describe the necessary capabilities for database management systems and compare them with th e limitation of RDB. And also we introduce object database and its efficiency, w hich will be the new trend of the database. We use “Jasmine2000” as a concrete e xample of object database in business use, and are going to verify its efficienc y with its applied cases. At the end, we will point the way of database's future .

  9. Client-Side Monitoring for Web Mining.

    Science.gov (United States)

    Fenstermacher, Kurt D.; Ginsburg, Mark

    2003-01-01

    Discusses mining Web data to draw conclusions about Web users and proposes a client-side monitoring system that supports flexible data collection and encompasses client-side applications beyond the Web browser to incorporate standard office productivity tools. Highlights include goals for client-side monitoring; framework for user monitoring,…

  10. Allied health applications of a computerized clinical log database system.

    Science.gov (United States)

    Boyce, K E; Winn, J S; Anderson, S L; Bryant, B G

    1999-01-01

    Preliminary research in the development and use of computerized clinical log records began in 1987 in an allied health college at a midwestern academic health center. This article reviews development and implementation of a computerized system for managing clinical log records to improve and enhance allied health educational programs in the radiation sciences. These clinical log databases are used for quantitative and qualitative analyses of student participation in clinical procedures, and educational planning for each student. Collecting and recording data from clinical log records serves as a valuable instructional tool for students, with both clinical and didactic applications. PMID:10389054

  11. A RAD approach to client/server system development

    International Nuclear Information System (INIS)

    The capability, richness, and leverage of inexpensive commercial operating systems, off-the-shelf applications, and powerful developing tools have made building feature-rich client/server systems possible in rapid time and at low cost--ushering in a new level of systems integration not before possible. The authors achieve rapid application development (RAD) by using a flexible and extendible client/service integration framework. The framework provides the means to integrate in-house and third-party software applications with databases and expert-system knowledge bases and, where appropriate, provides communication links among the applications. The authors discuss the integration framework's capabilities, explain its underlying system architecture, and outline the methods and tools used to customize and integrate many diverse applications

  12. Database application research in real-time data access of accelerator control system

    International Nuclear Information System (INIS)

    The control system of Shanghai Synchrotron Radiation Facility (SSRF) is a large-scale distributed real-time control system, It involves many types and large amounts of real-time data access during the operating. Database system has wide application prospects in the large-scale accelerator control system. It is the future development direction of the accelerator control system, to replace the differently dedicated data structures with the mature standardized database system. This article discusses the application feasibility of database system in accelerators based on the database interface technology, real-time data access testing, and system optimization research and to establish the foundation of the wide scale application of database system in the SSRF accelerator control system. Based on the database interface technology, real-time data access testing and system optimization research, this article will introduce the application feasibility of database system in accelerators, and lay the foundation of database system application in the SSRF accelerator control system. (authors)

  13. Graph databases and their applications to social networks

    OpenAIRE

    BARTHA, Tomáš

    2015-01-01

    Main idea of the bachelor work is theoretically introduce NoSQL databases and mainly graph databases with their using at social network. Sample database is implemented with using Neo4j technology and show Cypher query language use cases. At the last part we will analyze sample database data with using Neo4j server and query language and perform analysis assessment.

  14. Database Sampling to Support the Development of Data-Intensive Applications

    OpenAIRE

    Bisbal, Jesus

    2000-01-01

    A prototype database is a model of a database which exhibits the desired properties, in terms of its schema and/or data values, of an operational database. Database prototyping has been proposed as a technique to support the database design process in particular, and the whole data-intensive application development process in general (e.g. requirements elicitation, software testing, experimentation with design alternatives). Existing work on this area has been widely ignored in...

  15. Development of application program and building database to increase facilities for using the radiation effect assessment computer codes

    International Nuclear Information System (INIS)

    The current radiation effect assessment system is required the skillful technique about the application for various code and high level of special knowledge classified by field. Therefore, as a matter of fact, it is very difficult for the radiation users' who don't have enough special knowledge to assess or recognize the radiation effect properly. For this, we already have developed the five Computer codes(windows-based), that is the radiation effect assessment system, in radiation utilizing field including the nuclear power generation. It needs the computer program that non-specialist can use the five computer codes to have already developed with ease. So, we embodied the A.I-based specialist system that can infer the assessment system by itself, according to the characteristic of given problem. The specialist program can guide users, search data, inquire of administrator directly. Conceptually, with circumstance which user to apply the five computer code may encounter actually, we embodied to consider aspects as follows. First, the accessibility of concept and data to need must be improved. Second, the acquirement of reference theory and use of corresponding computer code must be easy. Third, Q and A function needed for solution of user's question out of consideration previously. Finally, the database must be renewed continuously. Actually, to express this necessity, we develop the client program to organize reference data, to build the access methodology(query) about organized data, to load the visible expression function of searched data. And It is embodied the instruction method(effective theory acquirement procedure and methodology) to acquire the theory referring the five computer codes. It is developed the data structure access program(DBMS) to renew continuously data with ease. For Q and A function, it is embodied the Q and A board within client program because the user of client program can search the content of question and answer. (authors)

  16. Android as a platform for database application development case : Winha mobile

    OpenAIRE

    Muli, Joseph

    2013-01-01

    This thesis aims to help beginner Android developers and anyone interested in database intergration on Android Operating System understand the basic fundamentals on how to design database applications, specifically for mobile devices. A review of Android applications has been made to give an overview of the general properties of the applications in relation to database creation and management. To accomplish this thesis, SQL (Structured Query Language) and Android application development w...

  17. Application of modern reliability database techniques to military system data

    Energy Technology Data Exchange (ETDEWEB)

    Bunea, Cornel [Department of Engineering Management and Systems Engineering, George Washington University, 1776 G Street, NW, Suite 9, Washington, DC 20052 (United States)], E-mail: cornel@gwu.edu; Mazzuchi, Thomas A.; Sarkani, Shahram; Chang, H.-C. [Department of Engineering Management and Systems Engineering, The George Washington University, 1776 G Street, NW, Suite 9, Washington, DC 20052 (United States)

    2008-01-15

    This paper focuses on analysis techniques of modern reliability databases, with an application to military system data. The analysis of military system data base consists of the following steps: clean the data and perform operation on it in order to obtain good estimators; present simple plots of data; analyze the data with statistical and probabilistic methods. Each step is dealt with separately and the main results are presented. Competing risks theory is advocated as the mathematical support for the analysis. The general framework of competing risks theory is presented together with simple independent and dependent competing risks models available in literature. These models are used to identify the reliability and maintenance indicators required by the operating personnel. Model selection is based on graphical interpretation of plotted data.

  18. Clinical application of the integrated multicenter discharge summary database.

    Science.gov (United States)

    Takahiro, Suzuki; Shunsuke, Doi; Yutaka, Hatakeyama; Masayuki, Honda; Yasushi, Matsumura; Gen, Shimada; Mitsuhiro, Takasaki; Shusaku, Tsumoto; Hideto, Yokoi; Katsuhiko, Takabayashi

    2015-01-01

    We performed the multi-year project to collect discharge summary from multiple hospitals and made the big text database to build a common document vector space, and developed various applications. We extracted 243,907 discharge summaries from seven hospitals. There was a difference in term structure and number of terms between the hospitals, however the differences by disease were similar. We built the vector space using TF-IDF method. We performed a cross-match analysis of DPC selection among seven hospitals. About 80% cases were correctly matched. The use of model data of other hospitals reduced selection rate to around 10%; however, integrated model data from all hospitals restored the selection rate. PMID:26262419

  19. Multimedia Database Applications: Issues and Concerns for Classroom Teaching

    CERN Document Server

    Yu, Chien

    2011-01-01

    The abundance of multimedia data and information is challenging educators to effectively search, browse, access, use, and store the data for their classroom teaching. However, many educators could still be accustomed to teaching or searching for information using conventional methods, but often the conventional methods may not function well with multimedia data. Educators need to efficiently interact and manage a variety of digital media files too. The purpose of this study is to review current multimedia database applications in teaching and learning, and further discuss some of the issues or concerns that educators may have while incorporating multimedia data into their classrooms. Some strategies and recommendations are also provided in order for educators to be able to use multimedia data more effectively in their teaching environments.

  20. Applications of the Cambridge Structural Database in chemical education.

    Science.gov (United States)

    Battle, Gary M; Ferrence, Gregory M; Allen, Frank H

    2010-10-01

    The Cambridge Structural Database (CSD) is a vast and ever growing compendium of accurate three-dimensional structures that has massive chemical diversity across organic and metal-organic compounds. For these reasons, the CSD is finding significant uses in chemical education, and these applications are reviewed. As part of the teaching initiative of the Cambridge Crystallographic Data Centre (CCDC), a teaching subset of more than 500 CSD structures has been created that illustrate key chemical concepts, and a number of teaching modules have been devised that make use of this subset in a teaching environment. All of this material is freely available from the CCDC website, and the subset can be freely viewed and interrogated using WebCSD, an internet application for searching and displaying CSD information content. In some cases, however, the complete CSD System is required for specific educational applications, and some examples of these more extensive teaching modules are also discussed. The educational value of visualizing real three-dimensional structures, and of handling real experimental results, is stressed throughout. PMID:20877495

  1. Web Application Software for Ground Operations Planning Database (GOPDb) Management

    Science.gov (United States)

    Lanham, Clifton; Kallner, Shawn; Gernand, Jeffrey

    2013-01-01

    A Web application facilitates collaborative development of the ground operations planning document. This will reduce costs and development time for new programs by incorporating the data governance, access control, and revision tracking of the ground operations planning data. Ground Operations Planning requires the creation and maintenance of detailed timelines and documentation. The GOPDb Web application was created using state-of-the-art Web 2.0 technologies, and was deployed as SaaS (Software as a Service), with an emphasis on data governance and security needs. Application access is managed using two-factor authentication, with data write permissions tied to user roles and responsibilities. Multiple instances of the application can be deployed on a Web server to meet the robust needs for multiple, future programs with minimal additional cost. This innovation features high availability and scalability, with no additional software that needs to be bought or installed. For data governance and security (data quality, management, business process management, and risk management for data handling), the software uses NAMS. No local copy/cloning of data is permitted. Data change log/tracking is addressed, as well as collaboration, work flow, and process standardization. The software provides on-line documentation and detailed Web-based help. There are multiple ways that this software can be deployed on a Web server to meet ground operations planning needs for future programs. The software could be used to support commercial crew ground operations planning, as well as commercial payload/satellite ground operations planning. The application source code and database schema are owned by NASA.

  2. APPLICATION OF GEOGRAPHICAL PARAMETER DATABASE TO ESTABLISHMENT OF UNIT POPULATION DATABASE

    Institute of Scientific and Technical Information of China (English)

    2003-01-01

    Now GIS is turning into a good tool in handling geographical, economical, and population data, so we can obtain more and more information from these data. On the other hand, in some cases, for a calamity, such as hurricane, earthquake, flood, drought etc., or a decision-making, such as setting up a broadcasting transmitter, building a chemical plant etc., we have to evaluate the total population in the region influenced by a calamity or a project. In this paper, a method is put forward to evaluate the population in such special region. Through exploring the correlation of geographical parameters and the distribution of people in the same region by means of quantitative analysis and qualitative analysis, unit population database (1km× 1km) is established. In this way, estimating the number of people in a special region is capable by adding up the population in every grid involved in this region boundary. The geographical parameters are obtained from topographic database and DEM database on the scale of 1∶ 250 000. The fundamental geographical parameter database covering county administrative boundaries and 1km× 1km grid is set up and the population database at county level is set up as well. Both geographical parameter database and unit population database are able to offer sufficient conditions for quantitative analysis. They will have important role in the research fields of data mining (DM), Decision-making Support Systems (DSS), and regional sustainable development.

  3. Combining applications and remote databases view in a common SQL distributed genomic database

    OpenAIRE

    Gros, Pierre-Emmanuel; Hérisson, Joan; Ferey, Nicolas; Gherbi, Rachid

    2006-01-01

    Huge volumes of bioinformatics data are emerging from sequencing efforts, gene expression assays, X-ray crystallography of proteins, and many other activities. High-throughput experimental methods produce masses of data, so that the whole of biology has changed from a data-light science into a data-driven science. Currently there are a lot of databases and software tools dealing with these genomic data. In general, each tool and database uses a different type of data in exchange protocols, an...

  4. The Application of Smart Phone Client in the Library%智能手机客户端在图书馆中的应用

    Institute of Scientific and Technical Information of China (English)

    楼向英; 施干卫; 高春玲

    2011-01-01

    手机图书馆现有的技术模式主要有SMS、WAP网站、二维码应用和智能手机的应用开发.智能手机的应用开发可简单地分成两类:一是Web应用;二是桌面应用,即手机客户端.文章调研了智能手机客户端在国内外图书馆的应用情况,认为以iPhone和Android智能手机为主的手机客户端开发模式将渐成潮流.%Nowadays the technology patterns for mobile library are mainly SMS, WAP, 2D barcode and smart phone application. The smart phone application simply fells into two types: web application and desk application, which is also mobile phone client The article focuses on the development of mobile phone client for libraries around the world and considers that the development of smart phone client based iPhone and Android will gradually become the trend.

  5. TOPCAT's TAP Client

    CERN Document Server

    Taylor, Mark

    2015-01-01

    TAP, the Table Access Protocol, is a Virtual Observatory (VO) protocol for executing queries in remote relational databases using ADQL, an SQL-like query language. It is one of the most powerful components of the VO, but also one of the most complex to use, with an extensive stack of associated standards. We present here recent improvements to the client and GUI for interacting with TAP services from the TOPCAT table analysis tool. As well as managing query submission and result retrieval, the GUI attempts to provide the user with as much help as possible in locating services, understanding service metadata and capabilities, and constructing correct and useful ADQL queries. The implementation and design are, unlike previous versions, both usable and performant even for the largest TAP services.

  6. Client Centred Desing

    DEFF Research Database (Denmark)

    Ørngreen, Rikke; Nielsen, Janni; Levinsen, Karin

    2008-01-01

    In this paper we argue for the use of Client Centred preparation phases when designing complex systems. Through Client Centred Design human computer interaction can extend the focus on end-users to alse encompass the client's needs, context and resources.......In this paper we argue for the use of Client Centred preparation phases when designing complex systems. Through Client Centred Design human computer interaction can extend the focus on end-users to alse encompass the client's needs, context and resources....

  7. Application of China's National Forest Continuous Inventory Database

    Science.gov (United States)

    Xie, Xiaokui; Wang, Qingli; Dai, Limin; Su, Dongkai; Wang, Xinchuang; Qi, Guang; Ye, Yujing

    2011-12-01

    The maintenance of a timely, reliable and accurate spatial database on current forest ecosystem conditions and changes is essential to characterize and assess forest resources and support sustainable forest management. Information for such a database can be obtained only through a continuous forest inventory. The National Forest Continuous Inventory (NFCI) is the first level of China's three-tiered inventory system. The NFCI is administered by the State Forestry Administration; data are acquired by five inventory institutions around the country. Several important components of the database include land type, forest classification and ageclass/ age-group. The NFCI database in China is constructed based on 5-year inventory periods, resulting in some of the data not being timely when reports are issued. To address this problem, a forest growth simulation model has been developed to update the database for years between the periodic inventories. In order to aid in forest plan design and management, a three-dimensional virtual reality system of forest landscapes for selected units in the database (compartment or sub-compartment) has also been developed based on Virtual Reality Modeling Language. In addition, a transparent internet publishing system for a spatial database based on open source WebGIS (UMN Map Server) has been designed and utilized to enhance public understanding and encourage free participation of interested parties in the development, implementation, and planning of sustainable forest management.

  8. The clients perspective: Client and owner

    International Nuclear Information System (INIS)

    After Nine Mile Points' Licensed Operator Requalification Program was declared unsatisfactory by the Nuclear Regulatory Commission, the importance of quality training (the clients need) took on new meaning. During a year of recovery, the clients needs were determined, and training responded. The end result was a strong program, with ownership in the right place

  9. CORAL Server and CORAL Server Proxy: Scalable Access to Relational Databases from CORAL Applications

    Science.gov (United States)

    Valassi, A.; Bartoldus, R.; Kalkhof, A.; Salnikov, A.; Wache, M.

    2011-12-01

    The CORAL software is widely used at CERN by the LHC experiments to access the data they store on relational databases, such as Oracle. Two new components have recently been added to implement a model involving a middle tier "CORAL server" deployed close to the database and a tree of "CORAL server proxies", providing data caching and multiplexing, deployed close to the client. A first implementation of the two new components, released in the summer 2009, is now deployed in the ATLAS online system to read the data needed by the High Level Trigger, allowing the configuration of a farm of several thousand processes. This paper reviews the architecture of the software, its development status and its usage in ATLAS.

  10. CORAL Server and CORAL Server Proxy: Scalable Access to Relational Databases from CORAL Applications

    International Nuclear Information System (INIS)

    The CORAL software is widely used at CERN by the LHC experiments to access the data they store on relational databases, such as Oracle. Two new components have recently been added to implement a model involving a middle tier 'CORAL server' deployed close to the database and a tree of 'CORAL server proxies', providing data caching and multiplexing, deployed close to the client. A first implementation of the two new components, released in the summer 2009, is now deployed in the ATLAS online system to read the data needed by the High Level Trigger, allowing the configuration of a farm of several thousand processes. This paper reviews the architecture of the software, its development status and its usage in ATLAS.

  11. Working with HITRAN Database Using Hapi: HITRAN Application Programming Interface

    Science.gov (United States)

    Kochanov, Roman V.; Hill, Christian; Wcislo, Piotr; Gordon, Iouli E.; Rothman, Laurence S.; Wilzewski, Jonas

    2015-06-01

    A HITRAN Application Programing Interface (HAPI) has been developed to allow users on their local machines much more flexibility and power. HAPI is a programming interface for the main data-searching capabilities of the new "HITRANonline" web service (http://www.hitran.org). It provides the possibility to query spectroscopic data from the HITRAN database in a flexible manner using either functions or query language. Some of the prominent current features of HAPI are: a) Downloading line-by-line data from the HITRANonline site to a local machine b) Filtering and processing the data in SQL-like fashion c) Conventional Python structures (lists, tuples, and dictionaries) for representing spectroscopic data d) Possibility to use a large set of third-party Python libraries to work with the data e) Python implementation of the HT lineshape which can be reduced to a number of conventional line profiles f) Python implementation of total internal partition sums (TIPS-2011) for spectra simulations g) High-resolution spectra calculation accounting for pressure, temperature and optical path length h) Providing instrumental functions to simulate experimental spectra i) Possibility to extend HAPI's functionality by custom line profiles, partitions sums and instrumental functions Currently the API is a module written in Python and uses Numpy library providing fast array operations. The API is designed to deal with data in multiple formats such as ASCII, CSV, HDF5 and XSAMS. This work has been supported by NASA Aura Science Team Grant NNX14AI55G and NASA Planetary Atmospheres Grant NNX13AI59G. L.S. Rothman et al. JQSRT, Volume 130, 2013, Pages 4-50 N.H. Ngo et al. JQSRT, Volume 129, November 2013, Pages 89-100 A. L. Laraia at al. Icarus, Volume 215, Issue 1, September 2011, Pages 391-400

  12. CORAL Server and CORAL Server Proxy: Scalable Access to Relational Databases from CORAL Applications

    CERN Document Server

    Valassi, A; Kalkhof, A; Salnikov, A; Wache, M

    2011-01-01

    The CORAL software is widely used at CERN for accessing the data stored by the LHC experiments using relational database technologies. CORAL provides a C++ abstraction layer that supports data persistency for several backends and deployment models, including local access to SQLite files, direct client access to Oracle and MySQL servers, and read-only access to Oracle through the FroNTier web server and cache. Two new components have recently been added to CORAL to implement a model involving a middle tier "CORAL server" deployed close to the database and a tree of "CORAL server proxy" instances, with data caching and multiplexing functionalities, deployed close to the client. The new components are meant to provide advantages for read-only and read-write data access, in both offline and online use cases, in the areas of scalability and performance (multiplexing for several incoming connections, optional data caching) and security (authentication via proxy certificates). A first implementation of the two new c...

  13. A Comparative Study of Relational and Non-Relational Database Models in a Web- Based Application

    OpenAIRE

    Cornelia Gyorödi; Robert Gyorödi; Roxana Sotoc

    2015-01-01

    The purpose of this paper is to present a comparative study between relational and non-relational database models in a web-based application, by executing various operations on both relational and on non-relational databases thus highlighting the results obtained during performance comparison tests. The study was based on the implementation of a web-based application for population records. For the non-relational database, we used MongoDB and for the relational database, we used MSSQL 2014. W...

  14. A Two-folded Impact Analysis of Schema Changes on Database Applications

    Institute of Scientific and Technical Information of China (English)

    Spyridon K.Gardikiotis; Nicos Malevris

    2009-01-01

    Database applications are becoming increasingly popular, mainly due to the advanced data management facilities that the underlying database management system offers compared against traditional legacy software applications. The interaction, however, of such applications with the database system introduces a number of issues, among which, this paper addresses the impact analysis of the changes performed at the database schema level. Our motivation is to provide the software engineers of database applications with automated methods that facilitate major maintenance tasks, such as source code corrections and regression testing, which should be triggered by the occurrence of such changes. The presented impact analysis is thus two-folded: the impact is analysed in terms of both the affected source code statements and the affected test suites concerning the testing of these applications. To achieve the former objective, a program slicing technique is employed, which is based on an extended version of the program dependency graph. The latter objective requires the analysis of test suites generated for database applications, which is accomplished by employing testing techniques tailored for this type of applications. Utilising both the slicing and the testing techniques enhances program comprehension of database applications, while also supporting the development of a number of practical metrics regarding their maintainability against schema changes. To evaluate the feasibility and effectiveness of the presented techniques and metrics, a software tool, called DATA, has been implemented. The experimental results from its usage on the TPC-C case study are reported and analysed.

  15. Databases for INDUS-1 and INDUS-2

    International Nuclear Information System (INIS)

    The databases for Indus are relational databases designed to store various categories of data related to the accelerator. The data archiving and retrieving system in Indus is based on a client/sever model. A general purpose commercial database is used to store parameters and equipment data for the whole machine. The database manages configuration, on-line and historical databases. On line and off line applications distributed in several systems can store and retrieve the data from the database over the network. This paper describes the structure of databases for Indus-1 and Indus-2 and their integration within the software architecture. The data analysis, design, resulting data-schema and implementation issues are discussed. (author)

  16. Application of Simulated Annealing to Clustering Tuples in Databases.

    Science.gov (United States)

    Bell, D. A.; And Others

    1990-01-01

    Investigates the value of applying principles derived from simulated annealing to clustering tuples in database design, and compares this technique with a graph-collapsing clustering method. It is concluded that, while the new method does give superior results, the expense involved in algorithm run time is prohibitive. (24 references) (CLB)

  17. Klaim-DB: A Modeling Language for Distributed Database Applications

    DEFF Research Database (Denmark)

    Wu, Xi; Li, Ximeng; Lluch Lafuente, Alberto;

    2015-01-01

    manipulation of structured data, with integrity and atomicity considerations. We present the formal semantics of KlaimDB and illustrate the use of the language in a scenario where the sales from different branches of a chain of department stores are aggregated from their local databases. It can be seen that...

  18. The Application of an Anatomical Database for Fetal Congenital Heart Disease

    Institute of Scientific and Technical Information of China (English)

    Li Yang; Qiu-Yan Pei; Yun-Tao Li; Zhen-Juan Yang

    2015-01-01

    Background:Fetal congenital heart anomalies are the most common congenital anomalies in live births.Fetal echocardiography (FECG) is the only prenatal diagnostic approach used to detect fetal congenital heart disease (CHD).FECG is not widely used,and the antenatal diagnosis rate of CHD varies considerably.Thus,mastering the anatomical characteristics of different kinds of CHD is critical for ultrasound physicians to improve FECG technology.The aim of this study is to investigate the applications of a fetal CHD anatomic database in FECG teaching and training program.Methods:We evaluated 60 transverse section databases including 27 types of fetal CHD built in the Prenatal Diagnosis Center in Peking University People's Hospital.Each original database contained 400-700 cross-sectional digital images with a resolution of 3744 pixels × 5616 pixels.We imported the database into Amira 5.3.1 (Australia Visage Imaging Company,Australia) three-dimensional (3D) software.The database functions use a series of 3D software visual operations.The features of the fetal CHD anatomical database were analyzed to determine its applications in FECG continuing education and training.Results:The database was rebuilt using the 3D software.The original and rebuilt databases can be displayed dynamically,continuously,and synchronically and can be rotated at arbitrary angles.The sections from the dynamic displays and rotating angles are consistent with the sections in FECG.The database successfully reproduced the anatomic structures and spatial relationship features of different fetal CHDs.We established a fetal CHD anatomy training database and a standardized training database for FECG.Ultrasound physicians and students can learn the anatomical features of fetal CHD and FECG through either centralized training or distance education.Conclusions:The database of fetal CHD successfully reproduced the anatomic structures and spatial relationship of different kinds of fetal CHD.This database can be

  19. HTML thin client and transactions

    CERN Document Server

    Touchette, J F

    1999-01-01

    When writing applications for thin clients such as Web browsers, you face several challenges that do not exist with fat-client applications written in Visual Basic, Delphi, or Java. For one thing, your development tools do not include facilities for automatically building reliable, nonrepeatable transactions into applications. Consequently, you must devise your own techniques to prevent users from transmitting duplicate transactions. The author explains how to implement reliable, nonrepeatable transactions using a technique that is applicable to any Java Server Development Kit based architecture. Although the examples presented are based on the IBM WebSphere 2.1 Application Server, they do not make use of any IBM WebSphere extensions. In short, the concepts presented here can be implemented in Perl CGI and ASP scripts, and the sample code has been tested with JDK 1.1.6 and 1.2. (0 refs).

  20. Applications and findings of an occupational exposure database for synthetic vitreous fibers.

    Science.gov (United States)

    Marchant, Gary; Bullock, Christopher; Carter, Charles; Connelly, Robert; Crane, Angus; Fayerweather, William; Johnson, Kathleen; Reynolds, Janis

    2009-03-01

    Occupational exposure databases are being used increasingly to characterize worker exposures in industries involving a variety of exposure scenarios. The glass and rock/slag segments of the synthetic vitreous fiber industry (in the United States) has developed a large (>14,000 samples) exposure database that can be used to estimate worker exposures based on industry sector, fiber type, product type, and job function. This article describes the development of this database as part of an industry-Occupational Safety and Health Administration collaborative Health and Safety Partnership Program and summarizes the findings and potential applications of the database. PMID:19116861

  1. Using Java Objects and Services for Database Business Applications

    OpenAIRE

    Dănuţ - Octavian Simion

    2013-01-01

    The paper presents the facilities advantages of using Enterprise Java Objects in Business Applications and emphases aspects like simplicity, application portability, component reusability, ability to build complex applications, separation of business logic from presentation logic, easy development of Web services, deployment in many operating environments, distributed deployment, application interoperability, integration with non-Java systems and development tools. Enterprise JavaBeans - EJB ...

  2. Innovative Evaluation System – IESM: An Architecture for the Database Management System for Mobile Application

    OpenAIRE

    Joan, Lu; Sundaram, Aswin; Arumugam, Vidyapriyadarshini

    2011-01-01

    As the mobile applications are constantly facing a rapid development in the recent years especially in the academic environment such as student response system [1-8] used in universities and other educational institutions; there has not been reported an effective and scalable Database Management System to support fast and reliable data storage and retrieval. This paper presents Database Management Architecture for an Innovative Evaluation System based on Mobile Learning Applications. The need...

  3. Web Service Clients on Mobile Android Devices: A Study on Architectural Alternatives and Client Performance

    OpenAIRE

    Knutsen, Johannes

    2009-01-01

    This paper studies Android, a new open source software stack initiated by Google, and the possibilities of developing a mobile client for MPower, a service oriented architecture platform based upon SOAP messaging. The study focuses on the architectural alternatives, their impacts on the mobile client application, Android’s performance on SOAP messaging, and how Web services’ design can be optimized to give well performing Android clients. The results from this study shows how different arch...

  4. Survey of standards applicable to a database management system

    Science.gov (United States)

    Urena, J. L.

    1981-01-01

    Industry, government, and NASA standards, and the status of standardization activities of standards setting organizations applicable to the design, implementation and operation of a data base management system for space related applications are identified. The applicability of the standards to a general purpose, multimission data base management system is addressed.

  5. Databases for nuclear applications available at the NEA Data Bank through the internet

    International Nuclear Information System (INIS)

    The NEA Data Bank acts as the Member countries' contre of reference for computer programs, basic scientific nuclear data and chemical thermodynamic data. It does this by keeping its collections up-to-date and by providing scientists with a reliable and quick retrieval service of these programs and data. The information stored at the Data Bank is handled according to agreed quality assurance methods. In addition, most of the data sets and programs have been validated in international benchmark exercises. The NEA Data Bank's longstanding experience in database development and maintenance allows it to efficiently handle large volumes of information. Through extensive use of electronic networks, such as the Internet, it is able to provide a rapid service to its clients. These services are constantly being developed in response to customer needs. (orig.)

  6. Explorative Study of SQL Injection Attacks and Mechanisms to Secure Web Application Database- A Review

    Directory of Open Access Journals (Sweden)

    Chandershekhar Sharma

    2016-03-01

    Full Text Available The increasing innovations in web development technologies direct the augmentation of user friendly web applications. With activities like - online banking, shopping, booking, trading etc. these applications have become an integral part of everyone’s daily routine. The profit driven online business industry has also acknowledged this growth because a thriving application provides the global platform to an organization. Database of web application is the most valuable asset which stores sensitive information of an individual and of an organization. SQLIA is the topmost threat as it targets the database on web application. It allows the attacker to gain control over the application ensuing financial fraud, leak of confidential data and even deleting the database. The exhaustive survey of SQL injection attacks presented in this paper is based on empirical analysis. This comprises the deployment of injection mechanism for each attack with respective types on various websites, dummy databases and web applications. The paramount security mechanism for web application database is also discussed to mitigate SQL injection attacks.

  7. Based on Technology of Client/Server Data Integrity Constraints Research and Application%基于Client/Server数据完整性约束的技术研究与应用

    Institute of Scientific and Technical Information of China (English)

    鲁广英

    2010-01-01

    讨论基于Client/Server结构的数据完整性约束,必须建立完整性约束机制,探讨数据完整性约束及其如何实现.根据多年来开发基于Client/Server结构的信息管理系统的经验,并以SQL Server、VB为平台,介绍管理信息系统实现数据完整性约束的方法.

  8. DEVELOPING FLEXIBLE APPLICATIONS WITH XML AND DATABASE INTEGRATION

    Directory of Open Access Journals (Sweden)

    Hale AS

    2004-04-01

    Full Text Available In recent years the most popular subject in Information System area is Enterprise Application Integration (EAI. It can be defined as a process of forming a standart connection between different systems of an organization?s information system environment. The incorporating, gaining and marriage of corporations are the major reasons of popularity in Enterprise Application Integration. The main purpose is to solve the application integrating problems while similar systems in such corporations continue working together for a more time. With the help of XML technology, it is possible to find solutions to the problems of application integration either within the corporation or between the corporations.

  9. Enterprise Android programming Android database applications for the enterprise

    CERN Document Server

    Mednieks, Zigurd; Dornin, Laird; Pan, Zane

    2013-01-01

    The definitive guide to building data-driven Android applications for enterprise systems Android devices represent a rapidly growing share of the mobile device market. With the release of Android 4, they are moving beyond consumer applications into corporate/enterprise use. Developers who want to start building data-driven Android applications that integrate with enterprise systems will learn how with this book. In the tradition of Wrox Professional guides, it thoroughly covers sharing and displaying data, transmitting data to enterprise applications, and much more. Shows Android developers w

  10. Comparison of the Frontier Distributed Database Caching System with NoSQL Databases

    CERN Document Server

    CERN. Geneva

    2012-01-01

    Non-relational "NoSQL" databases such as Cassandra and CouchDB are best known for their ability to scale to large numbers of clients spread over a wide area. The Frontier distributed database caching system, used in production by the Large Hadron Collider CMS and ATLAS detector projects, is based on traditional SQL databases but also has the same high scalability and wide-area distributability for an important subset of applications. This paper compares the architectures, behavior, performance, and maintainability of the two different approaches and identifies the criteria for choosing which approach to prefer over the other.

  11. Perceived Counselor Characteristics, Client Expectations, and Client Satisfaction with Counseling.

    Science.gov (United States)

    Heppner, P. Paul; Heesacker, Martin

    1983-01-01

    Examined interpersonal influence process within counseling including relationship between perceived counselor expertness, attractiveness, and trustworthiness and client satisfaction; between client expectations on perceived counselor expertness, attractiveness, trustworthiness, and client satisfaction; and effects of actual counselor experience…

  12. Explorative Study of SQL Injection Attacks and Mechanisms to Secure Web Application Database- A Review

    OpenAIRE

    Chandershekhar Sharma; Jain, Dr. S. C.; Dr. Arvind K Sharma

    2016-01-01

    The increasing innovations in web development technologies direct the augmentation of user friendly web applications. With activities like - online banking, shopping, booking, trading etc. these applications have become an integral part of everyone’s daily routine. The profit driven online business industry has also acknowledged this growth because a thriving application provides the global platform to an organization. Database of web application is the most valuable asset which stores sensit...

  13. Client Centred Design

    DEFF Research Database (Denmark)

    Ørngreen, Rikke; Nielsen, Janni; Tweddell Levinsen, Karin

    2004-01-01

    Abstract In this paper the Human Computer Interaction (HCI) Research Group reports on the pre-phase of an e-learning project, which was carried out in collaboration with the client. The project involved an initial exploration of the problem spaces, possibilities and challenges for an online...... on resources and competencies already existing in the client organisation. We asked: What is it we know? Uncovering the prerequisites and background of and with the client allowed us concurrently to identify: What do we not know? Working iteratively in collaboration with the client, allowed us to...... build on existing resources and networks, suggesting a design, which also included end-users community needs and work-context. Our argument is that if a preparation phase both seeks to confirm knowledge and contemplate what is not yet known, giving attention to the context and need of the client (i...

  14. Effective use of Java Data objects in developing database applications. Advantages and disadvantages

    OpenAIRE

    Zilidis, Paschalis.

    2004-01-01

    Approved for public release; distribution is unlimited Currently, the most common approach in developing database applications is to use an object-oriented language for the frontend module and a relational database for the backend datastore. The major disadvantage of this approach is the well-known "impedance mismatch" in which some form of mapping is required to connect the objects in the frontend and the relational tuples in the backend. Java Data Objects (JDO) technology is recently pro...

  15. Discovering Knowledge from AIS Database for Application in VTS

    Science.gov (United States)

    Tsou, Ming-Cheng

    The widespread use of the Automatic Identification System (AIS) has had a significant impact on maritime technology. AIS enables the Vessel Traffic Service (VTS) not only to offer commonly known functions such as identification, tracking and monitoring of vessels, but also to provide rich real-time information that is useful for marine traffic investigation, statistical analysis and theoretical research. However, due to the rapid accumulation of AIS observation data, the VTS platform is often unable quickly and effectively to absorb and analyze it. Traditional observation and analysis methods are becoming less suitable for the modern AIS generation of VTS. In view of this, we applied the same data mining technique used for business intelligence discovery (in Customer Relation Management (CRM) business marketing) to the analysis of AIS observation data. This recasts the marine traffic problem as a business-marketing problem and integrates technologies such as Geographic Information Systems (GIS), database management systems, data warehousing and data mining to facilitate the discovery of hidden and valuable information in a huge amount of observation data. Consequently, this provides the marine traffic managers with a useful strategic planning resource.

  16. Application of the Trend Filtering Algorithm on the MACHO Database

    CERN Document Server

    Szulagyi, J; Welch, D L

    2009-01-01

    Due to the strong effect of systematics/trends in variable star observations, we employ the Trend Filtering Algorithm (TFA) on a subset of the MACHO database and search for variable stars. TFA has been applied successfully in planetary transit searches, where weak, short-lasting periodic dimmings are sought in the presence of noise and various systematics (due to, e.g., imperfect flat fielding, crowding, etc). These latter effects introduce colored noise in the photometric time series that can lead to a complete miss of the signal. By using a large number of available photometric time series of a given field, TFA utilizes the fact that the same types of systematics appear in several/many time series of the same field. As a result, we fit each target time series by a (least-square-sense) optimum linear combination of templates and frequency-analyze the residuals. Once a signal is found, we reconstruct the signal by employing the full model, including the signal, systematics and noise. We apply TFA on the brigh...

  17. DEVELOPMENT OF AN APPLICATION WITH THE PURPOSE OF MAINTAINING A DATABASE

    Directory of Open Access Journals (Sweden)

    Wilson Fadlo Curi

    2011-04-01

    Full Text Available Nowadays, sustainable development is the great paradigm of human development, because of that new methodologies for the planning and management of systems, specially the ones for hydric resources, are being developed, this forms of evaluation are no longer restricted to mere economic evaluation, but are also submitted to social and environmental sustainability evaluation. The use of databases is essential for the manipulation of the most diverse kinds of data and information, which can be utilized for storing historical data and other information necessary for future use. Therefore, this article focuses mainly on presenting the application developed to manipulate tables in a database, to allow and facilitate the inclusion, elimination, renaming of tables and table fields in order to substitute some SQL commands available in programs of the various available databases. Thus, this application will add value to the decision support system that is being developed by GOTA (Group of Water Total Optimization which in many cases needs to make changes in its database for obtaining greater flexibility and data manipulation, such as register of reservoirs, irrigated perimeters, meteorological stations, gauged stations, institutions, etc. This application allows an intelligent and fast manipulation of tables in a database, which the present version runs on the PostgreSQL database and was developed in Java platform, which is a platform that permits its installation in several types of operational systems and many kinds of computers.

  18. PeDaB - the personal dosimetry database at the research centre Juelich

    International Nuclear Information System (INIS)

    In May, 1997 the mainframe based registration, processing and archiving of personal monitoring data at the research centre Juelich (FZJ) was transferred to a client server system. A complex database application was developed. The client user interface is a Windows based Microsoft ACCESS application which is connected to an ORACLE database via ODBC and TCP/IP. The conversion covered all areas of personal dosimetry including internal and external exposition as well as administrative areas. A higher degree of flexibility, data security and integrity was achieved. (orig.)

  19. El cliente interno

    OpenAIRE

    Juan J. López Sobejano

    2007-01-01

    Desde hace unos años se han incrementado las referencias teóricas a conceptos como “marketing relacional”, “marketing interno” o “cliente interno”, todas ellas relacionadas entre sí. La utilización de estas nuevas construcciones conceptuales refleja un nuevo enfoque de las relaciones empresa-cliente que en ocasiones no se materializa en el día a día.En concreto la expresión “cliente interno” no hace sino mostrar una nueva forma de entender el proceso de producción, principalmente en empresas ...

  20. Data-based considerations for electronic family health history applications.

    Science.gov (United States)

    Peace, Jane; Valdez, Rupa Sheth; Lutz, Kristin F

    2012-01-01

    Family health history contains important information about the genetic and environmental factors that contribute to patterns of health and illness in families. Applications for collecting, managing, and analyzing family health history could be improved if their design were informed by an understanding of how consumers think about and report family health history. This article presents a descriptive analysis of themes from family health history interviews that have implications for development, selection, and use of family health history tools. Important themes included ways in which family is defined, including nonbiological family members and pets; ideas about health and disease, including degree of exposure and individual perceptions; and barriers to reporting family health history, including large biological families and uncertainty. Some themes identified (eg, uncertainty) have been recognized previously and continue to be important considerations. Other themes identified, such as perceptions about severity of illness or conditions and causal relationships, are newly recognized and may have implications for nurses and other providers designing, selecting, and using family health history applications. PMID:21915045

  1. Campaign Consultants - Client Payments

    Data.gov (United States)

    City of San Francisco — Campaign Consultants are required to report ���economic consideration�۝ promised by or received from clients in exchange for campaign consulting services during the...

  2. Helping clients build credit

    OpenAIRE

    Vikki Frank

    2007-01-01

    Until now people who repaid loans from community groups had not been on credit bureaus’ radar. Now Credit Builders Alliance is partnering with Experian to help clients of community lenders build strong credit histories.

  3. Structure design and establishment of database application system for alien species in Shandong Province, China

    Institute of Scientific and Technical Information of China (English)

    GUO Wei-hua; LIU Heng; DU Ning; ZHANG Xin-shi; WANG Ren-qing

    2007-01-01

    This paper presents a case study on structure design and establishment of database application system for alien species in Shandong Province, integrating with Geographic Information System, computer network, and database technology to the research of alien species. The modules of alien species database, including classified data input, statistics and analysis, species pictures and distribution maps,and out date input, were approached by Visual Studio.net 2003 and Microsoft SQL server 2000. The alien species information contains the information of classification, species distinction characteristics, biological characteristics, original area, distribution area, the entering fashion and route, invasion time, invasion reason, interaction with the endemic species, growth state, danger state and spatial information, i.e.distribution map. Based on the above bases, several models including application, checking, modifying, printing, adding and returning models were developed. Furthermore, through the establishment of index tables and index maps, we can also spatially query the data like picture,text and GIS map data. This research established the technological platform of sharing information about scientific resource of alien species in Shandong Province, offering the basis for the dynamic inquiry of alien species, the warning technology of prevention and the fast reaction system. The database application system possessed the principles of good practicability, friendly user interface and convenient usage. It can supply full and accurate information inquiry services of alien species for the users and provide functions of dynamically managing the database for the administrator.

  4. Efficient Mobile Client Caching Supporting Transaction Semantics

    Directory of Open Access Journals (Sweden)

    IlYoung Chung

    2000-05-01

    Full Text Available In mobile client-server database systems, caching of frequently accessed data is an important technique that will reduce the contention on the narrow bandwidth wireless channel. As the server in mobile environments may not have any information about the state of its clients' cache(stateless server, using broadcasting approach to transmit the updated data lists to numerous concurrent mobile clients is an attractive approach. In this paper, a caching policy is proposed to maintain cache consistency for mobile computers. The proposed protocol adopts asynchronous(non-periodic broadcasting as the cache invalidation scheme, and supports transaction semantics in mobile environments. With the asynchronous broadcasting approach, the proposed protocol can improve the throughput by reducing the abortion of transactions with low communication costs. We study the performance of the protocol by means of simulation experiments.

  5. MS Client在无光驱实验室的应用%Application of MS Client in no CD-ROM laboratory

    Institute of Scientific and Technical Information of China (English)

    刘维学

    2004-01-01

    在有盘DOS工作站上安装MS Client(Microsoft Network Client V3.0 for MSDOS),以使其能连接到终端服务器、并能共享终端服务器的资源,解决计算机无光驱安装操作系统问题.

  6. Simple Application of Web Database%WEB数据库的简单实现

    Institute of Scientific and Technical Information of China (English)

    吕品; 张绍成; 岳承君

    2001-01-01

    主要讨论的是Windows NT平台下基于Ms Access的简单Web数据库的连接和应用.%In this paper,connection and application of simple database on basis of Ms Access under Windows NT were discussed.

  7. The National Landslide Database and GIS for Great Britain: construction, development, data acquisition, application and communication

    Science.gov (United States)

    Pennington, Catherine; Dashwood, Claire; Freeborough, Katy

    2014-05-01

    The National Landslide Database has been developed by the British Geological Survey (BGS) and is the focus for national geohazard research for landslides in Great Britain. The history and structure of the geospatial database and associated Geographical Information System (GIS) are explained, along with the future developments of the database and its applications. The database is the most extensive source of information on landslides in Great Britain with over 16,500 records of landslide events, each documented as fully as possible. Data are gathered through a range of procedures, including: incorporation of other databases; automated trawling of current and historical scientific literature and media reports; new field- and desk-based mapping technologies with digital data capture, and crowd-sourcing information through social media and other online resources. This information is invaluable for the investigation, prevention and mitigation of areas of unstable ground in accordance with Government planning policy guidelines. The national landslide susceptibility map (GeoSure) and a national landslide domain map currently under development rely heavily on the information contained within the landslide database. Assessing susceptibility to landsliding requires knowledge of the distribution of failures and an understanding of causative factors and their spatial distribution, whilst understanding the frequency and types of landsliding present is integral to modelling how rainfall will influence the stability of a region. Communication of landslide data through the Natural Hazard Partnership (NHP) contributes to national hazard mitigation and disaster risk reduction with respect to weather and climate. Daily reports of landslide potential are published by BGS through the NHP and data collected for the National Landslide Database is used widely for the creation of these assessments. The National Landslide Database is freely available via an online GIS and is used by a

  8. Application of the international union of radioecologists soil-to-plant database to Canadian settings

    International Nuclear Information System (INIS)

    Over ten years the International Union of Radioecologists (IUR) has amassed a large database on the soil-to-plant transfer of elements. Many of the studies were established using standardized experimental protocols, so that the opportunity for intercomparison is unparalleled among environmental transfer parameter databases. Although most contributors are from Europe, the database includes contributions from Canada and other non-European nations. For Cs, Sr, Co, Pu and Np there are 2035, 920, 782, 679 and 465 records, and there are also some records for Ag, Am, Ce, Cm, 1, La, Mn, Ni, Pb, Po, Ra, Ru, Sb, Tc, Th, U and Zn. An earlier analysis of the database was used to support the statistical distributions of the soil-to-plant transfer parameters applied for the assessment of Canada's nuclear fuel waste concept. The present project re-examines the full database and broadens the application. In many cases, the crops, soils and environmental conditions investigated by the IUR are very applicable to Canadian settings. For several elements, there are sufficient data to develop useful relationships between soil-to-plant transfer and the properties of the soil, although the variation of the transfer values remains large. The resulting estimated parameter values will be useful in many nuclear environmental safety assessments. (author)

  9. Taming the Tiger: How to Cope with Real Database Products in Transactional Federations for Internet Applications

    OpenAIRE

    Schenkel, Ralf; Weikum, Gerhard

    2000-01-01

    Data consistency in transactional federations is a key requirement of advanced E-service applications on the Internet, such as electronic auctions or real-estate purchase. Federated concurrency control needs to be aware of the fact that virtually all commercial database products support sub-serializability isolation levels, such as Snapshot Isolation, and that applications make indeed use of such local options. This paper discusses the problems that arise with regard t...

  10. Professional iPhone and iPad Database Application Programming

    CERN Document Server

    Alessi, Patrick

    2010-01-01

    A much-needed resource on database development and enterprise integration for the iPhone. An enormous demand exists for getting iPhone applications into the enterprise and this book guides you through all the necessary steps for integrating an iPhone app within an existing enterprise. Experienced iPhone developers will learn how to take advantage of the built-in capabilities of the iPhone to confidently implement a data-driven application for the iPhone.: Shows you how to integrate iPhone applications into enterprise class systems; Introduces development of data-driven applications on the iPho

  11. Design of Student Information Management Database Application System for Office and Departmental Target Responsibility System

    Science.gov (United States)

    Zhou, Hui

    It is the inevitable outcome of higher education reform to carry out office and departmental target responsibility system, in which statistical processing of student's information is an important part of student's performance review. On the basis of the analysis of the student's evaluation, the student information management database application system is designed by using relational database management system software in this paper. In order to implement the function of student information management, the functional requirement, overall structure, data sheets and fields, data sheet Association and software codes are designed in details.

  12. Report on the IAEA technical meeting on database of evaluated cross sections for ion beam applications

    International Nuclear Information System (INIS)

    Results of the IAEA Technical Meeting on Database of Evaluated Cross Sections for Ion Beam Applications held at the IAEA Headquarters, Vienna, Austria, 29 to 30 October 2003, are summarized in this report. The meeting discussed the nuclear data needs for ion beam analysis and produced recommendations concerning the compilation, assessment and evaluation of cross section data for ion beam analysis, as well as the related databases and their formats and their inclusion in the data collections of the IAEA Nuclear Data Section. (author)

  13. A New Communication Theory on Complex Information and a Groundbreaking New Declarative Method to Update Object Databases

    OpenAIRE

    Virkkunen, Heikki

    2016-01-01

    In this article I introduce a new communication theory for complex information represented as a direct graph of nodes. In addition, I introduce an application for the theory, a new radical method, embed, that can be used to update object databases declaratively. The embed method revolutionizes updating of object databases. One embed method call can replace dozens of lines of complicated updating code in a traditional client program of an object database, which is a huge improvement. As a decl...

  14. Dual diagnosis clients' treatment satisfaction - a systematic review

    Directory of Open Access Journals (Sweden)

    Stirling John

    2011-04-01

    Full Text Available Abstract Background The aim of this systematic review is to synthesize existing evidence about treatment satisfaction among clients with substance misuse and mental health co-morbidity (dual diagnoses, DD. Methods We examined satisfaction with treatment received, variations in satisfaction levels by type of treatment intervention and by diagnosis (i.e. DD clients vs. single diagnosis clients, and the influence of factors other than treatment type on satisfaction. Peer-reviewed studies published in English since 1970 were identified by searching electronic databases using pre-defined search strings. Results Across the 27 studies that met inclusion criteria, high average satisfaction scores were found. In most studies, integrated DD treatment yielded greater client satisfaction than standard treatment without explicit DD focus. In standard treatment without DD focus, DD clients tended to be less satisfied than single diagnosis clients. Whilst the evidence base on client and treatment variables related to satisfaction is small, it suggested client demographics and symptom severity to be unrelated to treatment satisfaction. However, satisfaction tended to be linked to other treatment process and outcome variables. Findings are limited in that many studies had very small sample sizes, did not use validated satisfaction instruments and may not have controlled for potential confounders. A framework for further research in this important area is discussed. Conclusions High satisfaction levels with current treatment provision, especially among those in integrated treatment, should enhance therapeutic optimism among practitioners dealing with DD clients.

  15. An extensible web interface for databases and its application to storing biochemical data

    CERN Document Server

    Angelopoulos, Nicos

    2010-01-01

    This paper presents a generic web-based database interface implemented in Prolog. We discuss the advantages of the implementation platform and demonstrate the system's applicability in providing access to integrated biochemical data. Our system exploits two libraries of SWI-Prolog to create a schema-transparent interface within a relational setting. As is expected in declarative programming, the interface was written with minimal programming effort due to the high level of the language and its suitability to the task. We highlight two of Prolog's features that are well suited to the task at hand: term representation of structured documents and relational nature of Prolog which facilitates transparent integration of relational databases. Although we developed the system for accessing in-house biochemical and genomic data the interface is generic and provides a number of extensible features. We describe some of these features with references to our research databases. Finally we outline an in-house library that...

  16. A fusion algorithm for joins based on collections in Odra (Object Database for Rapid Application development)

    CERN Document Server

    Satish, Laika

    2011-01-01

    In this paper we present the functionality of a currently under development database programming methodology called ODRA (Object Database for Rapid Application development) which works fully on the object oriented principles. The database programming language is called SBQL (Stack based query language). We discuss some concepts in ODRA for e.g. the working of ODRA, how ODRA runtime environment operates, the interoperability of ODRA with .net and java .A view of ODRA's working with web services and xml. Currently the stages under development in ODRA are query optimization. So we present the prior work that is done in ODRA related to Query optimization and we also present a new fusion algorithm of how ODRA can deal with joins based on collections like set, lists, and arrays for query optimization.

  17. The development and application of a thermodynamic database for magnesium alloys

    Science.gov (United States)

    Shang, Shunli; Zhang, Hui; Ganeshan, Swetha; Liu, Zi-Kui

    2008-12-01

    The available thermodynamic databases for magnesium alloys are discussed in this paper. Of particular interest are the features of a magnesium database developed by the authors with 19 elements: Mg-Al-Ca-Ce-Cu-Fe-KLa-Li-Mn-Na-Nd-Pr-Si-Sn-Sr-Y-Zn-Zr. Using this database, two applications are presented. One is the phase evolution in AZ61 magnesium alloy including the variations of phase fractions, alloying compositions, and partition coefficients of alloying elements as a function of temperature (or solid fraction). The other is to understand sodium-induced high-temperature embrittlement in the Al-Mg alloy, which is ascribed to the formation of a liquid phase due to the presence of sodium traces.

  18. Perspectives on a Big Data Application: What Database Engineers and IT Students Need to Know

    Directory of Open Access Journals (Sweden)

    E. Erturk

    2015-10-01

    Full Text Available Cloud Computing and Big Data are important and related current trends in the world of information technology. They will have significant impact on the curricula of computer engineering and information systems at universities and higher education institutions. Learning about big data is useful for both working database professionals and students, in accordance with the increase in jobs requiring these skills. It is also important to address a broad gamut of database engineering skills, i.e. database design, installation, and operation. Therefore the authors have investigated MongoDB, a popular application, both from the perspective of industry retraining for database specialists and for teaching. This paper demonstrates some practical activities that can be done by students at the Eastern Institute of Technology New Zealand. In addition to testing and preparing new content for future students, this paper contributes to the very recent and emerging academic literature in this area. This paper concludes with general recommendations for IT educators, database engineers, and other IT professionals.

  19. Development of a Personal Digital Assistant (PDA) based client/server NICU patient data and charting system.

    OpenAIRE

    Carroll, A. E.; Saluja, S.; Tarczy-Hornoch, P.

    2001-01-01

    Personal Digital Assistants (PDAs) offer clinicians the ability to enter and manage critical information at the point of care. Although PDAs have always been designed to be intuitive and easy to use, recent advances in technology have made them even more accessible. The ability to link data on a PDA (client) to a central database (server) allows for near-unlimited potential in developing point of care applications and systems for patient data management. Although many stand-alone systems exis...

  20. La contrainte client

    Directory of Open Access Journals (Sweden)

    Guillaume Tiffon

    2011-04-01

    Full Text Available Cet article montre que le contact client a beau être ambivalent, dans la mesure où il est à la fois source de contrainte et de reconnaissance, dans certains cas, comme celui des caissières, il constitue avant tout une contrainte, en ce que les clients contrôlent le travail qui s’opère « sous leurs yeux », tandis que, dans d’autres cas, comme celui des kinésithérapeutes, il contribue avant tout à donner du sens au travail et, par là, à susciter l’engagement des travailleurs. L’article souligne ainsi combien la contrainte client revêt des modalités différentes selon la configuration, spatiale et temporelle, dans laquelle se déroule la relation de service, et le différentiel de compétences entre les protagonistes engagés dans cette relation.The client constraint. A comparative analysis of cashiers and physiotherapistsThis article shows that despite the ambivalence of client contact, insofar as it is both a source of constraint and recognition, in some cases, as the ones of cashiers, it isprimarily a constraint: clients control the work that takes place “before their eyes”, whereas in other cases – as in the ones of physiotherapists – it contributes to give meaning to work and, thereby, to arouse the commitment of workers. The article highlights how the client constraint takes on different forms depending on thespatial and temporal configuration where the service relation runs, and the skills differential between the protagonists involved in this relation.El apremio de los clientes. Análisis comparativo entre las cajeras de supermercado y los kinesiterapeutasEn este artículo se demuestra que aunque el contacto con los clientes puede ser percibido como agradable, en realidad en la mayoría de los casos el cliente es percibido como un peso puesto que estos « controlan » visualmente el trabajo de las cajeras mientras que en otras profesiones como es el caso de los kinesiterapeutas la presencia del paciente

  1. Service Management Database for DSN Equipment

    Science.gov (United States)

    Zendejas, Silvino; Bui, Tung; Bui, Bach; Malhotra, Shantanu; Chen, Fannie; Wolgast, Paul; Allen, Christopher; Luong, Ivy; Chang, George; Sadaqathulla, Syed

    2009-01-01

    This data- and event-driven persistent storage system leverages the use of commercial software provided by Oracle for portability, ease of maintenance, scalability, and ease of integration with embedded, client-server, and multi-tiered applications. In this role, the Service Management Database (SMDB) is a key component of the overall end-to-end process involved in the scheduling, preparation, and configuration of the Deep Space Network (DSN) equipment needed to perform the various telecommunication services the DSN provides to its customers worldwide. SMDB makes efficient use of triggers, stored procedures, queuing functions, e-mail capabilities, data management, and Java integration features provided by the Oracle relational database management system. SMDB uses a third normal form schema design that allows for simple data maintenance procedures and thin layers of integration with client applications. The software provides an integrated event logging system with ability to publish events to a JMS messaging system for synchronous and asynchronous delivery to subscribed applications. It provides a structured classification of events and application-level messages stored in database tables that are accessible by monitoring applications for real-time monitoring or for troubleshooting and analysis over historical archives.

  2. The Development of Mobile Client Application in Yogyakarta Tourism and Culinary Information System Based on Social Media Integration

    Directory of Open Access Journals (Sweden)

    Novrian Fajar Hidayat

    2012-10-01

    Full Text Available Social network is currently being an important part of someone. Many of users in social network make it an effective publication. One of many things that can be published on social network is tourism. Indonesia has a lot of tourism and culinary, especially on Special District of Yogyakarta.Tourism and culinary resources on Yogyakarta can be published and shared using social network. In addition, development of mobile technology and smartphone make easier to access social network through internet. The release of Windows Phone 7 makes new color in the world of smartphone. Windows Phone 7comes with elegant interface, Metro Style. Besides that, standardized specification makes Windows Phone 7 suitable for integrating social network with tourism and culinary on Special District of Yogyakarta. This Research is expected to integrate social network with tourism and culinary on Yogyakarta. The method in this research is using ICONIX method. This method is one method that combines waterfall and agile methods. The results of this study are in the form of applications that run on Windows Phone 7 and consume a web service. This application provides information especially for tourist in order to be able to easily find culinary and tourism in Yogyakarta.

  3. Cliente Moodle para Android

    OpenAIRE

    Castillo Montaño, Enrique

    2014-01-01

    Este Proyecto consiste en la creación de una aplicación que nos permite acceder a sitios Moodle desde un terminal Android. No está orientado a un solo sitio Moodle, el cliente es genérico. La aplicación es software libre

  4. Counseling Bisexual Clients.

    Science.gov (United States)

    Smiley, Elizabeth B.

    1997-01-01

    Provides a brief conceptual statement about bisexuality. Offers a review of existing research studies, and suggests issues to consider when counseling bisexual clients. Defines bisexuality and discusses prevalence studies, identity development, and implications for counseling. Claims that bisexuality challenges traditional rules about sexual…

  5. Training Evaluation: Clients' Roles.

    Science.gov (United States)

    Hashim, Junaidah

    2001-01-01

    A study was conducted of 262 training providers in Malaysia, where providers must be government approved. Government regulation, client demands for high quality, and an economic downturn that focused attention on training costs have all influenced evaluation in a positive direction. (SK)

  6. The establish and application of equipment reliability database in Nuclear Power Plant

    International Nuclear Information System (INIS)

    Take the case of Daya Bay Nuclear Power Plant, the collecting and handling of equipment reliability data, the calculation method of reliability parameters and the establish and application of reliability databases, etc. are discussed. The data source involved the design information of the equipment, the operation information, the maintenance information and periodically test record, etc. Equipment reliability database built on a base of the operation experience. It provided the valid tool for thoroughly and objectively recording the operation history and the present condition of various equipment of the plant; supervising the appearance of the equipment, especially the safety-related equipment, provided the very practical worth information for enhancing the safety and availability management of the equipment and insuring the safety and economic operation of the plant; and provided the essential data for the research and applications in safety management, reliability analysis, probabilistic safety assessment, reliability centered maintenance and economic management in nuclear power plant. (authors)

  7. Current trends and new challenges of databases and web applications for systems driven biological research

    Directory of Open Access Journals (Sweden)

    Do HanKim

    2010-12-01

    Full Text Available Dynamic and rapidly evolving nature of systems driven research imposes special requirements on the technology, approach, design and architecture of computational infrastructure including database and web application. Several solutions have been proposed to meet the expectations and novel methods have been developed to address the persisting problems of data integration. It is important for researchers to understand different technologies and approaches. Having familiarized with the pros and cons of the existing technologies, researchers can exploit its capabilities to the maximum potential for integrating data. In this review we discuss the architecture, design and key technologies underlying some of the prominent databases (DBs and web applications. We will mention their roles in integration of biological data and investigate some of the emerging design concepts and computational technologies that are likely to have a key role in the future of systems driven biomedical research.

  8. Application of kernel functions for accurate similarity search in large chemical databases

    OpenAIRE

    2010-01-01

    Background Similaritysearch in chemical structure databases is an important problem with many applications in chemical genomics, drug design, and efficient chemical probe screening among others. It is widely believed that structure based methods provide an efficient way to do the query. Recently various graph kernel functions have been designed to capture the intrinsic similarity of graphs. Though successful in constructing accurate predictive and classification models, graph kernel functions...

  9. Extending Binary Large Object Support to Open Grid Services Architecture-Data Access and Integration Middleware Client Toolkit

    Directory of Open Access Journals (Sweden)

    Kiran K. Patnaik

    2011-01-01

    Full Text Available Problem statement: OGSA-DAI middleware allows data resources to be federated and accessed via web services on the web or within grids or clouds. It provides a client API for writing programs that access the exposed databases. Migrating existing applications to the new technology and using a new API to access the data of DBMS with BLOB is difficult and discouraging. A JDBC Driver is a much convenient alternative to existing mechanism and provides an extension to OGSA-DAI middleware and allows applications to use databases exposed in a grid through the OGSA-DAI 3.0. However, the driver does not support Binary Large Objects (BLOB. Approach: The driver is enhanced to support BLOB using the OGSA-DAI Client API. It transforms the JDBC calls into an OGSA-DAI workflow request and sends it to the server using Web Services (WS. The client API of OGSA-DAI uses activities that are connected to form a workflow and executed using a pipeline. This workflow mechanism is embedded into the driver. The WS container dispatches the request to the OGSA-DAI middleware for processing and the result is then transformed back to an instance of ResultSet implementation using the OGSA-DAI Client API, before it is returned to the user. Results: Test on handling of BLOBs (images, flash files and videos ranging from size 1 KB to size 2 GB were carried out on Oracle, MySQL and PostgreSQL databases using our enhanced JDBC driver and it performed well. Conclusion: The enhanced JDBC driver now can offer users, with no experience in Grid computing specifically on OGSA-DAI, the possibility to give their applications the ability to access databases exposed on the grid with minimal effort.

  10. Wireless access to a pharmaceutical database: A demonstrator for data driven Wireless Application Prorocol (WAP) applications in medical information processing

    DEFF Research Database (Denmark)

    Hansen, Michael Schacht; Dørup, Jens

    2001-01-01

    catalogue to Wireless Application Protocol using open source freeware at all steps. METHODS: We used Apache 1.3 web software on a Linux server. Data containing the Danish pharmaceutical catalogue were imported from an ASCII file into a MySQL 3.22.32 database using a Practical Extraction and Report Language...... number catalogue and Dictaphone function via answering machine technology. An independent Wireless Application Protocol gateway may be placed within hospital firewalls, which may be an advantage with respect to security. However, if Wireless Application Protocol phones are to become effective tools...

  11. Quality control in diagnostic radiology: software (Visual Basic 6) and database applications

    International Nuclear Information System (INIS)

    Quality Assurance programme in diagnostic Radiology is being implemented by the Ministry of Health (MoH) in Malaysia. Under this program the performance of an x-ray machine used for diagnostic purpose is tested by using the approved procedure which is commonly known as Quality Control in diagnostic radiology. The quality control or performance tests are carried out b a class H licence holder issued the Atomic Energy Licensing Act 1984. There are a few computer applications (software) that are available in the market which can be used for this purpose. A computer application (software) using Visual Basics 6 and Microsoft Access, is being developed to expedite data handling, analysis and storage as well as report writing of the quality control tests. In this paper important features of the software for quality control tests are explained in brief. A simple database is being established for this purpose which is linked to the software. Problems encountered in the preparation of database are discussed in this paper. A few examples of practical usage of the software and database applications are presented in brief. (Author)

  12. Thermochemistry in BWR. An overview of applications of program codes and databases

    International Nuclear Information System (INIS)

    The Swedish work on thermodynamics of metal-water systems relevant to BWR conditions has been ongoing since the 70ies, and at present time a compilation and adaptation of codes and thermodynamic databases are in progress. In the previous work, basic thermodynamic data were compiled for parts of the system Fe-Cr-Ni-Co-Zn-S-H2O at 25-300 °C. Since some thermodynamic information necessary for temperature extrapolations of data up to 300 °C was not published in the earlier works, these data have now been partially recalculated. This applies especially to the parameters of the HKF-model, which are used to extrapolate the thermodynamic data for ionic and neutral aqua species from 25 °C to BWR temperatures. Using the completed data, e.g. the change in standard Gibbs energy (ΔG0) and the equilibrium constant (log K) can be calculated for further applications at BWR/LWR conditions. In addition a computer program is currently being developed at Studsvik for the calculation of equilibrium conductivity in high temperature water. The program is intended for PWR applications, but can also be applied to BWR environment. Data as described above will be added to the database of this program. It will be relatively easy to further develop the program e.g. to calculate Pourbaix diagrams, and these graphs could then be calculated at any temperature. This means that there will be no limitation to the temperatures and total concentrations (usually 10-6 to 10-8 mol/kg) as reported in earlier work. It is also easy to add a function generating ΔG0 and log K values at selected temperatures. One of the fundamentals for this work was also to overview and collect publicly available thermodynamic program codes and databases of relevance for BWR conditions found in open sources. The focus has been on finding already done compilations and reviews, and some 40 codes and 15 databases were found. Codes and data-bases are often integrated and such a package is often developed for applications

  13. A portable, GUI-based, object-oriented client-server architecture for computer-based patient record (CPR) systems.

    Science.gov (United States)

    Schleyer, T K

    1995-01-01

    Software applications for computer-based patient records require substantial development investments. Portable, open software architectures are one way to delay or avoid software application obsolescence. The Clinical Management System at Temple University School of Dentistry uses a portable, GUI-based, object-oriented client-server architecture. Two main criteria determined this approach: preservation of investment in software development and a smooth migration path to a Computer-based Patient Record. The application is separated into three layers: graphical user interface, database interface, and application functionality Implementation with generic cross-platform development tools ensures maximum portability. PMID:7662879

  14. PmiRExAt: plant miRNA expression atlas database and web applications.

    Science.gov (United States)

    Gurjar, Anoop Kishor Singh; Panwar, Abhijeet Singh; Gupta, Rajinder; Mantri, Shrikant S

    2016-01-01

    High-throughput small RNA (sRNA) sequencing technology enables an entirely new perspective for plant microRNA (miRNA) research and has immense potential to unravel regulatory networks. Novel insights gained through data mining in publically available rich resource of sRNA data will help in designing biotechnology-based approaches for crop improvement to enhance plant yield and nutritional value. Bioinformatics resources enabling meta-analysis of miRNA expression across multiple plant species are still evolving. Here, we report PmiRExAt, a new online database resource that caters plant miRNA expression atlas. The web-based repository comprises of miRNA expression profile and query tool for 1859 wheat, 2330 rice and 283 maize miRNA. The database interface offers open and easy access to miRNA expression profile and helps in identifying tissue preferential, differential and constitutively expressing miRNAs. A feature enabling expression study of conserved miRNA across multiple species is also implemented. Custom expression analysis feature enables expression analysis of novel miRNA in total 117 datasets. New sRNA dataset can also be uploaded for analysing miRNA expression profiles for 73 plant species. PmiRExAt application program interface, a simple object access protocol web service allows other programmers to remotely invoke the methods written for doing programmatic search operations on PmiRExAt database.Database URL:http://pmirexat.nabi.res.in. PMID:27081157

  15. Client Perceptions of Pretreatment Change

    Science.gov (United States)

    Kindsvatter, Aaron; Osborn, Cynthia J.; Bubenzer, Donald; Duba, Jill D.

    2010-01-01

    The authors suggest that when counselors have a rich understanding of pretreatment changes, they are better able to assist clients in capitalizing on such changes. The current study examined client perceptions of pretreatment changes. Thirty-six clients completed Q-sorts pertaining to pretreatment changes they experienced. Four factors pertaining…

  16. 数据库保护的原则及适用法律综述%An Overview of the Principles of Database Protection and Its Applicable Laws

    Institute of Scientific and Technical Information of China (English)

    相丽玲; 黄富国

    2003-01-01

    At present, the principles of database protection are varied in developed countries. So are the applicable laws. This article summarizes these principles and applicable laws in an attempt to provide reference material for China in her legislation for databases.

  17. Client Centred Design

    OpenAIRE

    Ørngreen, Rikke N.; Nielsen, Janni; Levinsen, Karin

    2004-01-01

    Abstract In this paper the Human Computer Interaction (HCI) Research Group reports on the pre-phase of an e-learning project, which was carried out in collaboration with the client. The project involved an initial exploration of the problem spaces, possibilities and challenges for an online accredited Continued Medical Education (CME) programme at the Lundbeck Institute. The CME programme aims at end-users, which are primarily general practitioners, but also specialists (psychiatrist and psyc...

  18. Unpacking the Client(s): Constructions, Positions and Client-Consultant Dynamics

    OpenAIRE

    Alvesson, Mats; Kärreman, Dan; Sturdy, Andrew; Handley, Karen

    2006-01-01

    Increasing attention is being given to professional services in organisation and management theory. Whether the focus is on organisational forms or service processes such as knowledge transfer, the role of clients is often seen as central. However, typically, clients continue to be presented in a largely static, pre-structured and even monolithic way. While some recognition is given to the diversity of organisational clients and, to a lesser extent, individual clients, little attention has be...

  19. Architectural models for client interaction on service-oriented platforms

    NARCIS (Netherlands)

    Bonino da Silva Santos, L.O.; Ferreira Pires, L.; Sinderen, van M.J.; Sinderen, van M.J.

    2007-01-01

    Service-oriented platforms can provide different levels of functionality to the client applications as well as different interaction models. Depending on the platform’s goals and the computing capacity of their expected clients the platform functionality can range from just an interface to support t

  20. Facet Analysis of the Client Needs Assessment Instrument.

    Science.gov (United States)

    Dancer, L. Suzanne; Stanley, Lawrence R.

    The structure of the revised Client Needs Assessment Instrument (CNAI) is examined. In 1978-79, the Texas Department of Human Resources (DHR) developed the CNAI to provide an index of applicants' and clients' capacity for self-care by measuring the respondents' levels of functioning in: (1) physical health; (2) daily living activities; (3) mental…

  1. Introduction to ASPNET 4 AJAX Client Templates

    CERN Document Server

    Shoemaker, Craig

    2010-01-01

    This Wrox Blox will teach you how to create and customize ASP.NET 4 AJAX Preview 4 Client Templates. The author shows you how to use declarative as well as imperative data-binding techniques to address the simple to advanced UI requirements. He also covers how the observer pattern is fully implemented in ASP.NET 4 AJAX and, when used in conjunction with the Client Template markup extensions, provides a developer experience much like XAML-based applications like WPF and Silverlight. This Wrox Blox walks you through how to implement examples that fetch data from ASP.NET Web Forms using Page Meth

  2. Global geologic applications of the Space Shuttle earth observations photography database

    Science.gov (United States)

    Lulla, Kamlesh; Helfert, Michael; Evans, Cynthia; Wilkinson, M. J.; Pitts, David; Amsbury, David

    1993-01-01

    The advantages of the astronaut photography during Space Shuttle missions are briefly examined, and the scope and applications of the Space Shuttle earth observations photography database are discussed. The global and multidisciplinary nature of the data base is illustrated by several examples of geologic applications. These include the eruption of Mount Pinatubo (Philippine Islands), heat flow and ice cover on Lake Baikal in Siberia (Russia), and windblown dust in South America. It is noted that hand-held photography from the U.S. Space Shuttle provides unique remotely-sensed data for geologic applications because of the combination of varying perspectives, look angles, and illumination, and changing resolution resulting from different lenses and altitudes.

  3. Image storage, cataloguing and retrieval using a personal computer database software application

    International Nuclear Information System (INIS)

    Full text: Interesting images and cases are collected and collated by most nuclear medicine practitioners throughout the world. Changing imaging technology has altered the way in which images may be presented and are reported, with less reliance on 'hard copy' for both reporting and archiving purposes. Digital image generation and storage is rapidly replacing film in both radiological and nuclear medicine practice. A personal computer database based interesting case filing system is described and demonstrated. The digital image storage format allows instant access to both case information (e.g. history and examination, scan report or teaching point) and the relevant images. The database design allows rapid selection of cases and images appropriate to a particular diagnosis, scan type, age or other search criteria. Correlative X-ray, CT, MRI and ultrasound images can also be stored and accessed. The application is in use at The New Children's Hospital as an aid to postgraduate medical education, with new cases being regularly added to the database

  4. Relational database hybrid model, of high performance and storage capacity for nuclear engineering applications

    International Nuclear Information System (INIS)

    The objective of this work is to present the relational database, named FALCAO. It was created and implemented to support the storage of the monitored variables in the IEA-R1 research reactor, located in the Instituto de Pesquisas Energeticas e Nucleares, IPEN/CNEN-SP. The data logical model and its direct influence in the integrity of the provided information are carefully considered. The concepts and steps of normalization and de normalization including the entities and relations involved in the logical model are presented. It is also presented the effects of the model rules in the acquisition, loading and availability of the final information, under the performance concept since the acquisition process loads and provides lots of information in small intervals of time. The SACD application, through its functionalities, presents the information stored in the FALCAO database in a practical and optimized form. The implementation of the FALCAO database occurred successfully and its existence leads to a considerably favorable situation. It is now essential to the routine of the researchers involved, not only due to the substantial improvement of the process but also to the reliability associated to it. (author)

  5. Love Me Clients!.

    OpenAIRE

    Mora Casas, Javier

    2015-01-01

    Avanttic Consultoría Tecnológica ha desarrollado una idea y emprendido una iniciativa con la creación de un convenio FORTE con la Universidad de Castilla-La Mancha para llevar a cabo un proyecto y he sido yo en quien han confiado para su ejecución. La aplicación, Love Me Clients!, pretende ser una plataforma de comunicaciones multicanal para conseguir gestionar publicaciones en redes sociales, email, SMS y Push desde un único punto, obteniendo así un incremento de la prod...

  6. Teaching job interview skills to retarded clients.

    OpenAIRE

    Hall, C.; Sheldon-Wildgen, J; Sherman, J. A.

    1980-01-01

    Six retarded adults were taught job application and interview skills including introducing oneself, filling out a standard job application form, answering questions, and asking questions. A combination of instructions, modeling, role playing, and positive and corrective feedback was used across a multiple baseline experimental design. After training, the clients' performance in each area improved substantially over baseline levels. In addition, the newly taught skills appeared to generalize t...

  7. Three-Dimensional Audio Client Library

    Science.gov (United States)

    Rizzi, Stephen A.

    2005-01-01

    The Three-Dimensional Audio Client Library (3DAudio library) is a group of software routines written to facilitate development of both stand-alone (audio only) and immersive virtual-reality application programs that utilize three-dimensional audio displays. The library is intended to enable the development of three-dimensional audio client application programs by use of a code base common to multiple audio server computers. The 3DAudio library calls vendor-specific audio client libraries and currently supports the AuSIM Gold-Server and Lake Huron audio servers. 3DAudio library routines contain common functions for (1) initiation and termination of a client/audio server session, (2) configuration-file input, (3) positioning functions, (4) coordinate transformations, (5) audio transport functions, (6) rendering functions, (7) debugging functions, and (8) event-list-sequencing functions. The 3DAudio software is written in the C++ programming language and currently operates under the Linux, IRIX, and Windows operating systems.

  8. Application of embedded database to digital power supply system in HIRFL

    International Nuclear Information System (INIS)

    Background: This paper introduces the application of embedded MySQL database in the real-time monitoring system of the digital power supply system in Heavy Ion Research Facility in Lanzhou (HIRFL). Purpose: The aim is to optimize the real-time monitoring system of the digital power supply system for better performance. Methods: The MySQL database is designed and implemented under Linux operation system running on ARM processor, together with the related functions for real-time data monitoring, such as collection, storage and query. All status parameters of digital power supply system is collected and communicated with ARM by a FPGA, whilst the user interface is realized by Qt toolkits at ARM end. Results: The actual operation indicates that digital power supply can realize the function of real-time data monitoring, collection, storage and so on. Conclusion: Through practical application, we have found some aspects we can improve and we will try to optimize them in the future. (authors)

  9. Database Access through Java Technologies

    Directory of Open Access Journals (Sweden)

    Nicolae MERCIOIU

    2010-09-01

    Full Text Available As a high level development environment, the Java technologies offer support to the development of distributed applications, independent of the platform, providing a robust set of methods to access the databases, used to create software components on the server side, as well as on the client side. Analyzing the evolution of Java tools to access data, we notice that these tools evolved from simple methods that permitted the queries, the insertion, the update and the deletion of the data to advanced implementations such as distributed transactions, cursors and batch files. The client-server architectures allows through JDBC (the Java Database Connectivity the execution of SQL (Structured Query Language instructions and the manipulation of the results in an independent and consistent manner. The JDBC API (Application Programming Interface creates the level of abstractization needed to allow the call of SQL queries to any DBMS (Database Management System. In JDBC the native driver and the ODBC (Open Database Connectivity-JDBC bridge and the classes and interfaces of the JDBC API will be described. The four steps needed to build a JDBC driven application are presented briefly, emphasizing on the way each step has to be accomplished and the expected results. In each step there are evaluations on the characteristics of the database systems and the way the JDBC programming interface adapts to each one. The data types provided by SQL2 and SQL3 standards are analyzed by comparison with the Java data types, emphasizing on the discrepancies between those and the SQL types, but also the methods that allow the conversion between different types of data through the methods of the ResultSet object. Next, starting from the metadata role and studying the Java programming interfaces that allow the query of result sets, we will describe the advanced features of the data mining with JDBC. As alternative to result sets, the Rowsets add new functionalities that

  10. Development and application of indices using large volcanic databases for a global hazard and risk assessment

    Science.gov (United States)

    Brown, Sarah; Auker, Melanie; Cottrell, Elizabeth; Delgado Granados, Hugo; Loughlin, Sue; Ortiz Guerrero, Natalie; Sparks, Steve; Vye-Brown, Charlotte; Taskforce, Indices

    2015-04-01

    monitoring levels around the world; this is designed to be complementary to WOVOdat (the World Organisation of Volcano Observatories: Database of Volcanic Unrest). An index developed from this has been adapted and applied to a global dataset showing that approximately one third of historically active volcanoes have levels of ground-based monitoring that may permit analysis of magma movements and activity forecasts. Some unmonitored volcanoes score highly for both hazard and population risk. The development and application of such indices is dependent on the availability and accessibility of large, systematic, sustainable and compatible databases. These indices help to harmonise approaches and allows first order assessments, highlighting gaps in knowledge and areas where research and investment is recommended.

  11. Client/Server数据库结构、应用软件设计及其访问机制%Construction, Applications Design and Accessing Mechanism of Client/Server Database

    Institute of Scientific and Technical Information of China (English)

    黄丽霞

    2000-01-01

    全面论述了客户机/服务器数据库系统体系结构的发展和主要特点,提出新一代数据库客户端与服务器端应用软件的设计任务,进一步探讨了访问数据库的多种技术和途径.

  12. In-database processing of a large collection of remote sensing data: applications and implementation

    Science.gov (United States)

    Kikhtenko, Vladimir; Mamash, Elena; Chubarov, Dmitri; Voronina, Polina

    2016-04-01

    Large archives of remote sensing data are now available to scientists, yet the need to work with individual satellite scenes or product files constrains studies that span a wide temporal range or spatial extent. The resources (storage capacity, computing power and network bandwidth) required for such studies are often beyond the capabilities of individual geoscientists. This problem has been tackled before in remote sensing research and inspired several information systems. Some of them such as NASA Giovanni [1] and Google Earth Engine have already proved their utility for science. Analysis tasks involving large volumes of numerical data are not unique to Earth Sciences. Recent advances in data science are enabled by the development of in-database processing engines that bring processing closer to storage, use declarative query languages to facilitate parallel scalability and provide high-level abstraction of the whole dataset. We build on the idea of bridging the gap between file archives containing remote sensing data and databases by integrating files into relational database as foreign data sources and performing analytical processing inside the database engine. Thereby higher level query language can efficiently address problems of arbitrary size: from accessing the data associated with a specific pixel or a grid cell to complex aggregation over spatial or temporal extents over a large number of individual data files. This approach was implemented using PostgreSQL for a Siberian regional archive of satellite data products holding hundreds of terabytes of measurements from multiple sensors and missions taken over a decade-long span. While preserving the original storage layout and therefore compatibility with existing applications the in-database processing engine provides a toolkit for provisioning remote sensing data in scientific workflows and applications. The use of SQL - a widely used higher level declarative query language - simplifies interoperability

  13. Therapist Homophobia, Client Sexual Orientation, and Source of Client HIV Infection as Predictors of Therapist Reactions to Clients with HIV.

    Science.gov (United States)

    Hayes, Jeffrey A.; Erkis, Andrew J.

    2000-01-01

    Analyses revealed that therapists responded with less empathy, attributed less responsibility to client for problem solving, assessed client's functioning to be worse, and were less willing to work with client when the client's source of HIV infection was from something other than drugs, when the client was gay, and when the therapist was more…

  14. Comprehensive database of human E3 ubiquitin ligases: application to aquaporin-2 regulation.

    Science.gov (United States)

    Medvar, Barbara; Raghuram, Viswanathan; Pisitkun, Trairak; Sarkar, Abhijit; Knepper, Mark A

    2016-07-01

    Aquaporin-2 (AQP2) is regulated in part via vasopressin-mediated changes in protein half-life that are in turn dependent on AQP2 ubiquitination. Here we addressed the question, "What E3 ubiquitin ligase is most likely to be responsible for AQP2 ubiquitination?" using large-scale data integration based on Bayes' rule. The first step was to bioinformatically identify all E3 ligase genes coded by the human genome. The 377 E3 ubiquitin ligases identified in the human genome, consisting predominant of HECT, RING, and U-box proteins, have been used to create a publically accessible and downloadable online database (https://hpcwebapps.cit.nih.gov/ESBL/Database/E3-ligases/). We also curated a second database of E3 ligase accessory proteins that included BTB domain proteins, cullins, SOCS-box proteins, and F-box proteins. Using Bayes' theorem to integrate information from multiple large-scale proteomic and transcriptomic datasets, we ranked these 377 E3 ligases with respect to their probability of interaction with AQP2. Application of Bayes' rule identified the E3 ligases most likely to interact with AQP2 as (in order of probability): NEDD4 and NEDD4L (tied for first), AMFR, STUB1, ITCH, ZFPL1. Significantly, the two E3 ligases tied for top rank have also been studied extensively in the reductionist literature as regulatory proteins in renal tubule epithelia. The concordance of conclusions from reductionist and systems-level data provides strong motivation for further studies of the roles of NEDD4 and NEDD4L in the regulation of AQP2 protein turnover. PMID:27199454

  15. Knowledge Management and Database Marketing Applications = Bilgi Yönetimi ve Veritabanlı Pazarlama Uygulamaları

    Directory of Open Access Journals (Sweden)

    Erol EREN

    2004-01-01

    Full Text Available The aim of this article is to learn whether database marketing is used as a tool of knowledge management, and to investigate its applications among Turkish ready to wear retailers. The retailers who use knowledge technologies can collect and turn data into useful information and knowledge, and then use them in database marketing systems. This study investigates whether ready to wear retailers have such database marketing systems, and how they set up and work these systems if they have one. There is no doubt that businesses in the information technologies sector would be interested in the topic and the results.

  16. Analysis and Design of Soils and Terrain Digital Database (SOTER) Management System Based on Object-Oriented Method

    Institute of Scientific and Technical Information of China (English)

    ZHANG HAITAO; ZHOU YONG; R. V. BIRNIE; A. SIBBALD; REN YI

    2003-01-01

    A SOTER management system was developed by analyzing, designing, programming, testing, repeated proceeding and progressing based on the object-oriented method. The function of the attribute database management is inherited and expanded in the new system. The integrity and security of the SOTER database are enhanced. The attribute database management, the spatial database management and the model base are integrated into SOTER based on the component object model (COM), and the graphical user interface (GUI) for Windows is used to interact with clients, thus being easy to create and maintain the SOTER, and convenient to promote the quantification and automation of soil information application.

  17. Applications of TsunAWI: Operational scenario database in Indonesia, case studies in Chile

    Science.gov (United States)

    Rakowsky, Natalja; Harig, Sven; Immerz, Antonia; Androsov, Alexey; Hiller, Wolfgang; Schröter, Jens

    2016-04-01

    The numerical simulation code TsunAWI was developed in the framework of the German-Indonesian Tsunami Early Warning System (GITEWS). The Numerical simulation of prototypic tsunami scenarios plays a decisive role in the a priori risk assessment for coastal regions and in the early warning process itself. TsunAWI is suited for both tasks. It is based on a finite element discretisation, employs unstructured grids with high resolution along the coast, and includes inundation. This contribution presents two fields of applications. In the Indonesian tsunami early warning system, the existing TsunAWI scenario database covers the Sunda subduction zone from Sumatra to the Lesser Sunda Islands with 715 epicenters and 4500 scenarios. In a collaboration with Geoscience Australia, we support the scientific staff at the Indonesian warning center to extend the data base to the remaining tectonic zones in the Indonesian Archipelago. The extentension started for North Sulawesi, West and East Maluku Islands. For the Hydrographic and Oceanographic Service of the Chilean Navy (SHOA), we calculated a small scenario database of 100 scenarios (sources by Universidad de Chile) for a lightweight decision support system prototype (built by DLR). The earthquake and tsunami events on 1 April 2014 and 16 November 2016 showed the practical use of this approach in comparison to hind casts of these events.

  18. Group Work with Transgender Clients

    Science.gov (United States)

    Dickey, Lore M.; Loewy, Michael I.

    2010-01-01

    Drawing on the existing literature, the authors' research and clinical experiences, and the first author's personal journey as a member and leader of the transgender community, this article offers a brief history of group work with transgender clients followed by suggestions for group work with transgender clients from a social justice…

  19. Secure thin client architecture for DICOM image analysis

    Science.gov (United States)

    Mogatala, Harsha V. R.; Gallet, Jacqueline

    2005-04-01

    This paper presents a concept of Secure Thin Client (STC) Architecture for Digital Imaging and Communications in Medicine (DICOM) image analysis over Internet. STC Architecture provides in-depth analysis and design of customized reports for DICOM images using drag-and-drop and data warehouse technology. Using a personal computer and a common set of browsing software, STC can be used for analyzing and reporting detailed patient information, type of examinations, date, Computer Tomography (CT) dose index, and other relevant information stored within the images header files as well as in the hospital databases. STC Architecture is three-tier architecture. The First-Tier consists of drag-and-drop web based interface and web server, which provides customized analysis and reporting ability to the users. The Second-Tier consists of an online analytical processing (OLAP) server and database system, which serves fast, real-time, aggregated multi-dimensional data using OLAP technology. The Third-Tier consists of a smart algorithm based software program which extracts DICOM tags from CT images in this particular application, irrespective of CT vendor's, and transfers these tags into a secure database system. This architecture provides Winnipeg Regional Health Authorities (WRHA) with quality indicators for CT examinations in the hospitals. It also provides health care professionals with analytical tool to optimize radiation dose and image quality parameters. The information is provided to the user by way of a secure socket layer (SSL) and role based security criteria over Internet. Although this particular application has been developed for WRHA, this paper also discusses the effort to extend the Architecture to other hospitals in the region. Any DICOM tag from any imaging modality could be tracked with this software.

  20. CAVIAR: CLASSIFICATION VIA AGGREGATED REGRESSION AND ITS APPLICATION IN CLASSIFYING OASIS BRAIN DATABASE.

    Science.gov (United States)

    Chen, Ting; Rangarajan, Anand; Vemuri, Baba C

    2010-04-14

    This paper presents a novel classification via aggregated regression algorithm - dubbed CAVIAR - and its application to the OASIS MRI brain image database. The CAVIAR algorithm simultaneously combines a set of weak learners based on the assumption that the weight combination for the final strong hypothesis in CAVIAR depends on both the weak learners and the training data. A regularization scheme using the nearest neighbor method is imposed in the testing stage to avoid overfitting. A closed form solution to the cost function is derived for this algorithm. We use a novel feature - the histogram of the deformation field between the MRI brain scan and the atlas which captures the structural changes in the scan with respect to the atlas brain - and this allows us to automatically discriminate between various classes within OASIS [1] using CAVIAR. We empirically show that CAVIAR significantly increases the performance of the weak classifiers by showcasing the performance of our technique on OASIS. PMID:21151847

  1. Channel Access Client Toolbox for Matlab

    International Nuclear Information System (INIS)

    This paper reports on MATLAB Channel Access (MCA) Toolbox--MATLAB [1] interface to EPICS Channel Access (CA) client library. We are developing the toolbox for SPEAR3 accelerator controls, but it is of general use for accelerator and experimental physics applications programming. It is packaged as a MATLAB toolbox to allow easy development of complex CA client applications entirely in MATLAB. The benefits include: the ability to calculate and display parameters that use EPICS process variables as inputs, availability of MATLAB graphics tools for user interface design, and integration with the MATLABbased accelerator modeling software - Accelerator Toolbox [2-4]. Another purpose of this paper is to propose a feasible path to a synergy between accelerator control systems and accelerator simulation codes, the idea known as on-line accelerator model

  2. Probabilistic Databases

    CERN Document Server

    Suciu, Dan; Koch, Christop

    2011-01-01

    Probabilistic databases are databases where the value of some attributes or the presence of some records are uncertain and known only with some probability. Applications in many areas such as information extraction, RFID and scientific data management, data cleaning, data integration, and financial risk assessment produce large volumes of uncertain data, which are best modeled and processed by a probabilistic database. This book presents the state of the art in representation formalisms and query processing techniques for probabilistic data. It starts by discussing the basic principles for rep

  3. System analysis for the 300 MW nuclear power plant design database

    International Nuclear Information System (INIS)

    The structure of the 300 MW nuclear power station design database system which has been developed by Shanghai Nuclear Engineering Research and Design Institute is discussed. The system consists of an IBM RS/6000 workstation running ORACLE 7.0 as the server networked through a TCP/IP Ethernet with a number of PCs. The application software for the database system at both the server and the clients features of good data integrity, reliability, network transparency and of friendly user interface by using stored procedures, triggers and some other new database technology

  4. Java ME Clients for XML Web Services

    Directory of Open Access Journals (Sweden)

    Paul POCATILU

    2008-01-01

    Full Text Available Using Web services in developing applications has many advantages like the existence of standards, multiple software platforms that support them, and many areas of usage. These advantages derive from the XML and Web technologies. This paper describes the stages in the development of a Web service client for Java ME platform and presents examples based on kSOAP and JSR 172.

  5. ASP Applications to Access SQL Server 2000 Database%应用ASP访问SQL Server 2000数据库

    Institute of Scientific and Technical Information of China (English)

    徐涌; 郑瑞银

    2011-01-01

    在建立Web数据库站点的应用中,将ASP技术与SQLServer数据库相结合,使数据库管理更加的方便和安全。因此如何利用AsP访问SQLServer2000数据库成为了Web数据库站点开发的关键。首先介绍了AsP访问SQLServer2000数据库所用到的ADO组件,接着阐述了ASP与SQLServer2000数据库建立连接的4个主要步骤。%In creating a Web site,database applications,the ASP technology combined with the SQL Server database to make database management more convenient and secure.Therefore,how to use ASP to access SQL Server 2000 database into a Web site developed by key database.First introduced the ASP to access SQL Server 2000 database that uses the ADO components,and then elaborated on ASP and SQL Server 2000 database to connect the four main steps.

  6. Dogmatism within the Counselor-Client Dyad

    Science.gov (United States)

    Tosi, Donald J.

    1970-01-01

    Different levels of counselor and client dogmatism combined additively in terms of their effect on client ratings of the relationship. Client ratings of the relationship were progressively higher as more openness occurred in the dyad. (Author)

  7. A Responsive Client for Distributed Visualization

    Science.gov (United States)

    Bollig, E. F.; Jensen, P. A.; Erlebacher, G.; Yuen, D. A.; Momsen, A. R.

    2006-12-01

    As grids, web services and distributed computing continue to gain popularity in the scientific community, demand for virtual laboratories likewise increases. Today organizations such as the Virtual Laboratory for Earth and Planetary Sciences (VLab) are dedicated to developing web-based portals to perform various simulations remotely while abstracting away details of the underlying computation. Two of the biggest challenges in portal- based computing are fast visualization and smooth interrogation without over taxing clients resources. In response to this challenge, we have expanded on our previous data storage strategy and thick client visualization scheme [1] to develop a client-centric distributed application that utilizes remote visualization of large datasets and makes use of the local graphics processor for improved interactivity. Rather than waste precious client resources for visualization, a combination of 3D graphics and 2D server bitmaps are used to simulate the look and feel of local rendering. Java Web Start and Java Bindings for OpenGL enable install-on- demand functionality as well as low level access to client graphics for all platforms. Powerful visualization services based on VTK and auto-generated by the WATT compiler [2] are accessible through a standard web API. Data is permanently stored on compute nodes while separate visualization nodes fetch data requested by clients, caching it locally to prevent unnecessary transfers. We will demonstrate application capabilities in the context of simulated charge density visualization within the VLab portal. In addition, we will address generalizations of our application to interact with a wider number of WATT services and performance bottlenecks. [1] Ananthuni, R., Karki, B.B., Bollig, E.F., da Silva, C.R.S., Erlebacher, G., "A Web-Based Visualization and Reposition Scheme for Scientific Data," In Press, Proceedings of the 2006 International Conference on Modeling Simulation and Visualization Methods (MSV

  8. Distributed data collection for a database of radiological image interpretations

    Science.gov (United States)

    Long, L. Rodney; Ostchega, Yechiam; Goh, Gin-Hua; Thoma, George R.

    1997-01-01

    The National Library of Medicine, in collaboration with the National Center for Health Statistics and the National Institute for Arthritis and Musculoskeletal and Skin Diseases, has built a system for collecting radiological interpretations for a large set of x-ray images acquired as part of the data gathered in the second National Health and Nutrition Examination Survey. This system is capable of delivering across the Internet 5- and 10-megabyte x-ray images to Sun workstations equipped with X Window based 2048 X 2560 image displays, for the purpose of having these images interpreted for the degree of presence of particular osteoarthritic conditions in the cervical and lumbar spines. The collected interpretations can then be stored in a database at the National Library of Medicine, under control of the Illustra DBMS. This system is a client/server database application which integrates (1) distributed server processing of client requests, (2) a customized image transmission method for faster Internet data delivery, (3) distributed client workstations with high resolution displays, image processing functions and an on-line digital atlas, and (4) relational database management of the collected data.

  9. Application of Google Maps API service for creating web map of information retrieved from CORINE land cover databases

    Directory of Open Access Journals (Sweden)

    Kilibarda Milan

    2010-01-01

    Full Text Available Today, Google Maps API application based on Ajax technology as standard web service; facilitate users with publication interactive web maps, thus opening new possibilities in relation to the classical analogue maps. CORINE land cover databases are recognized as the fundamental reference data sets for numerious spatial analysis. The theoretical and applicable aspects of Google Maps API cartographic service are considered on the case of creating web map of change in urban areas in Belgrade and surround from 2000. to 2006. year, obtained from CORINE databases.

  10. Design of the system of maintenance operations occupational safety and health database application of nuclear power station

    International Nuclear Information System (INIS)

    Based on the KKS code of building equipment in nuclear power station, this paper introduces the method of establishing the system of maintenance operation occupational safety and health database application. Through the application system of maintenance occupational safety and health database, it can summarize systematically all kinds of maintenance operation dangerous factor of nuclear power station, and make a convenience for staff to learn the maintenance operation dangerous factors and the prevention measures, so that it can achieve the management concept of 'precaution crucial, continuous improvement' that advocated by OSHMS. (authors)

  11. Risk taking among diabetic clients.

    Science.gov (United States)

    Joseph, D H; Schwartz-Barcott, D; Patterson, B

    1992-01-01

    Diabetic clients must make daily decisions about their health care needs. Observational and anecdotal evidence suggests that vast differences exist between the kinds of choices diabetic clients make and the kinds of chances they are willing to take. The purpose of this investigation was to develop a diabetic risk-assessment tool. This instrument, which is based on subjective expected utility theory, measures risk-prone and risk-averse behavior. Initial findings from a pilot study of 18 women clients who are on insulin indicate that patterns of risk behavior exist in the areas of exercise, skin care, and diet. PMID:1729123

  12. BoreholeAR: A mobile tablet application for effective borehole database visualization using an augmented reality technology

    Science.gov (United States)

    Lee, Sangho; Suh, Jangwon; Park, Hyeong-Dong

    2015-03-01

    Boring logs are widely used in geological field studies since the data describes various attributes of underground and surface environments. However, it is difficult to manage multiple boring logs in the field as the conventional management and visualization methods are not suitable for integrating and combining large data sets. We developed an iPad application to enable its user to search the boring log rapidly and visualize them using the augmented reality (AR) technique. For the development of the application, a standard borehole database appropriate for a mobile-based borehole database management system was designed. The application consists of three modules: an AR module, a map module, and a database module. The AR module superimposes borehole data on camera imagery as viewed by the user and provides intuitive visualization of borehole locations. The map module shows the locations of corresponding borehole data on a 2D map with additional map layers. The database module provides data management functions for large borehole databases for other modules. Field survey was also carried out using more than 100,000 borehole data.

  13. Client Mobile Software Design Principles for Mobile Learning Systems

    Directory of Open Access Journals (Sweden)

    Qing Tan

    2009-01-01

    Full Text Available In a client-server mobile learning system, client mobile software must run on the mobile phone to acquire, package, and send student’s interaction data via the mobile communications network to the connected mobile application server. The server will receive and process the client data in order to offer appropriate content and learning activities. To develop the mobile learning systems there are a number of very important issues that must be addressed. Mobile phones have scarce computing resources. They consist of heterogeneous devices and use various mobile operating systems, they have limitations with their user/device interaction capabilities, high data communications cost, and must provide for device mobility and portability. In this paper we propose five principles for designing Client mobile learning software. A location-based adaptive mobile learning system is presented as a proof of concept to demonstrate the applicability of these design principles.

  14. The European power plant infrastructure-Presentation of the Chalmers energy infrastructure database with applications

    International Nuclear Information System (INIS)

    This paper presents a newly established database of the European power plant infrastructure (power plants, fuel infrastructure, fuel resources and CO2 storage options) for the EU25 member states (MS) and applies the database in a general discussion of the European power plant and natural gas infrastructure as well as in a simple simulation analysis of British and German power generation up to the year 2050 with respect to phase-out of existing generation capacity, fuel mix and fuel dependency. The results are discussed with respect to age structure of the current production plants, CO2 emissions, natural gas dependency and CO2 capture and storage (CCS) under stringent CO2 emission constraints. The analysis of the information from the power plant database, which includes planned projects, shows large variations in power plant infrastructure between the MS and a clear shift to natural gas-fuelled power plants during the last decade. The data indicates that this shift may continue in the short-term up to 2010 since the majority of planned plants are natural gas fired. The gas plants are, however, geographically concentrated to southern and northwest Europe. The data also shows large activities in the upstream gas sector to accommodate the ongoing shift to gas with pipelines, liquefaction plants and regasification terminals being built and gas fields being prepared for production. At the same time, utilities are integrating upwards in the fuel chain in order to secure supply while oil and gas companies are moving downwards the fuel chain to secure access to markets. However, it is not yet possible to state whether the ongoing shift to natural gas will continue in the medium term, i.e. after 2010, since this will depend on a number of factors as specified below. Recently there have also been announcements for construction of a number of new coal plants. The results of the simulations for the German and British power sector show that combination of a relatively low

  15. Identifying Social Impacts in Product Supply Chains:Overview and Application of the Social Hotspot Database

    Directory of Open Access Journals (Sweden)

    Gregory Norris

    2012-08-01

    Full Text Available One emerging tool to measure the social-related impacts in supply chains is Social Life Cycle Assessment (S-LCA, a derivative of the well-established environmental LCA technique. LCA has recently started to gain popularity among large corporations and initiatives, such as The Sustainability Consortium or the Sustainable Apparel Coalition. Both have made the technique a cornerstone of their applied-research program. The Social Hotspots Database (SHDB is an overarching, global database that eases the data collection burden in S-LCA studies. Proposed “hotspots” are production activities or unit processes (also defined as country-specific sectors in the supply chain that may be at risk for social issues to be present. The SHDB enables efficient application of S-LCA by allowing users to prioritize production activities for which site-specific data collection is most desirable. Data for three criteria are used to inform prioritization: (1 labor intensity in worker hours per unit process and (2 risk for, or opportunity to affect, relevant social themes or sub-categories related to Human Rights, Labor Rights and Decent Work, Governance and Access to Community Services (3 gravity of a social issue. The Worker Hours Model was developed using a global input/output economic model and wage rate data. Nearly 200 reputable sources of statistical data have been used to develop 20 Social Theme Tables by country and sector. This paper presents an overview of the SHDB development and features, as well as results from a pilot study conducted on strawberry yogurt. This study, one of seven Social Scoping Assessments mandated by The Sustainability Consortium, identifies the potential social hotspots existing in the supply chain of strawberry yogurt. With this knowledge, companies that manufacture or sell yogurt can refine their data collection efforts in order to put their social responsibility performance in perspective and effectively set up programs and

  16. Fine-grained policy control in U.S. Army Research Laboratory (ARL) multimodal signatures database

    Science.gov (United States)

    Bennett, Kelly; Grueneberg, Keith; Wood, David; Calo, Seraphin

    2014-06-01

    The U.S. Army Research Laboratory (ARL) Multimodal Signatures Database (MMSDB) consists of a number of colocated relational databases representing a collection of data from various sensors. Role-based access to this data is granted to external organizations such as DoD contractors and other government agencies through a client Web portal. In the current MMSDB system, access control is only at the database and firewall level. In order to offer finer grained security, changes to existing user profile schemas and authentication mechanisms are usually needed. In this paper, we describe a software middleware architecture and implementation that allows fine-grained access control to the MMSDB at a dataset, table, and row level. Result sets from MMSDB queries issued in the client portal are filtered with the use of a policy enforcement proxy, with minimal changes to the existing client software and database. Before resulting data is returned to the client, policies are evaluated to determine if the user or role is authorized to access the data. Policies can be authored to filter data at the row, table or column level of a result set. The system uses various technologies developed in the International Technology Alliance in Network and Information Science (ITA) for policy-controlled information sharing and dissemination1. Use of the Policy Management Library provides a mechanism for the management and evaluation of policies to support finer grained access to the data in the MMSDB system. The GaianDB is a policy-enabled, federated database that acts as a proxy between the client application and the MMSDB system.

  17. Failure rates in Barsebaeck-1 reactor coolant pressure boundary piping. An application of a piping failure database

    Energy Technology Data Exchange (ETDEWEB)

    Lydell, B. [RSA Technologies, Vista, CA (United States)

    1999-05-01

    This report documents an application of a piping failure database to estimate the frequency of leak and rupture in reactor coolant pressure boundary piping. The study used Barsebaeck-1 as reference plant. The study tried two different approaches to piping failure rate estimation: 1) PSA-style, simple estimation using Bayesian statistics, and 2) fitting of statistical distribution to failure data. A large, validated database on piping failures (like the SKI-PIPE database) supports both approaches. In addition to documenting leak and rupture frequencies, the SKI report describes the use of piping failure data to estimate frequency of medium and large loss of coolant accidents (LOCAs). This application study was co sponsored by Barsebaeck Kraft AB and SKI Research 41 refs, figs, tabs

  18. Failure rates in Barsebaeck-1 reactor coolant pressure boundary piping. An application of a piping failure database

    International Nuclear Information System (INIS)

    This report documents an application of a piping failure database to estimate the frequency of leak and rupture in reactor coolant pressure boundary piping. The study used Barsebaeck-1 as reference plant. The study tried two different approaches to piping failure rate estimation: 1) PSA-style, simple estimation using Bayesian statistics, and 2) fitting of statistical distribution to failure data. A large, validated database on piping failures (like the SKI-PIPE database) supports both approaches. In addition to documenting leak and rupture frequencies, the SKI report describes the use of piping failure data to estimate frequency of medium and large loss of coolant accidents (LOCAs). This application study was co sponsored by Barsebaeck Kraft AB and SKI Research

  19. GenExp: an interactive web-based genomic DAS client with client-side data rendering.

    Directory of Open Access Journals (Sweden)

    Bernat Gel Moreno

    Full Text Available BACKGROUND: The Distributed Annotation System (DAS offers a standard protocol for sharing and integrating annotations on biological sequences. There are more than 1000 DAS sources available and the number is steadily increasing. Clients are an essential part of the DAS system and integrate data from several independent sources in order to create a useful representation to the user. While web-based DAS clients exist, most of them do not have direct interaction capabilities such as dragging and zooming with the mouse. RESULTS: Here we present GenExp, a web based and fully interactive visual DAS client. GenExp is a genome oriented DAS client capable of creating informative representations of genomic data zooming out from base level to complete chromosomes. It proposes a novel approach to genomic data rendering and uses the latest HTML5 web technologies to create the data representation inside the client browser. Thanks to client-side rendering most position changes do not need a network request to the server and so responses to zooming and panning are almost immediate. In GenExp it is possible to explore the genome intuitively moving it with the mouse just like geographical map applications. Additionally, in GenExp it is possible to have more than one data viewer at the same time and to save the current state of the application to revisit it later on. CONCLUSIONS: GenExp is a new interactive web-based client for DAS and addresses some of the short-comings of the existing clients. It uses client-side data rendering techniques resulting in easier genome browsing and exploration. GenExp is open source under the GPL license and it is freely available at http://gralggen.lsi.upc.edu/recerca/genexp.

  20. Database design and database administration for a kindergarten

    OpenAIRE

    Vítek, Daniel

    2009-01-01

    The bachelor thesis deals with creation of database design for a standard kindergarten, installation of the designed database into the database system Oracle Database 10g Express Edition and demonstration of the administration tasks in this database system. The verification of the database was proved by a developed access application.

  1. PENGGUNAAN KONEKSI CORBA DENGAN PEMROGRAMAN MIDAS MULTI-TIER APPLICATION DALAM SISTEM RESERVASI HOTEL

    Directory of Open Access Journals (Sweden)

    Irwan Kristanto Julistiono

    2001-01-01

    Full Text Available This paper is made from a multi-tier system using corba technology for hotel reservation program for web browser and also client program. Client software is connected to application server with Corba Connection and client and application server connect to SQL server 7.0. via ODBC. The are 2 types of client: web client and delphi client. In making web browser client application, we use delphi activex from technology, in where in this system made like making the regular form, but it has shortage in integration with html language. Multi-pier application using corba system generally has another profit beside it could be developed, this system also stake with multi system database server, multi middle servers and multi client in which with these things all the system can system can be integrated. The weakness of this system is the complicated corba system, so it will be difficult to understand, while for multi-tier it self need a particular procedure to determine which server chossed by the client. Abstract in Bahasa Indonesia : Pada makalah ini dibuat suatu sistem multi-tier yang menggunakan teknologi CORBA untuk program reservasi hotel baik dengan web browser maupun program client. Perangkat lunak yang dipakai sebagai database server adalah SQL server 7.0. Program Client Delphi melalui Corba Connection akan dihubungkan ke Aplikasi server. Dan melalui ODBC Aplikasi Server akan dihubungkan ke SQL Server 7.0. Ada dua buah aplikasi client yaitu yang menggunakan lokal network dan yang menggunakan global network/web browser. Pada pembuatan aplikasi client untuk web browser. Digunakan teknologi activex form pada delphi dimana sistem ini dibuat seperti membuat form biasa, hanya saja memiliki kekurangan pada integrasi dengan bahasa html. Penggunaan sistem multi-tier dengan Corba ini secara umum memiliki keuntungan selain dapat dikembangkan lebih lanjut juga sistem ini dirancang dengan sistem multi database server, multi midle server, dan multi client dimana

  2. Development of DQM software infrastructure: storing and reading the monitoring information from the histograms filled by online client applications into relational tables.

    CERN Document Server

    Andrzejczak, Adam

    2015-01-01

    In CMS the online DQM stores the monitoring information from several heterogeneous data sources into histograms, which are later sent to the DQMGUI for visualization. System for the handling of monitoring data is crucial for operating the detector and realizing whether or not it is undergoing failures: in particular, relational databases are the current best option for hosting such data. In this context a new DQM plugin DQMDatabaseWriter was developed, it provides interface which can be used in other DQM modules to drop desired data into the relational database. In addition, a python script provides possibility to read and visualize already saved records.

  3. Web-Based Satellite Products Database for Meteorological and Climate Applications

    Science.gov (United States)

    Phan, Dung; Spangenberg, Douglas A.; Palikonda, Rabindra; Khaiyer, Mandana M.; Nordeen, Michele L.; Nguyen, Louis; Minnis, Patrick

    2004-01-01

    The need for ready access to satellite data and associated physical parameters such as cloud properties has been steadily growing. Air traffic management, weather forecasters, energy producers, and weather and climate researchers among others can utilize more satellite information than in the past. Thus, it is essential that such data are made available in near real-time and as archival products in an easy-access and user friendly environment. A host of Internet web sites currently provide a variety of satellite products for various applications. Each site has a unique contribution with appeal to a particular segment of the public and scientific community. This is no less true for the NASA Langley's Clouds and Radiation (NLCR) website (http://www-pm.larc.nasa.gov) that has been evolving over the past 10 years to support a variety of research projects This website was originally developed to display cloud products derived from the Geostationary Operational Environmental Satellite (GOES) over the Southern Great Plains for the Atmospheric Radiation Measurement (ARM) Program. It has evolved into a site providing a comprehensive database of near real-time and historical satellite products used for meteorological, aviation, and climate studies. To encourage the user community to take advantage of the site, this paper summarizes the various products and projects supported by the website and discusses future options for new datasets.

  4. Resistance, Reactance, and the Difficult Client.

    Science.gov (United States)

    Dowd, E. Thomas; Sanders, Daniel

    1994-01-01

    Describes effect of client resistance and reactance in counseling and methods for assessing these phenomena. Conceptualizes client symptoms as ego-syntonic, where symptom is consonant with client's self-image, or ego-dystonic, where it is not. Uses concepts in deriving counseling strategies for working with difficult clients according to model…

  5. Database of open-framework aluminophosphate syntheses:introduction and application (Ⅰ)

    Institute of Scientific and Technical Information of China (English)

    YAN Yan; LI JiYang; QI Miao; ZHANG Xiao; YU JiHong; XU RuRen

    2009-01-01

    The database of open-framework aluminophosphate (AIPO) syntheses has been established,which Includes about 1600 synthetic records.Data analysis has been done on the basis of the framework composition,structure dimension,pore ring,and organic template.This database will serve as useful guidance for the rational synthesis of microporous functional materials.

  6. Database of open-framework aluminophosphate syntheses:introduction and application(Ⅰ)

    Institute of Scientific and Technical Information of China (English)

    2009-01-01

    The database of open-framework aluminophosphate(AlPO) syntheses has been established,which includes about 1600 synthetic records.Data analysis has been done on the basis of the framework composition,structure dimension,pore ring,and organic template.This database will serve as useful guidance for the rational synthesis of microporous functional materials.

  7. The ARAC client system: network-based access to ARAC

    International Nuclear Information System (INIS)

    The ARAC Client System allows users (such as emergency managers and first responders) with commonly available desktop and laptop computers to utilize the central ARAC system over the Internet or any other communications link using Internet protocols. Providing cost-effective fast access to the central ARAC system greatly expands the availability of the ARAC capability. The ARAC Client system consists of (1) local client applications running on the remote user's computer, and (2) ''site servers'' that provide secure access to selected central ARAC system capabilities and run on a scalable number of dedicated workstations residing at the central facility. The remote client applications allow users to describe a real or potential them-bio event, electronically sends this information to the central ARAC system which performs model calculations, and quickly receive and visualize the resulting graphical products. The site servers will support simultaneous access to ARAC capabilities by multiple users. The ARAC Client system is based on object-oriented client/server and distributed computing technologies using CORBA and Java, and consists of a large number of interacting components

  8. Main-memory database VS Traditional database

    OpenAIRE

    Rehn, Marcus; Sunesson, Emil

    2013-01-01

    There has been a surge of new databases in recent years. Applications today create a higher demand on database performance than ever before. Main-memory databases have come into the market quite recently and they are just now catching a lot of interest from many different directions. Main-memory databases are a type of database that stores all of its data in the primary memory. They provide a big increase in performance to a lot of different applications. This work evaluates the difference in...

  9. Database and a worldwide web-application server as a tool for remote participation at TEXTOR-94

    International Nuclear Information System (INIS)

    This paper describes the use of a central database for storing the TEXTOR-94 machine parameters as defined by the engineer in charge. Furthermore, the possibility of writing an online experiment logbook is given to the physicist in charge. An online display of the comments for a discharge, given by the physicist, is installed. In addition the plasma quantities, necessary for the characterization of a discharge, are evaluated and stored within the database. An overview picture is created from the stored data with important time traces as plasma current, density and temperature and stored as a graphic interchangeable format file in the database. With a central worldwide web (WEB)-application server and a commercial WEB-browser the stored information can be accessed from different locations, specially from the home base of the three Trilateral Euregio Cluster partners

  10. A fusion algorithm for joins based on collections in Odra-Object Database for Rapid Application development

    Directory of Open Access Journals (Sweden)

    Laika Satish

    2011-07-01

    Full Text Available In this paper we present the functionality of a currently under development database programming methodology called ODRA (Object Database for Rapid Application development which works fully on the object oriented principles. The database programming language is called SBQL (Stack based query language. We discuss some concepts in ODRA for e.g. the working of ODRA, how ODRA runtime environment operates, the interoperability of ODRA with .net and java .A view of ODRA's working with web services and xml. Currently the stages under development in ODRA are query optimization. So we present the prior work that is done in ODRA related to Query optimization and we also present a new fusion algorithm of how ODRA can deal with joins based on collections like set, lists, and arrays for query optimization.

  11. Assessment of a enhanced ResultSet component for accessing relational databases

    OpenAIRE

    Pereira, Óscar M.; Aguiar, Rui L.; Santos, Maribel Yasmina

    2010-01-01

    Call Level Interfaces (CLI) provide services aimed at easing the integration of database components and components from client applications. CLI support native SQL statements keeping this way expressiveness and performance of SQL. Thus, they cannot be discarded as a valid option whenever SQL expressiveness and SQL performance are considered key requirements. Despite the aforementioned performance advantage, CLI do not comprise other important performance features, as c...

  12. What Makes Underwriting and Non-Underwriting Clients of Brokerage Firms Receive Different Recommendations? An Application of Uplift Random Forest Model

    Directory of Open Access Journals (Sweden)

    Shaowen Hua

    2016-04-01

    Full Text Available I explore company characteristics which explain the difference in analysts’ recommendations for companies that were underwritten (affiliated versus non-underwritten (unaffiliated by analysts’ brokerage firms. Prior literature documents that analysts issue more optimistic recommendations to underwriting clients of analysts’ brokerage employers. Extant research uses regression models to find general associations between recommendations and financial qualities of companies, with or without underwriting relationship. However, regression models cannot identify the qualities that cause the most difference in recommendations between affiliated versus unaffiliated companies. I adopt uplift random forest model, a popular technique in recent marketing and healthcare research, to identify the type of companies that earn analysts’ favor. I find that companies of stable earnings in the past, higher book-to-market ratio, smaller sizes, worsened earnings, and lower forward PE ratio are likely to receive higher recommendations if  they are affiliated with analysts than if they are unaffiliated with analysts. With uplift random forest model, I show that analysts pay more attention on price-related than earnings-related matrices when they value affiliated versus unaffiliated companies. This paper contributes to the literature by introducing an effective predictive model to capital market research and shedding additional light on the usefulness of analysts’ reports.

  13. Assessment Database (ADB)

    Data.gov (United States)

    U.S. Environmental Protection Agency — The Assessment Database (ADB) is a relational database application for tracking water quality assessment data, including use attainment, and causes and sources of...

  14. Development of a Single-Crystal Mineral Elasticity Database and Applications to Crustal and Upper Mantle Mineralogy

    Science.gov (United States)

    Duffy, T. S.

    2013-12-01

    The single-crystal elastic stiffness tensor fully characterizes the anisotropic elastic response of minerals. An understanding of how such elastic properties vary with pressure, temperature, structure, and composition are needed to interpret seismic data for the Earth. Additionally, elastic moduli are important for understanding many solid-state phenomena including mechanical stability, interatomic interactions, material strength, compressibility, and phase transition mechanisms. A database of single-crystal elastic properties of minerals and related phases is being assembled. This dataset currently incorporates over 400 sets of elastic constant measurements on more than 270 separate phases. In addition to compiling the individual elastic stiffnesses, the database also allows calculation of a variety of additional properties including anisotropy factors, bulk and linear compressibilities, and stability criteria, as well as evaluation of aggregate properties including bounds and averages of bulk, shear, Young's modulus, Poisson's ratio and elastic wave speeds. Extensions of the database to include high pressure and high temperature data as well as theoretical calculations are being planned. Examples of application of this database to geophysical problems will be highlighted. Specific applications to be discussed include: 1) variation of elastic anisotropy with pressure for mantle and crustal minerals; 2) evaluation of elasticity data for pyroxenes revealing major structural and chemical controls on elasticity as well as remaining ambiguities and uncertainties.

  15. Oracle数据库应用中安全问题研究%Research Oracle Database Application Security Issues

    Institute of Scientific and Technical Information of China (English)

    姚树春

    2014-01-01

    The Oracle database is the most direct way for enterprises to share resources, and its own security has also become the focus of one of the elements considered in today's enterprise. Although database systems bring people in dealing with the many convenient data problems, but also to the system brought a lot of security risks, in order to ensure reliable and secure database management system to ensure safety in the application of this paper, the Oracle database application security issues make research initiatives.%Oracle数据库的建立是企业实现资源共享的最直接途径,其自身的安全性也成为现今企业重点考虑的要素之一。尽管数据库系统带给了人们在处理数据问题上的诸多便利,但同时也给系统带来了诸多安全隐患,为保证数据库系统管理的安全可靠、确保在应用方面的安全性,本文就Oracle数据库应用中安全问题的举措做出研究。

  16. The South African National Vegetation Database: History, development, applications, problems and future

    Directory of Open Access Journals (Sweden)

    Leslie W. Powrie

    2012-01-01

    Full Text Available Southern Africa has been recognised as one of the most interesting and important areas of the world from an ecological and evolutionary point of view. The establishment and development of the National Vegetation Database (NVD of South Africa enabled South Africa to contribute to environmental planning and conservation management in this floristically unique region. In this paper, we aim to provide an update on the development of the NVD since it was last described, near its inception, more than a decade ago. The NVD was developed using the Turboveg software environment, and currently comprises 46 697 vegetation plots (relevés sharing 11 690 plant taxa and containing 968 943 species occurrence records. The NVD was primarily founded to serve vegetation classification and mapping goals but soon became recognised as an important tool in conservation assessment and target setting. The NVD has directly helped produce the National Vegetation Map, National Forest Type Classification, South African National Biodiversity Assessment and Forest Type Conservation Assessment. With further development of the NVD and more consistent handling of the legacy data (old data sets, the current limitations regarding certain types of application of the data should be significantly reduced. However, the use of the current NVD in multidisciplinary research has certainly not been fully explored. With the availability of new pools of well-trained vegetation surveyors, the NVD will continue to be purpose driven and serve the needs of biological survey in pursuit of sustainable use of the vegetation and flora resources of the southern African subcontinent.

  17. Principles of a New TLD Database System

    International Nuclear Information System (INIS)

    The personnel exposures to ionizing radiation at the NRC-Negev are evaluated et the Dosimetry Laboratory. The Laboratory operates an independent database system based on a combination of ''Open Access'' and ''Paradox'' data processors. It has many limitations and is not as efficient as modern systems are. This, and the Y2K constraint concerning the ''Open Access'' data processor originated the project of designing and operating a new and modern database system for the NRCN dosimetry laboratory. The System was designed to handle the output files Tom TLD-Readers, analyze the readings, calculate exposures and manage the dosimetry files which contain the detailed penetrating and skin doses. In addition. the system manages the attribution of TLD cards and exposures to employees. The system provides a fully automated database containing all the machine readings and calculated exposures of workers with a minimal interference of the system operators. The system reads the TLD-Reader files automatically, calculates the exposures of each TLD-Card and assigns these exposures to the employee who carries it. Records are produced which are sorted according to the date, and they form the personnel dosimetry database. The system was written for a single ''Harshaw 6600 TLD-Reader'' but can be easily adjusted to work with different kinds of TLD-Readers. There is no limit to the capacity of workers and cards records, and it can be adjusted to different kinds of calculation procedures. The system is a Client / Server application where the server side is written with ''Microsoft SQL SERVER'' and ''Microsoft Access 97'' is on the client side

  18. Bringing the client back in

    DEFF Research Database (Denmark)

    Danneris, Sophie; Nielsen, Mathias Herup

    2016-01-01

    Categorising the ‘job readiness’ of the unemployed client is a task of utmost importance for active labour market policies. Scholarly attention on the topic has mostly focused on either questions of political legitimacy or questions of how categories are practically negotiated in meetings between...... welfare system and client. This paper suggests a comparative design in which the government rhetoric of job readiness is contrasted with findings from a qualitative longitudinal study into the lived experiences of recent welfare reforms in Denmark. Thus, our study set out to explore how job readiness is......-known poststructuralist risk of reducing welfare clients to mere formable objects. Furthermore, the analysis presents a critical view on current categorisation practices, as it strongly and in great detail exemplifies what current government rhetoric fails to address....

  19. An interactive end-user software application for a deep-sea photographic database

    Digital Repository Service at National Institute of Oceanography (India)

    Jaisankar, S.; Sharma, R.

    A photographic database is created for cataloguing data from underwater deep-tow photographic surveys conducted in the Central Indian Ocean Basin. This includes digitizing, encoding and merging different types of data and information obtained from...

  20. Cloud Databases: A Paradigm Shift in Databases

    Directory of Open Access Journals (Sweden)

    Indu Arora

    2012-07-01

    Full Text Available Relational databases ruled the Information Technology (IT industry for almost 40 years. But last few years have seen sea changes in the way IT is being used and viewed. Stand alone applications have been replaced with web-based applications, dedicated servers with multiple distributed servers and dedicated storage with network storage. Cloud computing has become a reality due to its lesser cost, scalability and pay-as-you-go model. It is one of the biggest changes in IT after the rise of World Wide Web. Cloud databases such as Big Table, Sherpa and SimpleDB are becoming popular. They address the limitations of existing relational databases related to scalability, ease of use and dynamic provisioning. Cloud databases are mainly used for data-intensive applications such as data warehousing, data mining and business intelligence. These applications are read-intensive, scalable and elastic in nature. Transactional data management applications such as banking, airline reservation, online e-commerce and supply chain management applications are write-intensive. Databases supporting such applications require ACID (Atomicity, Consistency, Isolation and Durability properties, but these databases are difficult to deploy in the cloud. The goal of this paper is to review the state of the art in the cloud databases and various architectures. It further assesses the challenges to develop cloud databases that meet the user requirements and discusses popularly used Cloud databases.

  1. Perspectives on a Big Data Application: What Database Engineers and IT Students Need to Know

    OpenAIRE

    E. Erturk; Jyoti, K.

    2015-01-01

    Cloud Computing and Big Data are important and related current trends in the world of information technology. They will have significant impact on the curricula of computer engineering and information systems at universities and higher education institutions. Learning about big data is useful for both working database professionals and students, in accordance with the increase in jobs requiring these skills. It is also important to address a broad gamut of database engineering skills, i.e. da...

  2. Database management systems understanding and applying database technology

    CERN Document Server

    Gorman, Michael M

    1991-01-01

    Database Management Systems: Understanding and Applying Database Technology focuses on the processes, methodologies, techniques, and approaches involved in database management systems (DBMSs).The book first takes a look at ANSI database standards and DBMS applications and components. Discussion focus on application components and DBMS components, implementing the dynamic relationship application, problems and benefits of dynamic relationship DBMSs, nature of a dynamic relationship application, ANSI/NDL, and DBMS standards. The manuscript then ponders on logical database, interrogation, and phy

  3. Dealing Bandwidth to Mobile Clients Using Games

    Science.gov (United States)

    Sofokleous, Anastasis A.; Angelides, Marios C.

    This chapter exploits a gaming approach to bandwidth sharing in a network of non-cooperative clients whose aim is to satisfy their selfish objectives and be served in the shortest time and who share limited knowledge of one another. The chapter models this problem as a game in which players consume the bandwidth of a video streaming server. The rest of this chapter is organized in four sections: the proceeding section presents resource allocation taxonomies, following that is a section on game theory, where our approach is sourced from, and its application to resource allocation. The penultimate section presents our gaming approach to resource allocation. The final section concludes.

  4. Databases applicable to quantitative hazard/risk assessment-Towards a predictive systems toxicology

    International Nuclear Information System (INIS)

    The Workshop on The Power of Aggregated Toxicity Data addressed the requirement for distributed databases to support quantitative hazard and risk assessment. The authors have conceived and constructed with federal support several databases that have been used in hazard identification and risk assessment. The first of these databases, the EPA Gene-Tox Database was developed for the EPA Office of Toxic Substances by the Oak Ridge National Laboratory, and is currently hosted by the National Library of Medicine. This public resource is based on the collaborative evaluation, by government, academia, and industry, of short-term tests for the detection of mutagens and presumptive carcinogens. The two-phased evaluation process resulted in more than 50 peer-reviewed publications on test system performance and a qualitative database on thousands of chemicals. Subsequently, the graphic and quantitative EPA/IARC Genetic Activity Profile (GAP) Database was developed in collaboration with the International Agency for Research on Cancer (IARC). A chemical database driven by consideration of the lowest effective dose, GAP has served IARC for many years in support of hazard classification of potential human carcinogens. The Toxicological Activity Profile (TAP) prototype database was patterned after GAP and utilized acute, subchronic, and chronic data from the Office of Air Quality Planning and Standards. TAP demonstrated the flexibility of the GAP format for air toxics, water pollutants and other environmental agents. The GAP format was also applied to developmental toxicants and was modified to represent quantitative results from the rodent carcinogen bioassay. More recently, the authors have constructed: 1) the NIEHS Genetic Alterations in Cancer (GAC) Database which quantifies specific mutations found in cancers induced by environmental agents, and 2) the NIEHS Chemical Effects in Biological Systems (CEBS) Knowledgebase that integrates genomic and other biological data including

  5. Key-linked on-line databases for clinical research.

    Science.gov (United States)

    Müller, Thomas H

    2012-01-01

    Separating patient identification data from clinical data and/or information about biomaterial samples is an effective data protection measure, especially in clinical research employing "on-line", i.e., web-based, data capture. In this paper, we show that this specialised technique can be generalised into a network architecture of interconnected on-line databases potentially serving a variety of purposes. The basic idea of this approach consists of maintaining logical links, i.e., common record keys, between corresponding data structures in pairs of databases while keeping the actual key values hidden from clients. For client systems, simultaneous access to corresponding records is mediated by temporary access tokens. At the relational level, these links are represented by arbitrary unique record keys common to both databases. This architecture allows for integration of related data in different databases without replicating or permanently sharing this data in one place. Each participating on-line database can determine the degree of integration by specifying linkage keys only for those data structures that may be logically connected to other data. Logical links can de designed for specific use cases. In addition, each database controls user access by enforcing its own authorisation scheme. Another advantage is that individual database owners retain considerable leeway in adapting to changing local requirements without compromising the integration into the network. Beyond protecting individual subject identification data, this architecture permits splitting a cooperatively used data pool to achieve many kinds of objectives. Application examples could be clinical registries needing subject contact information for follow-up, biomaterial banks with or without genetic information, and automatic or assisted integration of data from electronic medical records into research data. PMID:22874246

  6. The abandoned surface mining sites in the Czech Republic: mapping and creating a database with a GIS web application

    OpenAIRE

    Pokorný, Richard; Peterková, Marie Tereza

    2016-01-01

    Based on the vectorization of the 55-volume book series the Quarry Inventories of the Czechoslovak Republic/Czechoslovak Socialist Republic, published in the years 1932–1961, a new comprehensive database was built comprising 9958 surface mining sites of raw materials, which were active in the first half of the 20th century. The mapped area covers 40.9 % of the territory of the Czech Republic. For the purposes of visualization, a map application, the Quarry Inventories Online...

  7. A large channel count multi client data acquisition system for superconducting magnet system of SST-1

    International Nuclear Information System (INIS)

    The magnet system of the Steady-state Superconducting Tokamak-1 at the Institute for Plasma Research, Gandhinagar, India, consists of sixteen Toroidal field and nine Poloidal field Superconducting coils together with a pair of resistive PF coils, an air core ohmic transformer and a pair of vertical field coils. These coils are instrumented with various cryogenic grade sensors and voltage taps to monitor its operating status and health during different operational scenarios. A VME based data acquisition system with remote system architecture is implemented for data acquisition and control of the complete magnet operation. Client-Server based architecture is implemented with remote hardware configuration and continuous online/offline monitoring. A JAVA based platform independent client application is developed for data analysis and data plotting. The server has multiple data pipeline architecture to send data to storage database, online plotting application, numerical display screen, and run time calculation. This paper describes software architecture, design and implementation of the data acquisition system. (author)

  8. Open client/server computing and middleware

    CERN Document Server

    Simon, Alan R

    2014-01-01

    Open Client/Server Computing and Middleware provides a tutorial-oriented overview of open client/server development environments and how client/server computing is being done.This book analyzes an in-depth set of case studies about two different open client/server development environments-Microsoft Windows and UNIX, describing the architectures, various product components, and how these environments interrelate. Topics include the open systems and client/server computing, next-generation client/server architectures, principles of middleware, and overview of ProtoGen+. The ViewPaint environment

  9. KALIMER database development

    International Nuclear Information System (INIS)

    KALIMER database is an advanced database to utilize the integration management for liquid metal reactor design technology development using Web applications. KALIMER design database is composed of results database, Inter-Office Communication (IOC), 3D CAD database, and reserved documents database. Results database is a research results database during all phase for liquid metal reactor design technology development of mid-term and long-term nuclear R and D. IOC is a linkage control system inter sub project to share and integrate the research results for KALIMER. 3D CAD database is a schematic overview for KALIMER design structure. And reserved documents database is developed to manage several documents and reports since project accomplishment

  10. Development of a dose database in the refuelling scenario of a nuclear power plant for a virtual reality application

    International Nuclear Information System (INIS)

    Operators in Nuclear Power Plants can receive high doses during refuelling operation. A training program simulating refuelling operations will be useful to reduce doses received by workers as well as to minimise operation time. With this goal in mind a Virtual Reality application is developed in the frame of CIPRES Project (Calculos Interactivos de Proteccion Radiologica en un Entorno de Simulacion - Interactive Calculations of Radiological Protection in a Simulation Environment), a RD project sponsored by IBERINCO and developed jointly by IBERINCO and the Nuclear Engineering Department of the Polytechnic University of Valencia. The Virtual Reality application requires the possibility of displaying doses, both instantaneous and accumulated, at all times during the operator training. Therefore, it is necessary to elaborate a database containing dose rates at every point of the refuelling plant. This database is elaborated from Radiological Protection Surveillance data measured throughout the plant during refuelling operation. To estimate doses throughout the refuelling plant some interpolation routines have been used. Different assumptions have been adopted in order to perform the interpolation and obtain consistent data. In this paper, procedures developed to elaborate the dose database for the Virtual Reality application are presented and analysed

  11. The applicability of public LCI databases in the framework of an integrated product policy in the area of electronics industry - a case study with the Swiss database ''ecoinvent''

    Energy Technology Data Exchange (ETDEWEB)

    Hischier, R.; Lehmann, M. [Swiss Federal Labs. for Materials Testing and Research, Technology and Society Lab, St. Gallen (Switzerland)

    2004-07-01

    Within the last couple of years, several initiatives for the creation of national Life-Cycle Inventory (LCI) databases have been taken - in Europe e.g. in Germany, in Denmark, in Sweden and in Switzerland. This presentation describes the content of such national LCI databases from the viewpoint of the electronics industry and shows its crucial importance in the framework of the application of the integrated product policy (IPP) to this sector. (orig.)

  12. GRAPH DATABASES AND GRAPH VIZUALIZATION

    OpenAIRE

    Klančar, Jure

    2013-01-01

    The thesis presents graph databases. Graph databases are a part of NoSQL databases, which is why this thesis presents basics of NoSQL databases as well. We have focused on advantages of graph databases compared to rela- tional databases. We have used one of native graph databases (Neo4j), to present more detailed processing of graph databases. To get more acquainted with graph databases and its principles, we developed a simple application that uses a Neo4j graph database to...

  13. Literature Review and Database of Relations Between Salinity and Aquatic Biota: Applications to Bowdoin National Wildlife Refuge, Montana

    Science.gov (United States)

    Gleason, Robert A.; Tangen, Brian A.; Laubhan, Murray K.; Finocchiaro, Raymond G.; Stamm, John F.

    2009-01-01

    Long-term accumulation of salts in wetlands at Bowdoin National Wildlife Refuge (NWR), Mont., has raised concern among wetland managers that increasing salinity may threaten plant and invertebrate communities that provide important habitat and food resources for migratory waterfowl. Currently, the U.S. Fish and Wildlife Service (USFWS) is evaluating various water management strategies to help maintain suitable ranges of salinity to sustain plant and invertebrate resources of importance to wildlife. To support this evaluation, the USFWS requested that the U.S. Geological Survey (USGS) provide information on salinity ranges of water and soil for common plants and invertebrates on Bowdoin NWR lands. To address this need, we conducted a search of the literature on occurrences of plants and invertebrates in relation to salinity and pH of the water and soil. The compiled literature was used to (1) provide a general overview of salinity concepts, (2) document published tolerances and adaptations of biota to salinity, (3) develop databases that the USFWS can use to summarize the range of reported salinity values associated with plant and invertebrate taxa, and (4) perform database summaries that describe reported salinity ranges associated with plants and invertebrates at Bowdoin NWR. The purpose of this report is to synthesize information to facilitate a better understanding of the ecological relations between salinity and flora and fauna when developing wetland management strategies. A primary focus of this report is to provide information to help evaluate and address salinity issues at Bowdoin NWR; however, the accompanying databases, as well as concepts and information discussed, are applicable to other areas or refuges. The accompanying databases include salinity values reported for 411 plant taxa and 330 invertebrate taxa. The databases are available in Microsoft Excel version 2007 (http://pubs.usgs.gov/sir/2009/5098/downloads/databases_21april2009.xls) and contain

  14. Turkish Cloud-Radiation Database (CRD) and Its Application with CDR Bayesian Probability Algorithm

    Science.gov (United States)

    Oztopal, A.; Mugnai, A.; Casella, D.; Formenton, M.; Sano, P.; Sonmez, I.; Sen, Z.; Hsaf Team

    2010-12-01

    ABSTRACT It is rather a very difficult task to determine ground rainfall amounts from few Special Sensor Microwave Imager/Sounder (SSMI/S) channels. Although ground rainfall cannot be observed from the space directly, but knowledge about the cloud physics helps to estimate the amound of ground rainfall. SSMI/S includes so much information about the atmospheric structure, however it cannot provide cloud micro-physical structural information. In such a situation, in the rainfall algorithm, besides the SSMI/S data, it is necessary to incorporate cloud micro-physical properties from an external data source. These properties can be obtained quite simply by the help of Cloud Resolving Model (CRM). Later, in addition to all available data, also micro-physical properties obtained from Radiative Transfer Model (RTM) help to determine the SSMI/S brightness temperatures (Brightness temperatures - TBs), which can then be correlated with Cloud-Radiation Database (CRD) data generation. SSMI/S satellite data and CDR provide a common basis for rainfall prediction procedure through CDR Bayesian probability algorithm, which combines the two sets of data in a scientific manner. The first applications of this algorithm, which is being used up today, is due to various researchers. In this work, in order to establish a reflection of available data processing CDR CRM University of Wisconsin - Non-hydrostatic Modeling System (UW-NMS) model is employed, which is first developed by Prof. Gregory J. Tripoli. It is also used by Turkish Meteorological Service by benefiting from radar network data, and finally 14 simulations are realized in this study. Moreover, one case study is fulfilled by using a 3X3 spatial filtering, and then radar data and result of CDR Bayesian probability algorithm are compared with each other. On 9 September 2009 at 03:40 GMT rainfall event on comparatively flat area matches far better with the retrieval values and hence the spatial rainfall occurrence extent and

  15. Mars Science Laboratory Frame Manager for Centralized Frame Tree Database and Target Pointing

    Science.gov (United States)

    Kim, Won S.; Leger, Chris; Peters, Stephen; Carsten, Joseph; Diaz-Calderon, Antonio

    2013-01-01

    The FM (Frame Manager) flight software module is responsible for maintaining the frame tree database containing coordinate transforms between frames. The frame tree is a proper tree structure of directed links, consisting of surface and rover subtrees. Actual frame transforms are updated by their owner. FM updates site and saved frames for the surface tree. As the rover drives to a new area, a new site frame with an incremented site index can be created. Several clients including ARM and RSM (Remote Sensing Mast) update their related rover frames that they own. Through the onboard centralized FM frame tree database, client modules can query transforms between any two frames. Important applications include target image pointing for RSM-mounted cameras and frame-referenced arm moves. The use of frame tree eliminates cumbersome, error-prone calculations of coordinate entries for commands and thus simplifies flight operations significantly.

  16. Knowledge discovery in databases of biomechanical variables: application to the sit to stand motor task

    OpenAIRE

    Benvenuti Francesco; Starita Antonina; Della Croce Ugo; Vannozzi Giuseppe; Cappozzo Aurelio

    2004-01-01

    Abstract Background The interpretation of data obtained in a movement analysis laboratory is a crucial issue in clinical contexts. Collection of such data in large databases might encourage the use of modern techniques of data mining to discover additional knowledge with automated methods. In order to maximise the size of the database, simple and low-cost experimental set-ups are preferable. The aim of this study was to extract knowledge inherent in the sit-to-stand task as performed by healt...

  17. Design of remote weather monitor system based on embedded web database

    International Nuclear Information System (INIS)

    The remote weather monitoring system is designed by employing the embedded Web database technology and the S3C2410 microprocessor as the core. The monitoring system can simultaneously monitor the multi-channel sensor signals, and can give a dynamic Web pages display of various types of meteorological information on the remote computer. It gives a elaborated introduction of the construction and application of the Web database under the embedded Linux. Test results show that the client access the Web page via the GPRS or the Internet, acquires data and uses an intuitive graphical way to display the value of various types of meteorological information. (authors)

  18. Teaching Three-Dimensional Structural Chemistry Using Crystal Structure Databases. 3. The Cambridge Structural Database System: Information Content and Access Software in Educational Applications

    Science.gov (United States)

    Battle, Gary M.; Allen, Frank H.; Ferrence, Gregory M.

    2011-01-01

    Parts 1 and 2 of this series described the educational value of experimental three-dimensional (3D) chemical structures determined by X-ray crystallography and retrieved from the crystallographic databases. In part 1, we described the information content of the Cambridge Structural Database (CSD) and discussed a representative teaching subset of…

  19. Snoezelen: benefits for nursing older clients.

    Science.gov (United States)

    Morrissey, M; Biela, C

    1997-10-01

    In this article, the authors examine the possible benefits of Snoezelen for older clients. The authors suggest that nurses can be instrumental in developing and creating innovative therapeutic environments for this vulnerable client group. PMID:9370672

  20. Client/server models for transparent, distributed computational resources

    International Nuclear Information System (INIS)

    Client/server models are proposed to address issues of shared resources in a distributed, heterogeneous UNIX environment. Recent development of automated Remote Procedure Call (RPC) interface generator has simplified the development of client/server models. Previously, implementation of the models was only possible at the UNIX socket level. An overview of RPCs and the interface generator will be presented and will include a discussion of generation and installation of remote services, the RPC paradigm, and the three levels of RPC programming. Two applications, the Nuclear Plant Analyzer (NPA) and a fluids simulation using molecular modelling, will be presented to demonstrate how client/server models using RPCs and External Data Representations (XDR) have been used production/computation situations. The NPA incorporates a client/server interface for transferring/translation of TRAC or RELAP results from the UNICOS Cray to a UNIX workstation. The fluids simulation program utilizes the client/server model to access the Cray via a single function allowing it to become a shared co-processor to the workstation application. 5 refs., 6 figs

  1. The construction and application of the AMSR-E global microwave emissivity database

    International Nuclear Information System (INIS)

    Land surface microwave emissivity is an important parameter to describe the characteristics of terrestrial microwave radiation, and is the necessary input amount for inversion various geophysical parameters. We use brightness temperature of the Advanced Microwave Scanning Radiometer-Earth Observing System (AMSR-E) and synchronous land surface temperature and atmospheric temperature-humidity profile data obtained from the MODIS which aboard on satellite AQUA the same as AMSR-E, to retrieved microwave emissivity under clear sky conditions. After quality control, evaluation and design, the global microwave emissivity database of AMSR-E under clear sky conditions is established. This database include 2002–2011 years, different regions, different surface coverage, dual-polarized, 6.9,10.65, 18.7, 23.8, 36.5 and 89GHz, ascending and descending orbit, spatial resolution 25km, global 0.05 degrees, instantaneous and half-month averaged emissivity data. The database can provide the underlying surface information for precipitation algorithm, water-vapor algorithm, and long-resolution mode model (General Circulation Model (GCM) etc.). It also provides underlying surface information for the satellite simulator, and provides basic prior knowledge of land surface radiation for future satellite sensors design. The emissivity database or the fast emissivity obtained can get ready for climate model, energy balance, data assimilation, geophysical model simulation, inversion and estimates of the physical parameters under the cloud cover conditions

  2. Document control system as an integral part of RA documentation database application

    International Nuclear Information System (INIS)

    The decision about the final shutdown of the RA research reactor in Vinca Institute has been brought in 2002, and therefore the preparations for its decommissioning have begun. All activities are supervised by the International Atomic Energy Agency (IAEA), which also provides technical and experts' support. This paper describes the document control system is an integral part of the existing RA documentation database. (author)

  3. 动态WEB数据库应用探究%Application Research of dynamic WEB database

    Institute of Scientific and Technical Information of China (English)

    乔立龙

    2015-01-01

    数据库技术现在已经相对成熟,并且结构比较严谨,不过这个灵活度还不够,如果可以实现把数据库和web相结合,那么必定能够在很大程度上扩大数据库的一个应用领域,这个相结合的方式其实也是现在数据库技术研究的一个热点。本课题总介绍的动态WEB数据库技术是采用这个中间件来完成的,这个的一个实现方法是:使用中间件吧这个Web服务器还有数据库服务器连在一个。由于中间件不当当可以使得前端用户能够访问后端异构数据库的这个数据源而达到一个中间件透明化的效果,同时还能够确保存取访问接口的一个开放性。%Database technology is already mature,and more rigorous structure,but this flexibility is not enough,if can realize the combination of database and web,then will be able to expand the database in a large extent of an applied field,the combination of in fact,is now the database technology a hot. This topic describes the dynamic web database technology is using the middleware to accomplish,this a realization method is that using middleware it the web server and database server connected to a.Due to the middleware Dangdang can make the front-end user to access heterogeneous backend database with the data source to a middleware transparency effect,but also can ensure the access interface of an open.

  4. Client participation in the rehabilitation process

    OpenAIRE

    Wressle, Ewa

    2002-01-01

    This thesis evaluates the rehabilitation process with respect to client participation. The Swedish version of a client-centred structure, the Canadian Occupational Performance Measure (COPM), is evaluated from the perspectives of the clients, the occupational therapists and the members of a rehabilitation team. Data have been collected through diaries, the COPM, assessments of ability to perform activities of daily living, mobility, self-assessments of pain and health, interviews with clients...

  5. Knowledge discovery in databases of biomechanical variables: application to the sit to stand motor task

    Directory of Open Access Journals (Sweden)

    Benvenuti Francesco

    2004-10-01

    Full Text Available Abstract Background The interpretation of data obtained in a movement analysis laboratory is a crucial issue in clinical contexts. Collection of such data in large databases might encourage the use of modern techniques of data mining to discover additional knowledge with automated methods. In order to maximise the size of the database, simple and low-cost experimental set-ups are preferable. The aim of this study was to extract knowledge inherent in the sit-to-stand task as performed by healthy adults, by searching relationships among measured and estimated biomechanical quantities. An automated method was applied to a large amount of data stored in a database. The sit-to-stand motor task was already shown to be adequate for determining the level of individual motor ability. Methods The technique of search for association rules was chosen to discover patterns as part of a Knowledge Discovery in Databases (KDD process applied to a sit-to-stand motor task observed with a simple experimental set-up and analysed by means of a minimum measured input model. Selected parameters and variables of a database containing data from 110 healthy adults, of both genders and of a large range of age, performing the task were considered in the analysis. Results A set of rules and definitions were found characterising the patterns shared by the investigated subjects. Time events of the task turned out to be highly interdependent at least in their average values, showing a high level of repeatability of the timing of the performance of the task. Conclusions The distinctive patterns of the sit-to-stand task found in this study, associated to those that could be found in similar studies focusing on subjects with pathologies, could be used as a reference for the functional evaluation of specific subjects performing the sit-to-stand motor task.

  6. Client Server design and implementation issues in the Accelerator Control System environment

    International Nuclear Information System (INIS)

    In distributed system communication software design, the Client Server model has been widely used. This paper addresses the design and implementation issues of such a model, particularly when used in Accelerator Control Systems. in designing the Client Server model one needs to decide how the services will be defined for a server, what types of messages the server will respond to, which data formats will be used for the network transactions and how the server will be located by the client. Special consideration needs to be given to error handling both on the server and client side. Since the server usually is located on a machine other than the client, easy and informative server diagnostic capability is required. The higher level abstraction provided by the Client Server model simplifies the application writing, however fine control over network parameters is essential to improve the performance. Above mentioned design issues and implementation trade-offs are discussed in this paper

  7. Exploring earthquake databases for the creation of magnitude-homogeneous catalogues: tools for application on a regional and global scale

    Science.gov (United States)

    Weatherill, G. A.; Pagani, M.; Garcia, J.

    2016-09-01

    The creation of a magnitude-homogenized catalogue is often one of the most fundamental steps in seismic hazard analysis. The process of homogenizing multiple catalogues of earthquakes into a single unified catalogue typically requires careful appraisal of available bulletins, identification of common events within multiple bulletins and the development and application of empirical models to convert from each catalogue's native scale into the required target. The database of the International Seismological Center (ISC) provides the most exhaustive compilation of records from local bulletins, in addition to its reviewed global bulletin. New open-source tools are developed that can utilize this, or any other compiled database, to explore the relations between earthquake solutions provided by different recording networks, and to build and apply empirical models in order to harmonize magnitude scales for the purpose of creating magnitude-homogeneous earthquake catalogues. These tools are described and their application illustrated in two different contexts. The first is a simple application in the Sub-Saharan Africa region where the spatial coverage and magnitude scales for different local recording networks are compared, and their relation to global magnitude scales explored. In the second application the tools are used on a global scale for the purpose of creating an extended magnitude-homogeneous global earthquake catalogue. Several existing high-quality earthquake databases, such as the ISC-GEM and the ISC Reviewed Bulletins, are harmonized into moment magnitude to form a catalogue of more than 562 840 events. This extended catalogue, while not an appropriate substitute for a locally calibrated analysis, can help in studying global patterns in seismicity and hazard, and is therefore released with the accompanying software.

  8. Web database development

    OpenAIRE

    Tsardas, Nikolaos A.

    2001-01-01

    This thesis explores the concept of Web Database Development using Active Server Pages (ASP) and Java Server Pages (JSP). These are among the leading technologies in the web database development. The focus of this thesis was to analyze and compare the ASP and JSP technologies, exposing their capabilities, limitations, and differences between them. Specifically, issues related to back-end connectivity using Open Database Connectivity (ODBC) and Java Database Connectivity (JDBC), application ar...

  9. Abductive Equivalential Translation and its application to Natural Language Database Interfacing

    CERN Document Server

    Rayner, M

    1994-01-01

    The thesis describes a logical formalization of natural-language database interfacing. We assume the existence of a ``natural language engine'' capable of mediating between surface linguistic string and their representations as ``literal'' logical forms: the focus of interest will be the question of relating ``literal'' logical forms to representations in terms of primitives meaningful to the underlying database engine. We begin by describing the nature of the problem, and show how a variety of interface functionalities can be considered as instances of a type of formal inference task which we call ``Abductive Equivalential Translation'' (AET); functionalities which can be reduced to this form include answering questions, responding to commands, reasoning about the completeness of answers, answering meta-questions of type ``Do you know...'', and generating assertions and questions. In each case, a ``linguistic domain theory'' (LDT) $\\Gamma$ and an input formula $F$ are given, and the goal is to construct a fo...

  10. Development of the Database for Environmental Sound Research and Application (DESRA: Design, Functionality, and Retrieval Considerations

    Directory of Open Access Journals (Sweden)

    Brian Gygi

    2010-01-01

    Full Text Available Theoretical and applied environmental sounds research is gaining prominence but progress has been hampered by the lack of a comprehensive, high quality, accessible database of environmental sounds. An ongoing project to develop such a resource is described, which is based upon experimental evidence as to the way we listen to sounds in the world. The database will include a large number of sounds produced by different sound sources, with a thorough background for each sound file, including experimentally obtained perceptual data. In this way DESRA can contain a wide variety of acoustic, contextual, semantic, and behavioral information related to an individual sound. It will be accessible on the Internet and will be useful to researchers, engineers, sound designers, and musicians.

  11. The South African National Vegetation Database: History, development, applications, problems and future

    OpenAIRE

    Leslie W. Powrie; Ladislav Mucina; Michael C. Rutherford

    2012-01-01

    Southern Africa has been recognised as one of the most interesting and important areas of the world from an ecological and evolutionary point of view. The establishment and development of the National Vegetation Database (NVD) of South Africa enabled South Africa to contribute to environmental planning and conservation management in this floristically unique region. In this paper, we aim to provide an update on the development of the NVD since it was last described, near its inception, more t...

  12. An Empirical Study of the Applications of Classification Techniques in Students Database

    OpenAIRE

    Tariq O. Fadl Elsid; Mergani. A. Eltahir

    2014-01-01

    University servers and databases store a huge amount of data including personal details, registration details, evaluation assessment, performance profiles, and many more for students and lecturers alike. main problem that faces any system administration or any users is data increasing per-second, which is stored in different type and format in the servers, learning about students from a huge amount of data including personal details, registration details, evaluation assessment...

  13. Proteomic database mining opens up avenues utilizing extracellular protein phosphorylation for novel therapeutic applications

    OpenAIRE

    Yalak, Garif; Olsen, Bjorn R.

    2015-01-01

    Summary Recent advances in extracellular signaling suggest that extracellular protein phosphorylation is a regulatory mechanism outside the cell. The list of reported active extracellular protein kinases and phosphatases is growing, and phosphorylation of an increasing number of extracellular matrix molecules and extracellular domains of trans-membrane proteins is being documented. Here, we use public proteomic databases, collagens – the major components of the extracellular matrix, extracell...

  14. ECG Database Applicable for Development and Testing of Pace Detection Algorithms

    Directory of Open Access Journals (Sweden)

    Irena Jekova

    2014-12-01

    Full Text Available This paper presents an ECG database, named 'PacedECGdb' (available at http://biomed.bas.bg/bioautomation/2014/vol_18.4/files/PacedECGdb.zip, which contains different arrhythmias generated by HKP (Heidelberger Praxisklinik simulator, combined with artificially superimposed pacing pulses that cover the wide ranges of rising edge (from <10 µs to 100 µs and total pulse durations (from 100 µs to 2 ms and correspond to various pacemaker modes. It involves a total number of 1404 recordings - 780 representing 'pure' ECG with pacing pulses and 624 that comprise paced ECGs corrupted by tremor. The signals are recorded with 9.81 µV/LSB amplitude resolution at 128 kHz sampling rate in order to preserve the steep raising and trailing edges of the pace pulses. To the best of our knowledge, 'PacedECGdb' is the first publicly available paced ECG database. It could be used for development and testing of methods for pace detection in the ECG. The existence of ECGs corrupted by tremor (the only physiological noise that could compromise the methods for pacing pulses detection is an advantage, since such signals could be applied to define the signal-to-noise level for correct operation of the algorithm, or for improvement of the noise immunity of a method that is under development. The open access of the database makes it suitable for comparative studies including different algorithms.

  15. A public turbulence database cluster and applications to study Lagrangian evolution of velocity increments in turbulence

    CERN Document Server

    Li, Yi; Wan, Minping; Yang, Yunke; Meneveau, Charles; Burns, Randal; Chen, Shiyi; Szalay, Alexander; Eyink, Gregory

    2008-01-01

    A public database system archiving a direct numerical simulation (DNS) data set of isotropic, forced turbulence is described in this paper. The data set consists of the DNS output on $1024^3$ spatial points and 1024 time-samples spanning about one large-scale turn-over timescale. This complete $1024^4$ space-time history of turbulence is accessible to users remotely through an interface that is based on the Web-services model. Users may write and execute analysis programs on their host computers, while the programs make subroutine-like calls that request desired parts of the data over the network. The users are thus able to perform numerical experiments by accessing the 27 Terabytes of DNS data using regular platforms such as laptops. The architecture of the database is explained, as are some of the locally defined functions, such as differentiation and interpolation. Test calculations are performed to illustrate the usage of the system and to verify the accuracy of the methods. The database is then used to a...

  16. Call center. Centrados en el cliente

    OpenAIRE

    Leal-Alonso-de-Castañeda, José Enrique

    2003-01-01

    La empresa actual ha de estar preparada para responder al Cliente tal y como éste espera, porque no se busca un cliente puntual, sino un cliente fiel. La globalización de la economía y del acceso a los mercados exige que la empresa sea capaz de atraer al cliente no sólo con un servicio de calidad, sino además con una atención de calidad. La implantación de un Call Center (Centro de Atención al Cliente, Centro de Atención de Llamadas) constituye por todo ello una estrategia de negocio qu...

  17. Database design for a kindergarten Pastelka

    OpenAIRE

    Grombíř, Tomáš

    2010-01-01

    This bachelor thesis deals with analysis, creation of database for a kindergarten and installation of the designed database into the database system MySQL. Functionality of the proposed database was verified through an application written in PHP.

  18. Collaborating with Your Clients Using Social Media & Mobile Communications

    Science.gov (United States)

    Typhina, Eli; Bardon, Robert E.; Gharis, Laurie W.

    2015-01-01

    Many Extension educators are still learning how to effectively integrate social media into their programs. By using the right social media platforms and mobile applications to create engaged, online communities, Extension educators can collaborate with clients to produce and to share information expanding and enhancing their social media and…

  19. Systemic Power, Disciplinary Agency, and Developer–Business Client Relations

    DEFF Research Database (Denmark)

    Rowlands, Bruce; Kautz, Karlheinz

    2013-01-01

    This paper presents Hardy’s multi-dimensional model of power and illustrates its application to the field of IS. Findings from a case study of developer—business client power relations within a large financial institution are presented. Our findings indicate that from the developers’ perspective......, the client exercised near complete control, with developers unwittingly playing a cooperative but submissive role. Our study makes two principal contributions. First, we combine Hardy’s (1996) multi-dimensional power framework and the principles of Pickering’s (1995) version of disciplinary agency to...

  20. The Database Query Support Processor (QSP)

    Science.gov (United States)

    1993-01-01

    The number and diversity of databases available to users continues to increase dramatically. Currently, the trend is towards decentralized, client server architectures that (on the surface) are less expensive to acquire, operate, and maintain than information architectures based on centralized, monolithic mainframes. The database query support processor (QSP) effort evaluates the performance of a network level, heterogeneous database access capability. Air Force Material Command's Rome Laboratory has developed an approach, based on ANSI standard X3.138 - 1988, 'The Information Resource Dictionary System (IRDS)' to seamless access to heterogeneous databases based on extensions to data dictionary technology. To successfully query a decentralized information system, users must know what data are available from which source, or have the knowledge and system privileges necessary to find out this information. Privacy and security considerations prohibit free and open access to every information system in every network. Even in completely open systems, time required to locate relevant data (in systems of any appreciable size) would be better spent analyzing the data, assuming the original question was not forgotten. Extensions to data dictionary technology have the potential to more fully automate the search and retrieval for relevant data in a decentralized environment. Substantial amounts of time and money could be saved by not having to teach users what data resides in which systems and how to access each of those systems. Information describing data and how to get it could be removed from the application and placed in a dedicated repository where it belongs. The result simplified applications that are less brittle and less expensive to build and maintain. Software technology providing the required functionality is off the shelf. The key difficulty is in defining the metadata required to support the process. The database query support processor effort will provide

  1. Hardened Client Platforms for Secure Internet Banking

    Science.gov (United States)

    Ronchi, C.; Zakhidov, S.

    We review the security of e-banking platforms with particular attention to the exploitable attack vectors of three main attack categories: Man-in-the-Middle, Man-in-the-PC and Man-in-the-Browser. It will be shown that the most serious threats come from combination attacks capable of hacking any transaction without the need to control the authentication process. Using this approach, the security of any authentication system can be bypassed, including those using SecureID Tokens, OTP Tokens, Biometric Sensors and Smart Cards. We will describe and compare two recently proposed e-banking platforms, the ZTIC and the USPD, both of which are based on the use of dedicated client devices, but with diverging approaches with respect to the need of hardening the Web client application. It will be shown that the use of a Hardened Browser (or H-Browser) component is critical to force attackers to employ complex and expensive techniques and to reduce the strength and variety of social engineering attacks down to physiological fraud levels.

  2. Reliability database of IEA-R1 Brazilian research reactor: Applications to the improvement of installation safety

    International Nuclear Information System (INIS)

    In this paper the main features of the reliability database being developed at Ipen-Cnen/SP for IEA-R1 reactor are briefly described. Besides that, the process for collection and updating of data regarding operation, failure and maintenance of IEA-R1 reactor components is presented. These activities have been conducted by the reactor personnel under the supervision of specialists in Probabilistic Safety Analysis (PSA). The compilation of data and subsequent calculation are based on the procedures defined during an IAEA Coordinated Research Project which Brazil took part in the period from 2001 to 2004. In addition to component reliability data, the database stores data on accident initiating events and human errors. Furthermore, this work discusses the experience acquired through the development of the reliability database covering aspects like improvements in the reactor records as well as the application of the results to the optimization of operation and maintenance procedures and to the PSA carried out for IEA-R1 reactor. (author)

  3. The abandoned surface mining sites in the Czech Republic: mapping and creating a database with a GIS web application

    Science.gov (United States)

    Pokorný, Richard; Tereza Peterková, Marie

    2016-05-01

    Based on the vectorization of the 55-volume book series the Quarry Inventories of the Czechoslovak Republic/Czechoslovak Socialist Republic, published in the years 1932-1961, a new comprehensive database was built comprising 9958 surface mining sites of raw materials, which were active in the first half of the 20th century. The mapped area covers 40.9 % of the territory of the Czech Republic. For the purposes of visualization, a map application, the Quarry Inventories Online, was created that enables the data visualization.

  4. A database application for pre-processing, storage and comparison of mass spectra derived from patients and controls

    Directory of Open Access Journals (Sweden)

    Sillevis Smitt Peter A

    2006-09-01

    Full Text Available Abstract Background Statistical comparison of peptide profiles in biomarker discovery requires fast, user-friendly software for high throughput data analysis. Important features are flexibility in changing input variables and statistical analysis of peptides that are differentially expressed between patient and control groups. In addition, integration the mass spectrometry data with the results of other experiments, such as microarray analysis, and information from other databases requires a central storage of the profile matrix, where protein id's can be added to peptide masses of interest. Results A new database application is presented, to detect and identify significantly differentially expressed peptides in peptide profiles obtained from body fluids of patient and control groups. The presented modular software is capable of central storage of mass spectra and results in fast analysis. The software architecture consists of 4 pillars, 1 a Graphical User Interface written in Java, 2 a MySQL database, which contains all metadata, such as experiment numbers and sample codes, 3 a FTP (File Transport Protocol server to store all raw mass spectrometry files and processed data, and 4 the software package R, which is used for modular statistical calculations, such as the Wilcoxon-Mann-Whitney rank sum test. Statistic analysis by the Wilcoxon-Mann-Whitney test in R demonstrates that peptide-profiles of two patient groups 1 breast cancer patients with leptomeningeal metastases and 2 prostate cancer patients in end stage disease can be distinguished from those of control groups. Conclusion The database application is capable to distinguish patient Matrix Assisted Laser Desorption Ionization (MALDI-TOF peptide profiles from control groups using large size datasets. The modular architecture of the application makes it possible to adapt the application to handle also large sized data from MS/MS- and Fourier Transform Ion Cyclotron Resonance (FT-ICR mass

  5. Web application for genetic modification flux with database to estimate metabolic fluxes of genetic mutants.

    Science.gov (United States)

    Mohd Ali, Noorlin; Tsuboi, Ryo; Matsumoto, Yuta; Koishi, Daisuke; Inoue, Kentaro; Maeda, Kazuhiro; Kurata, Hiroyuki

    2016-07-01

    Computational analysis of metabolic fluxes is essential in understanding the structure and function of a metabolic network and in rationally designing genetically modified mutants for an engineering purpose. We had presented the genetic modification flux (GMF) that predicts the flux distribution of a broad range of genetically modified mutants. To enhance the feasibility and usability of GMF, we have developed a web application with a metabolic network database to predict a flux distribution of genetically modified mutants. One hundred and twelve data sets of Escherichia coli, Corynebacterium glutamicum, Saccharomyces cerevisiae, and Chinese hamster ovary were registered as standard models. PMID:26777238

  6. An on-line scaling method for improving scalability of a database cluster

    Institute of Scientific and Technical Information of China (English)

    JANG Yong-ll; LEE Chung-ho; LEE Jae-dong; BAE Hae-young

    2004-01-01

    The explosive growth of the Internet and database applications has driven database to be more scalable and available, and able to support on-line scaling without interrupting service. To support more client's queries without downtime and degrading the response time, more nodes have to be scaled up while the database is running. This paper presents the overview of scalable and available database that satisfies the above characteristics. And we propose a novel on-line scaling method. Our method improves the existing on-line scaling method for fast response time and higher throughputs. Our proposed method reduces unnecessary network use, i.e. , we decrease the number of data copy by reusing the backup data. Also, our on-line scaling operation can be processed parallel by selecting adequate nodes as new node. Our performance study shows that our method results in significant reduction in data copy time.

  7. Improved materials management through client/server computing

    International Nuclear Information System (INIS)

    This paper reports that materials management and procurement impacts every organization within an electric utility from power generation to customer service. An efficient material management and procurement system can help improve productivity and minimize operating costs. It is no longer sufficient to simply automate materials management using inventory control systems. Smart companies are building centralized data warehouses and use the client/server style of computing to provide real time data access. This paper describes how Alabama Power Company, Southern Company Services and Digital Equipment Corporation transformed two existing applications, a purchase order application within DEC's ALL-IN-1 environment and a materials management application within an IBM CICS environment, into a data warehouse - client/server application. An application server is used to overcome incompatibilities between computing environments and provide easy, real-time access to information residing in multi-vendor environments

  8. DFACS - DATABASE, FORMS AND APPLICATIONS FOR CABLING AND SYSTEMS, VERSION 3.30

    Science.gov (United States)

    Billitti, J. W.

    1994-01-01

    DFACS is an interactive multi-user computer-aided engineering tool for system level electrical integration and cabling engineering. The purpose of the program is to provide the engineering community with a centralized database for entering and accessing system functional definitions, subsystem and instrument-end circuit pinout details, and harnessing data. The primary objective is to provide an instantaneous single point of information interchange, thus avoiding error-prone, time-consuming, and costly multiple-path data shuttling. The DFACS program, which is centered around a single database, has built-in menus that provide easy data input and access for all involved system, subsystem, and cabling personnel. The DFACS program allows parallel design of circuit data sheets and harness drawings. It also recombines raw information to automatically generate various project documents and drawings including the Circuit Data Sheet Index, the Electrical Interface Circuits List, Assembly and Equipment Lists, Electrical Ground Tree, Connector List, Cable Tree, Cabling Electrical Interface and Harness Drawings, Circuit Data Sheets, and ECR List of Affected Interfaces/Assemblies. Real time automatic production of harness drawings and circuit data sheets from the same data reservoir ensures instant system and cabling engineering design harmony. DFACS also contains automatic wire routing procedures and extensive error checking routines designed to minimize the possibility of engineering error. DFACS is designed to run on DEC VAX series computers under VMS using Version 6.3/01 of INGRES QUEL/OSL, a relational database system which is available through Relational Technology, Inc. The program is available in VAX BACKUP format on a 1600 BPI 9-track magnetic tape (standard media) or a TK50 tape cartridge. DFACS was developed in 1987 and last updated in 1990. DFACS is a copyrighted work with all copyright vested in NASA. DEC, VAX and VMS are trademarks of Digital Equipment Corporation

  9. An Embedded Database Application for the Aggregation of Farming Device Data

    DEFF Research Database (Denmark)

    Iftikhar, Nadeem; Pedersen, Torben Bach

    2010-01-01

    In order to store massive amounts of data produced by the farming devices and to keep data that spans long intervals of time for analysis, reporting and maintenance purposes; it is desirable to reduce the size of the data by maintaining the data at different aggregate levels. The older data can b...... data aggregation effectively. Furthermore, the aggregation is achieved by using either two ratio-based aggregation methods or a time-granularity based aggregation method. A detailed description of the embedded database technology on a tractor computer is also presented in this paper....

  10. A public turbulence database cluster and applications to study Lagrangian evolution of velocity increments in turbulence

    OpenAIRE

    Li, Yi; Perlman, Eric; Wan, Minping; Yang, Yunke; Meneveau, Charles; Burns, Randal; Chen, Shiyi; Szalay, Alexander; Eyink, Gregory

    2008-01-01

    A public database system archiving a direct numerical simulation (DNS) data set of isotropic, forced turbulence is described in this paper. The data set consists of the DNS output on $1024^3$ spatial points and 1024 time-samples spanning about one large-scale turn-over timescale. This complete $1024^4$ space-time history of turbulence is accessible to users remotely through an interface that is based on the Web-services model. Users may write and execute analysis programs on their host comput...

  11. On Intelligent Database Systems

    OpenAIRE

    Dennis McLeod; Paul Yanover

    1992-01-01

    In response to the limitations of comtemporary database management systems in addressing the requirements of many potential application environments, and in view of the characteristics of emerging interconnected systems, we examine research directions involving adding more ‘intelligence’ to database systems. Three major thrusts in the intelligent database systems area are discussed. The first involves increasing the modeling power to represent an application environment. The second emphasis c...

  12. Client/server approach to image capturing

    Science.gov (United States)

    Tuijn, Chris; Stokes, Earle

    1998-01-01

    The diversity of the digital image capturing devices on the market today is quite astonishing and ranges from low-cost CCD scanners to digital cameras (for both action and stand-still scenes), mid-end CCD scanners for desktop publishing and pre- press applications and high-end CCD flatbed scanners and drum- scanners with photo multiplier technology. Each device and market segment has its own specific needs which explains the diversity of the associated scanner applications. What all those applications have in common is the need to communicate with a particular device to import the digital images; after the import, additional image processing might be needed as well as color management operations. Although the specific requirements for all of these applications might differ considerably, a number of image capturing and color management facilities as well as other services are needed which can be shared. In this paper, we propose a client/server architecture for scanning and image editing applications which can be used as a common component for all these applications. One of the principal components of the scan server is the input capturing module. The specification of the input jobs is based on a generic input device model. Through this model we make abstraction of the specific scanner parameters and define the scan job definitions by a number of absolute parameters. As a result, scan job definitions will be less dependent on a particular scanner and have a more universal meaning. In this context, we also elaborate on the interaction of the generic parameters and the color characterization (i.e., the ICC profile). Other topics that are covered are the scheduling and parallel processing capabilities of the server, the image processing facilities, the interaction with the ICC engine, the communication facilities (both in-memory and over the network) and the different client architectures (stand-alone applications, TWAIN servers, plug-ins, OLE or Apple-event driven

  13. Automated detection of clustered microcalcifications on mammograms: CAD system application to MIAS database

    International Nuclear Information System (INIS)

    To investigate the detection performance of our automated detection scheme for clustered microcalcifications on mammograms, we applied our computer-aided diagnosis (CAD) system to the database of the Mammographic Image Analysis Society (MIAS) in the UK. Forty-three mammograms from this database were used in this study. In our scheme, the breast regions were firstly extracted by determining the skinline. Histograms of the original images were used to extract the high-density area within the breast region as the segmentation from the fatty area around the skinline. Then the contrast correction technique was employed. Gradient vectors of the image density were calculated on the contrast corrected images. To extract the specific features of the pattern of the microcalcifications, triple-ring filter analysis was employed. A variable-ring filter was used for more accurate detection after the triple-ring filter. The features of the detected candidate areas were then characterized by feature analysis. The areas which satisfied the characteristics and specific terms were classified and displayed as clusters. As a result, the sensitivity was 95.8% with the false-positive rate at 1.8 clusters per image. This demonstrates that the automated detection of clustered microcalcifications in our CAD system is reliable as an aid to radiologists. (author)

  14. Application of the International Union of Radioecologists soil-to-plant database to Canadian settings

    International Nuclear Information System (INIS)

    The International Union of Radioecologists (IUR) has compiled a very large database of soil-to-plant transfer factors. These factors are ratios of the radionuclide concentrations in dry plants divided by the corresponding concentrations in dry soil to a specified depth or thickness. In this report the factors are called CR values, for concentration ratio. The CR values are empirical and are considered element-specific. The IUR database has a lot of data for Cs, Sr, Co, Pu and Np, and contains records for Am, Ce, Cm, I, La, Mn, Ni, Pb, Po, Ra, Ru, Sb, Tc, Th, U and Zn. Where there was a large amount of data, interpolation for ranges of soil conditions was possible. The tables presented here summarize the data in a way that should be immediately useful to modellers. Values are averaged for a number of crop types and species. Correction factors are developed to facilitate interpolation among soil conditions. The data tables in this report do not substitute for site-specific measurements, but they will provide data where measurement is impossible and give a background to check more recent data. (author) 4 refs ., 48 tabs

  15. A Spatiotemporal Database to Track Human Scrub Typhus Using the VectorMap Application.

    Directory of Open Access Journals (Sweden)

    Daryl J Kelly

    2015-12-01

    Full Text Available Scrub typhus is a potentially fatal mite-borne febrile illness, primarily of the Asia-Pacific Rim. With an endemic area greater than 13 million km2 and millions of people at risk, scrub typhus remains an underreported, often misdiagnosed febrile illness. A comprehensive, updatable map of the true distribution of cases has been lacking, and therefore the true risk of disease within the very large endemic area remains unknown. The purpose of this study was to establish a database and map to track human scrub typhus. An online search using PubMed and the United States Armed Forces Pest Management Board Literature Retrieval System was performed to identify articles describing human scrub typhus cases both within and outside the traditionally accepted endemic regions. Using World Health Organization guidelines, stringent criteria were used to establish diagnoses for inclusion in the database. The preliminary screening of 181 scrub typhus publications yielded 145 publications that met the case criterion, 267 case records, and 13 serosurvey records that could be georeferenced, describing 13,739 probable or confirmed human cases in 28 countries. A map service has been established within VectorMap (www.vectormap.org to explore the role that relative location of vectors, hosts, and the pathogen play in the transmission of mite-borne scrub typhus. The online display of scrub typhus cases in VectorMap illustrates their presence and provides an up-to-date geographic distribution of proven scrub typhus cases.

  16. Database design: Community discussion board

    OpenAIRE

    Klepetko, Radim

    2009-01-01

    The goal of this thesis is designing a database for discussion board application, which will be able to provide classic discussion board functionality and web 2.0 features in addition. The emphasis lies on a precise description of the application requirements, which are used afterwards to design an optimal database model independent from technological implementations (chosen database system). In the end of my thesis the database design is tested using MySQL database system.

  17. CoopSC: A Cooperative Database Caching Architecture

    OpenAIRE

    Vancea, A; Stiller, B.

    2010-01-01

    Semantic caching is a technique used for optimizing the evaluation of database queries by caching results of old queries and using them when answering new queries. CoopSC is a cooperative database caching architecture, which extends the classic semantic caching approach by allowing clients to share their local caches in a cooperative matter. Cache entries of all clients are indexed in a distributed data structure constructed on top of a Peer-to-Peer (P2P) overlay netwo...

  18. Construction and Application of the underlying database Capital%基于CHS的数据库建设与应用

    Institute of Scientific and Technical Information of China (English)

    李荫荣

    2015-01-01

    This paper mainly presents the construction and application of how to create a harness design of the underlying database in Capital Symbol module and Capital Library modules of the Capital Harness System sofeware module application process,which consists of graphics rendering and data creation two parts;guid to how to create the underlying database of Capital Harness System,able to procide guarantees for the follow-up of high efficiency,high quality automotive wiring harness design work completed.%Capital Symbol和Capital Library是基于CHS软件实现汽车电气系统设计平台化、通用化、信息数据化管理;本文主要介绍基于CHS软件应用过程中如何在Capital Symbol模块和Capital Library模块,创建线束设计的基础数据库,其中包含图形绘制和数据创建两部分内容;用于指导如何创建CHS基础数据库,为后续能高效、高质量的完成汽车线束设计工作提供保证。

  19. jSPyDB, an open source database-independent tool for data management

    Science.gov (United States)

    Pierro, Giuseppe Antonio; Cavallari, Francesca; Di Guida, Salvatore; Innocente, Vincenzo

    2011-12-01

    Nowadays, the number of commercial tools available for accessing Databases, built on Java or .Net, is increasing. However, many of these applications have several drawbacks: usually they are not open-source, they provide interfaces only with a specific kind of database, they are platform-dependent and very CPU and memory consuming. jSPyDB is a free web-based tool written using Python and Javascript. It relies on jQuery and python libraries, and is intended to provide a simple handler to different database technologies inside a local web browser. Such a tool, exploiting fast access libraries such as SQLAlchemy, is easy to install, and to configure. The design of this tool envisages three layers. The front-end client side in the local web browser communicates with a backend server. Only the server is able to connect to the different databases for the purposes of performing data definition and manipulation. The server makes the data available to the client, so that the user can display and handle them safely. Moreover, thanks to jQuery libraries, this tool supports export of data in different formats, such as XML and JSON. Finally, by using a set of pre-defined functions, users are allowed to create their customized views for a better data visualization. In this way, we optimize the performance of database servers by avoiding short connections and concurrent sessions. In addition, security is enforced since we do not provide users the possibility to directly execute any SQL statement.

  20. jSPyDB, an open source database-independent tool for data management

    International Nuclear Information System (INIS)

    Nowadays, the number of commercial tools available for accessing Databases, built on Java or .Net, is increasing. However, many of these applications have several drawbacks: usually they are not open-source, they provide interfaces only with a specific kind of database, they are platform-dependent and very CPU and memory consuming. jSPyDB is a free web-based tool written using Python and Javascript. It relies on jQuery and python libraries, and is intended to provide a simple handler to different database technologies inside a local web browser. Such a tool, exploiting fast access libraries such as SQLAlchemy, is easy to install, and to configure. The design of this tool envisages three layers. The front-end client side in the local web browser communicates with a backend server. Only the server is able to connect to the different databases for the purposes of performing data definition and manipulation. The server makes the data available to the client, so that the user can display and handle them safely. Moreover, thanks to jQuery libraries, this tool supports export of data in different formats, such as XML and JSON. Finally, by using a set of pre-defined functions, users are allowed to create their customized views for a better data visualization. In this way, we optimize the performance of database servers by avoiding short connections and concurrent sessions. In addition, security is enforced since we do not provide users the possibility to directly execute any SQL statement.

  1. Development Method of Wireless Application Based on Wireless Makeup Language and Web Database Technology

    Institute of Scientific and Technical Information of China (English)

    ZHANG Li; SHAO Shi-huang; WANG Jian; YIN Mei-hua

    2002-01-01

    Wireless technology is a new emerging delivery networks and development scheme of wireless internet is given widely attention currently. In order to make international visitors to surge education website at any time, anywhere by mobile handsets. The communication method of web database, such as CGI, ISAPI, JDBC and so on have been aralyzed and a new Active Server Page &Wireless Makeup Language (ASP-WML) based approach is presented. The dynamical refreshment of the homepage of wireless website and the automatic query of main information have been realized. At last, the wireless website of Dong Hua University is taken as an example to testify the possibility of wireless website design which is mentioned above.

  2. Photon and electron interaction databases and their use in medical applications

    International Nuclear Information System (INIS)

    This paper discusses the All Particle-Method photon and electron interaction, and atomic relaxation data bases, that were initially developed for use in medical applications. Currently these data bases are being used in both medical and industrial applications. The All Particle Method data bases are designed to allow modelling of individual collisions in as much detail as possible. Elastic scattering can be modelled as single, as opposed to multiple, scattering events. Ionization can be modelled at the atomic subshell level, to define which subshell was ionized, spectrum of the initially emitted electron, as well as the spectra of electron and photons emitted as the atom relaxes back to neutrality. These data bases are currently being used in applications involving rather small spatial regions, where detailed calculations of individual events are required. While initially designed for use in medical applications, these data bases are now being used in a variety of industrial applications, e.g., transport in microelectronics

  3. Client Involvement in Home Care Practice

    DEFF Research Database (Denmark)

    Glasdam, Stinne; Henriksen, Nina; Kjær, Lone;

    2013-01-01

    Client involvement’ has been a mantra within health policies, education curricula and healthcare institutions over many years, yet very little is known about how ‘client involvement’ is practised in home-care services. The aim of this article is to analyse ‘client involvement’ in practise seen......, political and administrative frames that rule home- care practice. Client involvement is shown within four constructed analytical categories: ‘Structural conditions of providing and receiving home care’; ‘Client involvement inside the home: performing a professional task and living an everyday life......’; ‘Client involvement outside the home: liberal business and mutual goal setting’; and ‘Converting a home to a working place: refurnishing a life’. The meaning of involvement is depending on which position it is viewed from. On the basis of this analysis, we raise the question of the extent to which...

  4. Psychoanalytic psychotherapy with a client with bulimia nervosa.

    Science.gov (United States)

    Lunn, Susanne; Daniel, Sarah I F; Poulsen, Stig

    2016-06-01

    This case study presents the progress of one patient with bulimia nervosa who was originally very compromised in psychological domains that are the focus of analytic treatment, and includes in-session therapeutic process and a range of outcomes, for example, eating disorder symptoms, attachment status, and reflective functioning. Nested in a study showing more rapid behavioral improvement in subjects receiving cognitive behavior therapy than in subjects receiving psychoanalytic psychotherapy, the case highlights the importance of supplementing RCTs with single case studies and the need of adapting the therapeutic approach as well as the current therapeutic dialogue to the individual client. (PsycINFO Database Record PMID:27267505

  5. Client Update: A Solution for Service Evolution

    OpenAIRE

    Ouederni, Meriem; Salaün, Gwen; Pimentel, Ernesto

    2011-01-01

    International audience In service-based systems, service evolution might raise critical communication issues since the client cannot be aware of the changes that have occurred on the black-box services side. In this paper, we propose an automated process to adapt the client to the changes that have occurred. Our approach relies on a compatibility measuring method, and changes the client interface to ensure the system compatibility. This solution is fully automated inside a prototype tool w...

  6. Briefing: The ICE intelligent client capability framework

    OpenAIRE

    Madter, N; Bower, DA

    2015-01-01

    Recent aspirations to transform the delivery of major capital programmes and projects in the public sector are focusing on the achievement of value for money, whole‐life asset management and sustainable procurement, embodied in the principles of the Intelligent Client. However, there is little support offered to those working in client functions to promote the development of the skills and behaviours that underpin effective client decision-making. In line with the re-launch Infrastructure UK'...

  7. Do client fees help or hurt?

    Science.gov (United States)

    Barnett, B

    1998-01-01

    This article discusses the impact of client fees for family planning (FP) services on cost recovery and level of user services in developing countries. The UN Population Fund reports that developing country governments currently pay 75% of the costs of FP programs. Donors contribute 15%, and clients pay 10%. Current pressures are on FP services to broaden and improve their scope, while user demand is increasing. Program managers should consider the program's need for funds and the clients' willingness to pay. Clients are willing to pay about 1% of their income for contraception. A study of sterilization acceptance in Mexico finds that the average monthly case load declined by 10% after the 1st price increase from $43 to $55 and declined by 58% after the 2nd price increase to $60. Fewer low-income clients requested sterilization. A CEMOPLAF study in Ecuador finds that in three price increase situations the number of clients seeking services declined, but the economic mix of clients remained about the same. The decline was 20% in the group with a 20% price increase and 26% in the 40% increase group. In setting fees, the first need is to determine unit costs. The Futures Group International recommends considering political, regulatory, and institutional constraints for charging fees; priorities for revenue use; protection for poor clients; and monitoring of money collection and expenditure. Management Sciences for Health emphasizes consideration of the reasons for collection of fees, client affordability, and client perception of quality issues. Sliding scales can be used to protect poor clients. Charging fees for laboratory services can subsidize poor clients. A Bangladesh program operated a restaurant and catering service in order to subsidize FP services. Colombia's PROFAMILIA sells medical and surgical services and a social marketing program in order to expand clinics. PMID:12293239

  8. An extensive surface model database for population- related information: concept and application

    OpenAIRE

    I Bracken

    1993-01-01

    Information technology has had a substantial impact on methods of geographical study. Although this varies between different application fields, effective use of the new technology requires careful consideration of the underlying concepts of geographic data representation. Here, an improved data model is described by means of raster techniques to represent population-related data in the form of surfaces. This type of model is seen as having broad applicability to the representation of socioec...

  9. Integrating the DLD dosimetry system into the Almaraz NPP Corporative Database

    International Nuclear Information System (INIS)

    The article discusses the experience acquired during the integration of a new MGP Instruments DLD Dosimetry System into the Almaraz NPP corporative database and general communications network, following a client-server philosophy and taking into account the computer standards of the Plant. The most important results obtained are: Integration of DLD dosimetry information into corporative databases, permitting the use of new applications Sharing of existing personnel information with the DLD dosimetry application, thereby avoiding the redundant work of introducing data and improving the quality of the information. Facilitation of maintenance, both software and hardware, of the DLD system. Maximum explotation, from the computer point of view, of the initial investment. Adaptation of the application to the applicable legislation. (Author)

  10. Management system development to establish an alumni database: application to a nuclear institution

    International Nuclear Information System (INIS)

    To pursue the alumni professional evolution has been a long time aspiration. To solve this problem it was developed at IPEN - Nuclear and Energy Research Institute, with support from CNEN - National Nuclear Energy Commission, a system to collect data from graduate alumni. This system was introduced in 2006, during the 30 years celebration of the Nuclear Technology Graduate Program from IPEN, held in association with the University of Sao Paulo - USP. The main purpose is to follow the career development of the alumni, mainly those not employed in any of the institutes linked to CNEN. The developed system allowed the creation of a database comprising information about the academic degree, professional status and the extension of their contribution to the society. It allows also to follow if the knowledge obtained remained restrict to the Universities and Research Institutes or reached the private companies. The system allows several statistics to be done concerning not only the alumni but also the professors. In this work the first results of the data collection are presented, containing more than 750 responses from a total around 1340 alumni. The final purpose is to upgrade this system to collect data from the several institutes linked to CNEN, either graduate or undergraduate alumni. (author)

  11. An Analysis of Database Replication Technologies with Regard to Deep Space Network Application Requirements

    Science.gov (United States)

    Connell, Andrea M.

    2011-01-01

    The Deep Space Network (DSN) has three communication facilities which handle telemetry, commands, and other data relating to spacecraft missions. The network requires these three sites to share data with each other and with the Jet Propulsion Laboratory for processing and distribution. Many database management systems have replication capabilities built in, which means that data updates made at one location will be automatically propagated to other locations. This project examines multiple replication solutions, looking for stability, automation, flexibility, performance, and cost. After comparing these features, Oracle Streams is chosen for closer analysis. Two Streams environments are configured - one with a Master/Slave architecture, in which a single server is the source for all data updates, and the second with a Multi-Master architecture, in which updates originating from any of the servers will be propagated to all of the others. These environments are tested for data type support, conflict resolution, performance, changes to the data structure, and behavior during and after network or server outages. Through this experimentation, it is determined which requirements of the DSN can be met by Oracle Streams and which cannot.

  12. Constructing Database for Drugs and its Application to Biological Sample by HPTLC and GC/MS

    Energy Technology Data Exchange (ETDEWEB)

    Yoo, Y.C.; Park, S.W.; Lim, M.A.; Baeck, S.K.; Park, S.Y.; Lee, J.S.; Lee, J.S. [National Institute of Scientific investigation, Seoul (Korea); Lho, D.S. [Korea Institute of Science and Technology, Seoul (Korea)

    2000-04-01

    For the identification of unknown drugs in biological samples, we attempted rapid high performance thin layer chromatographic method which is sensitive and selective chromatographic analysis of high performance thin layer chromatography (HPTLC) with automated TLC sampler and ultra-violet (UV) scanner. We constructed HPTLC database (DB) on two hundred five drugs by using the data of Rf values and UV spectra (scan 200-360 nm) as well as gas chromatography/mass spectrometry (GC/MS) DB on ninety six drugs by using the data of relative retention time (RRT) on lidocain and mass spectra. After extracting drugs in geological sample by solid phase extraction (Clean Screen ZSDAU020), we applied them to HPTLC and GC/MS DB. Drugs, especially extracted from biological samples, showed good matching ratio to HPTLC DB and these drugs were confirmed by GC/MS. In conclusion, this DB system is thought to be very useful method for the screening of unknown drugs in biological samples. (author). 9 refs., 2 tabs., 6 figs.

  13. [Application characteristics and situation analysis of volatile oils in database of Chinese patent medicine].

    Science.gov (United States)

    Wang, Sai-Jun; Wu, Zhen-Feng; Yang, Ming; Wang, Ya-Qi; Hu, Peng-Yi; Jie, Xiao-Lu; Han, Fei; Wang, Fang

    2014-09-01

    Aromatic traditional Chinese medicines have a long history in China, with wide varieties. Volatile oils are active ingredients extracted from aromatic herbal medicines, which usually contain tens or hundreds of ingredients, with many biological activities. Therefore, volatile oils are often used in combined prescriptions and made into various efficient preparations for oral administration or external use. Based on the sources from the database of Newly Edited National Chinese Traditional Patent Medicines (the second edition), the author selected 266 Chinese patent medicines containing volatile oils in this paper, and then established an information sheet covering such items as name, dosage, dosage form, specification and usage, and main functions. Subsequently, on the basis of the multidisciplinary knowledge of pharmaceutics, traditional Chinese pharmacology and basic theory of traditional Chinese medicine, efforts were also made in the statistics of the dosage form and usage, variety of volatile oils and main functions, as well as the status analysis on volatile oils in terms of the dosage form development, prescription development, drug instruction and quality control, in order to lay a foundation for the further exploration of the market development situations of volatile oils and the future development orientation. PMID:25522633

  14. Student stock exchange application development on Android mobile platform

    OpenAIRE

    Balažic, Mitja

    2011-01-01

    The thesis describes the development of a mobile application that runs on Android platform and enables stock trading on Student stock exchange (Študentska borza). Mobile operating system Android and software development kit paired with Eclipse are first described. Web server Apache running PHP and MySQL database were used for server-side application. Technologies like REST, JSON and SSL are used for communication between client and server. Central part of the thesis describes application a...

  15. A qualitative meta-analysis examining clients' experiences of psychotherapy: A new agenda.

    Science.gov (United States)

    Levitt, Heidi M; Pomerville, Andrew; Surace, Francisco I

    2016-08-01

    This article argues that psychotherapy practitioners and researchers should be informed by the substantive body of qualitative evidence that has been gathered to represent clients' own experiences of therapy. The current meta-analysis examined qualitative research studies analyzing clients' experiences within adult individual psychotherapy that appeared in English-language journals. This omnibus review integrates research from across psychotherapy approaches and qualitative methods, focusing on the cross-cutting question of how clients experience therapy. It utilized an innovative method in which 67 studies were subjected to a grounded theory meta-analysis in order to develop a hierarchy of data and then 42 additional studies were added into this hierarchy using a content meta-analytic method-summing to 109 studies in total. Findings highlight the critical psychotherapy experiences for clients, based upon robust findings across these research studies. Process-focused principles for practice are generated that can enrich therapists' understanding of their clients in key clinical decision-making moments. Based upon these findings, an agenda is suggested in which research is directed toward heightening therapists' understanding of clients and recognizing them as agents of change within sessions, supporting the client as self-healer paradigm. This research aims to improve therapists' sensitivity to clients' experiences and thus can expand therapists' attunement and intentionality in shaping interventions in accordance with whichever theoretical orientation is in use. The article advocates for the full integration of the qualitative literature in psychotherapy research in which variables are conceptualized in reference to an understanding of clients' experiences in sessions. (PsycINFO Database Record PMID:27123862

  16. Client Server Model Based DAQ System for Real-Time Air Pollution Monitoring

    Directory of Open Access Journals (Sweden)

    Vetrivel. P

    2014-01-01

    Full Text Available The proposed system consists of client server model based Data-Acquisition Unit. The Embedded Web Server integrates Pollution Server and DAQ that collects air Pollutants levels (CO, NO2, and SO2. The Pollution Server is designed by considering modern resource constrained embedded systems. In contrast, an application server is designed to the efficient execution of programs and scripts for supporting the construction of various applications. While a pollution server mainly deals with sending HTML for display in a web browser on the client terminal, an application server provides access to server side logic for pollutants levels to be use by client application programs. The Embedded Web Server is an arm mcb2300 board with internet connectivity and acts as air pollution server as this standalone device gathers air pollutants levels and as a Server. Embedded Web server is accessed by various clients.

  17. Database replication

    OpenAIRE

    Popov, P. T.; Stankovic, V.

    2014-01-01

    A fault-tolerant node for synchronous heterogeneous database replication and a method for performing a synchronous heterogenous database replication at such a node are provided. A processor executes a computer program to generate a series of database transactions to be carried out at the fault-tolerant node. The fault-tolerant node comprises at least two relational database management systems, each of which are different relational database management system products, each implementing snapsh...

  18. Communicative Databases

    OpenAIRE

    Yu, Kwang-I

    1981-01-01

    A hierarchical organization stores its information in a la rge number of databases. These databases are interrelated , forming a closely-coupled database system. Traditional information systems and current database management systems do not have a means of expressing these relationships. This thesis describes a model of the information structure of the hierarchical organization that identifies the nature of database relationships. It also describes the design and implementatio...

  19. Generalized Database Management System Support for Numeric Database Environments.

    Science.gov (United States)

    Dominick, Wayne D.; Weathers, Peggy G.

    1982-01-01

    This overview of potential for utilizing database management systems (DBMS) within numeric database environments highlights: (1) major features, functions, and characteristics of DBMS; (2) applicability to numeric database environment needs and user needs; (3) current applications of DBMS technology; and (4) research-oriented and…

  20. SISSY: An example of a multi-threaded, networked, object-oriented databased application

    International Nuclear Information System (INIS)

    The Systems Integration Support SYstem (SISSY) is presented and its capabilities and techniques are discussed. It is fully automated data collection and analysis system supporting the SSCL's systems analysis activities as they relate to the Physics Detector and Simulation Facility (PDSF). SISSY itself is a paradigm of effective computing on the PDSF. It uses home-grown code (C++), network programming (RPC, SNMP), relational (SYBASE) and object-oriented (ObjectStore) DBMSs, UNIX operating system services (IRIX threads, cron, system utilities, shells scripts, etc.), and third party software applications (NetCentral Station, Wingz, DataLink) all of which act together as a single application to monitor and analyze the PDSF

  1. The Influence of Client Fees on Evaluations By Clients of Counseling Outcome.

    Science.gov (United States)

    Shipton, Brian; Spain, Armelle

    1980-01-01

    Psychoanalytic theory and cognitive dissonance theory predict that clients who pay a fee for counseling benefit more than clients who do not pay. Results of this study suggest that paying a fee does not significantly influence counseling outcome as measured by client evaluations of counselors. (Author)

  2. Materials Properties Database for Selection of High-Temperature Alloys and Concepts of Alloy Design for SOFC Applications

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Z Gary; Paxton, Dean M.; Weil, K. Scott; Stevenson, Jeffry W.; Singh, Prabhakar

    2002-11-24

    To serve as an interconnect / gas separator in an SOFC stack, an alloy should demonstrate the ability to provide (i) bulk and surface stability against oxidation and corrosion during prolonged exposure to the fuel cell environment, (ii) thermal expansion compatibility with the other stack components, (iii) chemical compatibility with adjacent stack components, (iv) high electrical conductivity of the surface reaction products, (v) mechanical reliability and durability at cell exposure conditions, (vii) good manufacturability, processability and fabricability, and (viii) cost effectiveness. As the first step of this approach, a composition and property database was compiled for high temperature alloys in order to assist in determining which alloys offer the most promise for SOFC interconnect applications in terms of oxidation and corrosion resistance. The high temperature alloys of interest included Ni-, Fe-, Co-base superal

  3. SISMA (Site of Italian Strong Motion Accelerograms): a Web-Database of Ground Motion Recordings for Engineering Applications

    International Nuclear Information System (INIS)

    The paper describes a new website called SISMA, i.e. Site of Italian Strong Motion Accelerograms, which is an Internet portal intended to provide natural records for use in engineering applications for dynamic analyses of structural and geotechnical systems. SISMA contains 247 three-component corrected motions recorded at 101 stations from 89 earthquakes that occurred in Italy in the period 1972-2002. The database of strong motion accelerograms was developed in the framework of a joint project between Sapienza University of Rome and University of California at Los Angeles (USA) and is described elsewhere. Acceleration histories and pseudo-acceleration response spectra (5% damping) are available for download from the website. Recordings can be located using simple search parameters related to seismic source and the recording station (e.g., magnitude, Vs30, etc) as well as ground motion characteristics (e.g. peak ground acceleration, peak ground velocity, peak ground displacement, Arias intensity, etc.)

  4. The X-Files: Investigating Alien Performance in a Thin-client World

    OpenAIRE

    Gunther, Neil J.

    2000-01-01

    Many scientific applications use the X11 window environment; an open source windows GUI standard employing a client/server architecture. X11 promotes: distributed computing, thin-client functionality, cheap desktop displays, compatibility with heterogeneous servers, remote services and administration, and greater maturity than newer web technologies. This paper details the author's investigations into close encounters with alien performance in X11-based seismic applications running on a 200-n...

  5. Web Technologies And Databases

    OpenAIRE

    Irina-Nicoleta Odoraba

    2011-01-01

    The database means a collection of many types of occurrences of logical records containing relationships between records and data elementary aggregates. Management System database (DBMS) - a set of programs for creating and operation of a database. Theoretically, any relational DBMS can be used to store data needed by a Web server. Basically, it was observed that the simple DBMS such as Fox Pro or Access is not suitable for Web sites that are used intensively. For large-scale Web applications...

  6. Nuclear Science References Database

    OpenAIRE

    PRITYCHENKO B.; Běták, E.; B. Singh; Totans, J.

    2013-01-01

    The Nuclear Science References (NSR) database together with its associated Web interface, is the world's only comprehensive source of easily accessible low- and intermediate-energy nuclear physics bibliographic information for more than 210,000 articles since the beginning of nuclear science. The weekly-updated NSR database provides essential support for nuclear data evaluation, compilation and research activities. The principles of the database and Web application development and maintenance...

  7. Storing an OWL 2 Ontology in a Relational Database Structure

    OpenAIRE

    Gorskis, Henrihs; Borisov, Arkady

    2015-01-01

    This paper examines the possibility of storing OWL 2 based ontology information in a classical relational database and reviews some existing methods for ontology databases. In most cases a database is a fitting solution for storing and sharing information among systems, clients or agents. Similarly, in order to make domain ontology information more accessible to systems, in a comparable way, it can be stored and provided in a database form. As of today, there is no consensus on a specific ont...

  8. DataBase on demand

    CERN Document Server

    Aparicio, Ruben Gaspar; Coterillo Coz, I

    2012-01-01

    At CERN a number of key database applications are running on user-managed MySQL database services. The database on demand project was born out of an idea to provide the CERN user community with an environment to develop and run database services outside of the actual centralised Oracle based database services. The Database on Demand (DBoD) empowers the user to perform certain actions that had been traditionally done by database administrators, DBA's, providing an enterprise platform for database applications. It also allows the CERN user community to run different database engines, e.g. presently open community version of MySQL and single instance Oracle database server. This article describes a technology approach to face this challenge, a service level agreement, the SLA that the project provides, and an evolution of possible scenarios.

  9. DataBase on Demand

    Science.gov (United States)

    Gaspar Aparicio, R.; Gomez, D.; Coterillo Coz, I.; Wojcik, D.

    2012-12-01

    At CERN a number of key database applications are running on user-managed MySQL database services. The database on demand project was born out of an idea to provide the CERN user community with an environment to develop and run database services outside of the actual centralised Oracle based database services. The Database on Demand (DBoD) empowers the user to perform certain actions that had been traditionally done by database administrators, DBA's, providing an enterprise platform for database applications. It also allows the CERN user community to run different database engines, e.g. presently open community version of MySQL and single instance Oracle database server. This article describes a technology approach to face this challenge, a service level agreement, the SLA that the project provides, and an evolution of possible scenarios.

  10. DataBase on Demand

    International Nuclear Information System (INIS)

    At CERN a number of key database applications are running on user-managed MySQL database services. The database on demand project was born out of an idea to provide the CERN user community with an environment to develop and run database services outside of the actual centralised Oracle based database services. The Database on Demand (DBoD) empowers the user to perform certain actions that had been traditionally done by database administrators, DBA's, providing an enterprise platform for database applications. It also allows the CERN user community to run different database engines, e.g. presently open community version of MySQL and single instance Oracle database server. This article describes a technology approach to face this challenge, a service level agreement, the SLA that the project provides, and an evolution of possible scenarios.

  11. ERDDAP - An Easier Way for Diverse Clients to Access Scientific Data From Diverse Sources

    Science.gov (United States)

    Mendelssohn, R.; Simons, R. A.

    2008-12-01

    ERDDAP is a new open-source, web-based service that aggregates data from other web services: OPeNDAP grid servers (THREDDS), OPeNDAP sequence servers (Dapper), NOS SOAP service, SOS (IOOS, OOStethys), microWFS, DiGIR (OBIS, BMDE). Regardless of the data source, ERDDAP makes all datasets available to clients via standard (and enhanced) DAP requests and makes some datasets accessible via WMS. A client's request also specifies the desired format for the results, e.g., .asc, .csv, .das, .dds, .dods, htmlTable, XHTML, .mat, netCDF, .kml, .png, or .pdf (formats more directly useful to clients). ERDDAP interprets a client request, requests the data from the data source (in the appropriate way), reformats the data source's response, and sends the result to the client. Thus ERDDAP makes data from diverse sources available to diverse clients via standardized interfaces. Clients don't have to install libraries to get data from ERDDAP because ERDDAP is RESTful and resource-oriented: a URL completely defines a data request and the URL can be used in any application that can send a URL and receive a file. This also makes it easy to use ERDDAP in mashups with other web services. ERDDAP could be extended to support other protocols. ERDDAP's hub and spoke architecture simplifies adding support for new types of data sources and new types of clients. ERDDAP includes metadata management support, catalog services, and services to make graphs and maps.

  12. Handling of network and database instabilities in CORAL

    International Nuclear Information System (INIS)

    The CORAL software is widely used by the LHC experiments for storing and accessing data using relational database technologies. CORAL provides a C++ abstraction layer that supports data persistency for several back-ends and deployment models, direct client access to Oracle servers being one of the most important use cases. Since 2010, several problems have been reported by the LHC experiments in their use of Oracle through CORAL, involving application errors, hangs or crashes after the network or the database servers became temporarily unavailable. CORAL already provided some level of handling of these instabilities, which are due to external causes and cannot be avoided, but this proved to be insufficient in some cases and to be itself the cause of other problems, such as the hangs and crashes mentioned before, in other cases. As a consequence, a major redesign of the CORAL plugins has been implemented, with the aim of making the software more robust against these database and network glitches. The new implementation ensures that CORAL automatically reconnects to Oracle databases in a transparent way whenever possible and gently terminates the application when this is not possible. Internally, this is done by resetting all relevant parameters of the underlying back-end technology (OCI, the Oracle Call Interface). This presentation reports on the status of this work at the time of the CHEP2012 conference, covering the design and implementation of these new features and the outlook for future developments in this area.

  13. Handling of network and database instabilities in CORAL

    Science.gov (United States)

    Trentadue, R.; Valassi, A.; Kalkhof, A.

    2012-12-01

    The CORAL software is widely used by the LHC experiments for storing and accessing data using relational database technologies. CORAL provides a C++ abstraction layer that supports data persistency for several back-ends and deployment models, direct client access to Oracle servers being one of the most important use cases. Since 2010, several problems have been reported by the LHC experiments in their use of Oracle through CORAL, involving application errors, hangs or crashes after the network or the database servers became temporarily unavailable. CORAL already provided some level of handling of these instabilities, which are due to external causes and cannot be avoided, but this proved to be insufficient in some cases and to be itself the cause of other problems, such as the hangs and crashes mentioned before, in other cases. As a consequence, a major redesign of the CORAL plugins has been implemented, with the aim of making the software more robust against these database and network glitches. The new implementation ensures that CORAL automatically reconnects to Oracle databases in a transparent way whenever possible and gently terminates the application when this is not possible. Internally, this is done by resetting all relevant parameters of the underlying back-end technology (OCI, the Oracle Call Interface). This presentation reports on the status of this work at the time of the CHEP2012 conference, covering the design and implementation of these new features and the outlook for future developments in this area.

  14. Can You Diagnose Me Now? A Proposal to Modify FDA's Regulation of Smartphone Mobile Health Applications with a Pre-Market Notification and Application Database System.

    Science.gov (United States)

    McInerney, Stephen

    2015-01-01

    Mobile applications provide limitless possibilities for the future of medical care. Yet these changes have also created concerns about patient safety. Under the Federal Food, Drug, and Cosmetic Act (FDCA), the Food and Drug Administration (FDA) has the authority to regulate a much broader spectrum of products beyond traditional medical devices like stethoscopes or pacemakers. The regulatory question is not if FDA has the statutory. authority to regulate health-related software, but rather how it will exercise its regulatory authority. In September 2013, FDA published guidance on Mobile Medical Applications; in it, the Agency limited its oversight to a small subset of medical-related mobile applications, referred to as "mobile medical applications." For the guidance to be effective, FDA must continue to work directly with all actors--including innovators, doctors, and patients--as the market for mobile health applications continues to develop. This Article argues that FDA should adopt a two-step plan--a pre-market notification program and a mobile medical application database--to aid in the successful implementation of its 2013 guidance. By doing so, FDA will ensure that this burgeoning market can reach its fullest potential. PMID:26292476

  15. Developing a virtual reality application for training Nuclear Power Plant operators: Setting up a database containing dose rates in the refuelling plant

    International Nuclear Information System (INIS)

    Operators in Nuclear Power Plants can receive high doses during refuelling operations. A training programme for simulating refuelling operations will be useful in reducing the doses received by workers as well as minimising operation time. With this goal in mind, a virtual reality application is developed within the framework of the CIPRES project. The application requires doses, both instantaneous and accumulated, to be displayed at all times during operator training. Therefore, it is necessary to set up a database containing dose rates at every point in the refuelling plant. This database is based on radiological protection surveillance data measured in the plant during refuelling operations. Some interpolation routines have been used to estimate doses through the refuelling plant. Different assumptions have been adopted in order to perform the interpolation and obtain consistent data. In this paper, the procedures developed to set up the dose database for the virtual reality application are presented and analysed. (authors)

  16. Developing a virtual reality application for training nuclear power plant operators: setting up a database containing dose rates in the refuelling plant.

    Science.gov (United States)

    Ródenas, J; Zarza, I; Burgos, M C; Felipe, A; Sánchez-Mayoral, M L

    2004-01-01

    Operators in Nuclear Power Plants can receive high doses during refuelling operations. A training programme for simulating refuelling operations will be useful in reducing the doses received by workers as well as minimising operation time. With this goal in mind, a virtual reality application is developed within the framework of the CIPRES project. The application requires doses, both instantaneous and accumulated, to be displayed at all times during operator training. Therefore, it is necessary to set up a database containing dose rates at every point in the refuelling plant. This database is based on radiological protection surveillance data measured in the plant during refuelling operations. Some interpolation routines have been used to estimate doses through the refuelling plant. Different assumptions have been adopted in order to perform the interpolation and obtain consistent data. In this paper, the procedures developed to set up the dose database for the virtual reality application are presented and analysed. PMID:15266073

  17. A virtual repository approach to clinical and utilization studies: application in mammography as alternative to a national database.

    OpenAIRE

    Ohno-Machado, L.; Boxwala, A. A.; Ehresman, J.; Smith, D N; Greenes, R. A.

    1997-01-01

    A national mammography database was proposed, based on a centralized architecture for collecting, monitoring, and auditing mammography data. We have developed an alternative architecture relying on Internet-based distributed queries to heterogeneous databases. This architecture creates a "virtual repository", or a federated database which is constructed dynamically, for each query and makes use of data available in legacy systems. It allows the construction of custom-tailored databases at ind...

  18. SISSY: An example of a multi-threaded, networked, object-oriented databased application

    International Nuclear Information System (INIS)

    The Systems Integration Support SYstem (SISSY) is presented and its capabilities and techniques are discussed. It is a fully automated data collection and analysis system supporting the SSCL's systems analysis activities as they relate to the Physics Detector and Simulation Facility (PDSF). SISSY itself is a paradigm of effective computing on the PDSF. It uses home-grown code (C++), network programming (RPC, SNMP), relational (SYBASE) and object-oriented (ObjectStore) DBMSs, UNIX operating system services (IRIX threads, cron, system utilities, shell scripts, etc.), and third party software applications (NetCentral Station, Wingz, DataLink) all of which act together as a single application to monitor and analyze the PDSF

  19. Indoor Location Fingerprinting with Heterogeneous Clients

    DEFF Research Database (Denmark)

    Kjærgaard, Mikkel Baun

    2011-01-01

    Heterogeneous wireless clients measure signal strength differently. This is a fundamental problem for indoor location fingerprinting, and it has a high impact on the positioning accuracy. Mapping-based solutions have been presented that require manual and error-prone calibration for each new client...

  20. Improving UK client-contractor relations

    International Nuclear Information System (INIS)

    The client's aim in any decommissioning project is that the originally intended end point is achieved, within budget and on time. The contractor's aim is to have a satisfied client, so that both are happy to work together again, and to have a reasonable return for his efforts. How can these - not incompatible - aims best be achieved? (UK)

  1. Counselors' Accounts of Their Clients' Spiritual Experiences.

    Science.gov (United States)

    Holden, Janice Miner

    2000-01-01

    Introduces a special section within this issue of Counseling and Values that focuses on counselors' accounts of their clients' transpersonal experiences. The eight articles in this special section discuss ten types of transpersonal experiences. Clients range in age from early 20s to early 80s. Experiences occurred in various settings and were…

  2. Meeting Client Resistance and Reactance with Reverence.

    Science.gov (United States)

    Cowan, Eric W.; Presbury, Jack H.

    2000-01-01

    Explores the meaning and function of client resistance in counseling from various theoretical orientations. A relational model of resistance is offered that redefines traditional formulations and has implications for clinical treatment. A vignette illustrates that the counselor's contribution to the emergence of client resistance is a relevant…

  3. Client Motivation and Rehabilitation Counseling Outcome.

    Science.gov (United States)

    Salomone, Paul R.

    This study investigates the relationship between client motivation or lack of motivation for vocational rehabilitation services, and rehabilitation outcome. Clients who had received services at a rehabilitation center during a two year period were rated on their level of motivation for rehabilitation services using the contents of diagnostic…

  4. YASGUI: Not Just Another SPARQL Client

    NARCIS (Netherlands)

    L. Rietveld; R. Hoekstra

    2013-01-01

    This paper introduces YASGUI, a user-friendly SPARQL client. We compare YASGUI with other SPARQL clients, and show the added value and ease of integrating Web APIs, services, and new technologies such as HTML5. Finally, we discuss some of the challenges we encountered in using these technologies for

  5. Organizational and Client Commitment among Contracted Employees

    Science.gov (United States)

    Coyle-Shapiro, Jacqueline A-M.; Morrow, Paula C.

    2006-01-01

    This study examines affective commitment to employing and client organizations among long-term contracted employees, a new and growing employment classification. Drawing on organizational commitment and social exchange literatures, we propose two categories of antecedents of employee commitment to client organizations. We tested our hypotheses…

  6. Training Therapists about Client Expectations of Psychotherapy.

    Science.gov (United States)

    Soley, Georgia; Marshall, Renee; Chambliss, Catherine

    Research has indicated that premature termination of therapy is sometimes due to a conflict in goal and outcome expectations between therapists and family members of clients. The present study requested both therapists and parents of child clients to complete questionnaires to determine if there is congruence between therapist and parental…

  7. GRAD: On Graph Database Modeling

    OpenAIRE

    Ghrab, Amine; Romero, Oscar; Skhiri, Sabri; Vaisman, Alejandro; Zimányi, Esteban

    2016-01-01

    Graph databases have emerged as the fundamental technology underpinning trendy application domains where traditional databases are not well-equipped to handle complex graph data. However, current graph databases support basic graph structures and integrity constraints with no standard algebra. In this paper, we introduce GRAD, a native and generic graph database model. GRAD goes beyond traditional graph database models, which support simple graph structures and constraints. Instead, GRAD pres...

  8. Semantics-based metrics and algorithms for dynamic content in web database applications

    OpenAIRE

    Papastavrou, Stavros C.; Παπασταύρου, Σταύρος Κ.

    2009-01-01

    Η Τεχνολογία Δυναμικού Περιεχομένου (Dynamic Web Content) συσχετίζει τις Παραδοσιακές Βάσεις Δεδομένων (Traditional Databases) με το Παγκόσμιο Πλέγμα Πληροφοριών (World Wide Web), επιτρέποντας την επισκόπηση και ενημέρωση των βάσεων δεδομένων μέσω δυναμικών ιστοσελίδων. Με την εμφάνιση του Common Gateway Interface (CGI), η τεχνολογία δυναμικού περιεχομένου έχει υποβοηθήσει στην μεταφορά των παραδοσιακών δημοφιλών εφαρμογών στο διαδικτυακό κόσμο. Παραδείγματα δημοφιλών εφαρμογών αποτελούν τα δ...

  9. Relational databases

    CERN Document Server

    Bell, D A

    1986-01-01

    Relational Databases explores the major advances in relational databases and provides a balanced analysis of the state of the art in relational databases. Topics covered include capture and analysis of data placement requirements; distributed relational database systems; data dependency manipulation in database schemata; and relational database support for computer graphics and computer aided design. This book is divided into three sections and begins with an overview of the theory and practice of distributed systems, using the example of INGRES from Relational Technology as illustration. The

  10. The Handling of Data Stored in Databases

    OpenAIRE

    Marius-Cristian Apetrei; Bogdanel Marian Dragut; Dominic Perez-Danielescu

    2013-01-01

    This paper aims at describing some techniques for manipulating data stored in databases. At first, it is important to describe the main features of a network, their classification, the client - server concept, and the data manipulation language. The paper is divided into two parts: the first part deals with the ADO.NET object, and its characteristics, and the second part deals with seeking the methods for handling data stored in the database.

  11. Combinatorial Design of Some Database Application Technologies Based on B/S

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    Evolving From principal-subordinate structure of C /S to flexible multileveled distributed structure, i.e. B/S architecture so as to form a wide, distributed and orderly Internet/Intranet integrated management inf ormation system, is the trend of development of application software of the whol e world. Advantages and disadvantages of the two modes: C/S and B/S are compared . It is pointed out that at present onefold B/S mode cannot yet fully fullfil th e demands of some complicated data processing, inform...

  12. Open source database of images DEIMOS: extension for large-scale subjective image quality assessment

    Science.gov (United States)

    Vítek, Stanislav

    2014-09-01

    DEIMOS (Database of Images: Open Source) is an open-source database of images and video sequences for testing, verification and comparison of various image and/or video processing techniques such as compression, reconstruction and enhancement. This paper deals with extension of the database allowing performing large-scale web-based subjective image quality assessment. Extension implements both administrative and client interface. The proposed system is aimed mainly at mobile communication devices, taking into account advantages of HTML5 technology; it means that participants don't need to install any application and assessment could be performed using web browser. The assessment campaign administrator can select images from the large database and then apply rules defined by various test procedure recommendations. The standard test procedures may be fully customized and saved as a template. Alternatively the administrator can define a custom test, using images from the pool and other components, such as evaluating forms and ongoing questionnaires. Image sequence is delivered to the online client, e.g. smartphone or tablet, as a fully automated assessment sequence or viewer can decide on timing of the assessment if required. Environmental data and viewing conditions (e.g. illumination, vibrations, GPS coordinates, etc.), may be collected and subsequently analyzed.

  13. jSPyDB, an open source database-independent tool for data management

    CERN Document Server

    Pierro, Giuseppe Antonio

    2010-01-01

    Nowadays, the number of commercial tools available for accessing Databases, built on Java or .Net, is increasing. However, many of these applications have several drawbacks: usually they are not open-source, they provide interfaces only with a specific kind of database, they are platform-dependent and very CPU and memory consuming. jSPyDB is a free web based tool written using Python and Javascript. It relies on jQuery and python libraries, and is intended to provide a simple handler to different Database technologies inside a local web browser. Such a tool, exploiting fast access libraries such as SQLAlchemy, is easy to install, and to configure. The design of this tool envisages three layers. The front-end client side in the local web browser communicates with a backend server. Only the server is able to connect to the different databases for the purposes of performing data definition and manipulation. The server makes the data available to the client, so that the user can display and handle them safely. ...

  14. Community Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This excel spreadsheet is the result of merging at the port level of several of the in-house fisheries databases in combination with other demographic databases...

  15. Biofuel Database

    Science.gov (United States)

    Biofuel Database (Web, free access)   This database brings together structural, biological, and thermodynamic data for enzymes that are either in current use or are being considered for use in the production of biofuels.

  16. Database Systems - Present and Future

    OpenAIRE

    Ion LUNGU; Manole VELICANU; Iuliana BOTHA

    2009-01-01

    The database systems have nowadays an increasingly important role in the knowledge-based society, in which computers have penetrated all fields of activity and the Internet tends to develop worldwide. In the current informatics context, the development of the applications with databases is the work of the specialists. Using databases, reach a database from various applications, and also some of related concepts, have become accessible to all categories of IT users. This paper aims to summariz...

  17. Hydrogen Leak Detection Sensor Database

    Science.gov (United States)

    Baker, Barton D.

    2010-01-01

    This slide presentation reviews the characteristics of the Hydrogen Sensor database. The database is the result of NASA's continuing interest in and improvement of its ability to detect and assess gas leaks in space applications. The database specifics and a snapshot of an entry in the database are reviewed. Attempts were made to determine the applicability of each of the 65 sensors for ground and/or vehicle use.

  18. Content independence in multimedia databases

    OpenAIRE

    Vries, de, P.M.

    2001-01-01

    A database management system is a general-purpose software system that facilitates the processes of defining, constructing, and manipulating databases for various applications. This article investigates the role of data management in multimedia digital libraries, and its implications for the design of database management systems. The notions of content abstraction and content independence are introduced, which clearly expose the unique challenges (for database architecture) of applications in...

  19. Using Solid State Drives as a Mid-Tier Cache in Enterprise Database OLTP Applications

    Science.gov (United States)

    Khessib, Badriddine M.; Vaid, Kushagra; Sankar, Sriram; Zhang, Chengliang

    When originally introduced, flash based solid state drives (SSD) exhibited a very high random read throughput with low sub-millisecond latencies. However, in addition to their steep prices, SSDs suffered from slow write rates and reliability concerns related to cell wear. For these reasons, they were relegated to a niche status in the consumer and personal computer market. Since then, several architectural enhancements have been introduced that led to a substantial increase in random write operations as well as a reasonable improvement in reliability. From a purely performance point of view, these high I/O rates and improved reliability make the SSDs an ideal choice for enterprise On-Line Transaction Processing (OLTP) applications. However, from a price/performance point of view, the case for SSDs may not be clear. Enterprise class SSD Price/GB, continues to be at least 10x higher than conventional magnetic hard disk drives (HDD) despite considerable drop in Flash chip prices.

  20. Discover knowledge in databases: Mining of data and applications; Descubrir conocimiento en bases de datos: Mineria de datos y aplicaciones

    Energy Technology Data Exchange (ETDEWEB)

    Rodriguez Martinez, Andres F [Instituto de Investigaciones Electricas, Temixco, Morelos (Mexico); Morales Manzanares, Eduardo [Instituto Tecnologico de Estudios Superiores de Monterrey (ITESM), Campus Cuernavaca, Morelos (Mexico)

    2000-07-01

    In the last years it has existed an enormous growth in the generation capacity and information storage, due to the increasing automation of processes in general and to the advances in the information capacity storage. Unfortunately, the information analysis techniques have not shown an equivalent development, reason why it exists the necessity of a new generation of computing techniques and tools that can assist the one who makes decisions in the automatic and intelligent analysis of large information volumes. To find useful knowledge among great amounts of data is the main objective of the area of discovery of knowledge in databases. The present article has like objective the spread of the process of discovering the knowledge in databases in general and the concept of mining of data in particular; to establish the relation that exists between the process of discovering knowledge in databases and the mining of data; as well as to fix the characteristics and complexities of looking for useful patterns in the data. Also the main methods of mining of data and the areas of application are described, where these algorithms have had greater success. [Spanish] En los ultimos anos ha existido un enorme crecimiento en la capacidad de generacion y almacenamiento de informacion, debido a la creciente automatizacion de procesos en general y a los avances en las capacidades de almacenamiento de informacion. Desafortunadamente, las tecnicas de analisis de informacion no han mostrado un desarrollo equivalente, por lo que existe la necesidad de una nueva generacion de tecnicas y herramientas computacionales que puedan asistir a quien toma decisiones en el analisis automatico e inteligente de grandes volumenes de informacion. Encontrar conocimiento util entre grandes cantidades de datos es el objetivo principal del area de descubrimiento de conocimiento en bases de datos. El presente articulo tiene como objetivo difundir el proceso de descubrir conocimiento en bases de datos en

  1. A press database on natural risks and its application in the study of floods in Northeastern Spain

    Directory of Open Access Journals (Sweden)

    M. C. Llasat

    2009-12-01

    Full Text Available The aim of this work is to introduce a systematic press database on natural hazards and climate change in Catalonia (NE of Spain and to analyze its potential application to social-impact studies. For this reason, a review of the concepts of risk, hazard, vulnerability and social perception is also included. This database has been built for the period 1982–2007 and contains all the news related with those issues published by the oldest still-active newspaper in Catalonia. Some parameters are registered for each article and for each event, including criteria that enable us to determine the importance accorded to it by the newspaper, and a compilation of information about it. This ACCESS data base allows each article to be classified on the basis of the seven defined topics and key words, as well as summary information about the format and structuring of the new itself, the social impact of the event and data about the magnitude or intensity of the event. The coverage given to this type of news has been assessed because of its influence on construction of the social perception of natural risk and climate change, and as a potential source of information about them. The treatment accorded by the press to different risks is also considered. More than 14 000 press articles have been classified. Results show that the largest number of news items for the period 1982–2007 relates to forest fires and droughts, followed by floods and heavy rainfalls, although floods are the major risk in the region of study. Two flood events recorded in 2002 have been analyzed in order to show an example of the role of the press information as indicator of risk perception.

  2. A press database on natural risks and its application in the study of floods in Northeastern Spain

    Science.gov (United States)

    Llasat, M. C.; Llasat-Botija, M.; López, L.

    2009-12-01

    The aim of this work is to introduce a systematic press database on natural hazards and climate change in Catalonia (NE of Spain) and to analyze its potential application to social-impact studies. For this reason, a review of the concepts of risk, hazard, vulnerability and social perception is also included. This database has been built for the period 1982-2007 and contains all the news related with those issues published by the oldest still-active newspaper in Catalonia. Some parameters are registered for each article and for each event, including criteria that enable us to determine the importance accorded to it by the newspaper, and a compilation of information about it. This ACCESS data base allows each article to be classified on the basis of the seven defined topics and key words, as well as summary information about the format and structuring of the new itself, the social impact of the event and data about the magnitude or intensity of the event. The coverage given to this type of news has been assessed because of its influence on construction of the social perception of natural risk and climate change, and as a potential source of information about them. The treatment accorded by the press to different risks is also considered. More than 14 000 press articles have been classified. Results show that the largest number of news items for the period 1982-2007 relates to forest fires and droughts, followed by floods and heavy rainfalls, although floods are the major risk in the region of study. Two flood events recorded in 2002 have been analyzed in order to show an example of the role of the press information as indicator of risk perception.

  3. The new ALICE DQM client: a web access to ROOT-based objects

    Science.gov (United States)

    von Haller, B.; Carena, F.; Carena, W.; Chapeland, S.; Chibante Barroso, V.; Costa, F.; Delort, C.; Dénes, E.; Diviá, R.; Fuchs, U.; Niedziela, J.; Simonetti, G.; Soós, C.; Telesca, A.; Vande Vyvre, P.; Wegrzynek, A.

    2015-12-01

    A Large Ion Collider Experiment (ALICE) is the heavy-ion detector designed to study the physics of strongly interacting matter and the quark-gluon plasma at the CERN Large Hadron Collider (LHC). The online Data Quality Monitoring (DQM) plays an essential role in the experiment operation by providing shifters with immediate feedback on the data being recorded in order to quickly identify and overcome problems. An immediate access to the DQM results is needed not only by shifters in the control room but also by detector experts worldwide. As a consequence, a new web application has been developed to dynamically display and manipulate the ROOT-based objects produced by the DQM system in a flexible and user friendly interface. The architecture and design of the tool, its main features and the technologies that were used, both on the server and the client side, are described. In particular, we detail how we took advantage of the most recent ROOT JavaScript I/O and web server library to give interactive access to ROOT objects stored in a database. We describe as well the use of modern web techniques and packages such as AJAX, DHTMLX and jQuery, which has been instrumental in the successful implementation of a reactive and efficient application. We finally present the resulting application and how code quality was ensured. We conclude with a roadmap for future technical and functional developments.

  4. Database Administrator

    Science.gov (United States)

    Moore, Pam

    2010-01-01

    The Internet and electronic commerce (e-commerce) generate lots of data. Data must be stored, organized, and managed. Database administrators, or DBAs, work with database software to find ways to do this. They identify user needs, set up computer databases, and test systems. They ensure that systems perform as they should and add people to the…

  5. Conditioning Probabilistic Databases

    CERN Document Server

    Koch, Christoph

    2008-01-01

    Past research on probabilistic databases has studied the problem of answering queries on a static database. Application scenarios of probabilistic databases however often involve the conditioning of a database using additional information in the form of new evidence. The conditioning problem is thus to transform a probabilistic database of priors into a posterior probabilistic database which is materialized for subsequent query processing or further refinement. It turns out that the conditioning problem is closely related to the problem of computing exact tuple confidence values. It is known that exact confidence computation is an NP-hard problem. This has lead researchers to consider approximation techniques for confidence computation. However, neither conditioning nor exact confidence computation can be solved using such techniques. In this paper we present efficient techniques for both problems. We study several problem decomposition methods and heuristics that are based on the most successful search techn...

  6. Autism genetic database (AGD: a comprehensive database including autism susceptibility gene-CNVs integrated with known noncoding RNAs and fragile sites

    Directory of Open Access Journals (Sweden)

    Talebizadeh Zohreh

    2009-09-01

    Full Text Available Abstract Background Autism is a highly heritable complex neurodevelopmental disorder, therefore identifying its genetic basis has been challenging. To date, numerous susceptibility genes and chromosomal abnormalities have been reported in association with autism, but most discoveries either fail to be replicated or account for a small effect. Thus, in most cases the underlying causative genetic mechanisms are not fully understood. In the present work, the Autism Genetic Database (AGD was developed as a literature-driven, web-based, and easy to access database designed with the aim of creating a comprehensive repository for all the currently reported genes and genomic copy number variations (CNVs associated with autism in order to further facilitate the assessment of these autism susceptibility genetic factors. Description AGD is a relational database that organizes data resulting from exhaustive literature searches for reported susceptibility genes and CNVs associated with autism. Furthermore, genomic information about human fragile sites and noncoding RNAs was also downloaded and parsed from miRBase, snoRNA-LBME-db, piRNABank, and the MIT/ICBP siRNA database. A web client genome browser enables viewing of the features while a web client query tool provides access to more specific information for the features. When applicable, links to external databases including GenBank, PubMed, miRBase, snoRNA-LBME-db, piRNABank, and the MIT siRNA database are provided. Conclusion AGD comprises a comprehensive list of susceptibility genes and copy number variations reported to-date in association with autism, as well as all known human noncoding RNA genes and fragile sites. Such a unique and inclusive autism genetic database will facilitate the evaluation of autism susceptibility factors in relation to known human noncoding RNAs and fragile sites, impacting on human diseases. As a result, this new autism database offers a valuable tool for the research

  7. Team-client Relationships And Extreme Programming

    Directory of Open Access Journals (Sweden)

    John Karn

    2008-01-01

    Full Text Available This paper describes a study that examined the relationship between software engineering teams who adhered to the extreme programming (XP methodology and their project clients. The study involved observing teams working on projects for clients who had commissioned a piece of software to be used in the real world. Interviews were conducted during and at the end of the project to get client opinion on how the project had progressed. Of interest to the researchers were opinions on frequency of feedback, how the team captured requirements, whether or not the iterative approach of XP proved to be helpful, and the level of contextual and software engineering knowledge the client had at the start of the project. In theory, fidelity to XP should result in enhanced communication, reduce expectation gaps, and lead to greater client satisfaction. Our results suggest that this depends heavily on the communication skills of the team and of the client, the expectations of the client, and the nature of the project.

  8. ANDROID BASED REMOTE DESKTOP CLIENT

    Directory of Open Access Journals (Sweden)

    AJIT KOTKAR

    2013-04-01

    Full Text Available Android based remote desktop client is a remote control system which allows you to view and interact with one computer (known as “server” to another computer or cellular phones (Android OS anywhere on the intranet. A viewer is provided on the cellular phone that enables the user to see and manipulate the desktop of remote systems such as MS Windows. The system to be accessed must be running a server and it must be attached to a network. A proxy is used to send the image of the desktop to the cellular phone, to convert different devices, to suppress network traffics, and to support recovery from an unscheduled disconnection. A prototype of the proposed system been implemented using Android and will be tested on a Android Virtual Device emulator. To reduce user effort and solve problems inherent to the cellular phone’s small screen, several functions are provided on the cellular viewer. Virtual Network Computing protocol is used to access Graphical User Interface of remote computer. It is based on the concept of a Remote Frame Buffer or RFB. The system will use Remote method invocation (RMI and screen image capturing techniques of API’s to implement VNC. All functions such mouse clicking, opening files, Playing media can be perform on server computer.

  9. Web数据库技术及其在物探中的应用%THE TECHNOLOGY OF WEB DATABASE AND IT'S APPLICATIONS IN GEOPHYSICAL EXPLORATION

    Institute of Scientific and Technical Information of China (English)

    孙旭; 鲍新毅; 李灿平; 刘飚

    2001-01-01

    作者在文中论述了在物探领域开发web数据库应用的意义。讨论了多种web数据库技术方法的原理、特点以及发展前景。同时给出一个例子说明某地区物探资料的Web数据库应用。%The importances of developing Web database applications in the geophysical exploration is discussed in the paper. The principals and features of a variety of Web database techniques and their prospects are also described and analyzed. Finally, an example of real data is given to illustrate the application of the Web database to the geophysical exploration.

  10. Client-Oriented Approach: Forming the System of Management of the Bank Relations with Clients

    Directory of Open Access Journals (Sweden)

    Zavadska Diana V.

    2015-03-01

    Full Text Available The aim of the article is to develop the theoretical principles of forming the bank relations with clients as part of the client-oriented strategy implementation. As a result of the conducted research there has been presented the definition of client-orientation, mechanism and system of management. The system of management of the bank relations with clients, the purpose and objectives of its formation have been substantiated. The hierarchy of subjects of forming and managing the process of the bank relations with client has been presented. The ways of implementing in practice the functions of the mechanism of managing relations with clients have been revealed. It has been proved that for implementation of the client-oriented approach the banking institution should have a comprehensive view of its clients’ behavior, which detailed understanding will allow for a more accurate segmentation and building individualized partnership relations. Implementing the principle of totality of client relationships level and comprehensive knowledge, development of employee behavior techniques and special techniques for working with the most valuable clients, the use of analytics and forecasting tools will provide targeting of marketing campaigns and lead to minimization of additional costs, satisfaction of every client, loyalty, increase in the market share, growth of sales volume, increase in profits of the banking institution.

  11. Esourcing capability model for client organizations

    CERN Document Server

    Hefley, Bill

    2010-01-01

    The eSourcing Capability Model for Client Organizations (eSCM-CL) is the best practices model that enables client organizations to appraise and improve their capability to foster the development of more effective relationships and to better manage these relationships. This title helps readers successfully implement a full range of client-organization tasks, ranging from developing the organization's sourcing strategy, planning for sourcing and service provider selection, initiating an agreement with service providers, managing service delivery, and completing the agreement.The eSCM-CL has been

  12. La Aplicacion de las Bases de Datos al Estudio Historico del Espanol (The Application of Databases to the Historical Study of Spanish).

    Science.gov (United States)

    Nadal, Gloria Claveria; Lancis, Carlos Sanchez

    1997-01-01

    Notes that the employment of databases to the study of the history of a language is a method that allows for substantial improvement in investigative quality. Illustrates this with the example of the application of this method to two studies of the history of Spanish developed in the Language and Information Seminary of the Independent University…

  13. GrayStarServer: Server-side spectrum synthesis with a browser-based client-side user interface

    CERN Document Server

    Short, C Ian

    2016-01-01

    I present GrayStarServer (GSS), a stellar atmospheric modeling and spectrum synthesis code of pedagogical accuracy that is accessible in any web browser on commonplace computational devices and that runs on a time-scale of a few seconds. The addition of spectrum synthesis annotated with line identifications extends the functionality and pedagogical applicability of GSS beyond that of its predecessor, GrayStar3 (GS3). The spectrum synthesis is based on a line list acquired from the NIST atomic spectra database, and the GSS post-processing and user interface (UI) client allows the user to inspect the plain text ASCII version of the line list, as well as to apply macroscopic broadening. Unlike GS3, GSS carries out the physical modeling on the server side in Java, and communicates with the JavaScript and HTML client via an asynchronous HTTP request. I also describe other improvements beyond GS3 such as more realistic modeling physics and use of the HTML element for higher quality plotting and rendering of result...

  14. rasdaman Array Database: current status

    Science.gov (United States)

    Merticariu, George; Toader, Alexandru

    2015-04-01

    defines request types for inserting, updating and deleting coverages. A web client, designed for both novice and experienced users, is also available for the service and its extensions. The client offers an intuitive interface that allows users to work with multi-dimensional coverages by abstracting the specifics of the standard definitions of the requests. The Web Coverage Processing Service defines a language for on-the-fly processing and filtering multi-dimensional raster coverages. rasdaman exposes this service through the WCS processing extension. Demonstrations are provided online via the Earthlook website (earthlook.org) which presents use-cases from a wide variety of application domains, using the rasdaman system as processing engine.

  15. DICOM image integration into an electronic medical record using thin viewing clients

    Science.gov (United States)

    Stewart, Brent K.; Langer, Steven G.; Taira, Ricky K.

    1998-07-01

    Purpose -- To integrate radiological DICOM images into our currently existing web-browsable Electronic Medical Record (MINDscape). Over the last five years the University of Washington has created a clinical data repository combining in a distributed relational database information from multiple departmental databases (MIND). A text-based view of this data called the Mini Medical Record (MMR) has been available for three years. MINDscape, unlike the text based MMR, provides a platform independent, web browser view of the MIND dataset that can easily be linked to other information resources on the network. We have now added the integration of radiological images into MINDscape through a DICOM webserver. Methods/New Work -- we have integrated a commercial webserver that acts as a DICOM Storage Class Provider to our, computed radiography (CR), computed tomography (CT), digital fluoroscopy (DF), magnetic resonance (MR) and ultrasound (US) scanning devices. These images can be accessed through CGI queries or by linking the image server database using ODBC or SQL gateways. This allows the use of dynamic HTML links to the images on the DICOM webserver from MINDscape, so that the radiology reports already resident in the MIND repository can be married with the associated images through the unique examination accession number generated by our Radiology Information System (RIS). The web browser plug-in used provides a wavelet decompression engine (up to 16-bits per pixel) and performs the following image manipulation functions: window/level, flip, invert, sort, rotate, zoom, cine-loop and save as JPEG. Results -- Radiological DICOM image sets (CR, CT, MR and US) are displayed with associated exam reports for referring physician and clinicians anywhere within the widespread academic medical center on PCs, Macs, X-terminals and Unix computers. This system is also being used for home teleradiology application. Conclusion -- Radiological DICOM images can be made available

  16. Rich Internet application development with the Vaadin Java Framework

    OpenAIRE

    Pratt, Aaron

    2011-01-01

    The purpose of this work was to design and create a new, custom web application to assist the client in the planning of musical events involving organizing groups of people and set lists of songs specific to each event. Future versions should handle communicating with people that are scheduled as well as managing a song database with music-related editing features. This thesis documents the planning and implementing of this application and describes the various the technologies and tools ...

  17. Counselor Values and the Pregnant Adolescent Client.

    Science.gov (United States)

    Kennedy, Bebe C.; And Others

    1984-01-01

    Reviews options counselors can suggest to pregnant adolescents, including abortion, adoption, marriage, and single parenthood. Discusses the need for counselors to be aware of their own values and help the client explore her values. (JAC)

  18. Accommodating blind and partially sighted clients

    OpenAIRE

    England, Gary; Gebbels, Tim; Whelan, Chantelle; Freeman, Sarah

    2014-01-01

    Veterinary surgeons provide an important service to blind and partially sighted guide dog owners. By adopting basic disability awareness and visual impairment training, practices can ensure that the assistance needs of those clients are met, facilitating access to veterinary care.

  19. Managing Client Values in Construction Design

    DEFF Research Database (Denmark)

    Thyssen, Mikael Hygum; Emmitt, Stephen; Bonke, Sten;

    2008-01-01

    capturing and managing client values within a lean framework. This paper describes the initial findings of a joint research project between academia and industry practitioners that seeks to develop the workshop method to create a state of the art approach in construction design management. This includes an......In construction projects the client will comprise both owner, end-users, and the wider society, representatives of which may have conflicting goals and values; and these may not be fully realized by the stakeholders themselves. Therefore it is a great challenge to capture and manage the values of...... the multiple stakeholders that constitutes the “client”. However, seeing client satisfaction as the end-goal of construction it is imperative to make client values explicit in the early project phase and make sure that these values are reflected in all subsequent phases of design and construction. The...

  20. 大型遥感图像处理系统中集成数据库设计及应用%Design and Application of Integrated Database for a Large Remote Sensing Processing System

    Institute of Scientific and Technical Information of China (English)

    李军; 刘高焕; 迟耀斌; 朱重光

    2001-01-01

    大型遥感图像处理应用系统中,往往需要实时获取各种背景或专题数据,该过程即是数据动态集成过程。集成数据库是建立在各种专题数据库基础上的数据集成使用框架体系,该文描述了集成数据库的结构及各类子库的组成,根据项目的特殊需求提出了虚拟数据库概念,并结合实例说明了集成数据库以元数据为链条的使用机制与方法。%It is necessary to provide any essential background data andthematic data timely in image processing and applications. In fact, it is very difficult to integrate different kinds of data into one database that is managed by commercial GIS or image processing software such as ARC/INFO or ERDAS. In this paper, the author describes an integrated database management system which is a framework based on different kinds of database, such as image database, vector spatial database, spatial entity spectrum characteristics database, spatial entity image sample database, control point (tics) database, documents database, models database, and product database. The querying and retrieving system, which are basic functions of integrated database management system, depend on metadata being divided into three parts: database metadata, dataset metadata and attribute field metadata. Finally, the author introduces the concept of virtual database that is a logical database based on other practical databases, and describes its structure and application in product making system for a large remote sensing application in detail.

  1. Statistical Analysis of Charpy Transition Temperature Shift in Reactor Pressure Vessel Steels: Application of Nuclear Materials Database(MatDB)

    International Nuclear Information System (INIS)

    The MDPortal contains various technical documents on the degradation and development of nuclear materials. Additionally, the nuclear materials database (MatDB) is also launched in KAERI recently. The MatDB covers the mechanical properties of various nuclear structural materials used as the components: a reactor pressure vessel, steam generator, and primary and secondary piping. In this study, we introduced MatDB briefly, and analyzed the Charpy transition temperature shift in reactor pressure vessel steels of Korean nuclear power plants retrieved from MatDB. It can show an application of the MatDB to the real case of material degradations in NPPs. The MatDB includes the tensile results, Charpy results, fatigue results and J-R curve results at present. In the future other properties such as creep, fracture toughness, and SCC degradations are going to be added consistently. The data from MatDB were successfully applied to estimate the TTS analysis of Korean RPV steels in surveillance tests

  2. Building a medical multimedia database system to integrate clinical information: an application of high-performance computing and communications technology.

    Science.gov (United States)

    Lowe, H J; Buchanan, B G; Cooper, G F; Vries, J K

    1995-01-01

    The rapid growth of diagnostic-imaging technologies over the past two decades has dramatically increased the amount of nontextual data generated in clinical medicine. The architecture of traditional, text-oriented, clinical information systems has made the integration of digitized clinical images with the patient record problematic. Systems for the classification, retrieval, and integration of clinical images are in their infancy. Recent advances in high-performance computing, imaging, and networking technology now make it technologically and economically feasible to develop an integrated, multimedia, electronic patient record. As part of The National Library of Medicine's Biomedical Applications of High-Performance Computing and Communications program, we plan to develop Image Engine, a prototype microcomputer-based system for the storage, retrieval, integration, and sharing of a wide range of clinically important digital images. Images stored in the Image Engine database will be indexed and organized using the Unified Medical Language System Metathesaurus and will be dynamically linked to data in a text-based, clinical information system. We will evaluate Image Engine by initially implementing it in three clinical domains (oncology, gastroenterology, and clinical pathology) at the University of Pittsburgh Medical Center. PMID:7703940

  3. Creating successful US client-contractor relations

    International Nuclear Information System (INIS)

    The recent growth in nuclear facility decommissioning worldwide has generated renewed interest in the client-contractor relationship and how best to plan and contract the work. What roles should the client -usually the owner utility - and the prime contractor and subcontractors play, and which basic planning tools, contract types, work scope changes, worker productivity factors and monitoring methods are needed to ensure the work is performed satisfactorily? (UK)

  4. Measuring Money Mismanagement Among Dually Diagnosed Clients

    OpenAIRE

    Black, Ryan A.; Rounsaville, Bruce J.; Rosenheck, Robert A; Conrad, Kendon J.; Ball, Samuel A.; Rosen, Marc I.

    2008-01-01

    Clients dually diagnosed with psychiatric and substance abuse disorders may be adversely affected if they mismanage their Social Security or public support benefits. Assistance managing funds, including assignment of a representative payee, is available but there are no objective assessments of money mismanagement. In this study, a Structured Clinical Interview for Money Mismanagement was administered twice at one-week intervals to 46 clients receiving disability payments and was compared to ...

  5. Client satisfaction and usefulness to external stakeholders from an audit client perspective

    OpenAIRE

    Peter Öhman; Einar Häckner; Dag Sörbom

    2012-01-01

    Purpose – The purpose of the paper is to develop, test, and improve a structural equation model (SEM) of client satisfaction with the audit, and of client perception of the usefulness of the audit to external stakeholders. Design/methodology/approach – A questionnaire was mailed to audit clients, i.e. managers of Swedish limited companies with 50 or more employees; 627 useable questionnaires were returned, giving a response rate of 43 percent. Data were processed using the SEM software LISREL...

  6. Threshold detection for the generalized Pareto distribution: Review of representative methods and application to the NOAA NCDC daily rainfall database

    Science.gov (United States)

    Langousis, Andreas; Mamalakis, Antonios; Puliga, Michelangelo; Deidda, Roberto

    2016-04-01

    In extreme excess modeling, one fits a generalized Pareto (GP) distribution to rainfall excesses above a properly selected threshold u. The latter is generally determined using various approaches, such as nonparametric methods that are intended to locate the changing point between extreme and nonextreme regions of the data, graphical methods where one studies the dependence of GP-related metrics on the threshold level u, and Goodness-of-Fit (GoF) metrics that, for a certain level of significance, locate the lowest threshold u that a GP distribution model is applicable. Here we review representative methods for GP threshold detection, discuss fundamental differences in their theoretical bases, and apply them to 1714 overcentennial daily rainfall records from the NOAA-NCDC database. We find that nonparametric methods are generally not reliable, while methods that are based on GP asymptotic properties lead to unrealistically high threshold and shape parameter estimates. The latter is justified by theoretical arguments, and it is especially the case in rainfall applications, where the shape parameter of the GP distribution is low; i.e., on the order of 0.1-0.2. Better performance is demonstrated by graphical methods and GoF metrics that rely on preasymptotic properties of the GP distribution. For daily rainfall, we find that GP threshold estimates range between 2 and 12 mm/d with a mean value of 6.5 mm/d, while the existence of quantization in the empirical records, as well as variations in their size, constitute the two most important factors that may significantly affect the accuracy of the obtained results.

  7. The GIS of the Central Apennines Geodetic Network (CA-GeoNet): Database Description and Application for Crustal Deformation Analysis

    Science.gov (United States)

    Cristofoletti, P.; Esposito, A.; Anzidei, M.; Galvani, A.; Baldi, P.; Pesci, A.; Casula, G.; Serpelloni, E.; Basili, R.

    2002-12-01

    During the last few years we set up and surveyed a GPS geodetic network to investigate the active tectonic areas of the Central Apennine, using a combination of permanent and not-permanent geodetic stations. The final goal is to evaluate the geodetic strain rate and the coseismic deformations of this seismically active area. For an optimal management and mapping of the CA-GeoNet (Central Apennine Geodetic Network) a Geographical Information System (GIS) has been developed. The GIS is used to analyze geodetic sources and improve the analysis of crustal deformations and has been realized on PC platform using MapInfo 6.0 and ArcGIS8.1 software. The GIS manages an SQL database consisting of different classes (Geodesy, Topography, Geography, Seismicity and Geology) administrated according to Thematic Layers. A GIS is required for the multidisciplinary approach and management of large multi-scaled data set, geographically referenced and with continuos or discrete coverage; it is particularly designed to analyze GPS sources and to improve crustal deformation analysis related with tectonic structures and seismicity. Through GIS we can display site displacements, strain rate maps and create new layers gained by numerical and spatial analysis. A tailor-made application to support co-seismic deformation scenarios related with historical and instrumental earthquakes and seismic sources, has been created. Our procedures can be successfully applied to design new geodetic networks in seismically active areas with respect to the known seismotectonic features. This dynamic approach in planning and managing GPS networks for geodynamic applications provides a useful tool for geophysical research, earthquake impact and civil protection management.

  8. The Gis of The Central Apennines Geodetic Network (ca-geonet): Database Description and Application For Crustal Deformation Analyses

    Science.gov (United States)

    Cristofoletti, P.; Esposito, A.; Anzidei, M.; Baldi, P.; Basili, R.; Casula, G.; Galvani, A.; Pesci, A.; Serpelloni, E.

    During the last few years we set up and surveyed a GPS geodetic network to inves- tigate the active tectonic areas of the Central Apennine, using a combination of per- manent and not-permanent geodetic stations. The final goal is to evaluate the geodetic strain rate and the coseismic deformations of this seismically active area. For an op- timal management and mapping of the CA-GeoNet (Central Apennine Geodetic Net- work) a Geographical Information System (GIS) has been developed. It has been real- ized on PC platform using MapInfo 6.0 and ArcGIS8.1 software. The GIS manages a database consisting of different classes (Geodesy, Topography, Geography, Seismicity and Geology) administrated according to Thematic Layers. A GIS is required for the multidisciplinary approach and management of large multi-scaled data set, geographi- cally referenced and with continuos or discrete coverage; it is particularly designed to analyze GPS sources and to improve crustal deformation analysis related with tectonic structures and seismicity. Through GIS we can display site displacements, strain rate maps and create new layers gained by numerical and spatial analysis. A tailor-made application to support co-seismic deformation scenarios related with historical and instrumental earthquakes and seismic sources, has been created. Our procedures can be successfully applied to design new geodetic networks in seismically active areas with respect to the known seismotectonic features. This dynamic approach in plan- ning and managing GPS networks for geodynamic applications provides a useful tool for geophysical research, earthquake impact and civil protection management.

  9. HINDI LANGUAGE INTERFACE TO DATABASES

    OpenAIRE

    Himani Jain

    2011-01-01

    The need for Hindi Language interface has become increasingly accurate as native people are using databases for storing the data. Large number of e-governance applications like agriculture, weather forecasting, railways, legacy matters etc use databases. So, to use such database applications with ease, people who are more comfortable with Hindi language, require these applications to accept a simple sentence in Hindi, and process it to generate a SQL query, which is further executed on the da...

  10. An Information System of Human Body Composition Based on Android Client

    Directory of Open Access Journals (Sweden)

    Bing Liu

    2014-12-01

    Full Text Available This paper proposes an information system of human body composition based on Android client. The system consists of the Android client, the measurement unit, the Database Server, the FTP Server, the Web Server and portable storage devices. It is able to collect, restore, synchronize, and batch import and export user profile information and human body composition information. The merits of the system are that the development cycle is shortened, the cost and energy consumption of equipment are reduced, and the portability and mobility are enhanced. The system has also optimized the communication of human body composition measurement. As a result, the client and the measurement unit are robust and capable of addressing the fault and solving deficiencies in the communication process. With a more reliable system, accurate transmission of data can be guaranteed.

  11. Clients' and therapists' stories about psychotherapy.

    Science.gov (United States)

    Adler, Jonathan M

    2013-12-01

    This article provides an overview of the emerging field of research on clients' stories about their experiences in psychotherapy. The theory of narrative identity suggests that individuals construct stories about their lives in order to provide the self with a sense of purpose and unity. Psychotherapy stories serve both psychological functions. Focusing on the theme of agency as a vehicle for operationalizing purpose and coherence as a way of operationalizing unity, this article will describe the existing scholarship connecting psychotherapy stories to clients' psychological well-being. Results from cross-sectional qualitative and quantitative studies as well as longitudinal research indicate a connection between the stories clients tell about therapy and their psychological well-being, both over the course of treatment and after it is over. In addition, a preliminary analysis of therapists' stories about their clients' treatment is presented. These analyses reveal that the way therapists recount a particular client's therapy does not impact the relationships between clients' narratives and their improvement. The article concludes with a discussion of how this body of scholarship might be fruitfully applied in the realm of clinical practice. PMID:22812587

  12. Asynchronous data change notification between database server and accelerator controls system

    International Nuclear Information System (INIS)

    Database data change notification (DCN) is a commonly used feature. Not all database management systems (DBMS) provide an explicit DCN mechanism. Even for those DBMS's which support DCN (such as Oracle and MS SQL server), some server side and/or client side programming may be required to make the DCN system work. This makes the setup of DCN between database server and interested clients tedious and time consuming. In accelerator control systems, there are many well established software client/server architectures (such as CDEV, EPICS, and ADO) that can be used to implement data reflection servers that transfer data asynchronously to any client using the standard SET/GET API. This paper describes a method for using such a data reflection server to set up asynchronous DCN (ADCN) between a DBMS and clients. This method works well for all DBMS systems which provide database trigger functionality. Asynchronous data change notification (ADCN) between database server and clients can be realized by combining the use of a database trigger mechanism, which is supported by major DBMS systems, with server processes that use client/server software architectures that are familiar in the accelerator controls community (such as EPICS, CDEV or ADO). This approach makes the ADCN system easy to set up and integrate into an accelerator controls system. Several ADCN systems have been set up and used in the RHIC-AGS controls system.

  13. Constraint Databases and Geographic Information Systems

    OpenAIRE

    Revesz, Peter

    2007-01-01

    Constraint databases and geographic information systems share many applications. However, constraint databases can go beyond geographic information systems in efficient spatial and spatiotemporal data handling methods and in advanced applications. This survey mainly describes ways that constraint databases go beyond geographic information systems. However, the survey points out that in some areas constraint databases can learn also from geographic information systems.

  14. Client Engagement Characteristics Associated with Problem Gambling Treatment Outcomes

    Science.gov (United States)

    Dowling, Nicki A.; Cosic, Sanja

    2011-01-01

    Previous research examining the factors associated with problem gambling treatment outcomes has examined client factors and to date, treatment characteristics, therapist factors, and client-therapist interactions have essentially remained unexplored. This study aimed to investigate how client engagement variables (client-rated therapeutic…

  15. A virtual repository approach to clinical and utilization studies: application in mammography as alternative to a national database.

    Science.gov (United States)

    Ohno-Machado, L; Boxwala, A A; Ehresman, J; Smith, D N; Greenes, R A

    1997-01-01

    A national mammography database was proposed, based on a centralized architecture for collecting, monitoring, and auditing mammography data. We have developed an alternative architecture relying on Internet-based distributed queries to heterogeneous databases. This architecture creates a "virtual repository", or a federated database which is constructed dynamically, for each query and makes use of data available in legacy systems. It allows the construction of custom-tailored databases at individual sites that can serve the dual purposes of providing data (a) to researchers through a common mammography repository and (b) to clinicians and administrators at participating institutions. We implemented this architecture in a prototype system at the Brigham and Women's Hospital to show its feasibility. Common queries are translated dynamically into database-specific queries, and the results are aggregated for immediate display or download by the user. Data reside in two different databases and consist of structured mammography reports, coded per BIRADS Standardized Mammography Lexicon, as well as pathology results. We prospectively collected data on 213 patients, and showed that our system can perform distributed queries effectively. We also implemented graphical exploratory analysis tools to allow visualization of results. Our findings indicate that the architecture is not only feasible, but also flexible and scaleable, constituting a good alternative to a national mammography database. PMID:9357650

  16. Conceptual considerations for CBM databases

    International Nuclear Information System (INIS)

    We consider a concept of databases for the Cm experiment. For this purpose, an analysis of the databases for large experiments at the LHC at CERN has been performed. Special features of various DBMS utilized in physical experiments, including relational and object-oriented DBMS as the most applicable ones for the tasks of these experiments, were analyzed. A set of databases for the CBM experiment, DBMS for their developments as well as use cases for the considered databases are suggested.

  17. JDD, Inc. Database

    Science.gov (United States)

    Miller, David A., Jr.

    2004-01-01

    JDD Inc, is a maintenance and custodial contracting company whose mission is to provide their clients in the private and government sectors "quality construction, construction management and cleaning services in the most efficient and cost effective manners, (JDD, Inc. Mission Statement)." This company provides facilities support for Fort Riley in Fo,rt Riley, Kansas and the NASA John H. Glenn Research Center at Lewis Field here in Cleveland, Ohio. JDD, Inc. is owned and operated by James Vaughn, who started as painter at NASA Glenn and has been working here for the past seventeen years. This summer I worked under Devan Anderson, who is the safety manager for JDD Inc. in the Logistics and Technical Information Division at Glenn Research Center The LTID provides all transportation, secretarial, security needs and contract management of these various services for the center. As a safety manager, my mentor provides Occupational Health and Safety Occupation (OSHA) compliance to all JDD, Inc. employees and handles all other issues (Environmental Protection Agency issues, workers compensation, safety and health training) involving to job safety. My summer assignment was not as considered "groundbreaking research" like many other summer interns have done in the past, but it is just as important and beneficial to JDD, Inc. I initially created a database using a Microsoft Excel program to classify and categorize data pertaining to numerous safety training certification courses instructed by our safety manager during the course of the fiscal year. This early portion of the database consisted of only data (training field index, employees who were present at these training courses and who was absent) from the training certification courses. Once I completed this phase of the database, I decided to expand the database and add as many dimensions to it as possible. Throughout the last seven weeks, I have been compiling more data from day to day operations and been adding the

  18. Using virtual Lustre clients on the WAN for analysis of data from high energy physics experiments

    International Nuclear Information System (INIS)

    We describe the work on creating system images of Lustre virtual clients in the ExTENCI project (Extending Science Through Enhanced National Cyberlnfrastructure), using several virtual technologies (Xen, VMware, VirtualBox, KVM). These virtual machines can be built at several levels, from a basic Linux installation (we use Scientific Linux 5 as an example), adding a Lustre client with Kerberos authentication, and up to complete clients including local or distributed (based on CernVM-FS) installations of the full CERN and project specific software stack for typical LHC experiments. The level, and size, of the images are determined by the users on demand. Various sites and individual users can just download and use them out of the box on Linux/UNIX, Windows and Mac OS X based hosts. We compare the performance of virtual clients with that of real physical systems for typical high energy physics applications like Monte Carlo simulations or analysis of data stored in ROOT trees.

  19. Application of LabVIEW in radiation monitor and database management system of Xi'an pulse reactor

    International Nuclear Information System (INIS)

    The radiation monitor and the database management system of XAPR uses single chip to collect, process and transport the signal, and uses RS-485 bus to constitute a testing network. The monitor and database management software was developed by LabVIEW. This software can monitor the radiation levels of each area of XAPR in real-time and can manage the dose database. The test run of the system indicates that the system is steady and trusty, and can achieve the targets. So the system can be applied. (authors)

  20. Design and deployment of a large brain-image database for clinical and nonclinical research

    Science.gov (United States)

    Yang, Guo Liang; Lim, Choie Cheio Tchoyoson; Banukumar, Narayanaswami; Aziz, Aamer; Hui, Francis; Nowinski, Wieslaw L.

    2004-04-01

    An efficient database is an essential component of organizing diverse information on image metadata and patient information for research in medical imaging. This paper describes the design, development and deployment of a large database system serving as a brain image repository that can be used across different platforms in various medical researches. It forms the infrastructure that links hospitals and institutions together and shares data among them. The database contains patient-, pathology-, image-, research- and management-specific data. The functionalities of the database system include image uploading, storage, indexing, downloading and sharing as well as database querying and management with security and data anonymization concerns well taken care of. The structure of database is multi-tier client-server architecture with Relational Database Management System, Security Layer, Application Layer and User Interface. Image source adapter has been developed to handle most of the popular image formats. The database has a user interface based on web browsers and is easy to handle. We have used Java programming language for its platform independency and vast function libraries. The brain image database can sort data according to clinically relevant information. This can be effectively used in research from the clinicians" points of view. The database is suitable for validation of algorithms on large population of cases. Medical images for processing could be identified and organized based on information in image metadata. Clinical research in various pathologies can thus be performed with greater efficiency and large image repositories can be managed more effectively. The prototype of the system has been installed in a few hospitals and is working to the satisfaction of the clinicians.

  1. Consistency and Security in Mobile Real Time Distributed Database (MRTDDB): A Combinational Giant Challenge

    Science.gov (United States)

    Gupta, Gyanendra Kr.; Sharma, A. K.; Swaroop, Vishnu

    2010-11-01

    Many type of Information System are widely used in various fields. With the hasty development of computer network, Information System users care more about data sharing in networks. In traditional relational database, data consistency was controlled by consistency control mechanism when a data object is locked in a sharing mode, other transactions can only read it, but can not update it. If the traditional consistency control method has been used yet, the system's concurrency will be inadequately influenced. So there are many new necessities for the consistency control and security in MRTDDB. The problem not limited only to type of data (e.g. mobile or real-time databases). There are many aspects of data consistency problems in MRTDDB, such as inconsistency between attribute and type of data; the inconsistency of topological relations after objects has been modified. In this paper, many cases of consistency are discussed. As the mobile computing becomes well liked and the database grows with information sharing security is a big issue for researchers. Consistency and Security of data is a big challenge for researchers because when ever the data is not consistent and secure no maneuver on the data (e.g. transaction) is productive. It becomes more and more crucial when the transactions are used in non-traditional environment like Mobile, Distributed, Real Time and Multimedia databases. In this paper we raise the different aspects and analyze the available solution for consistency and security of databases. Traditional Database Security has focused primarily on creating user accounts and managing user privileges to database objects. But in the mobility and nomadic computing uses these database creating a new opportunities for research. The wide spread use of databases over the web, heterogeneous client-server architectures, application servers, and networks creates a critical need to amplify this focus. In this paper we also discuss an overview of the new and old

  2. A Permutation Gigantic Issues in Mobile Real Time Distributed Database : Consistency & Security

    Directory of Open Access Journals (Sweden)

    Gyanendra Kr. Gupta

    2011-02-01

    Full Text Available Several shape of Information System are broadly used in a variety of System Models. With the rapid development of computer network, Information System users concern more about data sharing in networks. In conventional relational database, data consistency was controlled by consistency control mechanism when a data object is locked in a sharing mode, other transactions can only read it, but can not update it. If the traditional consistency control method has been used yet, the system’s concurrency will be inadequately influenced. So there are many new necessities for the consistency control and security in Mobile Real Time Distributed Database (MRTDDB. The problem not limited only to type of data (e.g. mobile or real-time databases. There are many aspects of data consistency problems in MRTDDB, such as inconsistency between characteristic and type of data; the nconsistency of topological relations after objects has been modified. In this paper, many cases of consistency are discussed. As the mobile computing becomes well-liked and the database grows with information sharing security is a big issue for researchers. Mutually both Consistency and Security of data is a big confront for esearchers because whenever the data is not consistent and secure no maneuver on the data (e.g. transaction is productive. It becomes more and more crucial when the transactions are used in on-traditional environment like Mobile, Distributed, Real Time and Multimedia databases. In this paper we raise the different aspects and analyze the available solution for consistency and security of databases. Traditional Database Security has focused primarily on creating user accounts and managing user rights to database objects. But in the mobility and drifting computing uses this database creating a new prospect for research. The wide spread use of databases over the web, heterogeneous client-server architectures,application servers, and networks creates a critical need to

  3. Database Manager

    Science.gov (United States)

    Martin, Andrew

    2010-01-01

    It is normal practice today for organizations to store large quantities of records of related information as computer-based files or databases. Purposeful information is retrieved by performing queries on the data sets. The purpose of DATABASE MANAGER is to communicate to students the method by which the computer performs these queries. This…

  4. Maize databases

    Science.gov (United States)

    This chapter is a succinct overview of maize data held in the species-specific database MaizeGDB (the Maize Genomics and Genetics Database), and selected multi-species data repositories, such as Gramene/Ensembl Plants, Phytozome, UniProt and the National Center for Biotechnology Information (NCBI), ...

  5. Prototype for a generic thin-client remote analysis environment for CMS

    International Nuclear Information System (INIS)

    The multi-tiered architecture of the highly-distributed CMS computing systems necessitates a flexible data distribution and analysis environment. The authors describe a prototype analysis environment which functions efficiently over wide area networks using a server installed at the Caltech/UCSD Tier 2 prototype to analyze CMS data stored at various locations using a thin client. The analysis environment is based on existing HEP (Anaphe) and CMS (CARF, ORCA, IGUANA) software technology on the server accessed from a variety of clients. A Java Analysis Studio (JAS, from SLAC) plug-in is being developed as a reference client. The server is operated as a 'black box' on the proto-Tier2 system. ORCA objectivity databases (e.g. an existing large CMS Muon sample) are hosted on the master and slave nodes, and remote clients can request processing of queries across the server nodes, and get the histogram results returned and rendered in the client. The server is implemented using pure C++, and use XML-RPC as a language-neutral transport. This has several benefits, including much better scalability, better integration with CARF-ORCA, and importantly, makes the work directly useful to other non-Java general-purpose analysis and presentation tools such as Hippodraw, Lizard, or ROOT

  6. Database Reports Over the Internet

    Science.gov (United States)

    Smith, Dean Lance

    2002-01-01

    Most of the summer was spent developing software that would permit existing test report forms to be printed over the web on a printer that is supported by Adobe Acrobat Reader. The data is stored in a DBMS (Data Base Management System). The client asks for the information from the database using an HTML (Hyper Text Markup Language) form in a web browser. JavaScript is used with the forms to assist the user and verify the integrity of the entered data. Queries to a database are made in SQL (Sequential Query Language), a widely supported standard for making queries to databases. Java servlets, programs written in the Java programming language running under the control of network server software, interrogate the database and complete a PDF form template kept in a file. The completed report is sent to the browser requesting the report. Some errors are sent to the browser in an HTML web page, others are reported to the server. Access to the databases was restricted since the data are being transported to new DBMS software that will run on new hardware. However, the SQL queries were made to Microsoft Access, a DBMS that is available on most PCs (Personal Computers). Access does support the SQL commands that were used, and a database was created with Access that contained typical data for the report forms. Some of the problems and features are discussed below.

  7. Nuclear Science References Database

    International Nuclear Information System (INIS)

    The Nuclear Science References (NSR) database together with its associated Web interface, is the world's only comprehensive source of easily accessible low- and intermediate-energy nuclear physics bibliographic information for more than 210,000 articles since the beginning of nuclear science. The weekly-updated NSR database provides essential support for nuclear data evaluation, compilation and research activities. The principles of the database and Web application development and maintenance are described. Examples of nuclear structure, reaction and decay applications are specifically included. The complete NSR database is freely available at the websites of the National Nuclear Data Center (http://www.nndc.bnl.gov/nsr) and the International Atomic Energy Agency (http://www-nds.iaea.org/nsr)

  8. Failure database and tools for wind turbine availability and reliability analyses. The application of reliability data for selected wind turbines

    DEFF Research Database (Denmark)

    Kozine, Igor; Christensen, P.; Winther-Jensen, M.

    2000-01-01

    The objective of this project was to develop and establish a database for collecting reliability and reliability-related data, for assessing the reliability of wind turbine components and subsystems and wind turbines as a whole, as well as for assessingwind turbine availability while ranking the ...... similar safety systems. The database was established with Microsoft Access DatabaseManagement System, the software for reliability and availability assessments was created with Visual Basic....... contributions at both the component and system levels. The project resulted in a software package combining a failure database with programs for predicting WTB availability and the reliability of all thecomponents and systems, especially the safety system. The report consists of a description of the theoretical...

  9. Database computing in HEP

    International Nuclear Information System (INIS)

    The major SSC experiments are expected to produce up to 1 Petabyte of data per year each. Once the primary reconstruction is completed by farms of inexpensive processors. I/O becomes a major factor in further analysis of the data. We believe that the application of database techniques can significantly reduce the I/O performed in these analyses. We present examples of such I/O reductions in prototype based on relational and object-oriented databases of CDF data samples

  10. Taxes in Europe Database

    OpenAIRE

    European Commission DG Taxation and Customs Union

    2009-01-01

    The Taxes in Europe database is the European Commission's on-line information tool covering the main taxes in force in the EU Member States. Access is free for all users. The system contains information on around 650 taxes, as provided to the European Commission by the national authorities. The "Taxes in Europe" database contains, for each individual tax, information on its legal basis, assessment base, main exemptions, applicable rate(s), economic and statistical classification, as well as t...

  11. Image Reference Database in Teleradiology: Migrating to WWW

    Science.gov (United States)

    Pasqui, Valdo

    The paper presents a multimedia Image Reference Data Base (IRDB) used in Teleradiology. The application was developed at the University of Florence in the framework of the European Community TELEMED Project. TELEMED overall goals and IRDB requirements are outlined and the resulting architecture is described. IRDB is a multisite database containing radiological images, selected because their scientific interest, and their related information. The architecture consists of a set of IRDB Installations which are accessed from Viewing Stations (VS) located at different medical sites. The interaction between VS and IRDB Installations follows the client-server paradigm and uses an OSI level-7 protocol, named Telemed Communication Language. After reviewing Florence prototype implementation and experimentation, IRDB migration to World Wide Web (WWW) is discussed. A possible scenery to implement IRDB on the basis of WWW model is depicted in order to exploit WWW servers and browsers capabilities. Finally, the advantages of this conversion are outlined.

  12. A NOVEL REDIS SECURITY BEST PRACTICES FOR NOSQL DATABASES

    OpenAIRE

    Jeelani Ahmed

    2016-01-01

    In last decades of years the field of databases has emerged. The organizations are migrating towards Non-Relational databases from Relational Databases due to the current trend of Big Data, Big Users and Cloud Computing. Business data processing is the main market of Relational Databases. It turns out to be harder to managing Big Clients and Big information on a cloud domain. To modeling the data these databases uses a rigid and schema based approach and are designed to run on a single machin...

  13. On Simplifying Features in OpenStreetMap database

    Science.gov (United States)

    Qian, Xinlin; Tao, Kunwang; Wang, Liang

    2015-04-01

    Currently the visualization of OpenStreetMap data is using a tile server which stores map tiles that have been rendered from vector data in advance. However, tiled map are short of functionalities such as data editing and customized styling. To enable these advanced functionality, Client-side processing and rendering of geospatial data is needed. Considering the voluminous size of the OpenStreetMap data, simply sending region queries results of OSM database to client is prohibitive. To make the OSM data retrieved from database adapted for client receiving and rendering, It must be filtered and simplified at server-side to limit its volume. We propose a database extension for OSM database to make it possible to simplifying geospatial objects such as ways and relations during data queries. Several auxiliary tables and PL/pgSQL functions are presented to make the geospatial features can be simplified by omitting unimportant vertices. There are five components in the database extension: Vertices weight computation by polyline and polygon simplification algorithm, Vertices weight storage in auxiliary tables. filtering and selecting of vertices using specific threshold value during spatial queries, assembling of simplified geospatial objects using filtered vertices, vertices weight updating after geospatial objects editing. The database extension is implemented on an OSM APIDB using PL/pgSQL. The database contains a subset of OSM database. The experimental database contains geographic data of United Kingdom which is about 100 million vertices and roughly occupy 100GB disk. JOSM are used to retrieve the data from the database using a revised data accessing API and render the geospatial objects in real-time. When serving simplified data to client, The database allows user to set the bound of the error of simplification or the bound of responding time in each data query. Experimental results show the effectiveness and efficiency of the proposed methods in building a

  14. Comparative study on Authenticated Sub Graph Similarity Search in Outsourced Graph Database

    Directory of Open Access Journals (Sweden)

    N. D. Dhamale

    2015-11-01

    Full Text Available Today security is very important in the database system. Advanced database systems face a great challenge raised by the emergence of massive, complex structural data in bioinformatics, chem-informatics, and many other applications. Since exact matching is often too restrictive, similarity search of complex structures becomes a vital operation that must be supported efficiently. The Subgraph similarity search is used in graph databases to retrieve graphs whose subgraphs are similar to a given query graph. It has been proven successful in a wide range of applications including bioinformatics and chem-informatics, etc. Due to the cost of providing efficient similarity search services on everincreasing graph data, database outsourcing is apparently an appealing solution to database owners. In this paper, we are studying on authentication techniques that follow the popular filtering-and-verification framework. An authentication-friendly metric index called GMTree. Specifically, we transform the similarity search into a search in a graph metric space and derive small verification objects (VOs to-be-transmitted to query clients. To further optimize GMTree, we are studying on a sampling-based pivot selection method and an authenticated version of MCS computation.

  15. 网络环境下的非结构化数据库应用研究%Application Research on Unstructured Database in the Network

    Institute of Scientific and Technical Information of China (English)

    王颖; 李建敏

    2015-01-01

    Based on unstructured database technology,this paper analyzes the current situation of data⁃base application in the network environment,discusses the construction of network databases,analysis of un⁃structured data,and an application example is used for explanation.%本文从非结构化数据库技术出发,分析了网络环境下数据库应用现状,探讨了网络数据库的建设、非结构化数据的分析,并以一个具体应用实例进行了说明。

  16. The Influence of a Client Preference on Auditor Judgment: An Investigation of Temporal Effects and Client Trustworthiness

    OpenAIRE

    Jenkins, James Gregory Jr.

    1998-01-01

    The purpose of this dissertation is to investigate auditors' judgments and decisions in the presence of an explicitly stated client preference. This investigation considers two factors. First, the temporal placement (i.e., timing) of the client preference is varied to allow for an examination of differential effects associated with the receipt of an early client preference and a late client preference. Second, client trustworthiness is varied so that participants may have a basis upon whic...

  17. Various Database Attacks and its Prevention Techniques

    OpenAIRE

    K.A.VarunKumar; M.Prabakaran; Ajay Kaurav; S.Sibi Chakkaravarthy; Thiyagarajan, S; Pokala Venkatesh

    2014-01-01

    Increasing in the popularity of internet, the application of database also widely spread. There are some serious threats because of hackers done various attempts to steal the data in the database. Various attacks like Sql injection, Cross site scripting may change the information in the databases which decreases the truthfulness of the database. Intrusion detection system is used to detect whether the attack is carried on the database. In this paper we surveyed different types of database att...

  18. SHORT SURVEY ON GRAPHICAL DATABASE

    Directory of Open Access Journals (Sweden)

    Harsha R Vyavahare

    2015-08-01

    Full Text Available This paper explores the features of graph databases and data models. The popularity towards work with graph models and datasets has been increased in the recent decades .Graph database has a number of advantage over the relational database. This paper take a short review on the graph and hyper graph concepts from mathematics so that graph so that we can understand the existing difficulties in the implantation of graph model. From the Past few decades saw hundreds of research contributions their vast research in the DBS field with graph database. However, the research on the existence of general purpose DBS managements and mining that suits for variety of applications is still very much active. The review is done based on the Application of graph model techniques in the database within the framework of graph based approaches with the aim of implementation of different graphical database and tabular database

  19. RDD Databases

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This database was established to oversee documents issued in support of fishery research activities including experimental fishing permits (EFP), letters of...

  20. Dealer Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The dealer reporting databases contain the primary data reported by federally permitted seafood dealers in the northeast. Electronic reporting was implemented May...

  1. Flash Caching on the Storage Client

    OpenAIRE

    Holland, David A.; Angelino, Elaine Lee; Wald, Gideon; Seltzer, Margo I.

    2013-01-01

    Flash memory has recently become popular as a caching medium. Most uses to date are on the storage server side. We investigate a different structure: flash as a cache on the client side of a networked storage environment. We use trace-driven simulation to explore the design space. We consider a wide range of configurations and policies to determine the potential client-side caches might offer and how best to arrange them. Our results show that the flash cache writeback policy does not signifi...

  2. National database

    DEFF Research Database (Denmark)

    Kristensen, Helen Grundtvig; Stjernø, Henrik

    1995-01-01

    Artikel om national database for sygeplejeforskning oprettet på Dansk Institut for Sundheds- og Sygeplejeforskning. Det er målet med databasen at samle viden om forsknings- og udviklingsaktiviteter inden for sygeplejen.......Artikel om national database for sygeplejeforskning oprettet på Dansk Institut for Sundheds- og Sygeplejeforskning. Det er målet med databasen at samle viden om forsknings- og udviklingsaktiviteter inden for sygeplejen....

  3. THE ROLE OF DATABASE MARKETING IN THE OPERATIONALIZATION OF THE SERVICES RELATIONSHIP MARKETING

    OpenAIRE

    DUMITRESCU Luigi; Mircea FUCIU

    2010-01-01

    The relationship marketing aims the construction of a durable relation between the enterprise and the final client, identified at an individual level. The particular part of the relationship marketing has two main concepts: individuality and the relation. This paper presents the concepts of relationship marketing, database marketing and geomarketing. We present the importance of implementing a marketing database in a service providing enterprise and its implications on one hand for the client...

  4. Handling of network and database instabilities in CORAL

    CERN Document Server

    Trentadue, R; Kalkhof, A

    2012-01-01

    The CORAL software is widely used by the LHC experiments for storing and accessing data using relational database technologies. CORAL provides a C++ abstraction layer that supports data persistency for several back-ends and deployment models, direct client access to Oracle servers being one of the most important use cases. Since 2010, several problems have been reported by the LHC experiments in their use of Oracle through CORAL, involving application errors, hangs or crashes after the network or the database servers became temporarily unavailable. CORAL already provided some level of handling of these instabilities, which are due to external causes and cannot be avoided, but this proved to be insufficient in some cases and to be itself the cause of other problems, such as the hangs and crashes mentioned before, in other cases. As a consequence, a major redesign of the CORAL plugins has been implemented, with the aim of making the software more robust against these database and network glitches. The new imple...

  5. Concierge: Personal database software for managing digital research resources

    Directory of Open Access Journals (Sweden)

    Hiroyuki Sakai

    2007-11-01

    Full Text Available This article introduces a desktop application, named Concierge, for managing personal digital research resources. Using simple operations, it enables storage of various types of files and indexes them based on content descriptions. A key feature of the software is a high level of extensibility. By installing optional plug-ins, users can customize and extend the usability of the software based on their needs. In this paper, we also introduce a few optional plug-ins: literaturemanagement, electronic laboratory notebook, and XooNlps client plug-ins. XooNIps is a content management system developed to share digital research resources among neuroscience communities. It has been adopted as the standard database system in Japanese neuroinformatics projects. Concierge, therefore, offers comprehensive support from management of personal digital research resources to their sharing in open-access neuroinformatics databases such as XooNIps. This interaction between personal and open-access neuroinformatics databases is expected to enhance the dissemination of digital research resources. Concierge is developed as an open source project; Mac OS X and Windows XP versions have been released at the official site (http://concierge.sourceforge.jp.

  6. 基于Web的数据库应用设计%Design of database Application Based on Web

    Institute of Scientific and Technical Information of China (English)

    马克

    2001-01-01

    本文通过对Web数据库信息发布方法的分析,论述了利用ASP(Active Server Page)与 ADO(ActiveX Data Objects)的组件对象访问数据库的技术,说明ASP技术具有良好的数据库兼容性。%The article analyses about Web database how to publish information on Internet/Intranet to fascinate more users.It discussed accessing database's technique that uses module objects of ASP(Active Server Page)and ADO(Active Data Objects).It explains the ASP techniques have good database compatibility.

  7. Improvement of the efficiency of artificial insemination services through the use of radioimmunoassay and a computer database application

    International Nuclear Information System (INIS)

    A study was conducted at several locations in four provinces of Indonesia to evaluate and increase the efficiency of artificial insemination (AI) services provided to cattle farmers and to improve the feeding and reproductive management practices. Radioimmunoassay (RIA) for progesterone measurement was used together with the computer program Artificial Insemination Database Application (AIDA) to monitor the success of AI and for the early diagnosis of non-pregnancy and reproductive disorders in dairy and beef cattle. Baseline surveys showed that the average calving to first service interval (CFSI) ranged from 121.3 ± 78.2 days in West Java to 203.5 ± 118.3 in West Sumatra, and the conception rate (CR) to first AI ranged from 27% in South Sulawesi to 44% in West Java. Supplementary feeding with urea-molasses multi-nutrient blocks (UMMB) combined with training of farmers on improved husbandry practices reduced the CFSI from 150.6 ± 66.3 days to 102.3 ± 36.5 days and increased the CR from 27% to 49% in South Sulawesi. Similar interventions in West Java reduced the CFSI from 121.3 ± 78.2 days to 112.1 ± 80.9 days and increased the CR from 34% to 37%. Results from measurement of progesterone in milk or blood samples collected on days 0, 10-12 and 22-24 after AI showed that 25% of the animals were non-cyclic or anovulatory, while 8.7% were pregnant at the time of AI. Investigation of cows with breeding problems using measurement of progesterone in combination with clinical examination revealed a range of problems, including true anoestrus, sub-oestrus or missed oestrus, persistent CL and luteal cysts. The ability to make an accurate diagnosis enabled the provision of appropriate advice or treatment for overcoming the problems. Anti-progesterone serum and 125I-Progesterone tracer for use in RIA were produced locally and were found to have acceptable characteristics. The tracer had good specific activity and stability for up to 12 weeks. The production of standards

  8. Database for foundry engineers – simulationDB – a modern database storing simulation results

    Directory of Open Access Journals (Sweden)

    P. Malinowski

    2010-11-01

    Full Text Available Purpose: of this paper The main aim of this paper is to build specific database system for collecting, analysing and searching simulation results.Design/methodology/approach: It was prepared using client-server architecture. Then was prepared GUI - Graphical User Interface.Findings: New database system for foundry was discovered.Practical implications: System development is in progress and practical implication will be hold in one of iron foundry in next year.Originality/value: The original value of this paper is innovative database system for storing and analysing simulation results.

  9. The EarthServer project: Exploiting Identity Federations, Science Gateways and Social and Mobile Clients for Big Earth Data Analysis

    Science.gov (United States)

    Barbera, Roberto; Bruno, Riccardo; Calanducci, Antonio; Messina, Antonio; Pappalardo, Marco; Passaro, Gianluca

    2013-04-01

    The EarthServer project (www.earthserver.eu), funded by the European Commission under its Seventh Framework Program, aims at establishing open access and ad-hoc analytics on extreme-size Earth Science data, based on and extending leading-edge Array Database technology. The core idea is to use database query languages as client/server interface to achieve barrier-free "mix & match" access to multi-source, any-size, multi-dimensional space-time data -- in short: "Big Earth Data Analytics" - based on the open standards of the Open Geospatial Consortium Web Coverage Processing Service (OGC WCPS) and the W3C XQuery. EarthServer combines both, thereby achieving a tight data/metadata integration. Further, the rasdaman Array Database System (www.rasdaman.com) is extended with further space-time coverage data types. On server side, highly effective optimizations - such as parallel and distributed query processing - ensure scalability to Exabyte volumes. Six Lighthouse Applications are being established in EarthServer, each of which poses distinct challenges on Earth Data Analytics: Cryospheric Science, Airborne Science, Atmospheric Science, Geology, Oceanography, and Planetary Science. Altogether, they cover all Earth Science domains; the Planetary Science use case has been added to challenge concepts and standards in non-standard environments. In addition, EarthLook (maintained by Jacobs University) showcases use of OGC standards in 1D through 5D use cases. In this contribution we will report on the first applications integrated in the EarthServer Science Gateway and on the clients for mobile appliances developed to access them. We will also show how federated and social identity services can allow Big Earth Data Providers to expose their data in a distributed environment keeping a strict and fine-grained control on user authentication and authorisation. The degree of fulfilment of the EarthServer implementation with the recommendations made in the recent TERENA Study on

  10. Features client-server bus seats reservation technology in the long-distance connection

    OpenAIRE

    Радченко, К. О.; Національний технічний університет України «КПІ»; Ружевський, М. С.; Національний технічний університет України «КПІ»; Шроль, А. Ю.; Національний технічний університет України «КПІ»

    2016-01-01

    There is description of the features of the client-server technology of booking places by a bus driverwith the help of the developed software for mobile devices and tablets based on Android operatingsystem. The application allows the driver of the long-distance connection to send data about theoccupied seats in a salon during the movement of the bus on the MTE server. The application has auser-friendly interface. For client-server communication capabilities Android Studio and AndroidSDK are used

  11. Attracting Clients to Service-Oriented Programs.

    Science.gov (United States)

    Disney, Diane M.

    One of a series of manuals developed by the Home and Community-Based Career Education Project, the outreach component publication describes how the project went about attracting clients for its adult vocational counseling services. Sections include: creating a publicity campaign, using an advertising agency, creating products for the mass media,…

  12. Counselor Loss: Terminating the Helped Client.

    Science.gov (United States)

    Miller, Mark J.

    1981-01-01

    Discusses counselor feelings of loss due to client departure from therapy. Describes components of loss within a five-stage model including denial, anger, bargaining, depression, and acceptance. Outlines strategies for coping with counselor loss. Suggests feelings of loss are natural. (RC)

  13. Practical Client Puzzle from Repeated Squaring

    NARCIS (Netherlands)

    Jeckmans, A.

    2009-01-01

    Cryptographic puzzles have been proposed by Merkle [15] to relay secret information between parties over an insecure channel. Client puzzles, a type of cryptographic puzzle, have been proposed by Juels and Brainard [8] to defend a server against denial of service attacks. However there is no general

  14. Energy companies need to cuddle their clients

    International Nuclear Information System (INIS)

    Due to a liberalized energy market in Europe more than 20% of the customers has chosen another electricity supplier. In spite of that many energy suppliers do not yet operate as client-oriented businesses, according to a report of Ernst and Young: 'Trend in Energy 2000'. Energy companies should become more active in relation management

  15. Borderline Clients: Practice Implications of Recent Research.

    Science.gov (United States)

    Johnson, Harriette C.

    1991-01-01

    Reviews current research on treatment of borderline clients with medication, individual counseling, and family interventions. Notes that recent studies indicate that borderline personality is heterogeneous condition in which different underlying disorders (affective, schizotypal, and neurological) may be present. Reviews effectiveness of various…

  16. Finding Happiness for Ourselves and Our Clients.

    Science.gov (United States)

    Miller, Geri

    2001-01-01

    Reviews D. G. Myers' (2000) examination of the contributing factors of happiness: money, relationships, and religion. Discusses the implications of these factors for counseling with specific recommendations made for counselors regarding their own self-care and their work with their clients. (GCP)

  17. Counselling the Borderline Client: An Interpersonal Approach.

    Science.gov (United States)

    Angus, Lynne; Gillies, Laurie A.

    1994-01-01

    Critically reviews characteristics thought to be associated with "difficult clients" in light of research findings emerging from psychiatric and psychological literature pertaining to Borderline Personality Disorder. Reviews recent developments for brief treatment strategies and describes development of short-term interpersonal therapy program.…

  18. Psychotherapists' Attitudes toward Homosexual Psychotherapy Clients.

    Science.gov (United States)

    Garfinkle, Ellen M.; Morin, Stephen F.

    1978-01-01

    Continued research into the sex-role expectations which therapists hold toward clients is an issue of particular relevance to the gay community. The training of psychotherapists should pay attention to both sex-role expectations and homosexual stereotypes as potential sources of bias in therapists' perceptions and evaluations of homosexual…

  19. 基于Client Honeypot的网络入侵检测系统%Network Intrusion Detection System Based on Client Honeypot

    Institute of Scientific and Technical Information of China (English)

    忻俊

    2015-01-01

    随着使用者的需求变化与Web应用技术的快速发展,Web应用更为开放并更强调分享及互动,使得Web应用成为当今网络应用的潮流,但也成为黑客新的攻击目标.黑客对网站植入恶意程序代码,造成Web事件威胁不断衍生, Web已变成信息安全攻击重要感染途径之一.该文将介绍一种恶意网页检测方法-Client Honeypot.Client Honeypot系利用Client端主动与Web Server产生互动以进行探测及诱捕,有别于传统的入侵检测系统被动式检测模式.该研究以Open Source工具Honey C为基础进行研究改进,实现对恶意网页检测的应用.%with the user's demand changes with the rapid development of Web application technology, Web application more open and more emphasis on sharing and interactive, making Web applications become the trends of network applications, but also be-come a new target of hackers. Hackers malicious code to your website to make the Web event threat derived from tradition, the Web has become a information security against one of the important infection. This article introduces a kind of malicious web pages detection method-Client Honeypot. Client Honeypot system using Client side active interactions with a Web Server to detect and trapping, passive detection is different from the traditional intrusion detection system model. This study is based on Open Source tools Honey C study improvement, for the application of detecting malicious web pages.

  20. Database Driven Web Systems for Education.

    Science.gov (United States)

    Garrison, Steve; Fenton, Ray

    1999-01-01

    Provides technical information on publishing to the Web. Demonstrates some new applications in database publishing. Discusses the difference between static and database-drive Web pages. Reviews failures and successes of a Web database system. Addresses the question of how to build a database-drive Web site, discussing connectivity software, Web…

  1. Generation of a predicted protein database from EST data and application to iTRAQ analyses in grape (Vitis vinifera cv. Cabernet Sauvignon berries at ripening initiation

    Directory of Open Access Journals (Sweden)

    Smith Derek

    2009-01-01

    Full Text Available Abstract Background iTRAQ is a proteomics technique that uses isobaric tags for relative and absolute quantitation of tryptic peptides. In proteomics experiments, the detection and high confidence annotation of proteins and the significance of corresponding expression differences can depend on the quality and the species specificity of the tryptic peptide map database used for analysis of the data. For species for which finished genome sequence data are not available, identification of proteins relies on similarity to proteins from other species using comprehensive peptide map databases such as the MSDB. Results We were interested in characterizing ripening initiation ('veraison' in grape berries at the protein level in order to better define the molecular control of this important process for grape growers and wine makers. We developed a bioinformatic pipeline for processing EST data in order to produce a predicted tryptic peptide database specifically targeted to the wine grape cultivar, Vitis vinifera cv. Cabernet Sauvignon, and lacking truncated N- and C-terminal fragments. By searching iTRAQ MS/MS data generated from berry exocarp and mesocarp samples at ripening initiation, we determined that implementation of the custom database afforded a large improvement in high confidence peptide annotation in comparison to the MSDB. We used iTRAQ MS/MS in conjunction with custom peptide db searches to quantitatively characterize several important pathway components for berry ripening previously described at the transcriptional level and confirmed expression patterns for these at the protein level. Conclusion We determined that a predicted peptide database for MS/MS applications can be derived from EST data using advanced clustering and trimming approaches and successfully implemented for quantitative proteome profiling. Quantitative shotgun proteome profiling holds great promise for characterizing biological processes such as fruit ripening

  2. Database driven scheduling for batch systems

    International Nuclear Information System (INIS)

    Experiments at the Jefferson Laboratory will soon be generating data at the rate of 1 TB/day. In this paper, the authors present a database driven scheme that they are currently implementing in order to ensure the safe archival and subsequent reconstruction of this data. They use a client-server architecture implemented in Java to serve data between the experiments, the mass storage, and the processor farm

  3. Mobile Database Application in Ad Hoc Network%Ad Hoc网中的移动数据库应用

    Institute of Scientific and Technical Information of China (English)

    范俊; 李晓宇

    2012-01-01

    As a result that the traditional mobile database model can not adapt to the actual situation of Ad Hoc network, this paper improves the traditional mobile database model with adding a local server as the agent to adapt to the situation of Ad Hoc network, forming the mobile database model which comprises mobile computer, local server and master server. Furthermore, two algorithms which are used to solve problems of transactions redoing and data synchronization between local server and master server are proposed, so that mobile computer can access database efficiently and correctly. Experimental results show that mobile database model can gain good stability.%传统的移动数据库模型应用到Ad Hoc网中,会带来通信代价增大等问题.为此,对传统的移动数据库模型进行改进,加入本地服务器作为中介,形成由移动主机、本地服务器和主服务器3类结点构成的移动数据库模型,并提出2个算法用来解决模型中本地服务器与主服务器上的数据同步和事务重做问题,从而使移动主机能够高效正确地访问数据库.实验结果证明,该移动数据库模型具有较好的稳定性.

  4. Prioritizing Project Performance Criteria within Client Perspective

    Directory of Open Access Journals (Sweden)

    Arazi Idrus

    2011-10-01

    Full Text Available Successful performance in a construction project helps to deliver good products to the client. At present, there is no standard approach used by clients to evaluate project performance as project success carries different definitions to different people. Some used the traditional project performance measures of cost, quality and time while others used additional non-traditional measures such as the environment, health and safety, level of technology and contractor planning. The purpose of this study is to identify and rank the actual criteria used by local clients in current practice to measure the performance of a construction project during construction as well as upon completion. The ranking is based on the relative importance of the criteria as perceived by project performance decision makers working for clients’ organizations within the Malaysian construction industry using their accumulated experience and judgment. The objective of this study was investigated through a postal questionnaire which covered a selected sample of the study. Data were analyzed using mean, variance, frequency and severity index analyses. The results of this paper show that Quality of finished project, Construction cost and Construction time were the three most important criteria considered crucial by the respondents for evaluating project performance from current practice in Malaysia. The paper provides supportive practical solution for project performance decision makers working for clients’ organizations within the Malaysian construction industry to enhance and improve their practices in measuring their clients’ project performance so that their clients would enjoy higher satisfaction levels from their projects. More so, the paper would serve as a guide to contractors by helping them to understand that Quality of finished project, Construction cost and Construction time are the criteria given high priority by clients in measuring the performance of a

  5. Towards the Interoperability of Web, Database, and Mass Storage Technologies for Petabyte Archives

    Science.gov (United States)

    Moore, Reagan; Marciano, Richard; Wan, Michael; Sherwin, Tom; Frost, Richard

    1996-01-01

    At the San Diego Supercomputer Center, a massive data analysis system (MDAS) is being developed to support data-intensive applications that manipulate terabyte sized data sets. The objective is to support scientific application access to data whether it is located at a Web site, stored as an object in a database, and/or storage in an archival storage system. We are developing a suite of demonstration programs which illustrate how Web, database (DBMS), and archival storage (mass storage) technologies can be integrated. An application presentation interface is being designed that integrates data access to all of these sources. We have developed a data movement interface between the Illustra object-relational database and the NSL UniTree archival storage system running in a production mode at the San Diego Supercomputer Center. With this interface, an Illustra client can transparently access data on UniTree under the control of the Illustr DBMS server. The current implementation is based on the creation of a new DBMS storage manager class, and a set of library functions that allow the manipulation and migration of data stored as Illustra 'large objects'. We have extended this interface to allow a Web client application to control data movement between its local disk, the Web server, the DBMS Illustra server, and the UniTree mass storage environment. This paper describes some of the current approaches successfully integrating these technologies. This framework is measured against a representative sample of environmental data extracted from the San Diego Ba Environmental Data Repository. Practical lessons are drawn and critical research areas are highlighted.

  6. Biological Databases

    Directory of Open Access Journals (Sweden)

    Kaviena Baskaran

    2013-12-01

    Full Text Available Biology has entered a new era in distributing information based on database and this collection of database become primary in publishing information. This data publishing is done through Internet Gopher where information resources easy and affordable offered by powerful research tools. The more important thing now is the development of high quality and professionally operated electronic data publishing sites. To enhance the service and appropriate editorial and policies for electronic data publishing has been established and editors of article shoulder the responsibility.

  7. A Robust Client Verification in cloud enabled m-Commerce using Gaining Protocol

    CERN Document Server

    N., Chitra Kiran

    2012-01-01

    The proposed system highlights a novel approach of exclusive verification process using gain protocol for ensuring security among both the parties (client-service provider) in m-commerce application with cloud enabled service. The proposed system is based on the potential to verify the clients with trusted hand held device depending on the set of frequent events and actions to be carried out. The framework of the proposed work is design after collecting a real time data sets from an android enabled hand set, which when subjected to gain protocol, will result in detection of malicious behavior of illegal clients in the network. The real time experiment is performed with applicable datasets gather, which show the best result for identifying threats from last 2 months data collected.

  8. A Robust Client Verification in Cloud Enabled m-Commerce using Gaining Protocol

    Directory of Open Access Journals (Sweden)

    Chitra Kiran N.

    2011-11-01

    Full Text Available The proposed system highlights a novel approach of exclusive verification process using gain protocol for ensuring security among both the parties (client-service provider in m-commerce application with cloud enabled service. The proposed system is based on the potential to verify the clients with trusted hand held device depending on the set of frequent events and actions to be carried out. The framework of the proposed work is design after collecting a real time data sets from an android enabled hand set, which when subjected to gain protocol, will result in detection of malicious behavior of illegal clients in the network. The real time experiment is performed with applicable datasets gather, which show the best result for identifying threats from last 2 months data collected.

  9. Web Technologies And Databases

    Directory of Open Access Journals (Sweden)

    Irina-Nicoleta Odoraba

    2011-04-01

    Full Text Available The database means a collection of many types of occurrences of logical records containing relationships between records and data elementary aggregates. Management System database (DBMS - a set of programs for creating and operation of a database. Theoretically, any relational DBMS can be used to store data needed by a Web server.Basically, it was observed that the simple DBMS such as Fox Pro or Access is not suitable for Web sites that are used intensively. For large-scale Web applications need high performance DBMS's able to run multiple applications simultaneously. Hyper Text Markup Language (HTML is used to create hypertext documents for web pages. The purpose of HTML is rather the presentation of information – paragraphs, fonts, tables,than semantics description document.

  10. Semi-structured interview instrument on client satisfaction for therapeutic community clients

    OpenAIRE

    Iyare, Sade

    2015-01-01

    Therapeutic community (TC) treatment is used around the world to treat drug addicts. Perheiden yhdistetyn hoidon yksikkö (Pyy) unit of Helsinki Deaconess Institute is specialized in drug rehabilitation of the families with children. Based on Cox's Interactive Model of Client Health Behavior (2003) there is a connection between the client satisfaction and the results of the treatment. TC is known to be efficient method of treatment to treat drug addicts, but there is still very little data...

  11. Base-on Cloud Computing A new type of distributed application server system design

    Directory of Open Access Journals (Sweden)

    Ying-ying Chen

    2012-11-01

    Full Text Available At this stage the application server systems, such as e-commerce platform , instant messaging system , enterprise information system and so on, will be led to lose connections , the data latency phenomena because of too much concurrent requests, application server architecture, system architecture, etc. In serious cases, the server is running blocked. The new type of application server system contains four parts: a client program, transfer servers, application servers and databases. Application server is the core of the system. Its performance determines the system’s performance. At the same time the application servers and transfer servers can be designed as the web service open to be used, and they can be achieved as distributed architecture by a number of hardware servers, which can effectively deal with high concurrent client application requests.

  12. Involvement of the Client in Home Care Practice

    DEFF Research Database (Denmark)

    Glasdam, Stinne; Kjær, Lone; Præstegaard, Jeanette

    2011-01-01

    one client, his cohabitant family and the involved healthcare professionals. Results: Client involvement in home care service is shown within the constructed categories: The schism between wishing for and actually being helped; The chronological order can be negotiated; not the content; Liberal...... business gives the client full influence into the treatment and the therapist gold; and Converting a home to at working place and a home. Client involvement in practice seems to be very limited. Conclusion: All in all, involvement of clients in home care service seems to be more of a political illusion......Background: Through the last 35 years, ‘client involvement’ has been a mantra within health policies, education curriculums and health care institutions, yet very little is known about how ‘client involvement’ is practiced in the meetings with clients and health professionals. Aim: To analyse and...

  13. Gender Dysphoria: The Therapist's Dilemma--The Client's Choice.

    Science.gov (United States)

    Sherebrin, Hannah

    1996-01-01

    Therapist's role and dilemmas faced in treating a gender dysphoric client are discussed. Examines ethical and moral issues relating to transsexualism and discusses the appropriateness of art therapy as a treatment for transsexual clients. (SNR)

  14. Client Centeredness and Health Reform: Key Issues for Occupational Therapy

    OpenAIRE

    Mroz, Tracy M.; Pitonyak, Jennifer S.; Fogelberg, Donald; Leland, Natalie E.

    2015-01-01

    Occupational therapy has the philosophical underpinnings to provide expanded and more effective client-centered care that emphasizes the active engagement of the client and recognizes the greater contexts of his or her life.

  15. Voice and Communication Therapy for Transgender/Transsexual Clients

    Science.gov (United States)

    ... Voice and Communication Therapy for Clients Who Are Transgender and/or Transsexual [ en Español ] What does the ... pathologist do when working with clients who are transgender/transsexual? What organizations have more information? Related Articles ...

  16. Counselor Stress in Relation to Disabled and Minority Clients

    Science.gov (United States)

    Vander Kolk, Charles J.

    1977-01-01

    Physiological and self-reported reactions of counselors in training to five disabled clients and a minority client were examined. Implications for counselor practice, education, and in-service education are discussed. (Author)

  17. Cryptanalysis of Some Client-to-Client Password-Authenticated Key Exchange Protocols

    Directory of Open Access Journals (Sweden)

    Tianjie Cao

    2009-06-01

    Full Text Available Client-to-Client Password-Authenticated Key Exchange (C2C-PAKE protocols allow two clients establish a common session key based on their passwords. In a secure C2C-PAKE protocol, there is no computationally bounded adversary learns anything about session keys shared between two clients. Especially a participating server should not learn anything about session keys. Server- compromise impersonation resilience is another desirable security property for a C2C-PAKE protocol. It means that compromising the password verifier of any client A should not enable outside adversary to share session key with A. Recently, Kwon and Lee proposed four C2C-PAKE protocols in the three-party setting, and Zhu et al. proposed a C2C-PAKE protocol in the cross-realm setting. All the proposed protocols are claimed to resist server compromise. However, in this paper, we show that Kwon and Lee’s protocols and Zhu et al’s protocol exist server compromise attacks, and a malicious server can mount man-in-themiddle attacks and can eavesdrop the communication between the two clients.

  18. How Social Workers Demonstrate Respect for Elderly Clients

    OpenAIRE

    Sung, Kyu-taik; Dunkle, Ruth E.

    2009-01-01

    Although respect is a crucial aspect of social work practice, few studies have examined how social workers convey their respect for elderly clients. This study explored the various forms of respect demonstrated by social workers when they were with older clients. Fifty social workers serving elderly clients were surveyed by a questionnaire with closed- and open-ended questions. Based on data on the way the social workers respected their elderly clients, the study identified seven forms most f...

  19. Analysis of isotropic turbulence using a public database and the Web service model, and applications to study subgrid models

    Science.gov (United States)

    Meneveau, Charles; Yang, Yunke; Perlman, Eric; Wan, Minpin; Burns, Randal; Szalay, Alex; Chen, Shiyi; Eyink, Gregory

    2008-11-01

    A public database system archiving a direct numerical simulation (DNS) data set of isotropic, forced turbulence is used for studying basic turbulence dynamics. The data set consists of the DNS output on 1024-cubed spatial points and 1024 time-samples spanning about one large-scale turn-over timescale. This complete space-time history of turbulence is accessible to users remotely through an interface that is based on the Web-services model (see http://turbulence.pha.jhu.edu). Users may write and execute analysis programs on their host computers, while the programs make subroutine-like calls that request desired parts of the data over the network. The architecture of the database is briefly explained, as are some of the new functions such as Lagrangian particle tracking and spatial box-filtering. These tools are used to evaluate and compare subgrid stresses and models.

  20. Cloud Databases: A Paradigm Shift in Databases

    OpenAIRE

    Indu Arora; Anu Gupta

    2012-01-01

    Relational databases ruled the Information Technology (IT) industry for almost 40 years. But last few years have seen sea changes in the way IT is being used and viewed. Stand alone applications have been replaced with web-based applications, dedicated servers with multiple distributed servers and dedicated storage with network storage. Cloud computing has become a reality due to its lesser cost, scalability and pay-as-you-go model. It is one of the biggest changes in IT after the rise of Wor...