WorldWideScience

Sample records for client database applications

  1. A Multidatabase System as 4-Tiered Client-Server Distributed Heterogeneous Database System

    OpenAIRE

    Mohammad Ghulam Ali

    2009-01-01

    In this paper, we describe a multidatabase system as 4-tiered Client-Server DBMS architectures. We discuss their functional components and provide an overview of their performance characteristics. The first component of this proposed system is a web-based interface or Graphical User Interface, which resides on top of the Client Application Program, the second component of the system is a client Application program running in an application server, which resides on top of the Global Database M...

  2. Multi-tiered Client/Server Database Application Based on Web%基于Web的多层客户/服务器数据库应用程序

    Institute of Scientific and Technical Information of China (English)

    李文生; 潘世兵

    2001-01-01

    讨论基于Web的多层客户/服务器数据库应用计算模型,并提出采用Delphi建立基于Web的多层客户/服务器数据库应用程序的方法和步骤。%This Paper discusses the computing model of multie-tieredclient/server database application based on Web and proposes method and steps for constructing multie-tiered client/server database application based on Web with Delphi.

  3. CLIENT-TO-CLIENT STREAMING SCHEME FOR VOD APPLICATIONS

    OpenAIRE

    T R Gopala Krishnan Nair; Dakshayini, M

    2010-01-01

    In this paper, we propose an efficient client-to-client streaming approach to cooperatively stream the video using chaining technique with unicast communication among the clients. This approach considers two major issues of VoD 1) Prefix caching scheme to accommodate more number of videos closer to client, so that the request-service delay for the user can be minimized. 2) Cooperative proxy and client chaining scheme for streaming the videos using unicasting. This approach minimizes the clien...

  4. Database and interface modifications: change management without affecting the clients

    International Nuclear Information System (INIS)

    The first Oracle-based Controls Configuration Database (CCDB) was developed in 1986, by which the controls system of CERN's Proton Synchrotron became data-driven. Since then, this mission-critical system has evolved tremendously going through several generational changes in terms of the increasing complexity of the control system, software technologies and data models. Today, the CCDB covers the whole CERN accelerator complex and satisfies a much wider range of functional requirements. Despite its online usage, everyday operations of the machines must not be disrupted. This paper describes our approach with respect to dealing with change while ensuring continuity. The successful strategy that has been put in place is based on the following guidelines: -) Involve end-users right from the start, throughout the design and development process; -) Provide four separate environments for development, unit and functional testing, integration testing (TestBed), production; -) Analyze the impact of a change and try to apply only backward compatible changes; -) Communicate timely, clearly and transparently on scheduled intervention and their impact; and -) Coordinate the upgrades with impacted clients

  5. Cloud Storage Client Application Analysis

    Directory of Open Access Journals (Sweden)

    Rakesh Malik

    2015-06-01

    Full Text Available The research proposed in this paper focuses on gathering evidence from devices with UNIX/Linux systems (in particular on Ubuntu 14.04 and Android OS, and Windows 8.1, in order to find artifacts left by cloud storage applications that suggests their use even after the deletion of the applications. The work performed aims to expand upon the prior work done by other researches in the field of cloud forensics and to show an example of analysis. We show where and what type of data remnants can be found using our analysis and how this information can be used as evidence in a digital forensic investigation.

  6. Database Application Schema Forensics

    OpenAIRE

    Hector Quintus Beyers; Olivier, Martin S; Hancke, Gerhard P.

    2014-01-01

    The application schema layer of a Database Management System (DBMS) can be modified to deliver results that may warrant a forensic investigation. Table structures can be corrupted by changing the metadata of a database or operators of the database can be altered to deliver incorrect results when used in queries. This paper will discuss categories of possibilities that exist to alter the application schema with some practical examples. Two forensic environments are introduced where a forensic ...

  7. A real time multi-server multi-client coherent database for a new high voltage system

    International Nuclear Information System (INIS)

    A high voltage system has been designed to allow multiple users (clients) access to the database of measured values and settings. This database is actively maintained in real time for a given mainframe containing multiple modules each having their own database. With limited CPU nd memory resources the mainframe system provides a data coherency scheme for multiple clients which (1) allows the client to determine when and what values need to be updated, (2) allows for changes from one client to be detected by another client, and (3) does not depend on the mainframe system tracking client accesses

  8. Incorporating client-server database architecture and graphical user interface into outpatient medical records.

    OpenAIRE

    Fiacco, P. A.; Rice, W. H.

    1991-01-01

    Computerized medical record systems require structured database architectures for information processing. However, the data must be able to be transferred across heterogeneous platform and software systems. Client-Server architecture allows for distributive processing of information among networked computers and provides the flexibility needed to link diverse systems together effectively. We have incorporated this client-server model with a graphical user interface into an outpatient medical ...

  9. Database Application Schema Forensics

    Directory of Open Access Journals (Sweden)

    Hector Quintus Beyers

    2014-12-01

    Full Text Available The application schema layer of a Database Management System (DBMS can be modified to deliver results that may warrant a forensic investigation. Table structures can be corrupted by changing the metadata of a database or operators of the database can be altered to deliver incorrect results when used in queries. This paper will discuss categories of possibilities that exist to alter the application schema with some practical examples. Two forensic environments are introduced where a forensic investigation can take place in. Arguments are provided why these environments are important. Methods are presented how these environments can be achieved for the application schema layer of a DBMS. A process is proposed on how forensic evidence should be extracted from the application schema layer of a DBMS. The application schema forensic evidence identification process can be applied to a wide range of forensic settings.

  10. Product Licenses Database Application

    CERN Document Server

    Tonkovikj, Petar

    2016-01-01

    The goal of this project is to organize and centralize the data about software tools available to CERN employees, as well as provide a system that would simplify the license management process by providing information about the available licenses and their expiry dates. The project development process is consisted of two steps: modeling the products (software tools), product licenses, legal agreements and other data related to these entities in a relational database and developing the front-end user interface so that the user can interact with the database. The result is an ASP.NET MVC web application with interactive views for displaying and managing the data in the underlying database.

  11. Databases and their application

    NARCIS (Netherlands)

    E.C. Grimm; R.H.W Bradshaw; S. Brewer; S. Flantua; T. Giesecke; A.M. Lézine; H. Takahara; J.W.,Jr Williams

    2013-01-01

    During the past 20 years, several pollen database cooperatives have been established. These databases are now constituent databases of the Neotoma Paleoecology Database, a public domain, multiproxy, relational database designed for Quaternary-Pliocene fossil data and modern surface samples. The poll

  12. Client-Centric Adaptive Scheduling of Service-Oriented Applications

    Institute of Scientific and Technical Information of China (English)

    Jing Wang; Li-Yong Zhang; Yan-Bo Han

    2006-01-01

    The paper proposes a client-centric computing model that allows for adaptive execution of service-oriented applications. The model can flexibly dispatch application tasks to the client side and the network side, dynamically adjust an execution scheme to adapt to environmental changes, and thus is expected to achieve better scalability, higher performance and more controllable privacy. Scheduling algorithms and the rescheduling strategies are proposed for the model.Experiments show that with the model the performance of service-oriented application execution can be improved.

  13. Database and Expert Systems Applications

    DEFF Research Database (Denmark)

    Viborg Andersen, Kim; Debenham, John; Wagner, Roland

    This book constitutes the refereed proceedings of the 16th International Conference on Database and Expert Systems Applications, DEXA 2005, held in Copenhagen, Denmark, in August 2005.The 92 revised full papers presented together with 2 invited papers were carefully reviewed and selected from 390......, reasoning and learning, network management and mobile systems, expert systems and decision support, and information modelling....... submissions. The papers are organized in topical sections on workflow automation, database queries, data classification and recommendation systems, information retrieval in multimedia databases, Web applications, implementational aspects of databases, multimedia databases, XML processing, security, XML...

  14. Database characterisation of HEP applications

    Science.gov (United States)

    Piorkowski, Mariusz; Grancher, Eric; Topurov, Anton

    2012-12-01

    Oracle-based database applications underpin many key aspects of operations for both the LHC accelerator and the LHC experiments. In addition to the overall performance, the predictability of the response is a key requirement to ensure smooth operations and delivering predictability requires understanding the applications from the ground up. Fortunately, database management systems provide several tools to check, measure, analyse and gather useful information. We present our experiences characterising the performance of several typical HEP database applications performance characterisations that were used to deliver improved predictability and scalability as well as for optimising the hardware platform choice as we migrated to new hardware and Oracle 11g.

  15. Database characterisation of HEP applications

    International Nuclear Information System (INIS)

    Oracle-based database applications underpin many key aspects of operations for both the LHC accelerator and the LHC experiments. In addition to the overall performance, the predictability of the response is a key requirement to ensure smooth operations and delivering predictability requires understanding the applications from the ground up. Fortunately, database management systems provide several tools to check, measure, analyse and gather useful information. We present our experiences characterising the performance of several typical HEP database applications performance characterisations that were used to deliver improved predictability and scalability as well as for optimising the hardware platform choice as we migrated to new hardware and Oracle 11g.

  16. Database Transformations for Biological Applications

    Energy Technology Data Exchange (ETDEWEB)

    Overton, C.; Davidson, S. B.; Buneman, P.; Tannen, V.

    2001-04-11

    The goal of this project was to develop tools to facilitate data transformations between heterogeneous data sources found throughout biomedical applications. Such transformations are necessary when sharing data between different groups working on related problems as well as when querying data spread over different databases, files and software analysis packages.

  17. International Ventilation Cooling Application Database

    DEFF Research Database (Denmark)

    Holzer, Peter; Psomas, Theofanis Ch.; OSullivan, Paul

    2016-01-01

    The currently running International Energy Agency, Energy and Conservation in Buildings, Annex 62 Ventilative Cooling (VC) project, is coordinating research towards extended use of VC. Within this Annex 62 the joint research activity of International VC Application Database has been carried out, ...

  18. ATLAS database application enhancements using Oracle 11g

    International Nuclear Information System (INIS)

    The ATLAS experiment at LHC relies on databases for detector online data-taking, storage and retrieval of configurations, calibrations and alignments, post data-taking analysis, file management over the grid, job submission and management, condition data replication to remote sites. Oracle Relational Database Management System (RDBMS) has been addressing the ATLAS database requirements to a great extent for many years. Ten database clusters are currently deployed for the needs of the different applications, divided in production, integration and standby databases. The data volume, complexity and demands from the users are increasing steadily with time. Nowadays more than 20 TB of data are stored in the ATLAS production Oracle databases at CERN (not including the index overhead), but the most impressive number is the hosted 260 database schemes (for the most common case each schema is related to a dedicated client application with its own requirements). At the beginning of 2012 all ATLAS databases at CERN have been upgraded to the newest Oracle version at the time: Oracle 11g Release 2. Oracle 11g come with several key improvements compared to previous database engine versions. In this work we present our evaluation of the most relevant new features of Oracle 11g of interest for ATLAS applications and use cases. Notably we report on the performance and scalability enhancements obtained in production since the Oracle 11g deployment during Q1 2012 and we outline plans for future work in this area.

  19. ATLAS database application enhancements using Oracle 11g

    Science.gov (United States)

    Dimitrov, G.; Canali, L.; Blaszczyk, M.; Sorokoletov, R.

    2012-12-01

    The ATLAS experiment at LHC relies on databases for detector online data-taking, storage and retrieval of configurations, calibrations and alignments, post data-taking analysis, file management over the grid, job submission and management, condition data replication to remote sites. Oracle Relational Database Management System (RDBMS) has been addressing the ATLAS database requirements to a great extent for many years. Ten database clusters are currently deployed for the needs of the different applications, divided in production, integration and standby databases. The data volume, complexity and demands from the users are increasing steadily with time. Nowadays more than 20 TB of data are stored in the ATLAS production Oracle databases at CERN (not including the index overhead), but the most impressive number is the hosted 260 database schemes (for the most common case each schema is related to a dedicated client application with its own requirements). At the beginning of 2012 all ATLAS databases at CERN have been upgraded to the newest Oracle version at the time: Oracle 11g Release 2. Oracle 11g come with several key improvements compared to previous database engine versions. In this work we present our evaluation of the most relevant new features of Oracle 11g of interest for ATLAS applications and use cases. Notably we report on the performance and scalability enhancements obtained in production since the Oracle 11g deployment during Q1 2012 and we outline plans for future work in this area.

  20. Design and implementation of an enterprise information system utilizing a component based three-tier client/server database system

    OpenAIRE

    Akbay, Murat.; Lewis, Steven C.

    1999-01-01

    The Naval Security Group currently requires a modem architecture to merge existing command databases into a single Enterprise Information System through which each command may manipulate administrative data. There are numerous technologies available to build and implement such a system. Component- based architectures are extremely well-suited for creating scalable and flexible three-tier Client/Server systems because the data and business logic are encapsulated within objects, allowing them t...

  1. Building Database-Powered Mobile Applications

    OpenAIRE

    Paul POCATILU

    2012-01-01

    Almost all mobile applications use persistency for their data. A common way for complex mobile applications is to store data in local relational databases. Almost all major mobile platforms include a relational database engine. These databases engines expose specific API (Application Programming Interface) to be used by mobile applications developers for data definition and manipulation. This paper focus on database-based application models for several mobile platforms (Android, Symbian, Wind...

  2. MCIP Client Application for SCADA in Iiot Environment

    Directory of Open Access Journals (Sweden)

    Nicoleta Cristina GAITAN

    2015-09-01

    Full Text Available Modern automation systems architectures which include several subsystems for which an adequate burden sharing is required. These subsystems must work together to fulfil the tasks imposed by the common function, given by the business purpose to be fulfilled. These subsystems or components, in order to perform these tasks, must communicate with each other, this being the critical function of the architecture of such a system. This article presents a MCIP (Monitoring and Control of the Industrial Processes client application which allows the monitoring and control of the industrial processes and which is object-oriented. As a novelty, the paper presents the architecture of the user object, which is actually a wrapper that allows the connection to Communication Standard Interface bus, the characteristics of the IIoT (Industrial Internet of Things object and the correspondence between a server’s address space and the address space of MCIP.

  3. Using a Framework to develop Client-Side App : A Javascript Framework for cross-platform application

    OpenAIRE

    Shakya, Udeep

    2014-01-01

    This project aims to study the comfort of using a framework to develop client side applications based on Hypertext Markup Language 5 (HTML5), Cascading Style Sheets (CSS) and JavaScript technology. The application tends to serve both as a web client application and a mobile client application for multiple platforms. A survey answering application which fetches questions (texts) from an Application Programming Interface (API) in the application server and uploads text, sound, video and picture...

  4. Multimedia database retrieval technology and applications

    CERN Document Server

    Muneesawang, Paisarn; Guan, Ling

    2014-01-01

    This book explores multimedia applications that emerged from computer vision and machine learning technologies. These state-of-the-art applications include MPEG-7, interactive multimedia retrieval, multimodal fusion, annotation, and database re-ranking. The application-oriented approach maximizes reader understanding of this complex field. Established researchers explain the latest developments in multimedia database technology and offer a glimpse of future technologies. The authors emphasize the crucial role of innovation, inspiring users to develop new applications in multimedia technologies

  5. Exchanging the Context between OGC Geospatial Web clients and GIS applications using Atom

    Science.gov (United States)

    Maso, Joan; Díaz, Paula; Riverola, Anna; Pons, Xavier

    2013-04-01

    Currently, the discovery and sharing of geospatial information over the web still presents difficulties. News distribution through website content was simplified by the use of Really Simple Syndication (RSS) and Atom syndication formats. This communication exposes an extension of Atom to redistribute references to geospatial information in a Spatial Data Infrastructure distributed environment. A geospatial client can save the status of an application that involves several OGC services of different kind and direct data and share this status with other users that need the same information and use different client vendor products in an interoperable way. The extensibility of the Atom format was essential to define a format that could be used in RSS enabled web browser, Mass Market map viewers and emerging geospatial enable integrated clients that support Open Geospatial Consortium (OGC) services. Since OWS Context has been designed as an Atom extension, it is possible to see the document in common places where Atom documents are valid. Internet web browsers are able to present the document as a list of items with title, abstract, time, description and downloading features. OWS Context uses GeoRSS so that, the document can be to be interpreted by both Google maps and Bing Maps as items that have the extent represented in a dynamic map. Another way to explode a OWS Context is to develop an XSLT to transform the Atom feed into an HTML5 document that shows the exact status of the client view window that saved the context document. To accomplish so, we use the width and height of the client window, and the extent of the view in world (geographic) coordinates in order to calculate the scale of the map. Then, we can mix elements in world coordinates (such as CF-NetCDF files or GML) with elements in pixel coordinates (such as WMS maps, WMTS tiles and direct SVG content). A smarter map browser application called MiraMon Map Browser is able to write a context document and read

  6. Web Applications Security : A security model for client-side web applications

    OpenAIRE

    Prabhakara, Deepak

    2009-01-01

    The Web has evolved to support sophisticated web applications. These web applications are exposed to a number of attacks and vulnerabilities. The existing security model is unable to cope with these increasing attacks and there is a need for a new security model that not only provides the required security but also supports recent advances like AJAX and mashups. The attacks on client-side Web Applications can be attributed to four main reasons – 1) lack of a security context for Web Browsers...

  7. The First Android Client Application for the iLab Shared Architecture

    Directory of Open Access Journals (Sweden)

    Bogdan-Alexandru Deaky

    2012-02-01

    Full Text Available This paper presents the first Android client application developed for online laboratories based on the iLab Shared Architecture. An important challenge was to properly connect to the ISA Service Broker, because its current version was developed with browser-based client applications in mind. The application was successfully tested on a few real-world mobile devices and the experience gained represents the basis for future changes in the Service Broker and for future teleengineering applications that involve Android.

  8. Maintaining Stored Procedures in Database Application

    Directory of Open Access Journals (Sweden)

    Santosh Kakade

    2012-06-01

    Full Text Available Stored procedure and triggers have an irreplaceable importance in any database application, as they provide a powerful way to code application logic that can be stored on the server and execute according to the need of application. Writing stored procedures for database application involves set of sql statements with an assigned name that's stored in the database in compiled form so that it can be shared by a number of programs. The use of stored procedures can be helpful in controlling access to data end-users may enter or change data but do not write procedures, preserving data integrity and improving productivity statements in a stored procedure only need to be written one time

  9. Professional iOS database application programming

    CERN Document Server

    Alessi, Patrick

    2013-01-01

    Updated and revised coverage that includes the latest versions of iOS and Xcode Whether you're a novice or experienced developer, you will want to dive into this updated resource on database application programming for the iPhone and iPad. Packed with more than 50 percent new and revised material - including completely rebuilt code, screenshots, and full coverage of new features pertaining to database programming and enterprise integration in iOS 6 - this must-have book intends to continue the precedent set by the previous edition by helping thousands of developers master database

  10. Research and application of ORACLE performance optimizing technologies for building airplane environment resource database

    Science.gov (United States)

    Zhang, Jianjun; Sun, Jianyong; Cheng, Conggao

    2013-03-01

    Many problems exist in processing experimental aircraft vibration (temperature, humidity) data and generating the intermediate calculations during the construction of airplane environment resource database, such as the need to deal with both structural and non-structural data, weak capacity of the client browser for data processing and massive network data transferring etc. To solve the above problems, some strategies on tuning and optimization performance of database are employed base on Oracle11g, which include data storage structure tuning, the memory configuration of the server, the disk I/O tuning and SQL statement tuning. The experimental results show that the performance of airplane environment resource database is enhanced about 80% compared with the database developed in the initial demonstration and validation phase. The application of new optimization strategies to the database construction can lay a sound foundation for finishing building airplane environment resource database.

  11. Connecting traces: understanding client-server interactions in Ajax applications

    NARCIS (Netherlands)

    Matthijssen, N.; Zaidman, A.; Storey, M.; Bull, I.; Van Deursen, A.

    2010-01-01

    Ajax-enabled web applications are a new breed of highly interactive, highly dynamic web applications. Although Ajax allows developers to create rich web applications, Ajax applications can be difficult to comprehend and thus to maintain. For this reason, we have created FireDetective, a tool that us

  12. Cross-platform development of the Smart Client application with Qt framework and QtQuick

    OpenAIRE

    Krajewski, Marek

    2016-01-01

    In this thesis the Qt Framework is evaluated as the tool that can support the crossplatform development of desktop, mobile and embedded applications. Hence, a hybrid client application is developed to assess its capabilities for creating a product providing a good user experience on a wide range of the target devices. The application is required to demonstrate implementation of the Graphical User Interface, network communication with a server and access to the native development e...

  13. Application of database systems in diabetes care.

    Science.gov (United States)

    Kopelman, P G; Sanderson, A J

    1996-01-01

    The St Vincent Declaration includes a commitment to continuous quality improvement in diabetes care. This necessitates the collection of appropriate information to ensure that diabetes services are efficient, effective and equitable. The quantity of information, and the need for rapid access, means that this must be computer-based. The choice of architecture and the design of a database for diabetes care must take into account available equipment and operational requirements. Hardware topology may be determined by the operating system and/or netware software. An effective database system will include: user-friendliness, rapid but secure access to data, a facility for multiple selections for analysis and audit, the ability to be used as part of the patient consultation process, the ability to interface or integrate with other applications, and cost efficiency. An example of a clinical information database for diabetes care, Diamond, is described. PMID:9244825

  14. Exemplary applications of the OECD fire database

    International Nuclear Information System (INIS)

    In general, the data from NPP experience with fire events stored in the OECD FIRE Database can provide answers to several interesting questions and insights on phenomena, such as examples of frequent fire initiators and their root causes, of electrical equipment failure modes, of fire protection equipment malfunctions, and of fire barriers impaired. Exemplary applications of the OECD FIRE Database show that it is already possible to retrieve reasonable qualitative information and to get to some extent also quantitative estimations, which can support the interpretation of the operating experience for specific events in the member countries participating in the OECD FIRE Project. The quantitative information will, of course, increase with the increasing number of reported events and a careful description of the respective events to provide as much information as available. In the third phase of the Project starting in 2010, the OECD FIRE Database will be further analyzed with respect to applications for first probabilistic safety assessment considerations, e.g. the positive and negative role of human factor in the fire ignition on the one hand, and, on the other hand, in fire detection and extinguishing. This has to be investigated in more detail to generate Fire PSA results with a higher confidence. Positive effects of human behavior for fire extinguishing are already identified in the existing Database. One of the main questions which could be answered by the OECD FIRE Database is how fires can propagate from the initial fire compartment to other compartments, even if there are protective means available for prevention of fire spreading. For generating meaningful event and fault trees for various safety significant fire scenarios, a clear and as far as possible detailed (with respect to time dependencies and safety significance) description of the initial fire event sequence and its consequences are essential. The coding of events has to reflect as far as feasible

  15. Application of graph database for analytical tasks

    OpenAIRE

    Günzl, Richard

    2014-01-01

    This diploma thesis is about graph databases, which belong to the category of database systems known as NoSQL databases, but graph databases are beyond NoSQL databases. Graph databases are useful in many cases thanks to native storing of interconnections between data, which brings advantageous properties in comparison with traditional relational database system, especially in querying. The main goal of the thesis is: to describe principles, properties and advantages of graph database; to desi...

  16. LISA, the next generation: from a web-based application to a fat client.

    Science.gov (United States)

    Pierlet, Noëlla; Aerts, Werner; Vanautgaerden, Mark; Van den Bosch, Bart; De Deurwaerder, André; Schils, Erik; Noppe, Thomas

    2008-01-01

    The LISA application, developed by the University Hospitals Leuven, permits referring physicians to consult the electronic medical records of their patients over the internet in a highly secure way. We decided to completely change the way we secured the application, discard the existing web application and build a completely new application, based on the in-house developed hospital information system, used in the University Hospitals Leuven. The result is a fat Java client, running on a Windows Terminal Server, secured by a commercial SSL-VPN solution.

  17. A Proposal of Client Application Architecture using Loosely Coupled Component Connection Method in Banking Branch System

    Science.gov (United States)

    Someya, Harushi; Mori, Yuichi; Abe, Masahiro; Machida, Isamu; Hasegawa, Atsushi; Yoshie, Osamu

    Due to the deregulation of financial industry, the branches in banking industry need to shift to the sales-oriented bases from the operation-oriented bases. For corresponding to this movement, new banking branch systems are being developed. It is the main characteristics of new systems that we bring the form operations that have traditionally been performed at each branch into the centralized operation center for the purpose of rationalization and efficiency of the form operations. The branches treat a wide variety of forms. The forms can be described by common items in many cases, but the items include the different business logic and each form has the different relation among the items. And there is a need to develop the client application by user oneself. Consequently the challenge is to arrange the development environment that is high reusable, easy customizable and user developable. We propose a client application architecture that has a loosely coupled component connection method, and allows developing the applications by only describing the screen configurations and their transitions in XML documents. By adopting our architecture, we developed client applications of the centralized operation center for the latest banking branch system. Our experiments demonstrate good performances.

  18. An Evaluation of the Eclipse Rich Client Platform for a telecom management application

    OpenAIRE

    Frising, Philip

    2008-01-01

    The Software Management Organizer (SMO) application is used by telecom operators for remote software and hardware handling of telecommunication equipment. The graphical user interface (GUI) provided by SMO is called SMO GUI and is costly to maintain, extend and test.The Eclipse Rich Client Platform (RCP) provides a platform for building component based GUIs with rich functionality. This thesis is to evaluate how the Eclipse RCP can be used for building a new SMO GUI. The evaluation will be pe...

  19. AIDA Asia. Artificial Insemination Database Application

    International Nuclear Information System (INIS)

    The objectives of AIDA (Artificial Insemination Database Application) and its companion GAIDA (Guide to AI Data Analysis) are to address two major problems in on-farm research on livestock production. The first is the quality of the data collected and the second is the intellectual rigor of the analyses and their associated results when statistically testing causal hypotheses. The solution is to develop a data management system such as AIDA and an analysis system such as GAIDA to estimate parameters that explain biological mechanisms for on-farm application. The system uses epidemiological study designs in the uncontrolled research environment of the farm, uses a database manager (Microsoft Access) to handle data management issues encountered in preparing data for analysis, and then uses a statistical program (SYSTAT) to do preliminary analyses. These analyses enable the researcher to have better understanding of the biological mechanisms involved in the data contained within the AIDA database. Using GAIDA as a guide, this preliminary analysis helps to determine the strategy for further in-depth analyse

  20. NSLS-II High Level Application Infrastructure And Client API Design

    International Nuclear Information System (INIS)

    The beam commissioning software framework of NSLS-II project adopts a client/server based architecture to replace the more traditional monolithic high level application approach. It is an open structure platform, and we try to provide a narrow API set for client application. With this narrow API, existing applications developed in different language under different architecture could be ported to our platform with small modification. This paper describes system infrastructure design, client API and system integration, and latest progress. As a new 3rd generation synchrotron light source with ultra low emittance, there are new requirements and challenges to control and manipulate the beam. A use case study and a theoretical analysis have been performed to clarify requirements and challenges to the high level applications (HLA) software environment. To satisfy those requirements and challenges, adequate system architecture of the software framework is critical for beam commissioning, study and operation. The existing traditional approaches are self-consistent, and monolithic. Some of them have adopted a concept of middle layer to separate low level hardware processing from numerical algorithm computing, physics modelling, data manipulating, plotting, and error handling. However, none of the existing approaches can satisfy the requirement. A new design has been proposed by introducing service oriented architecture technology. The HLA is combination of tools for accelerator physicists and operators, which is same as traditional approach. In NSLS-II, they include monitoring applications and control routines. Scripting environment is very important for the later part of HLA and both parts are designed based on a common set of APIs. Physicists and operators are users of these APIs, while control system engineers and a few accelerator physicists are the developers of these APIs. With our Client/Server mode based approach, we leave how to retrieve information to the

  1. Structuring modern web applications : A study of how to structure web clients to achieve modular, maintainable and longlived applications

    OpenAIRE

    MALMSTRÖM, TIM JOHAN

    2014-01-01

    This degree project, conducted at Decerno AB, investigates what can be done to create client side web applications that are maintainable for a long time. The focus is on basing the application on an existing framework which both simplifies the development process and helps keeping the application well structured. Which framework is currently the best is evaluated using a comparison between the currently most popular frameworks. The comparison is done using a set of categories that is defined ...

  2. Software Application for Supporting the Education of Database Systems

    Science.gov (United States)

    Vágner, Anikó

    2015-01-01

    The article introduces an application which supports the education of database systems, particularly the teaching of SQL and PL/SQL in Oracle Database Management System environment. The application has two parts, one is the database schema and its content, and the other is a C# application. The schema is to administrate and store the tasks and the…

  3. Handbook of video databases design and applications

    CERN Document Server

    Furht, Borko

    2003-01-01

    INTRODUCTIONIntroduction to Video DatabasesOge Marques and Borko FurhtVIDEO MODELING AND REPRESENTATIONModeling Video Using Input/Output Markov Models with Application to Multi-Modal Event DetectionAshutosh Garg, Milind R. Naphade, and Thomas S. HuangStatistical Models of Video Structure and SemanticsNuno VasconcelosFlavor: A Language for Media RepresentationAlexandros Eleftheriadis and Danny HongIntegrating Domain Knowledge and Visual Evidence to Support Highlight Detection in Sports VideosJuergen Assfalg, Marco Bertini, Carlo Colombo, and Alberto Del BimboA Generic Event Model and Sports Vid

  4. Software Applications to Access Earth Science Data: Building an ECHO Client

    Science.gov (United States)

    Cohen, A.; Cechini, M.; Pilone, D.

    2010-12-01

    Historically, developing an ECHO (NASA’s Earth Observing System (EOS) ClearingHOuse) client required interaction with its SOAP API. SOAP, as a framework for web service communication has numerous advantages for Enterprise applications and Java/C# type programming languages. However, as interest has grown for quick development cycles and more intriguing “mashups,” ECHO has seen the SOAP API lose its appeal. In order to address these changing needs, ECHO has introduced two new interfaces facilitating simple access to its metadata holdings. The first interface is built upon the OpenSearch format and ESIP Federated Search framework. The second interface is built upon the Representational State Transfer (REST) architecture. Using the REST and OpenSearch APIs to access ECHO makes development with modern languages much more feasible and simpler. Client developers can leverage the simple interaction with ECHO to focus more of their time on the advanced functionality they are presenting to users. To demonstrate the simplicity of developing with the REST API, participants will be led through a hands-on experience where they will develop an ECHO client that performs the following actions: + Login + Provider discovery + Provider based dataset discovery + Dataset, Temporal, and Spatial constraint based Granule discovery + Online Data Access

  5. Framework for Deploying Client/Server Distributed Database System for effective Human Resource Information Management Systems in Imo State Civil Service of Nigeria

    Directory of Open Access Journals (Sweden)

    Josiah Ahaiwe

    2012-08-01

    Full Text Available The information system is an integrated system that holds financial and personnel records of persons working in various branches of Imo state civil service. The purpose is to harmonize operations, reduce or if possible eliminate redundancy and control the introduction of “ghost workers” and fraud in pension management. In this research work, an attempt is made to design a frame work for deploying a client/server distributed database system for a human resource information management system with a scope on Imo state civil service in Nigeria. The system consists of a relational database of personnel variables which could be shared by various levels of management in all the ministries’ and their branches located all over the state. The server is expected to be hosted in the accountant general’s office. The system is capable of handling recruitment and promotions issues, training, monthly remunerations, pension and gratuity issues, and employment history, etc.

  6. Database and applications security integrating information security and data management

    CERN Document Server

    Thuraisingham, Bhavani

    2005-01-01

    This is the first book to provide an in-depth coverage of all the developments, issues and challenges in secure databases and applications. It provides directions for data and application security, including securing emerging applications such as bioinformatics, stream information processing and peer-to-peer computing. Divided into eight sections, each of which focuses on a key concept of secure databases and applications, this book deals with all aspects of technology, including secure relational databases, inference problems, secure object databases, secure distributed databases and emerging

  7. NoSQL and SQL Databases for Mobile Applications. Case Study: MongoDB versus PostgreSQL

    Directory of Open Access Journals (Sweden)

    Marin FOTACHE

    2013-01-01

    Full Text Available Compared with "classical" web, multi-tier applications, mobile applications have common and specific requirements concerning data persistence and processing. In mobile apps, database features can be distinctly analyzed for the client (minimalistic, isolated, memory-only and the server (data rich, centralized, distributed, synchronized and disk-based layers. Currently, a few lite relational database products reign the persistence for client platforms of mobile applications. There are two main objectives of this paper. First is to investigate storage options for major mobile platforms. Second is to point out some major differences between SQL and NoSQL datastores in terms of deployment, data model, schema design, data definition and manipulation. As NoSQL movement lacks standardization, from NoSQL products family MongoDB was chosen as reference, due to its strengths and popularity among developers. PostgreSQL serves the position of SQL DBMSs representative due to its popularity and conformity with SQL standards.

  8. Web application for detailed real-time database transaction monitoring for CMS condition data

    Science.gov (United States)

    de Gruttola, Michele; Di Guida, Salvatore; Innocente, Vincenzo; Pierro, Antonio

    2012-12-01

    In the upcoming LHC era, database have become an essential part for the experiments collecting data from LHC, in order to safely store, and consistently retrieve, a wide amount of data, which are produced by different sources. In the CMS experiment at CERN, all this information is stored in ORACLE databases, allocated in several servers, both inside and outside the CERN network. In this scenario, the task of monitoring different databases is a crucial database administration issue, since different information may be required depending on different users' tasks such as data transfer, inspection, planning and security issues. We present here a web application based on Python web framework and Python modules for data mining purposes. To customize the GUI we record traces of user interactions that are used to build use case models. In addition the application detects errors in database transactions (for example identify any mistake made by user, application failure, unexpected network shutdown or Structured Query Language (SQL) statement error) and provides warning messages from the different users' perspectives. Finally, in order to fullfill the requirements of the CMS experiment community, and to meet the new development in many Web client tools, our application was further developed, and new features were deployed.

  9. Just-in-time Database-Driven Web Applications

    OpenAIRE

    Ong, Kenneth R

    2003-01-01

    "Just-in-time" database-driven Web applications are inexpensive, quickly-developed software that can be put to many uses within a health care organization. Database-driven Web applications garnered 73873 hits on our system-wide intranet in 2002. They enabled collaboration and communication via user-friendly Web browser-based interfaces for both mission-critical and patient-care-critical functions. Nineteen database-driven Web applications were developed. The application categories that compri...

  10. DEVELOPING MULTITHREADED DATABASE APPLICATION USING JAVA TOOLS AND ORACLE DATABASE MANAGEMENT SYSTEM IN INTRANET ENVIRONMENT

    Directory of Open Access Journals (Sweden)

    Raied Salman

    2015-11-01

    Full Text Available In many business organizations, database applications are designed and implemented using various DBMS and Programming Languages. These applications are used to maintain databases for the organizations. The organization departments can be located at different locations and can be connected by intranet environment. In such environment maintenance of database records become an assignment of complexity which needs to be resolved. In this paper an intranet application is designed and implemented using Object-Oriented Programming Language Java and Object-Relational Database Management System Oracle in multithreaded Operating System environment.

  11. Database application platform for earthquake numerical simulation

    Institute of Scientific and Technical Information of China (English)

    LUO Yan; ZHENG Yue-jun; CHEN Lian-wang; LU Yuan-zhong; HUANG Zhong-xian

    2006-01-01

    @@ Introduction In recent years, all kinds of observation networks of seismology have been established, which have been continuously producing numerous digital information. In addition, there are many study results about 3D velocity structure model and tectonic model of crust (Huang and Zhao, 2006; Huang et al, 2003; Li and Mooney, 1998),which are valuable for studying the inner structure of the earth and earthquake preparation process. It is badly needed to combine the observed data, experimental study and theoretical analyses results by the way of numerical simulation and develop a database and a corresponding application platform to be used by numerical simulation,and is also a significant way to promote earthquake prediction.

  12. Techniques for multiple database integration

    OpenAIRE

    Whitaker, Barron D

    1997-01-01

    Approved for public release; distribution is unlimited There are several graphic client/server application development tools which can be used to easily develop powerful relational database applications. However, they do not provide a direct means of performing queries which require relational joins across multiple database boundaries. This thesis studies ways to access multiple databases. Specifically, it examines how a 'cross-database join' can be performed. A case study of techniques us...

  13. Modular Workflow Engine for Distributed Services using Lightweight Java Clients

    CERN Document Server

    Vetter, R -M; Peetz, J -V

    2009-01-01

    In this article we introduce the concept and the first implementation of a lightweight client-server-framework as middleware for distributed computing. On the client side an installation without administrative rights or privileged ports can turn any computer into a worker node. Only a Java runtime environment and the JAR files comprising the workflow client are needed. To connect all clients to the engine one open server port is sufficient. The engine submits data to the clients and orchestrates their work by workflow descriptions from a central database. Clients request new task descriptions periodically, thus the system is robust against network failures. In the basic set-up, data up- and downloads are handled via HTTP communication with the server. The performance of the modular system could additionally be improved using dedicated file servers or distributed network file systems. We demonstrate the design features of the proposed engine in real-world applications from mechanical engineering. We have used ...

  14. DOMAIN-DRIVEN DESIGN APPLICATION AND IMPLEMENTATION OF INFORMATION SYSTEMS FOR CLIENTS QUEUING SUBJECT AREAS

    Directory of Open Access Journals (Sweden)

    P. P. Oleynik

    2015-11-01

    Full Text Available The paper deals with domain-driven design applicability of information systems for client queuing subject areas. The following optimality criteria were put forward for the final implementation: the possibility of automation with a single system both for small institution and a whole network of institutions; advanced graphical interface with support for sensor screens; implementation of multi-users account of orders from clients; flexible application architecture with the ability of future enhancement; ability of integration with a variety of peripherals. The necessity of each criterion definition is shown. For implementability estimation, test information system was designed, automating the queuing system. Unified modeling language UML is used. Description of each class functionality is given and the association with other classes as well. Attention is paid to the design of tree (hierarchical structures and selection procedure of base classes based on the analysis of existing common attributes. For the system implementation, its own development environment SharpArchitect RAD Studio is used, offering MDA approach for implementation of systems based on standardized meta object system. A graphical view of order form developed prototype is presented, composition and structure are described, and notation developed by the author is given simplifying the prototyping process. Approaches to differentiation of access rights for different user roles are shown. Conformity of the received implementation to each selected optimality criterion is determined. Recommendations for further system development are given.

  15. Application of a Database in the Monitoring of Workstations in a Local Area Network

    Directory of Open Access Journals (Sweden)

    Eyo O. Ukem

    2009-01-01

    Full Text Available Problem statement: Computer hardware fault management and repairs can be a big challenge, especially if the number of staff available for the job is small. The task becomes more complicated if remote sites are managed and an engineer or technician has to be dispatched. Approach: Availability of relevant information when needed could ease the burden of maintenance by removing uncertainties. Such required information could be accumulated in a database and accessed as needed. Results: This study considered such a database, to assist a third party hardware maintenance firm keep track of its operations, including the machines that it services, together with their owners. A software application was developed in Java programming language, in the form of a database, using Microsoft Access as the database management system. It was designed to run on a local area network and to allow remote workstations to log on to a central computer in a client/server configuration. With this application it was possible to enter fault reports into the database residing on the central computer from any workstation on the network. Conclusion/Recommendations: The information generated from this data can be used by the third party hardware maintenance firm to speed up its service delivery, thus putting the firm in a position to render more responsive and efficient service to the customers.

  16. Applicability of the ReproQ client experiences questionnaire for quality improvement in maternity care.

    Science.gov (United States)

    Scheerhagen, Marisja; van Stel, Henk F; Tholhuijsen, Dominique J C; Birnie, Erwin; Franx, Arie; Bonsel, Gouke J

    2016-01-01

    Background. The ReproQuestionnaire (ReproQ) measures the client's experience with maternity care, following the WHO responsiveness model. In 2015, the ReproQ was appointed as national client experience questionnaire and will be added to the national list of indicators in maternity care. For using the ReproQ in quality improvement, the questionnaire should be able to identify best and worst practices. To achieve this, ReproQ should be reliable and able to identify relevant differences. Methods and Findings. We sent questionnaires to 17,867 women six weeks after labor (response 32%). Additionally, we invited 915 women for the retest (response 29%). Next we determined the test-retest reliability, the Minimally Important Difference (MID) and six known group comparisons, using two scorings methods: the percentage women with at least one negative experience and the mean score. The reliability for the percentage negative experience and mean score was both 'good' (Absolute agreement = 79%; intraclass correlation coefficient = 0.78). The MID was 11% for the percentage negative and 0.15 for the mean score. Application of the MIDs revealed relevant differences in women's experience with regard to professional continuity, setting continuity and having travel time. Conclusions. The measurement characteristics of the ReproQ support its use in quality improvement cycle. Test-retest reliability was good, and the observed minimal important difference allows for discrimination of good and poor performers, also at the level of specific features of performance. PMID:27478690

  17. Oracle database design for e-commerce application

    OpenAIRE

    Lihvoinen, Anna

    2009-01-01

    Published in September, 2009 at Haaga-Helia University of Applied Sciences The purpose of this thesis is to design a database for e-commerce application which will be further implemented in Oracle Application Express (Apex) by Database Software Horizons. The design document includes ER diagrams, table descriptions, table source code, and testing results. Logical and physical database designs for relational modeling methods are applied in this work. The result of the work is document...

  18. An integrated medical image database and retrieval system using a web application server.

    Science.gov (United States)

    Cao, Pengyu; Hashiba, Masao; Akazawa, Kouhei; Yamakawa, Tomoko; Matsuto, Takayuki

    2003-08-01

    We developed an Integrated Medical Image Database and Retrieval System (INIS) for easy access by medical staff. The INIS mainly consisted of four parts: specific servers to save medical images from multi-vendor modalities of CT, MRI, CR, ECG and endoscopy; an integrated image database (DB) server to save various kinds of images in a DICOM format; a Web application server to connect clients to the integrated image DB and the Web browser terminals connected to an HIS system. The INIS provided a common screen design to retrieve CT, MRI, CR, endoscopic and ECG images, and radiological reports, which would allow doctors to retrieve radiological images and corresponding reports, or ECG images of a patient simultaneously on a screen. Doctors working in internal medicine on average accessed information 492 times a month. Doctors working in cardiological and gastroenterological accessed information 308 times a month. Using the INIS, medical staff could browse all or parts of a patient's medical images and reports.

  19. SECURE REMOTE CLIENT AUTHENTICATION

    Directory of Open Access Journals (Sweden)

    K.Pradeep,

    2010-10-01

    Full Text Available This paper discusses an application of Secure Remote Client Authentication. It presents a Smart Cards and Digitally certification from third party vendors, Smart cards are based on algorithm to provide secure Remote client Authentication. These schemes vary significantly.In relation to today’s security challenges, which includephishing, man-in-the-middle attacks and malicious software. Secure Remote Client authentication plays a key role.

  20. SECURE REMOTE CLIENT AUTHENTICATION

    OpenAIRE

    K.Pradeep,; R.Usha Rani; E.Ravi Kumar; K.Nikhila,; Vijay Sankar

    2010-01-01

    This paper discusses an application of Secure Remote Client Authentication. It presents a Smart Cards and Digitally certification from third party vendors, Smart cards are based on algorithm to provide secure Remote client Authentication. These schemes vary significantly.In relation to today’s security challenges, which includephishing, man-in-the-middle attacks and malicious software. Secure Remote Client authentication plays a key role.

  1. Heterogeneous Database integration for Web Applications

    Directory of Open Access Journals (Sweden)

    V. Rajeswari

    2009-11-01

    Full Text Available In the contemporary business and industrial environment, the variety of data used by organizations are increasing rapidly. Also, there is an increasing demand for accessing this data. The size, complexity and variety of databases used for data handling cause serious problems in manipulating this distributed information. Integrating all the information from different databases into one database is a challenging problem. XML has been in use in recent times to handle data in web appliccations. XML (eXtensible Markup Language is a very open way of data communication. XML has become the undisputable standard both for data exchange and content management. XML is supported by the giants of the software industry like IBM, Oracle and Microsoft. The XML markup language should be the lingua franca of data interchange; but it’s rate of acceptance has been limited by a mismatch between XML and legacy databases. This in turn, has created a need for a mapping tool to integrate the XML and databases. This paper highlights the merging of heterogeneous database resource. This can be achieved by means of conversion of relational mode to XML schema and vice versa and by adding the semantic constraints to the XML Schema. The developments that the industry has seen in recent times in this field is referred to as the basis.

  2. Bitcoin clients

    OpenAIRE

    Skudnov, Rostislav

    2012-01-01

    Bitcoin is a new decentralized electronic currency which gained popularity in the last two years. The usage of Bitcoin is facilitated by software commonly called Bitcoin clients. This thesis provides an overview of Bitcoin and cryptography behind it, discusses different types of Bitcoin clients and researches additional features implemented by them. It also analyzes further enhancements that can be made to clients and the Bitcoin protocol. Bitcoin clients are grouped into types and analyz...

  3. Field installation of a distributed database application

    International Nuclear Information System (INIS)

    APC-based equipment failure reporting system was designed as a distributed database over a Wide Area Network. Speed related communication problems were encountered, necessitating a redesign of the program. In the original design, data for local failures was to reside in several local field offices. Division level reporting was to be accomplished by concatenating the field offices' databases over the network. Validation information was to reside in several small databases located on the division file server. This data was to be accessed by the field offices as a remote link. The program executables were to reside on yet another file server belonging to the service organization which maintained the program code. Both the field an division offices were to access the program code as a remote link. This paper discusses the network and communications facilities which were in place, the original design, the performance encountered, the design changes and present performance. There is also a brief discussion of possible future modifications

  4. Thin-Client/Server计算模式在社区图书馆中的应用%The Application in the Community Library with Thin-Client/Server

    Institute of Scientific and Technical Information of China (English)

    赵秀丽; 杨静; 马爱华; 秦梅素

    2003-01-01

    主要阐述了Thin-Client/Server计算模式在社区图书馆建立电子阅览室中的应用,以及Thin-Client/Server计算模式的概念、工作模式、技术特点,展望Thin-Client/Server计算模式在未来社区图书馆发展中的应用前景.

  5. Application of Windows Socket Technique to Communication Process of the Train Diagram Network System Based on Client/Server Structure

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    This paper is focused on the technique for design and realization of the process communications about the computer-aided train diagram network system. The Windows Socket technique is adopted to program for the client and the server to create system applications and solve the problems of data transfer and data sharing in the system.

  6. The Application of Genetic Algorithms and Multidimensional Distinguishing Model in Forecasting and Evaluating Credits for Mobile Clients

    Institute of Scientific and Technical Information of China (English)

    LiZhan; XuJi-sheng

    2003-01-01

    To solve the arrearage problem that puzzled most of the mobile corporations, we propose an approach to forecast and evaluate the credits for mobile clients, devising a method that is of the coalescence of genetic algorithm and multidimensional distinguishing model. In the end of this paper, a result of a testing application in Zhuhai Branch, GMCC was provided. The precision of the forecasting and evaluation of the client's credit is near 90%. This study is very significant to the mobile communication corporation at all levels.The popularization o{ the techniques and the result would produce great benefits of both society and economy.

  7. The Application of Genetic Algorithms and Multidimensional Distinguishing Model in Forecasting and Evaluating Credits for Mobile Clients

    Institute of Scientific and Technical Information of China (English)

    Li Zhan; Xu Ji-sheng

    2003-01-01

    To solve the arrearage problem that puzzled most of the mobile corporations, we propose an approach to forecast and evaluate the credits for mobile clients, devising a method that is of the coalescence of genetic algorithm and multidimensional distinguishing model. In the end of this pa-per, a result of a testing application in Zhuhai Branch, GMCC was provided. The precision of the forecasting and evaluation of the client's credit is near 90%. This study is very signifi-cant to the mobile communication corporation at all levels.The popularization of the techniques and the result would pro-duce great benefits of both society and economy.

  8. A thermodynamic database for geophysical applications

    Science.gov (United States)

    Saxena, S. K.

    2013-12-01

    Several thermodynamic databases are available for calculation of equilibrium reference state of the model earth. Prominent among these are the data bases of (a) SLB (1), (b) HP (2) and (c) FSPW (3). The two major problems, as discussed in a meeting of the database scientists (4), lie in the formulation of solid solutions and equations of state. The models adopted in databases (1) and (2) do not account for multi-components in natural solids and the sub-lattice or compound-energy models used in (3) require lot of fictive compound and mixing energy data for which there is no present ongoing attempt. The EOS formulation in (1) is based on Mie-Gruneisen equation of state and in (2) on modification of Tait EOS with limited parameters. The database (3) adopted the Birch-Murnaghan EOS and used it for high temperature by making compressibility a function of temperature. The (2) and (3) models lead to physically unacceptable values of entropy and heat capacity at extreme conditions. The problem is as much associated with the EOS formulation as with the adoption of a heat capacity change with temperature at 1 bar as discussed by Brosh (5). None of the databases (1), (2) or (3) include the database on multicomponent fluid at extreme conditions. These problems have been addressed in the new database modified after (3). It retains the solution models for solids as in (3) and adds the Brosh Model (5) for solid solutions and the Belonoshko et al (6) model for 13-component C-H-O-S fluid. The Superfluid model builds on the combination of experimental data on pure and mixed fluids at temperatures lower than 1000 K over several kilobars and molecular dynamics generated data at extreme conditions and has been found to be consistent with all the recent experimental data. New high pressure experiments on dissociation of volatile containing solids using laser- and externally-heated DAC are being conducted to obtain new pressure-volume-temperature data on fluids to extend the current kb

  9. Application of Integrated Database to the Casting Design

    Institute of Scientific and Technical Information of China (English)

    In-Sung Cho; Seung-Mok Yoo; Chae-Ho Lim; Jeong-Kil Choi

    2008-01-01

    Construction of integrated database including casting shapes with their casting design, technical knowledge, and thermophysical properties of the casting alloys were introduced in the present study. Recognition tech- nique for casting design by industrial computer tomography was used for the construction of shape database. Technical knowledge of the casting processes such as ferrous and non-ferrous alloys and their manufacturing process of the castings were accumulated and the search engine for the knowledge was developed. Database of thermophysical properties of the casting alloys were obtained via the experimental study, and the properties were used for .the in-house computer simulation of casting process. The databases were linked with intelligent casting expert system developed in center for e-design, KITECH. It is expected that the databases can help non casting experts to devise the casting and its process. Various examples of the application by using the databases were shown in the present study.

  10. Evolution and applications of plant pathway resources and databases

    DEFF Research Database (Denmark)

    Sucaet, Yves; Deva, Taru

    2011-01-01

    . We lay out trends and challenges in the ongoing efforts to integrate plant pathway databases and the applications of database integration. We also discuss how progress in non-plant communities can serve as an example for the improvement of the plant pathway database landscape and thereby allow......Plants are important sources of food and plant products are essential for modern human life. Plants are increasingly gaining importance as drug and fuel resources, bioremediation tools and as tools for recombinant technology. Considering these applications, database infrastructure for plant model...... systems deserves much more attention. Study of plant biological pathways, the interconnection between these pathways and plant systems biology on the whole has in general lagged behind human systems biology. In this article we review plant pathway databases and the resources that are currently available...

  11. Klaim-DB: A Modeling Language for Distributed Database Applications

    DEFF Research Database (Denmark)

    Wu, Xi; Li, Ximeng; Lluch Lafuente, Alberto;

    2015-01-01

    We present the modelling language, Klaim-DB, for distributed database applications. Klaim-DB borrows the distributed nets of the coordination language Klaim but essentially re-incarnates the tuple spaces of Klaim as databases, and provides high-level language abstractions for the access and manip......We present the modelling language, Klaim-DB, for distributed database applications. Klaim-DB borrows the distributed nets of the coordination language Klaim but essentially re-incarnates the tuple spaces of Klaim as databases, and provides high-level language abstractions for the access...... and manipulation of structured data, with integrity and atomicity considerations. We present the formal semantics of KlaimDB and illustrate the use of the language in a scenario where the sales from different branches of a chain of department stores are aggregated from their local databases. It can be seen...

  12. Client-Side Data Processing and Training for Multispectral Imagery Applications in the GOES-R Era

    Science.gov (United States)

    Fuell, Kevin; Gravelle, Chad; Burks, Jason; Berndt, Emily; Schultz, Lori; Molthan, Andrew; Leroy, Anita

    2016-01-01

    RGB imagery can be created locally (i.e. client-side) from single band imagery already on the system with little impact given recommended change to texture cache in AWIPS II. Training/Reference material accessible to forecasters within their operational display system improves RGB interpretation and application as demonstrated at OPG. Application examples from experienced forecasters are needed to support the larger community use of RGB imagery and these can be integrated into the user's display system.

  13. Database security and encryption technology research and application

    Science.gov (United States)

    Zhu, Li-juan

    2013-03-01

    The main purpose of this paper is to discuss the current database information leakage problem, and discuss the important role played by the message encryption techniques in database security, As well as MD5 encryption technology principle and the use in the field of website or application. This article is divided into introduction, the overview of the MD5 encryption technology, the use of MD5 encryption technology and the final summary. In the field of requirements and application, this paper makes readers more detailed and clearly understood the principle, the importance in database security, and the use of MD5 encryption technology.

  14. Thin-Client/Server架构在图书馆中的应用%Application of Thin- Client/Server in Library

    Institute of Scientific and Technical Information of China (English)

    陈春芳

    2005-01-01

    结合Thin-Client/Server架构在图书馆信息系统的实际应用情况,分析图书馆应用自动化的技术要求及Thin-Client/Server架构的优缺点.针对该架构的优缺点搞好终端服务器运行、客户端设备使用以及网络连接的管理,有利于进一步提高Thin-Client/Server架构在图书馆的功用.

  15. Database applications in high energy physics

    International Nuclear Information System (INIS)

    High Energy physicists were using computers to process and store their data early in the history of computing. They addressed problems of memory management, job control, job generation, data standards, file conventions, multiple simultaneous usage, tape file handling and data management earlier than, or at the same time as, the manufacturers of computing equipment. The HEP community have their own suites of programs for these functions, and are now turning their attention to the possibility of replacing some of the functional components of their 'homebrew' systems with more widely used software and/or hardware. High on the 'shopping list' for replacement is data management. ECFA Working Group 11 has been working on this problem. This paper reviews the characteristics of existing HEP systems and existing database systems and discusses the way forward. (orig.)

  16. Analysis of Turbulence Datasets using a Database Cluster: Requirements, Design, and Sample Applications

    Science.gov (United States)

    Meneveau, Charles

    2007-11-01

    The massive datasets now generated by Direct Numerical Simulations (DNS) of turbulent flows create serious new challenges. During a simulation, DNS provides only a few time steps at any instant, owing to storage limitations within the computational cluster. Therefore, traditional numerical experiments done during the simulation examine each time slice only a few times before discarding it. Conversely, if a few large datasets from high-resolution simulations are stored, they are practically inaccessible to most in the turbulence research community, who lack the cyber resources to handle the massive amounts of data. Even those who can compute at that scale must run simulations again forward in time in order to answer new questions about the dynamics, duplicating computational effort. The result is that most turbulence datasets are vastly underutilized and not available as they should be for creative experimentation. In this presentation, we discuss the desired features and requirements of a turbulence database that will enable its widest access to the research community. The guiding principle of large databases is ``move the program to the data'' (Szalay et al. ``Designing and mining multi-terabyte Astronomy archives: the Sloan Digital Sky Survey,'' in ACM SIGMOD, 2000). However, in the case of turbulence research, the questions and analysis techniques are highly specific to the client and vary widely from one client to another. This poses particularly hard challenges in the design of database analysis tools. We propose a minimal set of such tools that are of general utility across various applications. And, we describe a new approach based on a Web services interface that allows a client to access the data in a user-friendly fashion while allowing maximum flexibility to execute desired analysis tasks. Sample applications will be discussed. This work is performed by the interdisciplinary ITR group, consisting of the author and Yi Li(1), Eric Perlman(2), Minping Wan(1

  17. Application of the device database in the Python programming

    International Nuclear Information System (INIS)

    The Device Database has been developed using the relational database in the KEKB accelerator control system. It contains many kinds of parameters of the devices, mainly magnets and magnet power supplies. The parameters consist of the wiring information, the address of the interfaces, the specification of the hardware, the calibration constants, the magnetic field excitation functions and the any other parameters for the device control. These parameters are necessary not only for constructing EPICS IOC database but also for providing information to the high-level application programs, most of which are written in the script languages such as SAD or Python. Particularly Python is often used to access the Device Database. For this purpose, the Python library module that is designed to handle tabular data of the relational database on memory has been developed. The overview of the library module is reported. (author)

  18. Databases

    Data.gov (United States)

    National Aeronautics and Space Administration — The databases of computational and experimental data from the first Aeroelastic Prediction Workshop are located here. The databases file names tell their contents...

  19. Databases

    Digital Repository Service at National Institute of Oceanography (India)

    Kunte, P.D.

    Information on bibliographic as well as numeric/textual databases relevant to coastal geomorphology has been included in a tabular form. Databases cover a broad spectrum of related subjects like coastal environment and population aspects, coastline...

  20. Registry of EPA Applications, Models, and Databases

    Data.gov (United States)

    U.S. Environmental Protection Agency — READ is EPA's authoritative source for information about Agency information resources, including applications/systems, datasets and models. READ is one component of...

  1. Patterns of client behavior with their most recent male escort: an application of latent class analysis.

    Science.gov (United States)

    Grov, Christian; Starks, Tyrel J; Wolff, Margaret; Smith, Michael D; Koken, Juline A; Parsons, Jeffrey T

    2015-05-01

    Research examining interactions between male escorts and clients has relied heavily on data from escorts, men working on the street, and behavioral data aggregated over time. In the current study, 495 clients of male escorts answered questions about sexual behavior with their last hire. Latent class analysis identified four client sets based on these variables. The largest (n = 200, 40.4 %, labeled Typical Escort Encounter) included men endorsing behavior prior research found typical of paid encounters (e.g., oral sex and kissing). The second largest class (n = 157, 31.7 %, Typical Escort Encounter + Erotic Touching) included men reporting similar behaviors, but with greater variety along a spectrum of touching (e.g., mutual masturbation and body worship). Those classed BD/SM and Kink (n = 76, 15.4 %) reported activity along the kink spectrum (BD/SM and role play). Finally, men classed Erotic Massage Encounters (n = 58, 11.7 %) primarily engaged in erotic touch. Clients reporting condomless anal sex were in the minority (12.2 % overall). Escorts who engage in anal sex with clients might be appropriate to train in HIV prevention and other harm reduction practices-adopting the perspective of "sex workers as sex educators." PMID:24777440

  2. A relational database application in support of integrated neuroscience research.

    Science.gov (United States)

    Rudowsky, Ira; Kulyba, Olga; Kunin, Mikhail; Ogarodnikov, Dmitri; Raphan, Theodore

    2004-12-01

    The development of relational databases has significantly improved the performance of storage, search, and retrieval functions and has made it possible for applications that perform real-time data acquisition and analysis to interact with these types of databases. The purpose of this research was to develop a user interface for interaction between a data acquisition and analysis application and a relational database using the Oracle9i system. The overall system was designed to have an indexing capability that threads into the data acquisition and analysis programs. Tables were designed and relations within the database for indexing the files and information contained within the files were established. The system provides retrieval capabilities over a broad range of media, including analog, event, and video data types. The system's ability to interact with a data capturing program at the time of the experiment to create both multimedia files as well as the meta-data entries in the relational database avoids manual entries in the database and ensures data integrity and completeness for further interaction with the data by analysis applications. PMID:15657974

  3. Databases

    Directory of Open Access Journals (Sweden)

    Nick Ryan

    2004-01-01

    Full Text Available Databases are deeply embedded in archaeology, underpinning and supporting many aspects of the subject. However, as well as providing a means for storing, retrieving and modifying data, databases themselves must be a result of a detailed analysis and design process. This article looks at this process, and shows how the characteristics of data models affect the process of database design and implementation. The impact of the Internet on the development of databases is examined, and the article concludes with a discussion of a range of issues associated with the recording and management of archaeological data.

  4. Application of the Non—Stationary Oil Film Force Database

    Institute of Scientific and Technical Information of China (English)

    WANGWen; ZHANGZHi-ming; 等

    2001-01-01

    The technique of non-stationary oll film force database for hydrodynamic bearing is introduced and its potential applications in nonlinear rotor-dynamics are demonstrated.Through simulations of the locus of the shaft center aided by the database technique,nonlinear stability analysis can be performed and the natural frequency can be obtained as well.The easiness of “assembling” the individual bush forces from the database to form the bearing force.makes it very convenient to evaluate the stability of various types of journal bearings,Examples are demonstrated to show how the database technique makes it possible to get technically abundant simulation results at the expense of very short calculation time.

  5. Database Replication

    CERN Document Server

    Kemme, Bettina

    2010-01-01

    Database replication is widely used for fault-tolerance, scalability and performance. The failure of one database replica does not stop the system from working as available replicas can take over the tasks of the failed replica. Scalability can be achieved by distributing the load across all replicas, and adding new replicas should the load increase. Finally, database replication can provide fast local access, even if clients are geographically distributed clients, if data copies are located close to clients. Despite its advantages, replication is not a straightforward technique to apply, and

  6. Molten salts database for energy applications

    CERN Document Server

    Serrano-López, Roberto; Cuesta-López, Santiago

    2013-01-01

    The growing interest in energy applications of molten salts is justified by several of their properties. Their possibilities of usage as a coolant, heat transfer fluid or heat storage substrate, require thermo-hydrodynamic refined calculations. Many researchers are using simulation techniques, such as Computational Fluid Dynamics (CFD) for their projects or conceptual designs. The aim of this work is providing a review of basic properties (density, viscosity, thermal conductivity and heat capacity) of the most common and referred salt mixtures. After checking data, tabulated and graphical outputs are given in order to offer the most suitable available values to be used as input parameters for other calculations or simulations. The reviewed values show a general scattering in characterization, mainly in thermal properties. This disagreement suggests that, in several cases, new studies must be started (and even new measurement techniques should be developed) to obtain accurate values.

  7. A Green's function database platform for seismological research and education: applications and examples

    Science.gov (United States)

    Heimann, Sebastian; Kriegerowski, Marius; Dahm, Torsten; Simone, Cesca; Wang, Rongjiang

    2016-04-01

    The study of seismic sources from measured waveforms requires synthetic elementary seismograms (Green's functions, GF) calculated for specific earth models and source receiver geometries. Since the calculation of GFs is computationally expensive and requires careful parameter testing and quality control, pre-calculated GF databases, which can be re-used for different types of applications, can be of advantage. We developed a GF database web platform for the seismological community (http://kinherd.org/), where a researcher can share Green's function stores and retrieve synthetic seismograms on the fly for various point and extended earthquake source models for many different earth models at local, regional and global scale. This web service is part of a rich new toolset for the creation and handling of Green's functions and synthetic seismograms (http://emolch.github.com/pyrocko/gf). It can be used off-line or in client mode. We demonstrate core features of the GF platform with different applications on global, regional and local scales. These include the automatic inversion of kinematic source parameter from teleseismic body waves, the improved depth estimate of shallow induced earthquakes from regional seismological arrays, or the relative moment tensor inversion of local earthquakes from volcanic induced seismicity.

  8. [The application of graphic (visual) databases in neurology and neuropathology].

    Science.gov (United States)

    Lechowicz, W; Milewska, D; Swiderski, W; Dymecki, J

    1994-01-01

    The possibilities and principles of creation in Windows environment of visual databases are described which could be used for elaboration of multimedial encyclopedia of selected nosological entities. Such databases are particularly important in education making possible finding of data and their relative comparison. The method of organization of "files" using standard programmes of Windows packs and the method of application of the technique of image coding for forms of symbolic icons are presented. Examples of graphic bases of neurological and neuropathological data evolved in Windows environment using professional application programmes (framing and retouching of images in graphic editors, coding of icons of macroinstruction of reading of information introduced into the relating visual database of the analysed cases. PMID:8065541

  9. An Application to WIN/ISIS Database on Local Network

    Directory of Open Access Journals (Sweden)

    Robert Lechien

    2005-07-01

    Full Text Available A Translated Article containing an application to how WIN/ISIS database work on local network. It starts with main definitions, and how to install WIN/ISIS on PC, and how to install it on the local network server.

  10. Development of Integrated PSA Database and Application Technology

    International Nuclear Information System (INIS)

    The high quality of PSA is essential for the risk informed regulation and applications. The main elements of PSA are the model, methodology, reliability data, and tools. The purpose of the project is to develop the reliability database for the Korean nuclear power plants and PSA analysis and management system. The reliability database system has been developed and the reliability data has been collected for 4 types of reliability data such as the reactor trip, the piping, the component and the common cause failure. The database provides the reliability data for PSAs and risk informed applications. The FTREX software is the fastest PSA quantification engine in the world. The license agreement between KAERI and EPRI is made to sell FTREX to the members of EPRI. The advanced PSA management system AIMS- PSA has been developed. The PSA model is stored in the database and solved by clicking one button. All the information necessary for the KSNP Level-1 and 2 PSA is stored in the PSA information database. It provides the PSA users a useful mean to review and analyze the PSA

  11. Current research status, databases and application of single nucleotide polymorphism.

    Science.gov (United States)

    Javed, R; Mukesh

    2010-07-01

    Single Nucleotide Polymorphisms (SNPs) are the most frequent form of DNA variation in the genome. SNPs are genetic markers which are bi-allelic in nature and grow at a very fast rate. Current genomic databases contain information on several million SNPs. More than 6 million SNPs have been identified and the information is publicly available through the efforts of the SNP Consortium and others data bases. The NCBI plays a major role in facillating the identification and cataloging of SNPs through creation and maintenance of the public SNP database (dbSNP) by the biomedical community worldwide and stimulate many areas of biological research including the identification of the genetic components of disease. In this review article, we are compiling the existing SNP databases, research status and their application. PMID:21717869

  12. An integrated medical image database and retrieval system using a web application server.

    Science.gov (United States)

    Cao, Pengyu; Hashiba, Masao; Akazawa, Kouhei; Yamakawa, Tomoko; Matsuto, Takayuki

    2003-08-01

    We developed an Integrated Medical Image Database and Retrieval System (INIS) for easy access by medical staff. The INIS mainly consisted of four parts: specific servers to save medical images from multi-vendor modalities of CT, MRI, CR, ECG and endoscopy; an integrated image database (DB) server to save various kinds of images in a DICOM format; a Web application server to connect clients to the integrated image DB and the Web browser terminals connected to an HIS system. The INIS provided a common screen design to retrieve CT, MRI, CR, endoscopic and ECG images, and radiological reports, which would allow doctors to retrieve radiological images and corresponding reports, or ECG images of a patient simultaneously on a screen. Doctors working in internal medicine on average accessed information 492 times a month. Doctors working in cardiological and gastroenterological accessed information 308 times a month. Using the INIS, medical staff could browse all or parts of a patient's medical images and reports. PMID:12909158

  13. The Network Configuration of an Object Relational Database Management System

    Science.gov (United States)

    Diaz, Philip; Harris, W. C.

    2000-01-01

    The networking and implementation of the Oracle Database Management System (ODBMS) requires developers to have knowledge of the UNIX operating system as well as all the features of the Oracle Server. The server is an object relational database management system (DBMS). By using distributed processing, processes are split up between the database server and client application programs. The DBMS handles all the responsibilities of the server. The workstations running the database application concentrate on the interpretation and display of data.

  14. Advancements in web-database applications for rabies surveillance

    OpenAIRE

    Bélanger Denise; Coté Nathalie; Gendron Bruno; Lelièvre Frédérick; Rees Erin E

    2011-01-01

    Abstract Background Protection of public health from rabies is informed by the analysis of surveillance data from human and animal populations. In Canada, public health, agricultural and wildlife agencies at the provincial and federal level are responsible for rabies disease control, and this has led to multiple agency-specific data repositories. Aggregation of agency-specific data into one database application would enable more comprehensive data analyses and effective communication among pa...

  15. Server-side verification of client behavior in cryptographic protocols

    OpenAIRE

    Chi, Andrew; Cochran, Robert; Nesfield, Marie; Reiter, Michael K.; Sturton, Cynthia

    2016-01-01

    Numerous exploits of client-server protocols and applications involve modifying clients to behave in ways that untampered clients would not, such as crafting malicious packets. In this paper, we demonstrate practical verification of a cryptographic protocol client's messaging behavior as being consistent with the client program it is believed to be running. Moreover, we accomplish this without modifying the client in any way, and without knowing all of the client-side inputs driving its behav...

  16. 顾及语义差异的基础地理信息客户数据库更新实施模型%Fundamental Geo-information Client Database Updating Model Considering Semantic Heterogeneities

    Institute of Scientific and Technical Information of China (English)

    王育红; 牛亚辉; 林艳

    2011-01-01

    Client database updating refers to the process of utilizing the related information of the changed and updated features in the new version of fundamental geographical information database to perform the corresponding cascade updates on client database for ensuring it also has good currency. The current researches emphasize particularly on the distribution and delivery of updating information, but how to efficiently implement the updating of client database,especially in the case of semantic heterogeneity, doesn't be considered fully. Aiming to this problem, the semantic heterogeneities between two databases are firstly summarized, and then their impacts on the updating implementation process are described from some aspects such as efficiency, the completeness, the consistency and correctness of data. Finally, the updating implementation model consisted of three basic operations, semantic matching, updates extraction and updates integration, is proposed according to the theory of semantic mapping and transformation, and the execution strategies and key steps of these operations are respectively discussed. Through these discussions and analysis, the concept of fundamental geoqnformation client database updating, its implementation requirements and technical difficulties, the corresponding solutions, the key problems needed to be further researched are made clean Based on the current works and the future researches,an automated and efficient software tool for client database updating is promising to be developed and realized.%客户数据库更新就是利用新版基础地理数据库中更新变化的要素信息,对客户数据库进行相应的级联更新,以使其具有良好现势性的过程.现有研究大多侧重于基础地理数据库更新信息的分发与提供,而没有充分考虑语义差异环境下如何高效实施客户数据库更新的具体问题.针对此,该文概括了基础地理数据库与其客户数据库之间潜在的各种语义

  17. Advancements in web-database applications for rabies surveillance

    Directory of Open Access Journals (Sweden)

    Bélanger Denise

    2011-08-01

    Full Text Available Abstract Background Protection of public health from rabies is informed by the analysis of surveillance data from human and animal populations. In Canada, public health, agricultural and wildlife agencies at the provincial and federal level are responsible for rabies disease control, and this has led to multiple agency-specific data repositories. Aggregation of agency-specific data into one database application would enable more comprehensive data analyses and effective communication among participating agencies. In Québec, RageDB was developed to house surveillance data for the raccoon rabies variant, representing the next generation in web-based database applications that provide a key resource for the protection of public health. Results RageDB incorporates data from, and grants access to, all agencies responsible for the surveillance of raccoon rabies in Québec. Technological advancements of RageDB to rabies surveillance databases include 1 automatic integration of multi-agency data and diagnostic results on a daily basis; 2 a web-based data editing interface that enables authorized users to add, edit and extract data; and 3 an interactive dashboard to help visualize data simply and efficiently, in table, chart, and cartographic formats. Furthermore, RageDB stores data from citizens who voluntarily report sightings of rabies suspect animals. We also discuss how sightings data can indicate public perception to the risk of racoon rabies and thus aid in directing the allocation of disease control resources for protecting public health. Conclusions RageDB provides an example in the evolution of spatio-temporal database applications for the storage, analysis and communication of disease surveillance data. The database was fast and inexpensive to develop by using open-source technologies, simple and efficient design strategies, and shared web hosting. The database increases communication among agencies collaborating to protect human health from

  18. Design and Test of Application-Specific Integrated Circuits by use of Mobile Clients

    Directory of Open Access Journals (Sweden)

    Michael Auer

    2009-02-01

    Full Text Available The aim of this work is to develop a simultaneous multi user access system – READ (Remote ASIC Design and Test that allows users to perform test and measurements remotely via clients running on mobile devices as well as on standard PCs. The system also facilitates the remote design of circuits with the PAC-Designer The system is controlled by LabVIEW and was implemented using a Data Acquisition Card from National instruments. Such systems are specially suited for manufacturing process monitoring and control. The performance of the simultaneous access was tested under load with a variable number of users. The server implements a queue that processes user’s commands upon request.

  19. Design and implementation of a cartographic client application for mobile devices using SVG Tiny and J2ME

    Science.gov (United States)

    Hui, L.; Behr, F.-J.; Schröder, D.

    2006-10-01

    The dissemination of digital geospatial data is available now on mobile devices such as PDAs (personal digital assistants) and smart-phones etc. The mobile devices which support J2ME (Java 2 Micro Edition) offer users and developers one open interface, which they can use to develop or download the software according their own demands. Currently WMS (Web Map Service) can afford not only traditional raster image, but also the vector image. SVGT (Scalable Vector Graphics Tiny) is one subset of SVG (Scalable Vector Graphics) and because of its precise vector information, original styling and small file size, SVGT format is fitting well for the geographic mapping purpose, especially for the mobile devices which has bandwidth net connection limitation. This paper describes the development of a cartographic client for the mobile devices, using SVGT and J2ME technology. Mobile device will be simulated on the desktop computer for a series of testing with WMS, for example, send request and get the responding data from WMS and then display both vector and raster format image. Analyzing and designing of System structure such as user interface and code structure are discussed, the limitation of mobile device should be taken into consideration for this applications. The parsing of XML document which is received from WMS after the GetCapabilities request and the visual realization of SVGT and PNG (Portable Network Graphics) image are important issues in codes' writing. At last the client was tested on Nokia S40/60 mobile phone successfully.

  20. Object relationship notation (ORN) for database applications enhancing the modeling and implementation of associations

    CERN Document Server

    Ehlmann, Bryon K

    2009-01-01

    Conceptually, a database consists of objects and relationships. Object Relationship Notation (ORN) is a simple notation that more precisely defines relationships by combining UML multiplicities with uniquely defined referential actions. ""Object Relationship Notation (ORN) for Database Applications: Enhancing the Modeling and Implementation of Associations"" shows how ORN can be used in UML class diagrams and database definition languages (DDLs) to better model and implement relationships and thus more productively develop database applications. For the database developer, it presents many exa

  1. New Trend of Database for the Internet Era --Object database and its application

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    In the Internet era, the relational database in general usecannot be applied to some problems which should be solved. In this thesis, we describe the necessary capabilities for database management systems and compare them with th e limitation of RDB. And also we introduce object database and its efficiency, w hich will be the new trend of the database. We use “Jasmine2000” as a concrete e xample of object database in business use, and are going to verify its efficienc y with its applied cases. At the end, we will point the way of database's future .

  2. AIDA Asia. Artificial Insemination Database Application. User manual. 1

    International Nuclear Information System (INIS)

    Artificial Insemination Database Application (AIDA-Asia) is a computer application to store and analyze information from AI Services (farms, females, inseminated, semen, estrus characteristics, inseminator and pregnancy diagnosis data). The need for such an application arose during a consultancy undertaken by the author for the International Atomic Energy Agency (IAEA, Vienna) under the framework of its Regional Co-operative Agreement for Asia and the Pacific (RCA) which is implementing a project on 'Improving Animal Productivity and Reproductive Efficiency' (RAS/5/035). The detailed specifications for the application were determined through a Task Force Meeting of National Consultants from five RCA Member States, organized by the IAEA and held in Sri Lanka in April 2001. The application has been developed in MS Access 2000 and Visual Basic for Applications (VBA) 6.0. However, it can run as a stand-alone application through its own executable files. It is based on screen forms for data entry or editing of information and command buttons. The structure of the data, the design of the application and VBA codes cannot be seen and cannot be modified by users. However, the designated administrator of AIDA-Asia in each country can customize it

  3. Realization of client/server management information system of coal mine based on ODBC in geology and survey

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Q.; Mao, S.; Yang, F.; Han, Z. [Shandong University of Science and Technology (China). Geoscience Department

    2000-08-01

    The paper describes in detail the framework and the application theory of Open Database Connectivity (ODBC), the formation of a client/server system of geological and surveying management information system, and the connection of the various databases. Then systematically, the constitution and functional realization of the geological management information system are introduced. 5 refs., 5 figs.

  4. Understanding Ajax applications by connecting client and server-side execution traces

    NARCIS (Netherlands)

    Zaidman, A.E.; Matthijssen, N.; Storey, M.A.; Van Deursen, A.

    2012-01-01

    Ajax-enabled Web applications are a new breed of highly interactive, highly dynamic Web applications. Although Ajax allows developers to create rich Web applications, Ajax applications can be difficult to comprehend and thus to maintain. For this reason, we have created FireDetective, a tool that us

  5. The Application and Implementation of Thinking NoSQL Based on Client-side Programs%Client端程序中NoSQL思想的应用与实现

    Institute of Scientific and Technical Information of China (English)

    屠强; 徐宁

    2013-01-01

    说起NoSQL[1-4],人们首先想到的是Web, Database。Database通常是部署在服务器上,这和客户端的应用程序有什么关系呢?其中NoSQL的一种最常见的数据存储方式Key-Value,Key-Value正是通过常见的数据结构map或hashtable[5]来实现的,而map(或hashtable)对于client的开发人员来说,可就一点都不陌生了。在client端程序的各个角落里,都遍布着map(或hashtable)的身影,我们用它来存放各种数据。该文想要讲述的,正是这样一个客户端NoSQL Key-Value store的应用的实现,这个应用方式,既是传统的——使用map存放数据,也是新颖的——用来实现模块间的解耦合。%About NoSQL, we will first think of the web and database. Database is usually deployed on the server. What does it matter with the client application? In NoSQL, one of most common data storage way is Key-Value which to achieve by com?mon data structure map or hashtable. However, the map (or hashtable) for client developers may not unfamiliar. In every corner of the client-side program, we use the map (or hashtable) to store a variety of data. This paper describes just such a client NoSQL Key-Value store applications are being implemented. This application, both traditional- use the map stored data, but also the novel-is used to achieve a decoupling between modules.

  6. Graph databases and their applications to social networks

    OpenAIRE

    BARTHA, Tomáš

    2015-01-01

    Main idea of the bachelor work is theoretically introduce NoSQL databases and mainly graph databases with their using at social network. Sample database is implemented with using Neo4j technology and show Cypher query language use cases. At the last part we will analyze sample database data with using Neo4j server and query language and perform analysis assessment.

  7. Database Sampling to Support the Development of Data-Intensive Applications

    OpenAIRE

    Bisbal, Jesus

    2000-01-01

    A prototype database is a model of a database which exhibits the desired properties, in terms of its schema and/or data values, of an operational database. Database prototyping has been proposed as a technique to support the database design process in particular, and the whole data-intensive application development process in general (e.g. requirements elicitation, software testing, experimentation with design alternatives). Existing work on this area has been widely ignored in...

  8. Assessing client-caregiver relationships and the applicability of the 'student-teacher relationship scale' for people with intellectual disabilities

    NARCIS (Netherlands)

    J.M. Roeden; M.A. Maaskant; H.M.Y. Koomen; M.J.J.M. Candel; L.M.G. Curfs

    2012-01-01

    Improvements in client-caregiver relationships may lead to improvements in the quality of life of clients with intellectual disabilities (ID). For this reason, interventions aimed at influencing these relationships are important. To gain insight into the nature and intention of these relationships i

  9. Android as a platform for database application development case : Winha mobile

    OpenAIRE

    Muli, Joseph

    2013-01-01

    This thesis aims to help beginner Android developers and anyone interested in database intergration on Android Operating System understand the basic fundamentals on how to design database applications, specifically for mobile devices. A review of Android applications has been made to give an overview of the general properties of the applications in relation to database creation and management. To accomplish this thesis, SQL (Structured Query Language) and Android application development w...

  10. Multimedia Database Applications: Issues and Concerns for Classroom Teaching

    CERN Document Server

    Yu, Chien

    2011-01-01

    The abundance of multimedia data and information is challenging educators to effectively search, browse, access, use, and store the data for their classroom teaching. However, many educators could still be accustomed to teaching or searching for information using conventional methods, but often the conventional methods may not function well with multimedia data. Educators need to efficiently interact and manage a variety of digital media files too. The purpose of this study is to review current multimedia database applications in teaching and learning, and further discuss some of the issues or concerns that educators may have while incorporating multimedia data into their classrooms. Some strategies and recommendations are also provided in order for educators to be able to use multimedia data more effectively in their teaching environments.

  11. Applications of the Cambridge Structural Database in chemical education.

    Science.gov (United States)

    Battle, Gary M; Ferrence, Gregory M; Allen, Frank H

    2010-10-01

    The Cambridge Structural Database (CSD) is a vast and ever growing compendium of accurate three-dimensional structures that has massive chemical diversity across organic and metal-organic compounds. For these reasons, the CSD is finding significant uses in chemical education, and these applications are reviewed. As part of the teaching initiative of the Cambridge Crystallographic Data Centre (CCDC), a teaching subset of more than 500 CSD structures has been created that illustrate key chemical concepts, and a number of teaching modules have been devised that make use of this subset in a teaching environment. All of this material is freely available from the CCDC website, and the subset can be freely viewed and interrogated using WebCSD, an internet application for searching and displaying CSD information content. In some cases, however, the complete CSD System is required for specific educational applications, and some examples of these more extensive teaching modules are also discussed. The educational value of visualizing real three-dimensional structures, and of handling real experimental results, is stressed throughout.

  12. Applications of the Cambridge Structural Database in chemical education.

    Science.gov (United States)

    Battle, Gary M; Ferrence, Gregory M; Allen, Frank H

    2010-10-01

    The Cambridge Structural Database (CSD) is a vast and ever growing compendium of accurate three-dimensional structures that has massive chemical diversity across organic and metal-organic compounds. For these reasons, the CSD is finding significant uses in chemical education, and these applications are reviewed. As part of the teaching initiative of the Cambridge Crystallographic Data Centre (CCDC), a teaching subset of more than 500 CSD structures has been created that illustrate key chemical concepts, and a number of teaching modules have been devised that make use of this subset in a teaching environment. All of this material is freely available from the CCDC website, and the subset can be freely viewed and interrogated using WebCSD, an internet application for searching and displaying CSD information content. In some cases, however, the complete CSD System is required for specific educational applications, and some examples of these more extensive teaching modules are also discussed. The educational value of visualizing real three-dimensional structures, and of handling real experimental results, is stressed throughout. PMID:20877495

  13. APPLICATION OF GEOGRAPHICAL PARAMETER DATABASE TO ESTABLISHMENT OF UNIT POPULATION DATABASE

    Institute of Scientific and Technical Information of China (English)

    2003-01-01

    Now GIS is turning into a good tool in handling geographical, economical, and population data, so we can obtain more and more information from these data. On the other hand, in some cases, for a calamity, such as hurricane, earthquake, flood, drought etc., or a decision-making, such as setting up a broadcasting transmitter, building a chemical plant etc., we have to evaluate the total population in the region influenced by a calamity or a project. In this paper, a method is put forward to evaluate the population in such special region. Through exploring the correlation of geographical parameters and the distribution of people in the same region by means of quantitative analysis and qualitative analysis, unit population database (1km× 1km) is established. In this way, estimating the number of people in a special region is capable by adding up the population in every grid involved in this region boundary. The geographical parameters are obtained from topographic database and DEM database on the scale of 1∶ 250 000. The fundamental geographical parameter database covering county administrative boundaries and 1km× 1km grid is set up and the population database at county level is set up as well. Both geographical parameter database and unit population database are able to offer sufficient conditions for quantitative analysis. They will have important role in the research fields of data mining (DM), Decision-making Support Systems (DSS), and regional sustainable development.

  14. Web Application Software for Ground Operations Planning Database (GOPDb) Management

    Science.gov (United States)

    Lanham, Clifton; Kallner, Shawn; Gernand, Jeffrey

    2013-01-01

    A Web application facilitates collaborative development of the ground operations planning document. This will reduce costs and development time for new programs by incorporating the data governance, access control, and revision tracking of the ground operations planning data. Ground Operations Planning requires the creation and maintenance of detailed timelines and documentation. The GOPDb Web application was created using state-of-the-art Web 2.0 technologies, and was deployed as SaaS (Software as a Service), with an emphasis on data governance and security needs. Application access is managed using two-factor authentication, with data write permissions tied to user roles and responsibilities. Multiple instances of the application can be deployed on a Web server to meet the robust needs for multiple, future programs with minimal additional cost. This innovation features high availability and scalability, with no additional software that needs to be bought or installed. For data governance and security (data quality, management, business process management, and risk management for data handling), the software uses NAMS. No local copy/cloning of data is permitted. Data change log/tracking is addressed, as well as collaboration, work flow, and process standardization. The software provides on-line documentation and detailed Web-based help. There are multiple ways that this software can be deployed on a Web server to meet ground operations planning needs for future programs. The software could be used to support commercial crew ground operations planning, as well as commercial payload/satellite ground operations planning. The application source code and database schema are owned by NASA.

  15. The Application of Smart Phone Client in the Library%智能手机客户端在图书馆中的应用

    Institute of Scientific and Technical Information of China (English)

    楼向英; 施干卫; 高春玲

    2011-01-01

    手机图书馆现有的技术模式主要有SMS、WAP网站、二维码应用和智能手机的应用开发.智能手机的应用开发可简单地分成两类:一是Web应用;二是桌面应用,即手机客户端.文章调研了智能手机客户端在国内外图书馆的应用情况,认为以iPhone和Android智能手机为主的手机客户端开发模式将渐成潮流.%Nowadays the technology patterns for mobile library are mainly SMS, WAP, 2D barcode and smart phone application. The smart phone application simply fells into two types: web application and desk application, which is also mobile phone client The article focuses on the development of mobile phone client for libraries around the world and considers that the development of smart phone client based iPhone and Android will gradually become the trend.

  16. TOPCAT's TAP Client

    CERN Document Server

    Taylor, Mark

    2015-01-01

    TAP, the Table Access Protocol, is a Virtual Observatory (VO) protocol for executing queries in remote relational databases using ADQL, an SQL-like query language. It is one of the most powerful components of the VO, but also one of the most complex to use, with an extensive stack of associated standards. We present here recent improvements to the client and GUI for interacting with TAP services from the TOPCAT table analysis tool. As well as managing query submission and result retrieval, the GUI attempts to provide the user with as much help as possible in locating services, understanding service metadata and capabilities, and constructing correct and useful ADQL queries. The implementation and design are, unlike previous versions, both usable and performant even for the largest TAP services.

  17. Application of China's National Forest Continuous Inventory Database

    Science.gov (United States)

    Xie, Xiaokui; Wang, Qingli; Dai, Limin; Su, Dongkai; Wang, Xinchuang; Qi, Guang; Ye, Yujing

    2011-12-01

    The maintenance of a timely, reliable and accurate spatial database on current forest ecosystem conditions and changes is essential to characterize and assess forest resources and support sustainable forest management. Information for such a database can be obtained only through a continuous forest inventory. The National Forest Continuous Inventory (NFCI) is the first level of China's three-tiered inventory system. The NFCI is administered by the State Forestry Administration; data are acquired by five inventory institutions around the country. Several important components of the database include land type, forest classification and ageclass/ age-group. The NFCI database in China is constructed based on 5-year inventory periods, resulting in some of the data not being timely when reports are issued. To address this problem, a forest growth simulation model has been developed to update the database for years between the periodic inventories. In order to aid in forest plan design and management, a three-dimensional virtual reality system of forest landscapes for selected units in the database (compartment or sub-compartment) has also been developed based on Virtual Reality Modeling Language. In addition, a transparent internet publishing system for a spatial database based on open source WebGIS (UMN Map Server) has been designed and utilized to enhance public understanding and encourage free participation of interested parties in the development, implementation, and planning of sustainable forest management.

  18. [Knowledge discovery in database and its application in clinical diagnosis].

    Science.gov (United States)

    Lui, Hui; Qiu, Tianshuang

    2004-08-01

    Nowadays the tremendous amount of data has far exceeded our human ability for comprehension, and this has been particularly true for the medical database. However, traditional statistical techniques are no longer adequate for analyzing this vast collection of data. Knowledge discovery in database and data mining play an important role in analyzing data and uncovering important data patterns. This paper briefly presents the concepts of knowledge discovery in database and data mining, then describes the rough set theory, and gives some examples based on rough set.

  19. A Comparative Study of Relational and Non-Relational Database Models in a Web- Based Application

    Directory of Open Access Journals (Sweden)

    Cornelia Gyorödi

    2015-11-01

    Full Text Available The purpose of this paper is to present a comparative study between relational and non-relational database models in a web-based application, by executing various operations on both relational and on non-relational databases thus highlighting the results obtained during performance comparison tests. The study was based on the implementation of a web-based application for population records. For the non-relational database, we used MongoDB and for the relational database, we used MSSQL 2014. We will also present the advantages of using a non-relational database compared to a relational database integrated in a web-based application, which needs to manipulate a big amount of data.

  20. Working with HITRAN Database Using Hapi: HITRAN Application Programming Interface

    Science.gov (United States)

    Kochanov, Roman V.; Hill, Christian; Wcislo, Piotr; Gordon, Iouli E.; Rothman, Laurence S.; Wilzewski, Jonas

    2015-06-01

    A HITRAN Application Programing Interface (HAPI) has been developed to allow users on their local machines much more flexibility and power. HAPI is a programming interface for the main data-searching capabilities of the new "HITRANonline" web service (http://www.hitran.org). It provides the possibility to query spectroscopic data from the HITRAN database in a flexible manner using either functions or query language. Some of the prominent current features of HAPI are: a) Downloading line-by-line data from the HITRANonline site to a local machine b) Filtering and processing the data in SQL-like fashion c) Conventional Python structures (lists, tuples, and dictionaries) for representing spectroscopic data d) Possibility to use a large set of third-party Python libraries to work with the data e) Python implementation of the HT lineshape which can be reduced to a number of conventional line profiles f) Python implementation of total internal partition sums (TIPS-2011) for spectra simulations g) High-resolution spectra calculation accounting for pressure, temperature and optical path length h) Providing instrumental functions to simulate experimental spectra i) Possibility to extend HAPI's functionality by custom line profiles, partitions sums and instrumental functions Currently the API is a module written in Python and uses Numpy library providing fast array operations. The API is designed to deal with data in multiple formats such as ASCII, CSV, HDF5 and XSAMS. This work has been supported by NASA Aura Science Team Grant NNX14AI55G and NASA Planetary Atmospheres Grant NNX13AI59G. L.S. Rothman et al. JQSRT, Volume 130, 2013, Pages 4-50 N.H. Ngo et al. JQSRT, Volume 129, November 2013, Pages 89-100 A. L. Laraia at al. Icarus, Volume 215, Issue 1, September 2011, Pages 391-400

  1. CORAL Server and CORAL Server Proxy: Scalable Access to Relational Databases from CORAL Applications

    Science.gov (United States)

    Valassi, A.; Bartoldus, R.; Kalkhof, A.; Salnikov, A.; Wache, M.

    2011-12-01

    The CORAL software is widely used at CERN by the LHC experiments to access the data they store on relational databases, such as Oracle. Two new components have recently been added to implement a model involving a middle tier "CORAL server" deployed close to the database and a tree of "CORAL server proxies", providing data caching and multiplexing, deployed close to the client. A first implementation of the two new components, released in the summer 2009, is now deployed in the ATLAS online system to read the data needed by the High Level Trigger, allowing the configuration of a farm of several thousand processes. This paper reviews the architecture of the software, its development status and its usage in ATLAS.

  2. A Comparative Study of Relational and Non-Relational Database Models in a Web- Based Application

    OpenAIRE

    Cornelia Gyorödi; Robert Gyorödi; Roxana Sotoc

    2015-01-01

    The purpose of this paper is to present a comparative study between relational and non-relational database models in a web-based application, by executing various operations on both relational and on non-relational databases thus highlighting the results obtained during performance comparison tests. The study was based on the implementation of a web-based application for population records. For the non-relational database, we used MongoDB and for the relational database, we used MSSQL 2014. W...

  3. CORAL Server and CORAL Server Proxy: Scalable Access to Relational Databases from CORAL Applications

    CERN Document Server

    Valassi, A; Kalkhof, A; Salnikov, A; Wache, M

    2011-01-01

    The CORAL software is widely used at CERN for accessing the data stored by the LHC experiments using relational database technologies. CORAL provides a C++ abstraction layer that supports data persistency for several backends and deployment models, including local access to SQLite files, direct client access to Oracle and MySQL servers, and read-only access to Oracle through the FroNTier web server and cache. Two new components have recently been added to CORAL to implement a model involving a middle tier "CORAL server" deployed close to the database and a tree of "CORAL server proxy" instances, with data caching and multiplexing functionalities, deployed close to the client. The new components are meant to provide advantages for read-only and read-write data access, in both offline and online use cases, in the areas of scalability and performance (multiplexing for several incoming connections, optional data caching) and security (authentication via proxy certificates). A first implementation of the two new c...

  4. Client Centred Desing

    DEFF Research Database (Denmark)

    Ørngreen, Rikke; Nielsen, Janni; Levinsen, Karin

    2008-01-01

    In this paper we argue for the use of Client Centred preparation phases when designing complex systems. Through Client Centred Design human computer interaction can extend the focus on end-users to alse encompass the client's needs, context and resources.......In this paper we argue for the use of Client Centred preparation phases when designing complex systems. Through Client Centred Design human computer interaction can extend the focus on end-users to alse encompass the client's needs, context and resources....

  5. A Two-folded Impact Analysis of Schema Changes on Database Applications

    Institute of Scientific and Technical Information of China (English)

    Spyridon K.Gardikiotis; Nicos Malevris

    2009-01-01

    Database applications are becoming increasingly popular, mainly due to the advanced data management facilities that the underlying database management system offers compared against traditional legacy software applications. The interaction, however, of such applications with the database system introduces a number of issues, among which, this paper addresses the impact analysis of the changes performed at the database schema level. Our motivation is to provide the software engineers of database applications with automated methods that facilitate major maintenance tasks, such as source code corrections and regression testing, which should be triggered by the occurrence of such changes. The presented impact analysis is thus two-folded: the impact is analysed in terms of both the affected source code statements and the affected test suites concerning the testing of these applications. To achieve the former objective, a program slicing technique is employed, which is based on an extended version of the program dependency graph. The latter objective requires the analysis of test suites generated for database applications, which is accomplished by employing testing techniques tailored for this type of applications. Utilising both the slicing and the testing techniques enhances program comprehension of database applications, while also supporting the development of a number of practical metrics regarding their maintainability against schema changes. To evaluate the feasibility and effectiveness of the presented techniques and metrics, a software tool, called DATA, has been implemented. The experimental results from its usage on the TPC-C case study are reported and analysed.

  6. Databases for INDUS-1 and INDUS-2

    International Nuclear Information System (INIS)

    The databases for Indus are relational databases designed to store various categories of data related to the accelerator. The data archiving and retrieving system in Indus is based on a client/sever model. A general purpose commercial database is used to store parameters and equipment data for the whole machine. The database manages configuration, on-line and historical databases. On line and off line applications distributed in several systems can store and retrieve the data from the database over the network. This paper describes the structure of databases for Indus-1 and Indus-2 and their integration within the software architecture. The data analysis, design, resulting data-schema and implementation issues are discussed. (author)

  7. A novel application in the study of client language: Alcohol and marijuana-related statements in substance-using adolescents during a simulation task.

    Science.gov (United States)

    Ladd, Benjamin O; Garcia, Tracey A; Anderson, Kristen G

    2016-09-01

    The current study explored whether laboratory-based techniques can provide a strategy for studying client language as a mechanism of behavior change. Specifically, this study examined the potential of a simulation task to elicit healthy talk, or self-motivational statements in favor of healthy behavior, related to marijuana and alcohol use. Participants (N = 84) were adolescents reporting at least 10 lifetime substance use episodes recruited from various community settings in an urban Pacific Northwest setting. Participants completed the Adolescent Simulated Intoxication Digital Elicitation (A-SIDE), a validated paradigm for assessing substance use decision making in peer contexts. Participants responded to 4 types of offers in the A-SIDE: (a) marijuana, (b) food (marijuana control), (c) alcohol, and (d) soda (alcohol control). Using a validated coding scheme adapted for the current study, client language during a structured interview assessing participants' response to the simulated offers was evaluated. Associations between percent healthy talk (PHT, calculated by dividing the number of healthy statements by the sum of all substance-related statements) and cross-sectional outcomes of interest (previous substance use, substance use expectancies, and behavioral willingness) were explored. The frequency of substance-related statements differed in response to offer type; rate of PHT did not. PHT was associated with behavioral willingness to accept the offer. However, PHT was not associated with decontextualized measures of substance use. Associations between PHT and global expectancies were limited. Simulation methods may be useful in investigating the impact of context on self-talk and to systematically explore client language as a mechanism of change. (PsycINFO Database Record PMID:27454368

  8. Application of Simulated Annealing to Clustering Tuples in Databases.

    Science.gov (United States)

    Bell, D. A.; And Others

    1990-01-01

    Investigates the value of applying principles derived from simulated annealing to clustering tuples in database design, and compares this technique with a graph-collapsing clustering method. It is concluded that, while the new method does give superior results, the expense involved in algorithm run time is prohibitive. (24 references) (CLB)

  9. Database Application for a Youth Market Livestock Production Education Program

    Science.gov (United States)

    Horney, Marc R.

    2013-01-01

    This article offers an example of a database designed to support teaching animal production and husbandry skills in county youth livestock programs. The system was used to manage production goals, animal growth and carcass data, photos and other imagery, and participant records. These were used to produce a variety of customized reports to help…

  10. The Application of an Anatomical Database for Fetal Congenital Heart Disease

    Institute of Scientific and Technical Information of China (English)

    Li Yang; Qiu-Yan Pei; Yun-Tao Li; Zhen-Juan Yang

    2015-01-01

    Background:Fetal congenital heart anomalies are the most common congenital anomalies in live births.Fetal echocardiography (FECG) is the only prenatal diagnostic approach used to detect fetal congenital heart disease (CHD).FECG is not widely used,and the antenatal diagnosis rate of CHD varies considerably.Thus,mastering the anatomical characteristics of different kinds of CHD is critical for ultrasound physicians to improve FECG technology.The aim of this study is to investigate the applications of a fetal CHD anatomic database in FECG teaching and training program.Methods:We evaluated 60 transverse section databases including 27 types of fetal CHD built in the Prenatal Diagnosis Center in Peking University People's Hospital.Each original database contained 400-700 cross-sectional digital images with a resolution of 3744 pixels × 5616 pixels.We imported the database into Amira 5.3.1 (Australia Visage Imaging Company,Australia) three-dimensional (3D) software.The database functions use a series of 3D software visual operations.The features of the fetal CHD anatomical database were analyzed to determine its applications in FECG continuing education and training.Results:The database was rebuilt using the 3D software.The original and rebuilt databases can be displayed dynamically,continuously,and synchronically and can be rotated at arbitrary angles.The sections from the dynamic displays and rotating angles are consistent with the sections in FECG.The database successfully reproduced the anatomic structures and spatial relationship features of different fetal CHDs.We established a fetal CHD anatomy training database and a standardized training database for FECG.Ultrasound physicians and students can learn the anatomical features of fetal CHD and FECG through either centralized training or distance education.Conclusions:The database of fetal CHD successfully reproduced the anatomic structures and spatial relationship of different kinds of fetal CHD.This database can be

  11. Using Java Objects and Services for Database Business Applications

    OpenAIRE

    Dănuţ - Octavian Simion

    2013-01-01

    The paper presents the facilities advantages of using Enterprise Java Objects in Business Applications and emphases aspects like simplicity, application portability, component reusability, ability to build complex applications, separation of business logic from presentation logic, easy development of Web services, deployment in many operating environments, distributed deployment, application interoperability, integration with non-Java systems and development tools. Enterprise JavaBeans - EJB ...

  12. Applications of interest : a relational database approach to managing control system software applications.

    Energy Technology Data Exchange (ETDEWEB)

    Quock, D. B.; Arnold, N.; Dohan, D.; Anderson, J.; Clemons, D. (APS Engineering Support Division)

    2008-04-14

    Large accelerator facilities such as the Advanced Photon Source (APS) typically are operated by a diverse set of integrated control systems, such as front-end controllers, PLCs, and FPGAs. This type of control system structure encompasses numerous engineering documents, distributed real-time control system databases, source code, user displays, and other components. The complexity of the control system is further increased as the life cycle of a control system is never ending, change is constant. And the accelerator itself generates new operational problems on a regular basis. This overall controls environment begs the question of how best to provide a means for control system engineers to easily and quickly troubleshoot unique functions of the control system, find relevant information, and understand the impact of changes to one part of the control system on other applications. The answer to this question lies in being able to associate pertinent drawings, manuals, source code, hardware, and expert developers in an efficient and logical manner. Applications of Interest is a relational database software tool created for the purpose of providing alternative views of the supporting information behind each distinct control system application at the APS.

  13. HTML thin client and transactions

    CERN Document Server

    Touchette, J F

    1999-01-01

    When writing applications for thin clients such as Web browsers, you face several challenges that do not exist with fat-client applications written in Visual Basic, Delphi, or Java. For one thing, your development tools do not include facilities for automatically building reliable, nonrepeatable transactions into applications. Consequently, you must devise your own techniques to prevent users from transmitting duplicate transactions. The author explains how to implement reliable, nonrepeatable transactions using a technique that is applicable to any Java Server Development Kit based architecture. Although the examples presented are based on the IBM WebSphere 2.1 Application Server, they do not make use of any IBM WebSphere extensions. In short, the concepts presented here can be implemented in Perl CGI and ASP scripts, and the sample code has been tested with JDK 1.1.6 and 1.2. (0 refs).

  14. Applications of the Cambridge Structural Database in organic chemistry and crystal chemistry.

    Science.gov (United States)

    Allen, Frank H; Motherwell, W D Samuel

    2002-06-01

    The Cambridge Structural Database (CSD) and its associated software systems have formed the basis for more than 800 research applications in structural chemistry, crystallography and the life sciences. Relevant references, dating from the mid-1970s, and brief synopses of these papers are collected in a database, DBUse, which is freely available via the CCDC website. This database has been used to review research applications of the CSD in organic chemistry, including supramolecular applications, and in organic crystal chemistry. The review concentrates on applications that have been published since 1990 and covers a wide range of topics, including structure correlation, conformational analysis, hydrogen bonding and other intermolecular interactions, studies of crystal packing, extended structural motifs, crystal engineering and polymorphism, and crystal structure prediction. Applications of CSD information in studies of crystal structure precision, the determination of crystal structures from powder diffraction data, together with applications in chemical informatics, are also discussed.

  15. Survey of standards applicable to a database management system

    Science.gov (United States)

    Urena, J. L.

    1981-01-01

    Industry, government, and NASA standards, and the status of standardization activities of standards setting organizations applicable to the design, implementation and operation of a data base management system for space related applications are identified. The applicability of the standards to a general purpose, multimission data base management system is addressed.

  16. GIS Methodic and New Database for Magmatic Rocks. Application for Atlantic Oceanic Magmatism.

    Science.gov (United States)

    Asavin, A. M.

    2001-12-01

    There are several geochemical Databases in INTERNET available now. There one of the main peculiarities of stored geochemical information is geographical coordinates of each samples in those Databases. As rule the software of this Database use spatial information only for users interface search procedures. In the other side, GIS-software (Geographical Information System software),for example ARC/INFO software which using for creation and analyzing special geological, geochemical and geophysical e-map, have been deeply involved with geographical coordinates for of samples. We join peculiarities GIS systems and relational geochemical Database from special software. Our geochemical information system created in Vernadsky Geological State Museum and institute of Geochemistry and Analytical Chemistry from Moscow. Now we tested system with data of geochemistry oceanic rock from Atlantic and Pacific oceans, about 10000 chemical analysis. GIS information content consist from e-map covers Wold Globes. Parts of these maps are Atlantic ocean covers gravica map (with grid 2''), oceanic bottom hot stream, altimeteric maps, seismic activity, tectonic map and geological map. Combination of this information content makes possible created new geochemical maps and combination of spatial analysis and numerical geochemical modeling of volcanic process in ocean segment. Now we tested information system on thick client technology. Interface between GIS system Arc/View and Database resides in special multiply SQL-queries sequence. The result of the above gueries were simple DBF-file with geographical coordinates. This file act at the instant of creation geochemical and other special e-map from oceanic region. We used more complex method for geophysical data. From ARC\\View we created grid cover for polygon spatial geophysical information.

  17. Explorative Study of SQL Injection Attacks and Mechanisms to Secure Web Application Database- A Review

    Directory of Open Access Journals (Sweden)

    Chandershekhar Sharma

    2016-03-01

    Full Text Available The increasing innovations in web development technologies direct the augmentation of user friendly web applications. With activities like - online banking, shopping, booking, trading etc. these applications have become an integral part of everyone’s daily routine. The profit driven online business industry has also acknowledged this growth because a thriving application provides the global platform to an organization. Database of web application is the most valuable asset which stores sensitive information of an individual and of an organization. SQLIA is the topmost threat as it targets the database on web application. It allows the attacker to gain control over the application ensuing financial fraud, leak of confidential data and even deleting the database. The exhaustive survey of SQL injection attacks presented in this paper is based on empirical analysis. This comprises the deployment of injection mechanism for each attack with respective types on various websites, dummy databases and web applications. The paramount security mechanism for web application database is also discussed to mitigate SQL injection attacks.

  18. Databases for nuclear applications available at the NEA Data Bank through the internet

    International Nuclear Information System (INIS)

    The NEA Data Bank acts as the Member countries' contre of reference for computer programs, basic scientific nuclear data and chemical thermodynamic data. It does this by keeping its collections up-to-date and by providing scientists with a reliable and quick retrieval service of these programs and data. The information stored at the Data Bank is handled according to agreed quality assurance methods. In addition, most of the data sets and programs have been validated in international benchmark exercises. The NEA Data Bank's longstanding experience in database development and maintenance allows it to efficiently handle large volumes of information. Through extensive use of electronic networks, such as the Internet, it is able to provide a rapid service to its clients. These services are constantly being developed in response to customer needs. (orig.)

  19. Enterprise Android programming Android database applications for the enterprise

    CERN Document Server

    Mednieks, Zigurd; Dornin, Laird; Pan, Zane

    2013-01-01

    The definitive guide to building data-driven Android applications for enterprise systems Android devices represent a rapidly growing share of the mobile device market. With the release of Android 4, they are moving beyond consumer applications into corporate/enterprise use. Developers who want to start building data-driven Android applications that integrate with enterprise systems will learn how with this book. In the tradition of Wrox Professional guides, it thoroughly covers sharing and displaying data, transmitting data to enterprise applications, and much more. Shows Android developers w

  20. Reducing client waiting time.

    Science.gov (United States)

    1992-01-01

    This first issues of Family Planning (FP) Manager focuses on how to analyze client waiting time and reduce long waits easily and inexpensively. Client flow analysis can be used by managers and staff to identify organizational factors affecting waiting time. Symptoms of long waiting times are overcrowded waiting rooms, clients not returning for services, staff complaints about rushing and waiting, and hurried counseling sessions. Client satisfaction is very important in order to retain FP users. Simple procedures such as routing return visits differently can make a difference in program effectiveness. Assessment of the number of first visits, the number of revisits, and types of methods and services that the clinic provides is a first step. Client flow analysis involves assigning a number to each client on registration, attaching the client flow form to the medical chart, entering the FP method and type of visit, asking staff to note the time at each station, and summarizing data in a master chart. The staff should be involved in plotting data for each client to show waiting versus staff contact time through the use of color coding for each type of staff contact. Bottlenecks become very visible when charted. The amount of time spent at each station can be measured, and gaps in client's contact with staff can be identified. An accurate measure of total waiting time can be obtained. A quick assessment can be made by recording arrival and departure times for each client in one morning or afternoon of a peak day. The procedure is to count the number of clients waiting at 15-minute intervals. The process should be repeated every 3-6 months to observe changes. If waiting times appear long, a more thorough assessment is needed on both a peak and a typical day. An example is given of a completed chart and graph of results with sample data. Managers need to set goals for client flow, streamline client routes, and utilize waiting time wisely by providing educational talks

  1. Web Service Clients on Mobile Android Devices: A Study on Architectural Alternatives and Client Performance

    OpenAIRE

    Knutsen, Johannes

    2009-01-01

    This paper studies Android, a new open source software stack initiated by Google, and the possibilities of developing a mobile client for MPower, a service oriented architecture platform based upon SOAP messaging. The study focuses on the architectural alternatives, their impacts on the mobile client application, Android’s performance on SOAP messaging, and how Web services’ design can be optimized to give well performing Android clients. The results from this study shows how different arch...

  2. Based on ethernet development of OPC client application program%基于以太网的OPC客户端应用程序实现

    Institute of Scientific and Technical Information of China (English)

    段宝利

    2012-01-01

    西门子通过以太网建立起Simatic Net OPC Server与S7系列PLC的S7连接,基于以太网的OPC客户应用程序实现,达到对现场PLC数据的交互访问、远程控制的目的.开发的客户端程序可以成功应用于不同环境中对Siemens、GE PLC数据交互访问,充分达到软件复用性.%Brief description of how Siemens via Ethernet to establish S7 connection between Simatic Net OPC Server and S7 series PLC is also referredlmplementation of OPC client applications based on Ethernet can achieved the objective of field PLC data interactive access and remote control.This client applications has been successfully used in PLC of Siemens and GE data interactive access under different environments,and software reuse is fully realized.

  3. Comparison of the Frontier Distributed Database Caching System with NoSQL Databases

    CERN Document Server

    CERN. Geneva

    2012-01-01

    Non-relational "NoSQL" databases such as Cassandra and CouchDB are best known for their ability to scale to large numbers of clients spread over a wide area. The Frontier distributed database caching system, used in production by the Large Hadron Collider CMS and ATLAS detector projects, is based on traditional SQL databases but also has the same high scalability and wide-area distributability for an important subset of applications. This paper compares the architectures, behavior, performance, and maintainability of the two different approaches and identifies the criteria for choosing which approach to prefer over the other.

  4. A Call for Feminist Research: A Limited Client Perspective

    Science.gov (United States)

    Murray, Kirsten

    2006-01-01

    Feminist approaches embrace a counselor stance that is both collaborative and supportive, seeking client empowerment. On review of feminist family and couple counseling literature of the past 20 years using several academic databases, no research was found that explored a clients experience of feminist-informed family and couple counseling. The…

  5. Based on Technology of Client/Server Data Integrity Constraints Research and Application%基于Client/Server数据完整性约束的技术研究与应用

    Institute of Scientific and Technical Information of China (English)

    鲁广英

    2010-01-01

    讨论基于Client/Server结构的数据完整性约束,必须建立完整性约束机制,探讨数据完整性约束及其如何实现.根据多年来开发基于Client/Server结构的信息管理系统的经验,并以SQL Server、VB为平台,介绍管理信息系统实现数据完整性约束的方法.

  6. Effective use of Java Data objects in developing database applications. Advantages and disadvantages

    OpenAIRE

    Zilidis, Paschalis.

    2004-01-01

    Approved for public release; distribution is unlimited Currently, the most common approach in developing database applications is to use an object-oriented language for the frontend module and a relational database for the backend datastore. The major disadvantage of this approach is the well-known "impedance mismatch" in which some form of mapping is required to connect the objects in the frontend and the relational tuples in the backend. Java Data Objects (JDO) technology is recently pro...

  7. Application of the Trend Filtering Algorithm on the MACHO Database

    CERN Document Server

    Szulagyi, J; Welch, D L

    2009-01-01

    Due to the strong effect of systematics/trends in variable star observations, we employ the Trend Filtering Algorithm (TFA) on a subset of the MACHO database and search for variable stars. TFA has been applied successfully in planetary transit searches, where weak, short-lasting periodic dimmings are sought in the presence of noise and various systematics (due to, e.g., imperfect flat fielding, crowding, etc). These latter effects introduce colored noise in the photometric time series that can lead to a complete miss of the signal. By using a large number of available photometric time series of a given field, TFA utilizes the fact that the same types of systematics appear in several/many time series of the same field. As a result, we fit each target time series by a (least-square-sense) optimum linear combination of templates and frequency-analyze the residuals. Once a signal is found, we reconstruct the signal by employing the full model, including the signal, systematics and noise. We apply TFA on the brigh...

  8. Discovering Knowledge from AIS Database for Application in VTS

    Science.gov (United States)

    Tsou, Ming-Cheng

    The widespread use of the Automatic Identification System (AIS) has had a significant impact on maritime technology. AIS enables the Vessel Traffic Service (VTS) not only to offer commonly known functions such as identification, tracking and monitoring of vessels, but also to provide rich real-time information that is useful for marine traffic investigation, statistical analysis and theoretical research. However, due to the rapid accumulation of AIS observation data, the VTS platform is often unable quickly and effectively to absorb and analyze it. Traditional observation and analysis methods are becoming less suitable for the modern AIS generation of VTS. In view of this, we applied the same data mining technique used for business intelligence discovery (in Customer Relation Management (CRM) business marketing) to the analysis of AIS observation data. This recasts the marine traffic problem as a business-marketing problem and integrates technologies such as Geographic Information Systems (GIS), database management systems, data warehousing and data mining to facilitate the discovery of hidden and valuable information in a huge amount of observation data. Consequently, this provides the marine traffic managers with a useful strategic planning resource.

  9. DEVELOPMENT OF AN APPLICATION WITH THE PURPOSE OF MAINTAINING A DATABASE

    Directory of Open Access Journals (Sweden)

    Wilson Fadlo Curi

    2011-04-01

    Full Text Available Nowadays, sustainable development is the great paradigm of human development, because of that new methodologies for the planning and management of systems, specially the ones for hydric resources, are being developed, this forms of evaluation are no longer restricted to mere economic evaluation, but are also submitted to social and environmental sustainability evaluation. The use of databases is essential for the manipulation of the most diverse kinds of data and information, which can be utilized for storing historical data and other information necessary for future use. Therefore, this article focuses mainly on presenting the application developed to manipulate tables in a database, to allow and facilitate the inclusion, elimination, renaming of tables and table fields in order to substitute some SQL commands available in programs of the various available databases. Thus, this application will add value to the decision support system that is being developed by GOTA (Group of Water Total Optimization which in many cases needs to make changes in its database for obtaining greater flexibility and data manipulation, such as register of reservoirs, irrigated perimeters, meteorological stations, gauged stations, institutions, etc. This application allows an intelligent and fast manipulation of tables in a database, which the present version runs on the PostgreSQL database and was developed in Java platform, which is a platform that permits its installation in several types of operational systems and many kinds of computers.

  10. Perceived Counselor Characteristics, Client Expectations, and Client Satisfaction with Counseling.

    Science.gov (United States)

    Heppner, P. Paul; Heesacker, Martin

    1983-01-01

    Examined interpersonal influence process within counseling including relationship between perceived counselor expertness, attractiveness, and trustworthiness and client satisfaction; between client expectations on perceived counselor expertness, attractiveness, trustworthiness, and client satisfaction; and effects of actual counselor experience…

  11. Wireless access to a pharmaceutical database: A demonstrator for data driven Wireless Application Prorocol (WAP) applications in medical information processing

    DEFF Research Database (Denmark)

    Hansen, Michael Schacht; Dørup, Jens

    2001-01-01

    script for easy update of the database. Data were distributed in 35 interrelated tables. Each pharmaceutical brand name was given its own card with links to general information about the drug, active substances, contraindications etc. Access was available through 1) browsing therapeutic groups and 2......) searching for a brand name. The database interface was programmed in the server-side scripting language PHP3. RESULTS: A free, open source Wireless Application Protocol gateway to a pharmaceutical catalogue was established to allow dial-in access independent of commercial Wireless Application Protocol...... service providers. The application was tested on the Nokia 7110 and Ericsson R320s cellular phones. CONCLUSIONS: We have demonstrated that Wireless Application Protocol-based access to a dynamic clinical database can be established using open source freeware. The project opens perspectives for a further...

  12. Data-based considerations for electronic family health history applications.

    Science.gov (United States)

    Peace, Jane; Valdez, Rupa Sheth; Lutz, Kristin F

    2012-01-01

    Family health history contains important information about the genetic and environmental factors that contribute to patterns of health and illness in families. Applications for collecting, managing, and analyzing family health history could be improved if their design were informed by an understanding of how consumers think about and report family health history. This article presents a descriptive analysis of themes from family health history interviews that have implications for development, selection, and use of family health history tools. Important themes included ways in which family is defined, including nonbiological family members and pets; ideas about health and disease, including degree of exposure and individual perceptions; and barriers to reporting family health history, including large biological families and uncertainty. Some themes identified (eg, uncertainty) have been recognized previously and continue to be important considerations. Other themes identified, such as perceptions about severity of illness or conditions and causal relationships, are newly recognized and may have implications for nurses and other providers designing, selecting, and using family health history applications. PMID:21915045

  13. PeDaB - the personal dosimetry database at the research centre Juelich

    International Nuclear Information System (INIS)

    In May, 1997 the mainframe based registration, processing and archiving of personal monitoring data at the research centre Juelich (FZJ) was transferred to a client server system. A complex database application was developed. The client user interface is a Windows based Microsoft ACCESS application which is connected to an ORACLE database via ODBC and TCP/IP. The conversion covered all areas of personal dosimetry including internal and external exposition as well as administrative areas. A higher degree of flexibility, data security and integrity was achieved. (orig.)

  14. Construction, database integration, and application of an Oenothera EST library.

    Science.gov (United States)

    Mrácek, Jaroslav; Greiner, Stephan; Cho, Won Kyong; Rauwolf, Uwe; Braun, Martha; Umate, Pavan; Altstätter, Johannes; Stoppel, Rhea; Mlcochová, Lada; Silber, Martina V; Volz, Stefanie M; White, Sarah; Selmeier, Renate; Rudd, Stephen; Herrmann, Reinhold G; Meurer, Jörg

    2006-09-01

    Coevolution of cellular genetic compartments is a fundamental aspect in eukaryotic genome evolution that becomes apparent in serious developmental disturbances after interspecific organelle exchanges. The genus Oenothera represents a unique, at present the only available, resource to study the role of the compartmentalized plant genome in diversification of populations and speciation processes. An integrated approach involving cDNA cloning, EST sequencing, and bioinformatic data mining was chosen using Oenothera elata with the genetic constitution nuclear genome AA with plastome type I. The Gene Ontology system grouped 1621 unique gene products into 17 different functional categories. Application of arrays generated from a selected fraction of ESTs revealed significantly differing expression profiles among closely related Oenothera species possessing the potential to generate fertile and incompatible plastid/nuclear hybrids (hybrid bleaching). Furthermore, the EST library provides a valuable source of PCR-based polymorphic molecular markers that are instrumental for genotyping and molecular mapping approaches. PMID:16829020

  15. Structure design and establishment of database application system for alien species in Shandong Province, China

    Institute of Scientific and Technical Information of China (English)

    GUO Wei-hua; LIU Heng; DU Ning; ZHANG Xin-shi; WANG Ren-qing

    2007-01-01

    This paper presents a case study on structure design and establishment of database application system for alien species in Shandong Province, integrating with Geographic Information System, computer network, and database technology to the research of alien species. The modules of alien species database, including classified data input, statistics and analysis, species pictures and distribution maps,and out date input, were approached by Visual Studio.net 2003 and Microsoft SQL server 2000. The alien species information contains the information of classification, species distinction characteristics, biological characteristics, original area, distribution area, the entering fashion and route, invasion time, invasion reason, interaction with the endemic species, growth state, danger state and spatial information, i.e.distribution map. Based on the above bases, several models including application, checking, modifying, printing, adding and returning models were developed. Furthermore, through the establishment of index tables and index maps, we can also spatially query the data like picture,text and GIS map data. This research established the technological platform of sharing information about scientific resource of alien species in Shandong Province, offering the basis for the dynamic inquiry of alien species, the warning technology of prevention and the fast reaction system. The database application system possessed the principles of good practicability, friendly user interface and convenient usage. It can supply full and accurate information inquiry services of alien species for the users and provide functions of dynamically managing the database for the administrator.

  16. Customizable neuroinformatics database system: XooNIps and its application to the pupil platform.

    Science.gov (United States)

    Yamaji, Kazutsuna; Sakai, Hiroyuki; Okumura, Yoshihiro; Usui, Shiro

    2007-07-01

    The developing field of neuroinformatics includes technologies for the collection and sharing of neuro-related digital resources. These resources will be of increasing value for understanding the brain. Developing a database system to integrate these disparate resources is necessary to make full use of these resources. This study proposes a base database system termed XooNIps that utilizes the content management system called XOOPS. XooNIps is designed for developing databases in different research fields through customization of the option menu. In a XooNIps-based database, digital resources are stored according to their respective categories, e.g., research articles, experimental data, mathematical models, stimulations, each associated with their related metadata. Several types of user authorization are supported for secure operations. In addition to the directory and keyword searches within a certain database, XooNIps searches simultaneously across other XooNIps-based databases on the Internet. Reviewing systems for user registration and for data submission are incorporated to impose quality control. Furthermore, XOOPS modules containing news, forums schedules, blogs and other information can be combined to enhance XooNIps functionality. These features provide better scalability, extensibility, and customizability to the general neuroinformatics community. The application of this system to data, models, and other information related to human pupils is described here.

  17. Simple Application of Web Database%WEB数据库的简单实现

    Institute of Scientific and Technical Information of China (English)

    吕品; 张绍成; 岳承君

    2001-01-01

    主要讨论的是Windows NT平台下基于Ms Access的简单Web数据库的连接和应用.%In this paper,connection and application of simple database on basis of Ms Access under Windows NT were discussed.

  18. Monet: a next-generation database kernel for query-intensive applications

    NARCIS (Netherlands)

    Boncz, P.A.

    2002-01-01

    Monet is a database kernel targeted at query-intensive, heavy analysis applications (the opposite of transaction processing), which include OLAP and data mining, but also go beyond the business domain in GIS processing, multi-media retrieval and XML. The clean sheet approach of Monet tries to depart

  19. Examining the Factors That Contribute to Successful Database Application Implementation Using the Technology Acceptance Model

    Science.gov (United States)

    Nworji, Alexander O.

    2013-01-01

    Most organizations spend millions of dollars due to the impact of improperly implemented database application systems as evidenced by poor data quality problems. The purpose of this quantitative study was to use, and extend, the technology acceptance model (TAM) to assess the impact of information quality and technical quality factors on database…

  20. Processes involved in client-nominated relationship building incidents: Client attachment, attachment to therapist, and session impact.

    Science.gov (United States)

    Janzen, Jennifer; Fitzpatrick, Marilyn; Drapeau, Martin

    2008-09-01

    Thirty volunteer clients of trainee therapists nominated an incident that was critical in the development of their therapeutic relationship. Clients completed the Client Attachment to Therapist Scale (CATS), the Experiences in Close Relationships Scale (ECRS), and the Session Impacts Scale (SIS). Clients reported an increase in attachment security with their therapists, along with perceptions of support and relief and increasing exploration following the relationship building incident. While clients' avoidant attachment was unrelated to attachment to the therapist prior to the incidents, in subsequent sessions avoidance was related to a change in secure attachment to therapist. Finally, client attachment to therapist but not general attachment was significantly related to in-session exploration. Findings are discussed in light of attachment theory and convergence with findings from the field of social psychology. (PsycINFO Database Record (c) 2010 APA, all rights reserved).

  1. Client Centred Design

    DEFF Research Database (Denmark)

    Ørngreen, Rikke; Nielsen, Janni; Tweddell Levinsen, Karin

    2004-01-01

    Abstract In this paper the Human Computer Interaction (HCI) Research Group reports on the pre-phase of an e-learning project, which was carried out in collaboration with the client. The project involved an initial exploration of the problem spaces, possibilities and challenges for an online...... on existing resources and networks, suggesting a design, which also included end-users community needs and work-context. Our argument is that if a preparation phase both seeks to confirm knowledge and contemplate what is not yet known, giving attention to the context and need of the client (i.e. not only end......-users,) then it is possible to build on existing resources within the client organisation, leading to grounding of design decisions and a match between the e-learning environment designed and the capabilities of the client....

  2. Campaign Consultants - Client Payments

    Data.gov (United States)

    City of San Francisco — Campaign Consultants are required to report ���economic consideration�۝ promised by or received from clients in exchange for campaign consulting services during the...

  3. Helping clients build credit

    OpenAIRE

    Vikki Frank

    2007-01-01

    Until now people who repaid loans from community groups had not been on credit bureaus’ radar. Now Credit Builders Alliance is partnering with Experian to help clients of community lenders build strong credit histories.

  4. Efficient Mobile Client Caching Supporting Transaction Semantics

    Directory of Open Access Journals (Sweden)

    IlYoung Chung

    2000-05-01

    Full Text Available In mobile client-server database systems, caching of frequently accessed data is an important technique that will reduce the contention on the narrow bandwidth wireless channel. As the server in mobile environments may not have any information about the state of its clients' cache(stateless server, using broadcasting approach to transmit the updated data lists to numerous concurrent mobile clients is an attractive approach. In this paper, a caching policy is proposed to maintain cache consistency for mobile computers. The proposed protocol adopts asynchronous(non-periodic broadcasting as the cache invalidation scheme, and supports transaction semantics in mobile environments. With the asynchronous broadcasting approach, the proposed protocol can improve the throughput by reducing the abortion of transactions with low communication costs. We study the performance of the protocol by means of simulation experiments.

  5. Professional iPhone and iPad Database Application Programming

    CERN Document Server

    Alessi, Patrick

    2010-01-01

    A much-needed resource on database development and enterprise integration for the iPhone. An enormous demand exists for getting iPhone applications into the enterprise and this book guides you through all the necessary steps for integrating an iPhone app within an existing enterprise. Experienced iPhone developers will learn how to take advantage of the built-in capabilities of the iPhone to confidently implement a data-driven application for the iPhone.: Shows you how to integrate iPhone applications into enterprise class systems; Introduces development of data-driven applications on the iPho

  6. Migrating to the Cloud IT Application, Database, and Infrastructure Innovation and Consolidation

    CERN Document Server

    Laszewski, Tom

    2011-01-01

    Whether your company is planning on database migration, desktop application migration, or has IT infrastructure consolidation projects, this book gives you all the resources you'll need. It gives you recommendations on tools, strategy and best practices and serves as a guide as you plan, determine effort and budget, design, execute and roll your modern Oracle system out to production. Focusing on Oracle grid relational database technology and Oracle Fusion Middleware as the target cloud-based architecture, your company can gain organizational efficiency, agility, increase innovation and reduce

  7. Design of Student Information Management Database Application System for Office and Departmental Target Responsibility System

    Science.gov (United States)

    Zhou, Hui

    It is the inevitable outcome of higher education reform to carry out office and departmental target responsibility system, in which statistical processing of student's information is an important part of student's performance review. On the basis of the analysis of the student's evaluation, the student information management database application system is designed by using relational database management system software in this paper. In order to implement the function of student information management, the functional requirement, overall structure, data sheets and fields, data sheet Association and software codes are designed in details.

  8. MS Client在无光驱实验室的应用%Application of MS Client in no CD-ROM laboratory

    Institute of Scientific and Technical Information of China (English)

    刘维学

    2004-01-01

    在有盘DOS工作站上安装MS Client(Microsoft Network Client V3.0 for MSDOS),以使其能连接到终端服务器、并能共享终端服务器的资源,解决计算机无光驱安装操作系统问题.

  9. An extensible web interface for databases and its application to storing biochemical data

    CERN Document Server

    Angelopoulos, Nicos

    2010-01-01

    This paper presents a generic web-based database interface implemented in Prolog. We discuss the advantages of the implementation platform and demonstrate the system's applicability in providing access to integrated biochemical data. Our system exploits two libraries of SWI-Prolog to create a schema-transparent interface within a relational setting. As is expected in declarative programming, the interface was written with minimal programming effort due to the high level of the language and its suitability to the task. We highlight two of Prolog's features that are well suited to the task at hand: term representation of structured documents and relational nature of Prolog which facilitates transparent integration of relational databases. Although we developed the system for accessing in-house biochemical and genomic data the interface is generic and provides a number of extensible features. We describe some of these features with references to our research databases. Finally we outline an in-house library that...

  10. The development and application of a thermodynamic database for magnesium alloys

    Science.gov (United States)

    Shang, Shunli; Zhang, Hui; Ganeshan, Swetha; Liu, Zi-Kui

    2008-12-01

    The available thermodynamic databases for magnesium alloys are discussed in this paper. Of particular interest are the features of a magnesium database developed by the authors with 19 elements: Mg-Al-Ca-Ce-Cu-Fe-KLa-Li-Mn-Na-Nd-Pr-Si-Sn-Sr-Y-Zn-Zr. Using this database, two applications are presented. One is the phase evolution in AZ61 magnesium alloy including the variations of phase fractions, alloying compositions, and partition coefficients of alloying elements as a function of temperature (or solid fraction). The other is to understand sodium-induced high-temperature embrittlement in the Al-Mg alloy, which is ascribed to the formation of a liquid phase due to the presence of sodium traces.

  11. A New Communication Theory on Complex Information and a Groundbreaking New Declarative Method to Update Object Databases

    OpenAIRE

    Virkkunen, Heikki

    2016-01-01

    In this article I introduce a new communication theory for complex information represented as a direct graph of nodes. In addition, I introduce an application for the theory, a new radical method, embed, that can be used to update object databases declaratively. The embed method revolutionizes updating of object databases. One embed method call can replace dozens of lines of complicated updating code in a traditional client program of an object database, which is a huge improvement. As a decl...

  12. Perspectives on a Big Data Application: What Database Engineers and IT Students Need to Know

    Directory of Open Access Journals (Sweden)

    E. Erturk

    2015-10-01

    Full Text Available Cloud Computing and Big Data are important and related current trends in the world of information technology. They will have significant impact on the curricula of computer engineering and information systems at universities and higher education institutions. Learning about big data is useful for both working database professionals and students, in accordance with the increase in jobs requiring these skills. It is also important to address a broad gamut of database engineering skills, i.e. database design, installation, and operation. Therefore the authors have investigated MongoDB, a popular application, both from the perspective of industry retraining for database specialists and for teaching. This paper demonstrates some practical activities that can be done by students at the Eastern Institute of Technology New Zealand. In addition to testing and preparing new content for future students, this paper contributes to the very recent and emerging academic literature in this area. This paper concludes with general recommendations for IT educators, database engineers, and other IT professionals.

  13. Dual diagnosis clients' treatment satisfaction - a systematic review

    Directory of Open Access Journals (Sweden)

    Stirling John

    2011-04-01

    Full Text Available Abstract Background The aim of this systematic review is to synthesize existing evidence about treatment satisfaction among clients with substance misuse and mental health co-morbidity (dual diagnoses, DD. Methods We examined satisfaction with treatment received, variations in satisfaction levels by type of treatment intervention and by diagnosis (i.e. DD clients vs. single diagnosis clients, and the influence of factors other than treatment type on satisfaction. Peer-reviewed studies published in English since 1970 were identified by searching electronic databases using pre-defined search strings. Results Across the 27 studies that met inclusion criteria, high average satisfaction scores were found. In most studies, integrated DD treatment yielded greater client satisfaction than standard treatment without explicit DD focus. In standard treatment without DD focus, DD clients tended to be less satisfied than single diagnosis clients. Whilst the evidence base on client and treatment variables related to satisfaction is small, it suggested client demographics and symptom severity to be unrelated to treatment satisfaction. However, satisfaction tended to be linked to other treatment process and outcome variables. Findings are limited in that many studies had very small sample sizes, did not use validated satisfaction instruments and may not have controlled for potential confounders. A framework for further research in this important area is discussed. Conclusions High satisfaction levels with current treatment provision, especially among those in integrated treatment, should enhance therapeutic optimism among practitioners dealing with DD clients.

  14. Service Management Database for DSN Equipment

    Science.gov (United States)

    Zendejas, Silvino; Bui, Tung; Bui, Bach; Malhotra, Shantanu; Chen, Fannie; Wolgast, Paul; Allen, Christopher; Luong, Ivy; Chang, George; Sadaqathulla, Syed

    2009-01-01

    This data- and event-driven persistent storage system leverages the use of commercial software provided by Oracle for portability, ease of maintenance, scalability, and ease of integration with embedded, client-server, and multi-tiered applications. In this role, the Service Management Database (SMDB) is a key component of the overall end-to-end process involved in the scheduling, preparation, and configuration of the Deep Space Network (DSN) equipment needed to perform the various telecommunication services the DSN provides to its customers worldwide. SMDB makes efficient use of triggers, stored procedures, queuing functions, e-mail capabilities, data management, and Java integration features provided by the Oracle relational database management system. SMDB uses a third normal form schema design that allows for simple data maintenance procedures and thin layers of integration with client applications. The software provides an integrated event logging system with ability to publish events to a JMS messaging system for synchronous and asynchronous delivery to subscribed applications. It provides a structured classification of events and application-level messages stored in database tables that are accessible by monitoring applications for real-time monitoring or for troubleshooting and analysis over historical archives.

  15. Current trends and new challenges of databases and web applications for systems driven biological research

    Directory of Open Access Journals (Sweden)

    Pradeep Kumar eSreenivasaiah

    2010-12-01

    Full Text Available Dynamic and rapidly evolving nature of systems driven research imposes special requirements on the technology, approach, design and architecture of computational infrastructure including database and web application. Several solutions have been proposed to meet the expectations and novel methods have been developed to address the persisting problems of data integration. It is important for researchers to understand different technologies and approaches. Having familiarized with the pros and cons of the existing technologies, researchers can exploit its capabilities to the maximum potential for integrating data. In this review we discuss the architecture, design and key technologies underlying some of the prominent databases (DBs and web applications. We will mention their roles in integration of biological data and investigate some of the emerging design concepts and computational technologies that are likely to have a key role in the future of systems driven biomedical research.

  16. Development of a Personal Digital Assistant (PDA) based client/server NICU patient data and charting system.

    OpenAIRE

    Carroll, A. E.; Saluja, S.; Tarczy-Hornoch, P.

    2001-01-01

    Personal Digital Assistants (PDAs) offer clinicians the ability to enter and manage critical information at the point of care. Although PDAs have always been designed to be intuitive and easy to use, recent advances in technology have made them even more accessible. The ability to link data on a PDA (client) to a central database (server) allows for near-unlimited potential in developing point of care applications and systems for patient data management. Although many stand-alone systems exis...

  17. Application of kernel functions for accurate similarity search in large chemical databases

    OpenAIRE

    2010-01-01

    Background Similaritysearch in chemical structure databases is an important problem with many applications in chemical genomics, drug design, and efficient chemical probe screening among others. It is widely believed that structure based methods provide an efficient way to do the query. Recently various graph kernel functions have been designed to capture the intrinsic similarity of graphs. Though successful in constructing accurate predictive and classification models, graph kernel functions...

  18. Quality control in diagnostic radiology: software (Visual Basic 6) and database applications

    International Nuclear Information System (INIS)

    Quality Assurance programme in diagnostic Radiology is being implemented by the Ministry of Health (MoH) in Malaysia. Under this program the performance of an x-ray machine used for diagnostic purpose is tested by using the approved procedure which is commonly known as Quality Control in diagnostic radiology. The quality control or performance tests are carried out b a class H licence holder issued the Atomic Energy Licensing Act 1984. There are a few computer applications (software) that are available in the market which can be used for this purpose. A computer application (software) using Visual Basics 6 and Microsoft Access, is being developed to expedite data handling, analysis and storage as well as report writing of the quality control tests. In this paper important features of the software for quality control tests are explained in brief. A simple database is being established for this purpose which is linked to the software. Problems encountered in the preparation of database are discussed in this paper. A few examples of practical usage of the software and database applications are presented in brief. (Author)

  19. TuBaFrost 5: multifunctional central database application for a European tumor bank.

    Science.gov (United States)

    Isabelle, M; Teodorovic, I; Morente, M M; Jaminé, D; Passioukov, A; Lejeune, S; Therasse, P; Dinjens, W N M; Oosterhuis, J W; Lam, K H; Oomen, M H A; Spatz, A; Ratcliffe, C; Knox, K; Mager, R; Kerr, D; Pezzella, F; van de Vijver, M; van Boven, H; Alonso, S; Kerjaschki, D; Pammer, J; Lopez-Guerrero, J A; Llombart Bosch, A; Carbone, A; Gloghini, A; van Veen, E-B; van Damme, B; Riegman, P H J

    2006-12-01

    Developing a tissue bank database has become more than just logically arranging data in tables combined with a search engine. Current demand for high quality samples and data, and the ever-changing legal and ethical regulations mean that the application must reflect TuBaFrost rules and protocols for the collection, exchange and use of tissue. To ensure continuation and extension of the TuBaFrost European tissue bank, the custodianship of the samples, and hence the decision over whether to issue samples to requestors, remains with the local collecting centre. The database application described in this article has been developed to facilitate this open structure virtual tissue bank model serving a large group. It encompasses many key tasks, without the requirement for personnel, hence minimising operational costs. The Internet-accessible database application enables search, selection and request submission for requestors, whereas collectors can upload and edit their collection. Communication between requestor and involved collectors is started with automatically generated e-mails. PMID:17029787

  20. La contrainte client

    Directory of Open Access Journals (Sweden)

    Guillaume Tiffon

    2011-04-01

    Full Text Available Cet article montre que le contact client a beau être ambivalent, dans la mesure où il est à la fois source de contrainte et de reconnaissance, dans certains cas, comme celui des caissières, il constitue avant tout une contrainte, en ce que les clients contrôlent le travail qui s’opère « sous leurs yeux », tandis que, dans d’autres cas, comme celui des kinésithérapeutes, il contribue avant tout à donner du sens au travail et, par là, à susciter l’engagement des travailleurs. L’article souligne ainsi combien la contrainte client revêt des modalités différentes selon la configuration, spatiale et temporelle, dans laquelle se déroule la relation de service, et le différentiel de compétences entre les protagonistes engagés dans cette relation.The client constraint. A comparative analysis of cashiers and physiotherapistsThis article shows that despite the ambivalence of client contact, insofar as it is both a source of constraint and recognition, in some cases, as the ones of cashiers, it isprimarily a constraint: clients control the work that takes place “before their eyes”, whereas in other cases – as in the ones of physiotherapists – it contributes to give meaning to work and, thereby, to arouse the commitment of workers. The article highlights how the client constraint takes on different forms depending on thespatial and temporal configuration where the service relation runs, and the skills differential between the protagonists involved in this relation.El apremio de los clientes. Análisis comparativo entre las cajeras de supermercado y los kinesiterapeutasEn este artículo se demuestra que aunque el contacto con los clientes puede ser percibido como agradable, en realidad en la mayoría de los casos el cliente es percibido como un peso puesto que estos « controlan » visualmente el trabajo de las cajeras mientras que en otras profesiones como es el caso de los kinesiterapeutas la presencia del paciente

  1. Error adjustments for file linking methods using encrypted unique client identifier (eUCI) with application to recently released prisoners who are HIV+.

    Science.gov (United States)

    Gutman, R; Sammartino, C J; Green, T C; Montague, B T

    2016-01-15

    Incarceration provides an opportunity to test for HIV, provide treatment such as highly active anti-retroviral therapy, as well as link infected persons to comprehensive HIV care upon their release. A key factor in assessing the success of a program that links released individuals to care is the time from release to receiving care in the community (linkage time). To estimate the linkage time, records from correction systems are linked to Ryan White Clinic data using encrypted Unique Client Identifier (eUCI). Most of the records that were linked using eUCI belong to the same individual; however, in some cases, it may link records incorrectly, or not identify records that should have been linked. We propose a Bayesian procedure that relies on the relationships between variables that appear in either of the data sources, as well as variables that exists in both to identify correctly linked records among all linked records. The procedure generates K datasets in which each pair of linked records is identified as a true link or a false link. The K datasets are analyzed independently, and the results are combined using Rubin's multiple imputation rules. A small validation dataset is used to examine different statistical models and to inform the prior distributions of the parameters. In comparison with previously proposed methods, the proposed method utilizes all of the available data and is both flexible and computationally efficient. In addition, this approach can be applied in other file linking applications.

  2. PmiRExAt: plant miRNA expression atlas database and web applications.

    Science.gov (United States)

    Gurjar, Anoop Kishor Singh; Panwar, Abhijeet Singh; Gupta, Rajinder; Mantri, Shrikant S

    2016-01-01

    High-throughput small RNA (sRNA) sequencing technology enables an entirely new perspective for plant microRNA (miRNA) research and has immense potential to unravel regulatory networks. Novel insights gained through data mining in publically available rich resource of sRNA data will help in designing biotechnology-based approaches for crop improvement to enhance plant yield and nutritional value. Bioinformatics resources enabling meta-analysis of miRNA expression across multiple plant species are still evolving. Here, we report PmiRExAt, a new online database resource that caters plant miRNA expression atlas. The web-based repository comprises of miRNA expression profile and query tool for 1859 wheat, 2330 rice and 283 maize miRNA. The database interface offers open and easy access to miRNA expression profile and helps in identifying tissue preferential, differential and constitutively expressing miRNAs. A feature enabling expression study of conserved miRNA across multiple species is also implemented. Custom expression analysis feature enables expression analysis of novel miRNA in total 117 datasets. New sRNA dataset can also be uploaded for analysing miRNA expression profiles for 73 plant species. PmiRExAt application program interface, a simple object access protocol web service allows other programmers to remotely invoke the methods written for doing programmatic search operations on PmiRExAt database.Database URL:http://pmirexat.nabi.res.in. PMID:27081157

  3. The Development of Mobile Client Application in Yogyakarta Tourism and Culinary Information System Based on Social Media Integration

    Directory of Open Access Journals (Sweden)

    Novrian Fajar Hidayat

    2012-10-01

    Full Text Available Social network is currently being an important part of someone. Many of users in social network make it an effective publication. One of many things that can be published on social network is tourism. Indonesia has a lot of tourism and culinary, especially on Special District of Yogyakarta.Tourism and culinary resources on Yogyakarta can be published and shared using social network. In addition, development of mobile technology and smartphone make easier to access social network through internet. The release of Windows Phone 7 makes new color in the world of smartphone. Windows Phone 7comes with elegant interface, Metro Style. Besides that, standardized specification makes Windows Phone 7 suitable for integrating social network with tourism and culinary on Special District of Yogyakarta. This Research is expected to integrate social network with tourism and culinary on Yogyakarta. The method in this research is using ICONIX method. This method is one method that combines waterfall and agile methods. The results of this study are in the form of applications that run on Windows Phone 7 and consume a web service. This application provides information especially for tourist in order to be able to easily find culinary and tourism in Yogyakarta.

  4. Psychotherapy for Suicidal Clients.

    Science.gov (United States)

    Lester, David

    1994-01-01

    Reviews various systems of psychotherapy for suitability for suicidal clients. Discusses psychoanalysis, cognitive therapy, primal therapy, transactional analysis, Gestalt therapy, reality therapy, person-centered therapy, existential analysis, and Jungian analysis in light of available treatment options. Includes 36 citations. (Author/CRR)

  5. Training Evaluation: Clients' Roles.

    Science.gov (United States)

    Hashim, Junaidah

    2001-01-01

    A study was conducted of 262 training providers in Malaysia, where providers must be government approved. Government regulation, client demands for high quality, and an economic downturn that focused attention on training costs have all influenced evaluation in a positive direction. (SK)

  6. Extending Binary Large Object Support to Open Grid Services Architecture-Data Access and Integration Middleware Client Toolkit

    Directory of Open Access Journals (Sweden)

    Kiran K. Patnaik

    2011-01-01

    Full Text Available Problem statement: OGSA-DAI middleware allows data resources to be federated and accessed via web services on the web or within grids or clouds. It provides a client API for writing programs that access the exposed databases. Migrating existing applications to the new technology and using a new API to access the data of DBMS with BLOB is difficult and discouraging. A JDBC Driver is a much convenient alternative to existing mechanism and provides an extension to OGSA-DAI middleware and allows applications to use databases exposed in a grid through the OGSA-DAI 3.0. However, the driver does not support Binary Large Objects (BLOB. Approach: The driver is enhanced to support BLOB using the OGSA-DAI Client API. It transforms the JDBC calls into an OGSA-DAI workflow request and sends it to the server using Web Services (WS. The client API of OGSA-DAI uses activities that are connected to form a workflow and executed using a pipeline. This workflow mechanism is embedded into the driver. The WS container dispatches the request to the OGSA-DAI middleware for processing and the result is then transformed back to an instance of ResultSet implementation using the OGSA-DAI Client API, before it is returned to the user. Results: Test on handling of BLOBs (images, flash files and videos ranging from size 1 KB to size 2 GB were carried out on Oracle, MySQL and PostgreSQL databases using our enhanced JDBC driver and it performed well. Conclusion: The enhanced JDBC driver now can offer users, with no experience in Grid computing specifically on OGSA-DAI, the possibility to give their applications the ability to access databases exposed on the grid with minimal effort.

  7. 数据库保护的原则及适用法律综述%An Overview of the Principles of Database Protection and Its Applicable Laws

    Institute of Scientific and Technical Information of China (English)

    相丽玲; 黄富国

    2003-01-01

    At present, the principles of database protection are varied in developed countries. So are the applicable laws. This article summarizes these principles and applicable laws in an attempt to provide reference material for China in her legislation for databases.

  8. The Competitive Advantage: Client Service.

    Science.gov (United States)

    Leffel, Linda G.; DeBord, Karen B.

    The adult education literature contains a considerable amount of research on and discussion of client service in the marketing process, management and staff roles in service- and product-oriented businesses, and the importance of client service and service quality to survival in the marketplace. By applying the principles of client-oriented…

  9. Improvement of AMGA Python Client Library for Belle II Experiment

    Science.gov (United States)

    Kwak, Jae-Hyuck; Park, Geunchul; Huh, Taesang; Hwang, Soonwook

    2015-12-01

    This paper describes the recent improvement of the AMGA (ARDA Metadata Grid Application) python client library for the Belle II Experiment. We were drawn to the action items related to library improvement after in-depth discussions with the developer of the Belle II distributed computing system. The improvement includes client-side metadata federation support in python, DIRAC SSL library support as well as API refinement for synchronous operation. Some of the improvements have already been applied to the AMGA python client library as bundled with the Belle II distributed computing software. The recent mass Monte- Carlo (MC) production campaign shows that the AMGA python client library is reliably stable.

  10. An integrated intranet and dynamic database application for the Security Manager at Naval Postgraduate School

    OpenAIRE

    Perry, Sonja Michele

    2002-01-01

    Approved for public release; distribution is unlimited. This thesis presents an analysis, design and implementation of the Naval Postgraduate School's Sensitive Compartmented Information Facility (SCIF) consolidated Access database and website. The database was designed using a Microsoft Access 2000 relational database. This new database consolidates two previously separate personnel and classified inventories databases. The SCIF website was created utilizing Macromedia's Dreamweaver MX. A...

  11. ARQUITETURA E PROTOCOLO PARA APLICAÇÃO VISUAL THIN CLIENT

    Directory of Open Access Journals (Sweden)

    Vanius Roberto Bittencourt

    2011-12-01

    Full Text Available In this paper are studied different solutions to an increasingly common need today - to run applications thin client visuals. To solve common problems is provided a proposed open architecture and protocol for client-server application with visual thin client on windows.

  12. Unpacking the Client(s): Constructions, Positions and Client-Consultant Dynamics

    OpenAIRE

    Alvesson, Mats; Kärreman, Dan; Sturdy, Andrew; Handley, Karen

    2006-01-01

    Increasing attention is being given to professional services in organisation and management theory. Whether the focus is on organisational forms or service processes such as knowledge transfer, the role of clients is often seen as central. However, typically, clients continue to be presented in a largely static, pre-structured and even monolithic way. While some recognition is given to the diversity of organisational clients and, to a lesser extent, individual clients, little attention has be...

  13. “科研在线”平台文档的移动客户端%Mobile Client Application of Document Based on Research Online

    Institute of Scientific and Technical Information of China (English)

    王巧; 郑依华; 南凯

    2013-01-01

    云计算和移动智能终端的发展极大地改变了人们的生活,也为协同工作带来更多的便利。科研在线平台中的协同文档库是基于云存储的协同工具,为用户提供面向团队的文档协作与管理服务。本文的工作是设计并实现了基于iOS的协同文档库移动客户端。通过对用户使用场景的分析,得出系统功能。根据移动应用的特点,设计系统框架。本文主要从网络编程、数据缓存和登录认证这三个方面的关键技术来描述系统的实现。%Cloud computing and mobile intelligent have changed people's live greatly, and brought more convenience to collaborative work. Collaborative document library of Research Online platform is a collaborative tool based on cloud storage, providing users with team-oriented document collaboration and management service. The work of this paper is to design and implement iOS mobile client based on the Research Online document library. The system functions are acquired through the analysis of user scenarios. The system framework is designed according to the features of mobile application. This paper introduces the system implementation mainly from three aspects of network programming, data cache and login authentication key.

  14. Image storage, cataloguing and retrieval using a personal computer database software application

    International Nuclear Information System (INIS)

    Full text: Interesting images and cases are collected and collated by most nuclear medicine practitioners throughout the world. Changing imaging technology has altered the way in which images may be presented and are reported, with less reliance on 'hard copy' for both reporting and archiving purposes. Digital image generation and storage is rapidly replacing film in both radiological and nuclear medicine practice. A personal computer database based interesting case filing system is described and demonstrated. The digital image storage format allows instant access to both case information (e.g. history and examination, scan report or teaching point) and the relevant images. The database design allows rapid selection of cases and images appropriate to a particular diagnosis, scan type, age or other search criteria. Correlative X-ray, CT, MRI and ultrasound images can also be stored and accessed. The application is in use at The New Children's Hospital as an aid to postgraduate medical education, with new cases being regularly added to the database

  15. Multi-center, multi-topic heart sound databases and their applications.

    Science.gov (United States)

    Xie, Meilan; Xiao, Shouzhong; Liu, Tianhu; Yi, Qijian; You, Fengzhi; Guo, Xingming; Shao, Yong; Huo, Junmimg; Du, Deqi; Xu, Dongmei; Wu, Wenzhu; Xiao, Zifu; Yang, Yong; Guo, Weizhen

    2012-02-01

    This paper describes a large resource of multi-center and multi-topic heart sound databases, which were based on the measured data from more than 9,000 heart sound samples (saved in WAV file format). According to different research topics, these samples were respectively stored in different folders (corresponding to different research topics and distributed over various cooperative research centers), most of which as subfolds were stored in a pooled folder in the principal center. According to different research topics, the measured data from these samples were used to create different databases. Relevant data for a specific topic can be pooled in a large database for further analysis. This resource is shared by members of related centers for their own specific topic. The applications of this resource include evaluation of cardiac safety of pregnant women, evaluation of cardiac reserve for children, athletes, addicts, astronauts, and general populations, as well as studies on a bedside method for evaluating cardiac energy, reversal of S1-S2 ratio, etc.

  16. Relational database hybrid model, of high performance and storage capacity for nuclear engineering applications

    International Nuclear Information System (INIS)

    The objective of this work is to present the relational database, named FALCAO. It was created and implemented to support the storage of the monitored variables in the IEA-R1 research reactor, located in the Instituto de Pesquisas Energeticas e Nucleares, IPEN/CNEN-SP. The data logical model and its direct influence in the integrity of the provided information are carefully considered. The concepts and steps of normalization and de normalization including the entities and relations involved in the logical model are presented. It is also presented the effects of the model rules in the acquisition, loading and availability of the final information, under the performance concept since the acquisition process loads and provides lots of information in small intervals of time. The SACD application, through its functionalities, presents the information stored in the FALCAO database in a practical and optimized form. The implementation of the FALCAO database occurred successfully and its existence leads to a considerably favorable situation. It is now essential to the routine of the researchers involved, not only due to the substantial improvement of the process but also to the reliability associated to it. (author)

  17. Architectural models for client interaction on service-oriented platforms

    NARCIS (Netherlands)

    Bonino da Silva Santos, L.O.; Ferreira Pires, L.; Sinderen, van M.J.; Sinderen, van M.J.

    2007-01-01

    Service-oriented platforms can provide different levels of functionality to the client applications as well as different interaction models. Depending on the platform’s goals and the computing capacity of their expected clients the platform functionality can range from just an interface to support t

  18. Application of embedded database to digital power supply system in HIRFL

    International Nuclear Information System (INIS)

    Background: This paper introduces the application of embedded MySQL database in the real-time monitoring system of the digital power supply system in Heavy Ion Research Facility in Lanzhou (HIRFL). Purpose: The aim is to optimize the real-time monitoring system of the digital power supply system for better performance. Methods: The MySQL database is designed and implemented under Linux operation system running on ARM processor, together with the related functions for real-time data monitoring, such as collection, storage and query. All status parameters of digital power supply system is collected and communicated with ARM by a FPGA, whilst the user interface is realized by Qt toolkits at ARM end. Results: The actual operation indicates that digital power supply can realize the function of real-time data monitoring, collection, storage and so on. Conclusion: Through practical application, we have found some aspects we can improve and we will try to optimize them in the future. (authors)

  19. Database Access through Java Technologies

    Directory of Open Access Journals (Sweden)

    Nicolae MERCIOIU

    2010-09-01

    Full Text Available As a high level development environment, the Java technologies offer support to the development of distributed applications, independent of the platform, providing a robust set of methods to access the databases, used to create software components on the server side, as well as on the client side. Analyzing the evolution of Java tools to access data, we notice that these tools evolved from simple methods that permitted the queries, the insertion, the update and the deletion of the data to advanced implementations such as distributed transactions, cursors and batch files. The client-server architectures allows through JDBC (the Java Database Connectivity the execution of SQL (Structured Query Language instructions and the manipulation of the results in an independent and consistent manner. The JDBC API (Application Programming Interface creates the level of abstractization needed to allow the call of SQL queries to any DBMS (Database Management System. In JDBC the native driver and the ODBC (Open Database Connectivity-JDBC bridge and the classes and interfaces of the JDBC API will be described. The four steps needed to build a JDBC driven application are presented briefly, emphasizing on the way each step has to be accomplished and the expected results. In each step there are evaluations on the characteristics of the database systems and the way the JDBC programming interface adapts to each one. The data types provided by SQL2 and SQL3 standards are analyzed by comparison with the Java data types, emphasizing on the discrepancies between those and the SQL types, but also the methods that allow the conversion between different types of data through the methods of the ResultSet object. Next, starting from the metadata role and studying the Java programming interfaces that allow the query of result sets, we will describe the advanced features of the data mining with JDBC. As alternative to result sets, the Rowsets add new functionalities that

  20. In-database processing of a large collection of remote sensing data: applications and implementation

    Science.gov (United States)

    Kikhtenko, Vladimir; Mamash, Elena; Chubarov, Dmitri; Voronina, Polina

    2016-04-01

    Large archives of remote sensing data are now available to scientists, yet the need to work with individual satellite scenes or product files constrains studies that span a wide temporal range or spatial extent. The resources (storage capacity, computing power and network bandwidth) required for such studies are often beyond the capabilities of individual geoscientists. This problem has been tackled before in remote sensing research and inspired several information systems. Some of them such as NASA Giovanni [1] and Google Earth Engine have already proved their utility for science. Analysis tasks involving large volumes of numerical data are not unique to Earth Sciences. Recent advances in data science are enabled by the development of in-database processing engines that bring processing closer to storage, use declarative query languages to facilitate parallel scalability and provide high-level abstraction of the whole dataset. We build on the idea of bridging the gap between file archives containing remote sensing data and databases by integrating files into relational database as foreign data sources and performing analytical processing inside the database engine. Thereby higher level query language can efficiently address problems of arbitrary size: from accessing the data associated with a specific pixel or a grid cell to complex aggregation over spatial or temporal extents over a large number of individual data files. This approach was implemented using PostgreSQL for a Siberian regional archive of satellite data products holding hundreds of terabytes of measurements from multiple sensors and missions taken over a decade-long span. While preserving the original storage layout and therefore compatibility with existing applications the in-database processing engine provides a toolkit for provisioning remote sensing data in scientific workflows and applications. The use of SQL - a widely used higher level declarative query language - simplifies interoperability

  1. Teaching job interview skills to retarded clients.

    OpenAIRE

    Hall, C.; Sheldon-Wildgen, J; Sherman, J. A.

    1980-01-01

    Six retarded adults were taught job application and interview skills including introducing oneself, filling out a standard job application form, answering questions, and asking questions. A combination of instructions, modeling, role playing, and positive and corrective feedback was used across a multiple baseline experimental design. After training, the clients' performance in each area improved substantially over baseline levels. In addition, the newly taught skills appeared to generalize t...

  2. Comprehensive database of human E3 ubiquitin ligases: application to aquaporin-2 regulation.

    Science.gov (United States)

    Medvar, Barbara; Raghuram, Viswanathan; Pisitkun, Trairak; Sarkar, Abhijit; Knepper, Mark A

    2016-07-01

    Aquaporin-2 (AQP2) is regulated in part via vasopressin-mediated changes in protein half-life that are in turn dependent on AQP2 ubiquitination. Here we addressed the question, "What E3 ubiquitin ligase is most likely to be responsible for AQP2 ubiquitination?" using large-scale data integration based on Bayes' rule. The first step was to bioinformatically identify all E3 ligase genes coded by the human genome. The 377 E3 ubiquitin ligases identified in the human genome, consisting predominant of HECT, RING, and U-box proteins, have been used to create a publically accessible and downloadable online database (https://hpcwebapps.cit.nih.gov/ESBL/Database/E3-ligases/). We also curated a second database of E3 ligase accessory proteins that included BTB domain proteins, cullins, SOCS-box proteins, and F-box proteins. Using Bayes' theorem to integrate information from multiple large-scale proteomic and transcriptomic datasets, we ranked these 377 E3 ligases with respect to their probability of interaction with AQP2. Application of Bayes' rule identified the E3 ligases most likely to interact with AQP2 as (in order of probability): NEDD4 and NEDD4L (tied for first), AMFR, STUB1, ITCH, ZFPL1. Significantly, the two E3 ligases tied for top rank have also been studied extensively in the reductionist literature as regulatory proteins in renal tubule epithelia. The concordance of conclusions from reductionist and systems-level data provides strong motivation for further studies of the roles of NEDD4 and NEDD4L in the regulation of AQP2 protein turnover. PMID:27199454

  3. Knowledge Management and Database Marketing Applications = Bilgi Yönetimi ve Veritabanlı Pazarlama Uygulamaları

    Directory of Open Access Journals (Sweden)

    Erol EREN

    2004-01-01

    Full Text Available The aim of this article is to learn whether database marketing is used as a tool of knowledge management, and to investigate its applications among Turkish ready to wear retailers. The retailers who use knowledge technologies can collect and turn data into useful information and knowledge, and then use them in database marketing systems. This study investigates whether ready to wear retailers have such database marketing systems, and how they set up and work these systems if they have one. There is no doubt that businesses in the information technologies sector would be interested in the topic and the results.

  4. Exploring the Ligand-Protein Networks in Traditional Chinese Medicine: Current Databases, Methods, and Applications

    Directory of Open Access Journals (Sweden)

    Mingzhu Zhao

    2013-01-01

    Full Text Available The traditional Chinese medicine (TCM, which has thousands of years of clinical application among China and other Asian countries, is the pioneer of the “multicomponent-multitarget” and network pharmacology. Although there is no doubt of the efficacy, it is difficult to elucidate convincing underlying mechanism of TCM due to its complex composition and unclear pharmacology. The use of ligand-protein networks has been gaining significant value in the history of drug discovery while its application in TCM is still in its early stage. This paper firstly surveys TCM databases for virtual screening that have been greatly expanded in size and data diversity in recent years. On that basis, different screening methods and strategies for identifying active ingredients and targets of TCM are outlined based on the amount of network information available, both on sides of ligand bioactivity and the protein structures. Furthermore, applications of successful in silico target identification attempts are discussed in detail along with experiments in exploring the ligand-protein networks of TCM. Finally, it will be concluded that the prospective application of ligand-protein networks can be used not only to predict protein targets of a small molecule, but also to explore the mode of action of TCM.

  5. Client/Server数据库结构、应用软件设计及其访问机制%Construction, Applications Design and Accessing Mechanism of Client/Server Database

    Institute of Scientific and Technical Information of China (English)

    黄丽霞

    2000-01-01

    全面论述了客户机/服务器数据库系统体系结构的发展和主要特点,提出新一代数据库客户端与服务器端应用软件的设计任务,进一步探讨了访问数据库的多种技术和途径.

  6. Applications of TsunAWI: Operational scenario database in Indonesia, case studies in Chile

    Science.gov (United States)

    Rakowsky, Natalja; Harig, Sven; Immerz, Antonia; Androsov, Alexey; Hiller, Wolfgang; Schröter, Jens

    2016-04-01

    The numerical simulation code TsunAWI was developed in the framework of the German-Indonesian Tsunami Early Warning System (GITEWS). The Numerical simulation of prototypic tsunami scenarios plays a decisive role in the a priori risk assessment for coastal regions and in the early warning process itself. TsunAWI is suited for both tasks. It is based on a finite element discretisation, employs unstructured grids with high resolution along the coast, and includes inundation. This contribution presents two fields of applications. In the Indonesian tsunami early warning system, the existing TsunAWI scenario database covers the Sunda subduction zone from Sumatra to the Lesser Sunda Islands with 715 epicenters and 4500 scenarios. In a collaboration with Geoscience Australia, we support the scientific staff at the Indonesian warning center to extend the data base to the remaining tectonic zones in the Indonesian Archipelago. The extentension started for North Sulawesi, West and East Maluku Islands. For the Hydrographic and Oceanographic Service of the Chilean Navy (SHOA), we calculated a small scenario database of 100 scenarios (sources by Universidad de Chile) for a lightweight decision support system prototype (built by DLR). The earthquake and tsunami events on 1 April 2014 and 16 November 2016 showed the practical use of this approach in comparison to hind casts of these events.

  7. Analysis and Design of Soils and Terrain Digital Database (SOTER) Management System Based on Object-Oriented Method

    Institute of Scientific and Technical Information of China (English)

    ZHANG HAITAO; ZHOU YONG; R. V. BIRNIE; A. SIBBALD; REN YI

    2003-01-01

    A SOTER management system was developed by analyzing, designing, programming, testing, repeated proceeding and progressing based on the object-oriented method. The function of the attribute database management is inherited and expanded in the new system. The integrity and security of the SOTER database are enhanced. The attribute database management, the spatial database management and the model base are integrated into SOTER based on the component object model (COM), and the graphical user interface (GUI) for Windows is used to interact with clients, thus being easy to create and maintain the SOTER, and convenient to promote the quantification and automation of soil information application.

  8. Probabilistic Databases

    CERN Document Server

    Suciu, Dan; Koch, Christop

    2011-01-01

    Probabilistic databases are databases where the value of some attributes or the presence of some records are uncertain and known only with some probability. Applications in many areas such as information extraction, RFID and scientific data management, data cleaning, data integration, and financial risk assessment produce large volumes of uncertain data, which are best modeled and processed by a probabilistic database. This book presents the state of the art in representation formalisms and query processing techniques for probabilistic data. It starts by discussing the basic principles for rep

  9. DEVELOPMENT OF A DATABASE SECURITY POLICY METADATA MODEL AND ITS APPLICATION

    Directory of Open Access Journals (Sweden)

    Dilek TAPUCU ARPAÇAY

    2004-04-01

    Full Text Available Data stored in the database must be protected by pre-defined security policies designed for systems. The security policies must be described with the worth of knowledge and related risks. In this paper, two different security policies are explained by "Generally Accepted System Security Principles". For these policies metadata and metamodels are created and related metamodels are applied by using XML Schema. Then, the structures of elements for each model are exposed. For the realized application, determination of the policy in the organization, for changing conditions rearrangement will be easy. Thus, with the changes made in the metamodels, related policies will be continues in the process and metamodels are moved between different systems.

  10. 案主自决原则在临终关怀中的应用及伦理冲突%The Application and Ethical Conflicts in Client Self- determination Principle in Hospice Care

    Institute of Scientific and Technical Information of China (English)

    定光莉

    2011-01-01

    现代医学模式强调对临终者的人文关怀,社会工作者在临终关怀中发挥的作用不客忽视.案主自决原则作为社会工作的服务原则之一,充分体现了临终关怀的"提高生命质量"的照护宗旨与社会工作的"尊重素主权利"的服务态度的结合.探讨在我国具体的国情下,案主自决原则在临终关怀中的运用范围,具体分析三类限制案主自决原则的伦理冲突:医疗保护中隐瞒病情与知情同意中告知原则的伦理冲突;中国传统孝道观念与尊重案主抉择权利的伦理冲突;对待死亡讳莫如深的态度与商讨死亡、预办后事的伦理冲突.结合社会工作案例,分析诸多矛盾,找出解决问题的办法.社会工作者应运用访谈法、参与观察法、讨论法等专业方法和引领性、影响性、支持性等实务技巧,充分协调案主意愿、家属心理、社会舆论之间的关系,把握案主自决原则的运用尺度.同时,要在临终关怀中改变传统的死亡观念,完善死亡教育和伦理教育.%Modern medical mode stressed the humanistic concern for the dying person.Social workers play a critical role in hospice care.The client self - determination principle is one of fundamental principles of social work services, fully reflecting the combination of the care purpose of "improving the quality of life" of hospice and the service attitude of "respecting client's right" of social work.This paper discusses the application of client self - determination principle in hospice care under the specific national conditions in China.Three types of ethical conflicts which restrict the application of client self - determination principle are analyzed in detail, including the ethical conflict between concealing of patient's condition in the medical protection and the informed principle in informed consent , the ethical conflict between traditional Chinese concept of filial piety and respect for client's right of self

  11. Research Study on the Migration of Clients on Banking Market

    Directory of Open Access Journals (Sweden)

    Cornelia Tureac

    2013-02-01

    Full Text Available In this paper we present the relevancy and importance of knowing the reasons on clients’migration to competitive banking institutions. The main reason of being the client of several banks isdue to the fierce competition between credit institutions,thusthe banking market has changed. Basedon a case study within the Raiffeisen Bank we researched and presented the reasons fordiscontinuation of banking tiesand the migration of clients to other banks. The used researchmethodology consisted of the application of analysis point of contact by sending a questionnairethrough which there could have been identified 105 migrating clients, out of which 89 were formerclients of Raiffeisen Bank. Since both in the specialized literatureand in practice there is very littleinformation about migration behavior of banking clients-especially in the category of small andmedium enterprises-the present research was not limited to the Raiffeisen Bank clients, but to all 105respondents whodiscontinued totally or partially their connection with the bank. It can be concludedthat the attitude of the bank clients has a considerable influence on the migration behavior. The most“infidel” banking clients are considered in the category of “clients oriented towards the conditions.”

  12. ASP Applications to Access SQL Server 2000 Database%应用ASP访问SQL Server 2000数据库

    Institute of Scientific and Technical Information of China (English)

    徐涌; 郑瑞银

    2011-01-01

    在建立Web数据库站点的应用中,将ASP技术与SQLServer数据库相结合,使数据库管理更加的方便和安全。因此如何利用AsP访问SQLServer2000数据库成为了Web数据库站点开发的关键。首先介绍了AsP访问SQLServer2000数据库所用到的ADO组件,接着阐述了ASP与SQLServer2000数据库建立连接的4个主要步骤。%In creating a Web site,database applications,the ASP technology combined with the SQL Server database to make database management more convenient and secure.Therefore,how to use ASP to access SQL Server 2000 database into a Web site developed by key database.First introduced the ASP to access SQL Server 2000 database that uses the ADO components,and then elaborated on ASP and SQL Server 2000 database to connect the four main steps.

  13. System analysis for the 300 MW nuclear power plant design database

    International Nuclear Information System (INIS)

    The structure of the 300 MW nuclear power station design database system which has been developed by Shanghai Nuclear Engineering Research and Design Institute is discussed. The system consists of an IBM RS/6000 workstation running ORACLE 7.0 as the server networked through a TCP/IP Ethernet with a number of PCs. The application software for the database system at both the server and the clients features of good data integrity, reliability, network transparency and of friendly user interface by using stored procedures, triggers and some other new database technology

  14. Secure thin client architecture for DICOM image analysis

    Science.gov (United States)

    Mogatala, Harsha V. R.; Gallet, Jacqueline

    2005-04-01

    This paper presents a concept of Secure Thin Client (STC) Architecture for Digital Imaging and Communications in Medicine (DICOM) image analysis over Internet. STC Architecture provides in-depth analysis and design of customized reports for DICOM images using drag-and-drop and data warehouse technology. Using a personal computer and a common set of browsing software, STC can be used for analyzing and reporting detailed patient information, type of examinations, date, Computer Tomography (CT) dose index, and other relevant information stored within the images header files as well as in the hospital databases. STC Architecture is three-tier architecture. The First-Tier consists of drag-and-drop web based interface and web server, which provides customized analysis and reporting ability to the users. The Second-Tier consists of an online analytical processing (OLAP) server and database system, which serves fast, real-time, aggregated multi-dimensional data using OLAP technology. The Third-Tier consists of a smart algorithm based software program which extracts DICOM tags from CT images in this particular application, irrespective of CT vendor's, and transfers these tags into a secure database system. This architecture provides Winnipeg Regional Health Authorities (WRHA) with quality indicators for CT examinations in the hospitals. It also provides health care professionals with analytical tool to optimize radiation dose and image quality parameters. The information is provided to the user by way of a secure socket layer (SSL) and role based security criteria over Internet. Although this particular application has been developed for WRHA, this paper also discusses the effort to extend the Architecture to other hospitals in the region. Any DICOM tag from any imaging modality could be tracked with this software.

  15. Constructing Genograms with Lesbian Clients.

    Science.gov (United States)

    Magnuson, Sandy; And Others

    1995-01-01

    The intergenerational family therapy method of genogram construction may be a useful technique for increasing levels of differentiation in clients with a lesbian sexual orientation. Provides a description, rationale, and illustration of how genogram construction may be used by family counselors to treat lesbian clients. (JBJ)

  16. Group Work with Transgender Clients

    Science.gov (United States)

    Dickey, Lore M.; Loewy, Michael I.

    2010-01-01

    Drawing on the existing literature, the authors' research and clinical experiences, and the first author's personal journey as a member and leader of the transgender community, this article offers a brief history of group work with transgender clients followed by suggestions for group work with transgender clients from a social justice…

  17. Vocational Indecision and Rehabilitation Clients.

    Science.gov (United States)

    Strohmer, Douglas C.; And Others

    1984-01-01

    Assessed the vocational decision-making problems of rehabilitation clients (N=60). Revealed that decision-making problems of clients can be grouped into three areas: employment readiness, self-appraisal, and decision-making readiness. Suggested that vocationally decided and undecided subjects differ significantly in the extent to which they have…

  18. Design of the system of maintenance operations occupational safety and health database application of nuclear power station

    International Nuclear Information System (INIS)

    Based on the KKS code of building equipment in nuclear power station, this paper introduces the method of establishing the system of maintenance operation occupational safety and health database application. Through the application system of maintenance occupational safety and health database, it can summarize systematically all kinds of maintenance operation dangerous factor of nuclear power station, and make a convenience for staff to learn the maintenance operation dangerous factors and the prevention measures, so that it can achieve the management concept of 'precaution crucial, continuous improvement' that advocated by OSHMS. (authors)

  19. Application of Google Maps API service for creating web map of information retrieved from CORINE land cover databases

    Directory of Open Access Journals (Sweden)

    Kilibarda Milan

    2010-01-01

    Full Text Available Today, Google Maps API application based on Ajax technology as standard web service; facilitate users with publication interactive web maps, thus opening new possibilities in relation to the classical analogue maps. CORINE land cover databases are recognized as the fundamental reference data sets for numerious spatial analysis. The theoretical and applicable aspects of Google Maps API cartographic service are considered on the case of creating web map of change in urban areas in Belgrade and surround from 2000. to 2006. year, obtained from CORINE databases.

  20. Channel Access Client Toolbox for Matlab

    International Nuclear Information System (INIS)

    This paper reports on MATLAB Channel Access (MCA) Toolbox--MATLAB [1] interface to EPICS Channel Access (CA) client library. We are developing the toolbox for SPEAR3 accelerator controls, but it is of general use for accelerator and experimental physics applications programming. It is packaged as a MATLAB toolbox to allow easy development of complex CA client applications entirely in MATLAB. The benefits include: the ability to calculate and display parameters that use EPICS process variables as inputs, availability of MATLAB graphics tools for user interface design, and integration with the MATLABbased accelerator modeling software - Accelerator Toolbox [2-4]. Another purpose of this paper is to propose a feasible path to a synergy between accelerator control systems and accelerator simulation codes, the idea known as on-line accelerator model

  1. Identifying Social Impacts in Product Supply Chains:Overview and Application of the Social Hotspot Database

    Directory of Open Access Journals (Sweden)

    Gregory Norris

    2012-08-01

    Full Text Available One emerging tool to measure the social-related impacts in supply chains is Social Life Cycle Assessment (S-LCA, a derivative of the well-established environmental LCA technique. LCA has recently started to gain popularity among large corporations and initiatives, such as The Sustainability Consortium or the Sustainable Apparel Coalition. Both have made the technique a cornerstone of their applied-research program. The Social Hotspots Database (SHDB is an overarching, global database that eases the data collection burden in S-LCA studies. Proposed “hotspots” are production activities or unit processes (also defined as country-specific sectors in the supply chain that may be at risk for social issues to be present. The SHDB enables efficient application of S-LCA by allowing users to prioritize production activities for which site-specific data collection is most desirable. Data for three criteria are used to inform prioritization: (1 labor intensity in worker hours per unit process and (2 risk for, or opportunity to affect, relevant social themes or sub-categories related to Human Rights, Labor Rights and Decent Work, Governance and Access to Community Services (3 gravity of a social issue. The Worker Hours Model was developed using a global input/output economic model and wage rate data. Nearly 200 reputable sources of statistical data have been used to develop 20 Social Theme Tables by country and sector. This paper presents an overview of the SHDB development and features, as well as results from a pilot study conducted on strawberry yogurt. This study, one of seven Social Scoping Assessments mandated by The Sustainability Consortium, identifies the potential social hotspots existing in the supply chain of strawberry yogurt. With this knowledge, companies that manufacture or sell yogurt can refine their data collection efforts in order to put their social responsibility performance in perspective and effectively set up programs and

  2. Java ME Clients for XML Web Services

    Directory of Open Access Journals (Sweden)

    Paul POCATILU

    2008-01-01

    Full Text Available Using Web services in developing applications has many advantages like the existence of standards, multiple software platforms that support them, and many areas of usage. These advantages derive from the XML and Web technologies. This paper describes the stages in the development of a Web service client for Java ME platform and presents examples based on kSOAP and JSR 172.

  3. Failure rates in Barsebaeck-1 reactor coolant pressure boundary piping. An application of a piping failure database

    Energy Technology Data Exchange (ETDEWEB)

    Lydell, B. [RSA Technologies, Vista, CA (United States)

    1999-05-01

    This report documents an application of a piping failure database to estimate the frequency of leak and rupture in reactor coolant pressure boundary piping. The study used Barsebaeck-1 as reference plant. The study tried two different approaches to piping failure rate estimation: 1) PSA-style, simple estimation using Bayesian statistics, and 2) fitting of statistical distribution to failure data. A large, validated database on piping failures (like the SKI-PIPE database) supports both approaches. In addition to documenting leak and rupture frequencies, the SKI report describes the use of piping failure data to estimate frequency of medium and large loss of coolant accidents (LOCAs). This application study was co sponsored by Barsebaeck Kraft AB and SKI Research 41 refs, figs, tabs

  4. Dogmatism within the Counselor-Client Dyad

    Science.gov (United States)

    Tosi, Donald J.

    1970-01-01

    Different levels of counselor and client dogmatism combined additively in terms of their effect on client ratings of the relationship. Client ratings of the relationship were progressively higher as more openness occurred in the dyad. (Author)

  5. A Responsive Client for Distributed Visualization

    Science.gov (United States)

    Bollig, E. F.; Jensen, P. A.; Erlebacher, G.; Yuen, D. A.; Momsen, A. R.

    2006-12-01

    As grids, web services and distributed computing continue to gain popularity in the scientific community, demand for virtual laboratories likewise increases. Today organizations such as the Virtual Laboratory for Earth and Planetary Sciences (VLab) are dedicated to developing web-based portals to perform various simulations remotely while abstracting away details of the underlying computation. Two of the biggest challenges in portal- based computing are fast visualization and smooth interrogation without over taxing clients resources. In response to this challenge, we have expanded on our previous data storage strategy and thick client visualization scheme [1] to develop a client-centric distributed application that utilizes remote visualization of large datasets and makes use of the local graphics processor for improved interactivity. Rather than waste precious client resources for visualization, a combination of 3D graphics and 2D server bitmaps are used to simulate the look and feel of local rendering. Java Web Start and Java Bindings for OpenGL enable install-on- demand functionality as well as low level access to client graphics for all platforms. Powerful visualization services based on VTK and auto-generated by the WATT compiler [2] are accessible through a standard web API. Data is permanently stored on compute nodes while separate visualization nodes fetch data requested by clients, caching it locally to prevent unnecessary transfers. We will demonstrate application capabilities in the context of simulated charge density visualization within the VLab portal. In addition, we will address generalizations of our application to interact with a wider number of WATT services and performance bottlenecks. [1] Ananthuni, R., Karki, B.B., Bollig, E.F., da Silva, C.R.S., Erlebacher, G., "A Web-Based Visualization and Reposition Scheme for Scientific Data," In Press, Proceedings of the 2006 International Conference on Modeling Simulation and Visualization Methods (MSV

  6. Database design and database administration for a kindergarten

    OpenAIRE

    Vítek, Daniel

    2009-01-01

    The bachelor thesis deals with creation of database design for a standard kindergarten, installation of the designed database into the database system Oracle Database 10g Express Edition and demonstration of the administration tasks in this database system. The verification of the database was proved by a developed access application.

  7. Fine-grained policy control in U.S. Army Research Laboratory (ARL) multimodal signatures database

    Science.gov (United States)

    Bennett, Kelly; Grueneberg, Keith; Wood, David; Calo, Seraphin

    2014-06-01

    The U.S. Army Research Laboratory (ARL) Multimodal Signatures Database (MMSDB) consists of a number of colocated relational databases representing a collection of data from various sensors. Role-based access to this data is granted to external organizations such as DoD contractors and other government agencies through a client Web portal. In the current MMSDB system, access control is only at the database and firewall level. In order to offer finer grained security, changes to existing user profile schemas and authentication mechanisms are usually needed. In this paper, we describe a software middleware architecture and implementation that allows fine-grained access control to the MMSDB at a dataset, table, and row level. Result sets from MMSDB queries issued in the client portal are filtered with the use of a policy enforcement proxy, with minimal changes to the existing client software and database. Before resulting data is returned to the client, policies are evaluated to determine if the user or role is authorized to access the data. Policies can be authored to filter data at the row, table or column level of a result set. The system uses various technologies developed in the International Technology Alliance in Network and Information Science (ITA) for policy-controlled information sharing and dissemination1. Use of the Policy Management Library provides a mechanism for the management and evaluation of policies to support finer grained access to the data in the MMSDB system. The GaianDB is a policy-enabled, federated database that acts as a proxy between the client application and the MMSDB system.

  8. Students as Clients in a Professional/Client Relationship.

    Science.gov (United States)

    Bailey, Jeffrey J.

    2000-01-01

    Proposes the metaphor of professional/client rather than student-as-customer to characterize the relationship between professors and students. Uses examples of fitness trainer, management consultant, accounting service, and mountain guide to illustrate faculty and student roles. (SK)

  9. Risk taking among diabetic clients.

    Science.gov (United States)

    Joseph, D H; Schwartz-Barcott, D; Patterson, B

    1992-01-01

    Diabetic clients must make daily decisions about their health care needs. Observational and anecdotal evidence suggests that vast differences exist between the kinds of choices diabetic clients make and the kinds of chances they are willing to take. The purpose of this investigation was to develop a diabetic risk-assessment tool. This instrument, which is based on subjective expected utility theory, measures risk-prone and risk-averse behavior. Initial findings from a pilot study of 18 women clients who are on insulin indicate that patterns of risk behavior exist in the areas of exercise, skin care, and diet. PMID:1729123

  10. Client Mobile Software Design Principles for Mobile Learning Systems

    Directory of Open Access Journals (Sweden)

    Qing Tan

    2009-01-01

    Full Text Available In a client-server mobile learning system, client mobile software must run on the mobile phone to acquire, package, and send student’s interaction data via the mobile communications network to the connected mobile application server. The server will receive and process the client data in order to offer appropriate content and learning activities. To develop the mobile learning systems there are a number of very important issues that must be addressed. Mobile phones have scarce computing resources. They consist of heterogeneous devices and use various mobile operating systems, they have limitations with their user/device interaction capabilities, high data communications cost, and must provide for device mobility and portability. In this paper we propose five principles for designing Client mobile learning software. A location-based adaptive mobile learning system is presented as a proof of concept to demonstrate the applicability of these design principles.

  11. Web-Based Satellite Products Database for Meteorological and Climate Applications

    Science.gov (United States)

    Phan, Dung; Spangenberg, Douglas A.; Palikonda, Rabindra; Khaiyer, Mandana M.; Nordeen, Michele L.; Nguyen, Louis; Minnis, Patrick

    2004-01-01

    The need for ready access to satellite data and associated physical parameters such as cloud properties has been steadily growing. Air traffic management, weather forecasters, energy producers, and weather and climate researchers among others can utilize more satellite information than in the past. Thus, it is essential that such data are made available in near real-time and as archival products in an easy-access and user friendly environment. A host of Internet web sites currently provide a variety of satellite products for various applications. Each site has a unique contribution with appeal to a particular segment of the public and scientific community. This is no less true for the NASA Langley's Clouds and Radiation (NLCR) website (http://www-pm.larc.nasa.gov) that has been evolving over the past 10 years to support a variety of research projects This website was originally developed to display cloud products derived from the Geostationary Operational Environmental Satellite (GOES) over the Southern Great Plains for the Atmospheric Radiation Measurement (ARM) Program. It has evolved into a site providing a comprehensive database of near real-time and historical satellite products used for meteorological, aviation, and climate studies. To encourage the user community to take advantage of the site, this paper summarizes the various products and projects supported by the website and discusses future options for new datasets.

  12. Database of open-framework aluminophosphate syntheses:introduction and application(Ⅰ)

    Institute of Scientific and Technical Information of China (English)

    2009-01-01

    The database of open-framework aluminophosphate(AlPO) syntheses has been established,which includes about 1600 synthetic records.Data analysis has been done on the basis of the framework composition,structure dimension,pore ring,and organic template.This database will serve as useful guidance for the rational synthesis of microporous functional materials.

  13. Database of open-framework aluminophosphate syntheses:introduction and application (Ⅰ)

    Institute of Scientific and Technical Information of China (English)

    YAN Yan; LI JiYang; QI Miao; ZHANG Xiao; YU JiHong; XU RuRen

    2009-01-01

    The database of open-framework aluminophosphate (AIPO) syntheses has been established,which Includes about 1600 synthetic records.Data analysis has been done on the basis of the framework composition,structure dimension,pore ring,and organic template.This database will serve as useful guidance for the rational synthesis of microporous functional materials.

  14. GenExp: an interactive web-based genomic DAS client with client-side data rendering.

    Directory of Open Access Journals (Sweden)

    Bernat Gel Moreno

    Full Text Available BACKGROUND: The Distributed Annotation System (DAS offers a standard protocol for sharing and integrating annotations on biological sequences. There are more than 1000 DAS sources available and the number is steadily increasing. Clients are an essential part of the DAS system and integrate data from several independent sources in order to create a useful representation to the user. While web-based DAS clients exist, most of them do not have direct interaction capabilities such as dragging and zooming with the mouse. RESULTS: Here we present GenExp, a web based and fully interactive visual DAS client. GenExp is a genome oriented DAS client capable of creating informative representations of genomic data zooming out from base level to complete chromosomes. It proposes a novel approach to genomic data rendering and uses the latest HTML5 web technologies to create the data representation inside the client browser. Thanks to client-side rendering most position changes do not need a network request to the server and so responses to zooming and panning are almost immediate. In GenExp it is possible to explore the genome intuitively moving it with the mouse just like geographical map applications. Additionally, in GenExp it is possible to have more than one data viewer at the same time and to save the current state of the application to revisit it later on. CONCLUSIONS: GenExp is a new interactive web-based client for DAS and addresses some of the short-comings of the existing clients. It uses client-side data rendering techniques resulting in easier genome browsing and exploration. GenExp is open source under the GPL license and it is freely available at http://gralggen.lsi.upc.edu/recerca/genexp.

  15. A fusion algorithm for joins based on collections in Odra-Object Database for Rapid Application development

    Directory of Open Access Journals (Sweden)

    Laika Satish

    2011-07-01

    Full Text Available In this paper we present the functionality of a currently under development database programming methodology called ODRA (Object Database for Rapid Application development which works fully on the object oriented principles. The database programming language is called SBQL (Stack based query language. We discuss some concepts in ODRA for e.g. the working of ODRA, how ODRA runtime environment operates, the interoperability of ODRA with .net and java .A view of ODRA's working with web services and xml. Currently the stages under development in ODRA are query optimization. So we present the prior work that is done in ODRA related to Query optimization and we also present a new fusion algorithm of how ODRA can deal with joins based on collections like set, lists, and arrays for query optimization.

  16. PENGGUNAAN KONEKSI CORBA DENGAN PEMROGRAMAN MIDAS MULTI-TIER APPLICATION DALAM SISTEM RESERVASI HOTEL

    Directory of Open Access Journals (Sweden)

    Irwan Kristanto Julistiono

    2001-01-01

    Full Text Available This paper is made from a multi-tier system using corba technology for hotel reservation program for web browser and also client program. Client software is connected to application server with Corba Connection and client and application server connect to SQL server 7.0. via ODBC. The are 2 types of client: web client and delphi client. In making web browser client application, we use delphi activex from technology, in where in this system made like making the regular form, but it has shortage in integration with html language. Multi-pier application using corba system generally has another profit beside it could be developed, this system also stake with multi system database server, multi middle servers and multi client in which with these things all the system can system can be integrated. The weakness of this system is the complicated corba system, so it will be difficult to understand, while for multi-tier it self need a particular procedure to determine which server chossed by the client. Abstract in Bahasa Indonesia : Pada makalah ini dibuat suatu sistem multi-tier yang menggunakan teknologi CORBA untuk program reservasi hotel baik dengan web browser maupun program client. Perangkat lunak yang dipakai sebagai database server adalah SQL server 7.0. Program Client Delphi melalui Corba Connection akan dihubungkan ke Aplikasi server. Dan melalui ODBC Aplikasi Server akan dihubungkan ke SQL Server 7.0. Ada dua buah aplikasi client yaitu yang menggunakan lokal network dan yang menggunakan global network/web browser. Pada pembuatan aplikasi client untuk web browser. Digunakan teknologi activex form pada delphi dimana sistem ini dibuat seperti membuat form biasa, hanya saja memiliki kekurangan pada integrasi dengan bahasa html. Penggunaan sistem multi-tier dengan Corba ini secara umum memiliki keuntungan selain dapat dikembangkan lebih lanjut juga sistem ini dirancang dengan sistem multi database server, multi midle server, dan multi client dimana

  17. Nurse Interaction With Clients In Communication Therapeutic Study Analysis Of Symbolic Interactionism Hospital South Sulawesi

    Directory of Open Access Journals (Sweden)

    Hj.Indirawaty

    2015-08-01

    Full Text Available ABSTRACT This study aimed to describe briefly on the application of social interaction which made nurses to clients while performing therapeutic communication at the Hospital of South Sulawesi with frame symbolic interactionism. Result achieved against the system carried nurse interaction with clients who patterned on therapeutic communication. At the stage of pre-interaction system is applied such as before the nurse interacts with the client well in advance to prepare the way of dressing reception duties of nurse and studying the book status of each client. Introduction or orientation phase nurses visit each client and when the first met uttered a greeting before asking the clients condition when the interaction takes place he uses verbal and non-verbal language and attitude shown in full client hospitality and courtesy. Stage work nurses do an evaluation or action on the clients condition in accordance with the termination task. Midwife stage nurse re-evaluate the client and conclude the development of the clients condition and report a doctor who handles client. The fourth aspect of the application using the analysis of symbolic interactionism

  18. Development of DQM software infrastructure: storing and reading the monitoring information from the histograms filled by online client applications into relational tables.

    CERN Document Server

    Andrzejczak, Adam

    2015-01-01

    In CMS the online DQM stores the monitoring information from several heterogeneous data sources into histograms, which are later sent to the DQMGUI for visualization. System for the handling of monitoring data is crucial for operating the detector and realizing whether or not it is undergoing failures: in particular, relational databases are the current best option for hosting such data. In this context a new DQM plugin DQMDatabaseWriter was developed, it provides interface which can be used in other DQM modules to drop desired data into the relational database. In addition, a python script provides possibility to read and visualize already saved records.

  19. Cloud Databases: A Paradigm Shift in Databases

    Directory of Open Access Journals (Sweden)

    Indu Arora

    2012-07-01

    Full Text Available Relational databases ruled the Information Technology (IT industry for almost 40 years. But last few years have seen sea changes in the way IT is being used and viewed. Stand alone applications have been replaced with web-based applications, dedicated servers with multiple distributed servers and dedicated storage with network storage. Cloud computing has become a reality due to its lesser cost, scalability and pay-as-you-go model. It is one of the biggest changes in IT after the rise of World Wide Web. Cloud databases such as Big Table, Sherpa and SimpleDB are becoming popular. They address the limitations of existing relational databases related to scalability, ease of use and dynamic provisioning. Cloud databases are mainly used for data-intensive applications such as data warehousing, data mining and business intelligence. These applications are read-intensive, scalable and elastic in nature. Transactional data management applications such as banking, airline reservation, online e-commerce and supply chain management applications are write-intensive. Databases supporting such applications require ACID (Atomicity, Consistency, Isolation and Durability properties, but these databases are difficult to deploy in the cloud. The goal of this paper is to review the state of the art in the cloud databases and various architectures. It further assesses the challenges to develop cloud databases that meet the user requirements and discusses popularly used Cloud databases.

  20. Assessment of a enhanced ResultSet component for accessing relational databases

    OpenAIRE

    Pereira, Óscar M.; Aguiar, Rui L.; Santos, Maribel Yasmina

    2010-01-01

    Call Level Interfaces (CLI) provide services aimed at easing the integration of database components and components from client applications. CLI support native SQL statements keeping this way expressiveness and performance of SQL. Thus, they cannot be discarded as a valid option whenever SQL expressiveness and SQL performance are considered key requirements. Despite the aforementioned performance advantage, CLI do not comprise other important performance features, as c...

  1. Client - server programs analysis in the EPOCA environment

    Science.gov (United States)

    Donatelli, Susanna; Mazzocca, Nicola; Russo, Stefano

    1996-09-01

    Client - server processing is a popular paradigm for distributed computing. In the development of client - server programs, the designer has first to ensure that the implementation behaves correctly, in particular that it is deadlock free. Second, he has to guarantee that the program meets predefined performance requirements. This paper addresses the issues in the analysis of client - server programs in EPOCA. EPOCA is a computer-aided software engeneering (CASE) support system that allows the automated construction and analysis of generalized stochastic Petri net (GSPN) models of concurrent applications. The paper describes, on the basis of a realistic case study, how client - server systems are modelled in EPOCA, and the kind of qualitative and quantitative analysis supported by its tools.

  2. The South African National Vegetation Database: History, development, applications, problems and future

    Directory of Open Access Journals (Sweden)

    Leslie W. Powrie

    2012-01-01

    Full Text Available Southern Africa has been recognised as one of the most interesting and important areas of the world from an ecological and evolutionary point of view. The establishment and development of the National Vegetation Database (NVD of South Africa enabled South Africa to contribute to environmental planning and conservation management in this floristically unique region. In this paper, we aim to provide an update on the development of the NVD since it was last described, near its inception, more than a decade ago. The NVD was developed using the Turboveg software environment, and currently comprises 46 697 vegetation plots (relevés sharing 11 690 plant taxa and containing 968 943 species occurrence records. The NVD was primarily founded to serve vegetation classification and mapping goals but soon became recognised as an important tool in conservation assessment and target setting. The NVD has directly helped produce the National Vegetation Map, National Forest Type Classification, South African National Biodiversity Assessment and Forest Type Conservation Assessment. With further development of the NVD and more consistent handling of the legacy data (old data sets, the current limitations regarding certain types of application of the data should be significantly reduced. However, the use of the current NVD in multidisciplinary research has certainly not been fully explored. With the availability of new pools of well-trained vegetation surveyors, the NVD will continue to be purpose driven and serve the needs of biological survey in pursuit of sustainable use of the vegetation and flora resources of the southern African subcontinent.

  3. Encrypting Analytical Web Applications

    OpenAIRE

    Fuhry, Benny; Tighzert, Walter; Kerschbaum. Florian

    2016-01-01

    The software-as-a-service (SaaS) market is growing very fast, but still many clients are concerned about the confidentiality of their data in the cloud. Motivated hackers or malicious insiders could try to steal the clients’ data. Encryption is a potential solution, but supporting the necessary functionality also in existing applications is difficult. In this paper, we examine encrypting analytical web applications that perform extensive number processing operations in the database. Existing ...

  4. Oracle数据库应用中安全问题研究%Research Oracle Database Application Security Issues

    Institute of Scientific and Technical Information of China (English)

    姚树春

    2014-01-01

    The Oracle database is the most direct way for enterprises to share resources, and its own security has also become the focus of one of the elements considered in today's enterprise. Although database systems bring people in dealing with the many convenient data problems, but also to the system brought a lot of security risks, in order to ensure reliable and secure database management system to ensure safety in the application of this paper, the Oracle database application security issues make research initiatives.%Oracle数据库的建立是企业实现资源共享的最直接途径,其自身的安全性也成为现今企业重点考虑的要素之一。尽管数据库系统带给了人们在处理数据问题上的诸多便利,但同时也给系统带来了诸多安全隐患,为保证数据库系统管理的安全可靠、确保在应用方面的安全性,本文就Oracle数据库应用中安全问题的举措做出研究。

  5. Podemos fidelizar clientes inicialmente insatisfechos

    Directory of Open Access Journals (Sweden)

    Jesús Cambra-Fierro

    2011-01-01

    Full Text Available El paradigma relacional, dominante en el ámbito de la mercadotecnia, aboga por establecer y desarrollar relaciones duraderas con los clientes. Para ello es preciso conocer cuáles son sus necesidades y esforzarse por satisfacerlas. Los clientes quieren sentirse importantes y, por tanto, las empresas deberían preocuparse no sólo por vender, sino también por conocer su índice real de satisfacción/ insatisfacción. Por tanto, desde un punto de vista lógico este debería ser el patrón de comportamiento empresarial, los trabajos de Barroso (2008 y Coca (2008 así lo indican. Pero la realidad demuestra que esto no siempre es así. A pesar de que los clientes siempre desean sentirse atendidos, existen empresas que parecen olvidarse de esta premisa básica y, sin embargo, obtienen resultados positivos. Este trabajo tiene el objetivo de analizar la posible contribución de los procesos de recuperación de servicios en la fidelización de clientes/usuarios. Para ello tomamos como referencia el concepto de procesos de recuperación de servicio y estudiamos el contexto del sector de telefonía móvil en España.Através de un análisis de estadísticos descriptivos y de la técnica Partial Least Squares (PLS, concluimos que las empresas se comportan de manera opuesta a lo que esperan los clientes y no se preocupan realmente por reconquistar su satisfacción. Sin embargo, la opinión de los usuarios resulta muy reveladora y sugiere que es posible convertir un cliente inicialmente insatisfecho en un cliente fiel.

  6. The ARAC client system: network-based access to ARAC

    International Nuclear Information System (INIS)

    The ARAC Client System allows users (such as emergency managers and first responders) with commonly available desktop and laptop computers to utilize the central ARAC system over the Internet or any other communications link using Internet protocols. Providing cost-effective fast access to the central ARAC system greatly expands the availability of the ARAC capability. The ARAC Client system consists of (1) local client applications running on the remote user's computer, and (2) ''site servers'' that provide secure access to selected central ARAC system capabilities and run on a scalable number of dedicated workstations residing at the central facility. The remote client applications allow users to describe a real or potential them-bio event, electronically sends this information to the central ARAC system which performs model calculations, and quickly receive and visualize the resulting graphical products. The site servers will support simultaneous access to ARAC capabilities by multiple users. The ARAC Client system is based on object-oriented client/server and distributed computing technologies using CORBA and Java, and consists of a large number of interacting components

  7. Perspectives on a Big Data Application: What Database Engineers and IT Students Need to Know

    OpenAIRE

    E. Erturk; Jyoti, K.

    2015-01-01

    Cloud Computing and Big Data are important and related current trends in the world of information technology. They will have significant impact on the curricula of computer engineering and information systems at universities and higher education institutions. Learning about big data is useful for both working database professionals and students, in accordance with the increase in jobs requiring these skills. It is also important to address a broad gamut of database engineering skills, i.e. da...

  8. Database management systems understanding and applying database technology

    CERN Document Server

    Gorman, Michael M

    1991-01-01

    Database Management Systems: Understanding and Applying Database Technology focuses on the processes, methodologies, techniques, and approaches involved in database management systems (DBMSs).The book first takes a look at ANSI database standards and DBMS applications and components. Discussion focus on application components and DBMS components, implementing the dynamic relationship application, problems and benefits of dynamic relationship DBMSs, nature of a dynamic relationship application, ANSI/NDL, and DBMS standards. The manuscript then ponders on logical database, interrogation, and phy

  9. KALIMER database development

    International Nuclear Information System (INIS)

    KALIMER database is an advanced database to utilize the integration management for liquid metal reactor design technology development using Web applications. KALIMER design database is composed of results database, Inter-Office Communication (IOC), 3D CAD database, and reserved documents database. Results database is a research results database during all phase for liquid metal reactor design technology development of mid-term and long-term nuclear R and D. IOC is a linkage control system inter sub project to share and integrate the research results for KALIMER. 3D CAD database is a schematic overview for KALIMER design structure. And reserved documents database is developed to manage several documents and reports since project accomplishment

  10. KALIMER database development

    Energy Technology Data Exchange (ETDEWEB)

    Jeong, Kwan Seong; Lee, Yong Bum; Jeong, Hae Yong; Ha, Kwi Seok

    2003-03-01

    KALIMER database is an advanced database to utilize the integration management for liquid metal reactor design technology development using Web applications. KALIMER design database is composed of results database, Inter-Office Communication (IOC), 3D CAD database, and reserved documents database. Results database is a research results database during all phase for liquid metal reactor design technology development of mid-term and long-term nuclear R and D. IOC is a linkage control system inter sub project to share and integrate the research results for KALIMER. 3D CAD database is a schematic overview for KALIMER design structure. And reserved documents database is developed to manage several documents and reports since project accomplishment.

  11. What Makes Underwriting and Non-Underwriting Clients of Brokerage Firms Receive Different Recommendations? An Application of Uplift Random Forest Model

    Directory of Open Access Journals (Sweden)

    Shaowen Hua

    2016-04-01

    Full Text Available I explore company characteristics which explain the difference in analysts’ recommendations for companies that were underwritten (affiliated versus non-underwritten (unaffiliated by analysts’ brokerage firms. Prior literature documents that analysts issue more optimistic recommendations to underwriting clients of analysts’ brokerage employers. Extant research uses regression models to find general associations between recommendations and financial qualities of companies, with or without underwriting relationship. However, regression models cannot identify the qualities that cause the most difference in recommendations between affiliated versus unaffiliated companies. I adopt uplift random forest model, a popular technique in recent marketing and healthcare research, to identify the type of companies that earn analysts’ favor. I find that companies of stable earnings in the past, higher book-to-market ratio, smaller sizes, worsened earnings, and lower forward PE ratio are likely to receive higher recommendations if  they are affiliated with analysts than if they are unaffiliated with analysts. With uplift random forest model, I show that analysts pay more attention on price-related than earnings-related matrices when they value affiliated versus unaffiliated companies. This paper contributes to the literature by introducing an effective predictive model to capital market research and shedding additional light on the usefulness of analysts’ reports.

  12. GRAPH DATABASES AND GRAPH VIZUALIZATION

    OpenAIRE

    Klančar, Jure

    2013-01-01

    The thesis presents graph databases. Graph databases are a part of NoSQL databases, which is why this thesis presents basics of NoSQL databases as well. We have focused on advantages of graph databases compared to rela- tional databases. We have used one of native graph databases (Neo4j), to present more detailed processing of graph databases. To get more acquainted with graph databases and its principles, we developed a simple application that uses a Neo4j graph database to...

  13. The applicability of public LCI databases in the framework of an integrated product policy in the area of electronics industry - a case study with the Swiss database ''ecoinvent''

    Energy Technology Data Exchange (ETDEWEB)

    Hischier, R.; Lehmann, M. [Swiss Federal Labs. for Materials Testing and Research, Technology and Society Lab, St. Gallen (Switzerland)

    2004-07-01

    Within the last couple of years, several initiatives for the creation of national Life-Cycle Inventory (LCI) databases have been taken - in Europe e.g. in Germany, in Denmark, in Sweden and in Switzerland. This presentation describes the content of such national LCI databases from the viewpoint of the electronics industry and shows its crucial importance in the framework of the application of the integrated product policy (IPP) to this sector. (orig.)

  14. Turkish Cloud-Radiation Database (CRD) and Its Application with CDR Bayesian Probability Algorithm

    Science.gov (United States)

    Oztopal, A.; Mugnai, A.; Casella, D.; Formenton, M.; Sano, P.; Sonmez, I.; Sen, Z.; Hsaf Team

    2010-12-01

    ABSTRACT It is rather a very difficult task to determine ground rainfall amounts from few Special Sensor Microwave Imager/Sounder (SSMI/S) channels. Although ground rainfall cannot be observed from the space directly, but knowledge about the cloud physics helps to estimate the amound of ground rainfall. SSMI/S includes so much information about the atmospheric structure, however it cannot provide cloud micro-physical structural information. In such a situation, in the rainfall algorithm, besides the SSMI/S data, it is necessary to incorporate cloud micro-physical properties from an external data source. These properties can be obtained quite simply by the help of Cloud Resolving Model (CRM). Later, in addition to all available data, also micro-physical properties obtained from Radiative Transfer Model (RTM) help to determine the SSMI/S brightness temperatures (Brightness temperatures - TBs), which can then be correlated with Cloud-Radiation Database (CRD) data generation. SSMI/S satellite data and CDR provide a common basis for rainfall prediction procedure through CDR Bayesian probability algorithm, which combines the two sets of data in a scientific manner. The first applications of this algorithm, which is being used up today, is due to various researchers. In this work, in order to establish a reflection of available data processing CDR CRM University of Wisconsin - Non-hydrostatic Modeling System (UW-NMS) model is employed, which is first developed by Prof. Gregory J. Tripoli. It is also used by Turkish Meteorological Service by benefiting from radar network data, and finally 14 simulations are realized in this study. Moreover, one case study is fulfilled by using a 3X3 spatial filtering, and then radar data and result of CDR Bayesian probability algorithm are compared with each other. On 9 September 2009 at 03:40 GMT rainfall event on comparatively flat area matches far better with the retrieval values and hence the spatial rainfall occurrence extent and

  15. Literature Review and Database of Relations Between Salinity and Aquatic Biota: Applications to Bowdoin National Wildlife Refuge, Montana

    Science.gov (United States)

    Gleason, Robert A.; Tangen, Brian A.; Laubhan, Murray K.; Finocchiaro, Raymond G.; Stamm, John F.

    2009-01-01

    Long-term accumulation of salts in wetlands at Bowdoin National Wildlife Refuge (NWR), Mont., has raised concern among wetland managers that increasing salinity may threaten plant and invertebrate communities that provide important habitat and food resources for migratory waterfowl. Currently, the U.S. Fish and Wildlife Service (USFWS) is evaluating various water management strategies to help maintain suitable ranges of salinity to sustain plant and invertebrate resources of importance to wildlife. To support this evaluation, the USFWS requested that the U.S. Geological Survey (USGS) provide information on salinity ranges of water and soil for common plants and invertebrates on Bowdoin NWR lands. To address this need, we conducted a search of the literature on occurrences of plants and invertebrates in relation to salinity and pH of the water and soil. The compiled literature was used to (1) provide a general overview of salinity concepts, (2) document published tolerances and adaptations of biota to salinity, (3) develop databases that the USFWS can use to summarize the range of reported salinity values associated with plant and invertebrate taxa, and (4) perform database summaries that describe reported salinity ranges associated with plants and invertebrates at Bowdoin NWR. The purpose of this report is to synthesize information to facilitate a better understanding of the ecological relations between salinity and flora and fauna when developing wetland management strategies. A primary focus of this report is to provide information to help evaluate and address salinity issues at Bowdoin NWR; however, the accompanying databases, as well as concepts and information discussed, are applicable to other areas or refuges. The accompanying databases include salinity values reported for 411 plant taxa and 330 invertebrate taxa. The databases are available in Microsoft Excel version 2007 (http://pubs.usgs.gov/sir/2009/5098/downloads/databases_21april2009.xls) and contain

  16. Bringing the client back in

    DEFF Research Database (Denmark)

    Danneris, Sophie; Nielsen, Mathias Herup

    2016-01-01

    Categorising the ‘job readiness’ of the unemployed client is a task of utmost importance for active labour market policies. Scholarly attention on the topic has mostly focused on either questions of political legitimacy or questions of how categories are practically negotiated in meetings between......-known poststructuralist risk of reducing welfare clients to mere formable objects. Furthermore, the analysis presents a critical view on current categorisation practices, as it strongly and in great detail exemplifies what current government rhetoric fails to address....

  17. Application of Oracle Database to the Radiation Monitoring System%Oracle 数据库在园区辐射监测系统中的应用

    Institute of Scientific and Technical Information of China (English)

    雷蕾; 徐海霞; 韩利峰; 瞿叶玺; 洪鹏飞; 陈永忠; 李勇平

    2014-01-01

    简述了钍基熔盐堆( TMSR)专项的园区在线辐射监测系统( MRP)中存档和报警两个子系统的软件架构,着重介绍了Oracle数据库在其中的应用。该系统采用EPICS软件架构完成显示、存档和报警等功能,实现辐射监测信息的分布式控制和集中管理。同时采用Oracle数据库完成存档和报警模块中的数据采集与存储,实现了该系统后台数据的统一管理。并在此基础上实现了快速查询历史数据、及时定位报警、报警历史数据浏览等功能,能够很好地满足辐射监测系统中对数据查询速度、报警定位速度的要求。%The online radiation monitoring system(MRP) of thorium-based molten salt reactor(TMSR)is de-signed to be an EPICS -based system , including displaying , archiving and alarm .It provides decentralized control as well as centralized management .The archive system and alarm system , especially the application of Oracle database , are described .Oracle is used to perform data acquisition and storage , which realizes the uni-fied management of the background data in the system .And users are allowed to browse history data easily , al-locate alarm site quickly and browser alarm history through the CSS client GUI .Its application indicates that the Oracle-based scheme can meet the requirements of the radiation monitoring system .

  18. Clients of SSA Net-Ready Data

    Science.gov (United States)

    Richmond, D.; Bicker, S.; Malloy, B.

    2014-09-01

    Multiple Net-Centric approaches have been developed to expose optical and radar sensor data. Client applications have been developed to ingest and process this data by NASIC and JMS. Data flows and formats used that integrate these Net-Centric approaches with JMS and NASIC will be presented. Example data collected and examples of improved SSA benefits will be discussed. Potential future improvements to increase the precision of the SOI processing algorithm will be addressed. Specifics regarding the process to gain access to the N-CSDS GEODSS sensor data in near real time will be identified.

  19. Teaching Three-Dimensional Structural Chemistry Using Crystal Structure Databases. 3. The Cambridge Structural Database System: Information Content and Access Software in Educational Applications

    Science.gov (United States)

    Battle, Gary M.; Allen, Frank H.; Ferrence, Gregory M.

    2011-01-01

    Parts 1 and 2 of this series described the educational value of experimental three-dimensional (3D) chemical structures determined by X-ray crystallography and retrieved from the crystallographic databases. In part 1, we described the information content of the Cambridge Structural Database (CSD) and discussed a representative teaching subset of…

  20. The construction and application of the AMSR-E global microwave emissivity database

    Science.gov (United States)

    Lijuan, Shi; Yubao, Qiu; Jingjing, Niu; Wenbo, Wu

    2014-03-01

    Land surface microwave emissivity is an important parameter to describe the characteristics of terrestrial microwave radiation, and is the necessary input amount for inversion various geophysical parameters. We use brightness temperature of the Advanced Microwave Scanning Radiometer-Earth Observing System (AMSR-E) and synchronous land surface temperature and atmospheric temperature-humidity profile data obtained from the MODIS which aboard on satellite AQUA the same as AMSR-E, to retrieved microwave emissivity under clear sky conditions. After quality control, evaluation and design, the global microwave emissivity database of AMSR-E under clear sky conditions is established. This database include 2002-2011 years, different regions, different surface coverage, dual-polarized, 6.9,10.65, 18.7, 23.8, 36.5 and 89GHz, ascending and descending orbit, spatial resolution 25km, global 0.05 degrees, instantaneous and half-month averaged emissivity data. The database can provide the underlying surface information for precipitation algorithm, water-vapor algorithm, and long-resolution mode model (General Circulation Model (GCM) etc.). It also provides underlying surface information for the satellite simulator, and provides basic prior knowledge of land surface radiation for future satellite sensors design. The emissivity database or the fast emissivity obtained can get ready for climate model, energy balance, data assimilation, geophysical model simulation, inversion and estimates of the physical parameters under the cloud cover conditions.

  1. Application of Knowledge Discovery in Databases Methodologies for Predictive Models for Pregnancy Adverse Events

    Science.gov (United States)

    Taft, Laritza M.

    2010-01-01

    In its report "To Err is Human", The Institute of Medicine recommended the implementation of internal and external voluntary and mandatory automatic reporting systems to increase detection of adverse events. Knowledge Discovery in Databases (KDD) allows the detection of patterns and trends that would be hidden or less detectable if analyzed by…

  2. The construction and application of the AMSR-E global microwave emissivity database

    International Nuclear Information System (INIS)

    Land surface microwave emissivity is an important parameter to describe the characteristics of terrestrial microwave radiation, and is the necessary input amount for inversion various geophysical parameters. We use brightness temperature of the Advanced Microwave Scanning Radiometer-Earth Observing System (AMSR-E) and synchronous land surface temperature and atmospheric temperature-humidity profile data obtained from the MODIS which aboard on satellite AQUA the same as AMSR-E, to retrieved microwave emissivity under clear sky conditions. After quality control, evaluation and design, the global microwave emissivity database of AMSR-E under clear sky conditions is established. This database include 2002–2011 years, different regions, different surface coverage, dual-polarized, 6.9,10.65, 18.7, 23.8, 36.5 and 89GHz, ascending and descending orbit, spatial resolution 25km, global 0.05 degrees, instantaneous and half-month averaged emissivity data. The database can provide the underlying surface information for precipitation algorithm, water-vapor algorithm, and long-resolution mode model (General Circulation Model (GCM) etc.). It also provides underlying surface information for the satellite simulator, and provides basic prior knowledge of land surface radiation for future satellite sensors design. The emissivity database or the fast emissivity obtained can get ready for climate model, energy balance, data assimilation, geophysical model simulation, inversion and estimates of the physical parameters under the cloud cover conditions

  3. Design of remote weather monitor system based on embedded web database

    International Nuclear Information System (INIS)

    The remote weather monitoring system is designed by employing the embedded Web database technology and the S3C2410 microprocessor as the core. The monitoring system can simultaneously monitor the multi-channel sensor signals, and can give a dynamic Web pages display of various types of meteorological information on the remote computer. It gives a elaborated introduction of the construction and application of the Web database under the embedded Linux. Test results show that the client access the Web page via the GPRS or the Internet, acquires data and uses an intuitive graphical way to display the value of various types of meteorological information. (authors)

  4. 动态WEB数据库应用探究%Application Research of dynamic WEB database

    Institute of Scientific and Technical Information of China (English)

    乔立龙

    2015-01-01

    数据库技术现在已经相对成熟,并且结构比较严谨,不过这个灵活度还不够,如果可以实现把数据库和web相结合,那么必定能够在很大程度上扩大数据库的一个应用领域,这个相结合的方式其实也是现在数据库技术研究的一个热点。本课题总介绍的动态WEB数据库技术是采用这个中间件来完成的,这个的一个实现方法是:使用中间件吧这个Web服务器还有数据库服务器连在一个。由于中间件不当当可以使得前端用户能够访问后端异构数据库的这个数据源而达到一个中间件透明化的效果,同时还能够确保存取访问接口的一个开放性。%Database technology is already mature,and more rigorous structure,but this flexibility is not enough,if can realize the combination of database and web,then will be able to expand the database in a large extent of an applied field,the combination of in fact,is now the database technology a hot. This topic describes the dynamic web database technology is using the middleware to accomplish,this a realization method is that using middleware it the web server and database server connected to a.Due to the middleware Dangdang can make the front-end user to access heterogeneous backend database with the data source to a middleware transparency effect,but also can ensure the access interface of an open.

  5. Open client/server computing and middleware

    CERN Document Server

    Simon, Alan R

    2014-01-01

    Open Client/Server Computing and Middleware provides a tutorial-oriented overview of open client/server development environments and how client/server computing is being done.This book analyzes an in-depth set of case studies about two different open client/server development environments-Microsoft Windows and UNIX, describing the architectures, various product components, and how these environments interrelate. Topics include the open systems and client/server computing, next-generation client/server architectures, principles of middleware, and overview of ProtoGen+. The ViewPaint environment

  6. Knowledge discovery in databases of biomechanical variables: application to the sit to stand motor task

    Directory of Open Access Journals (Sweden)

    Benvenuti Francesco

    2004-10-01

    Full Text Available Abstract Background The interpretation of data obtained in a movement analysis laboratory is a crucial issue in clinical contexts. Collection of such data in large databases might encourage the use of modern techniques of data mining to discover additional knowledge with automated methods. In order to maximise the size of the database, simple and low-cost experimental set-ups are preferable. The aim of this study was to extract knowledge inherent in the sit-to-stand task as performed by healthy adults, by searching relationships among measured and estimated biomechanical quantities. An automated method was applied to a large amount of data stored in a database. The sit-to-stand motor task was already shown to be adequate for determining the level of individual motor ability. Methods The technique of search for association rules was chosen to discover patterns as part of a Knowledge Discovery in Databases (KDD process applied to a sit-to-stand motor task observed with a simple experimental set-up and analysed by means of a minimum measured input model. Selected parameters and variables of a database containing data from 110 healthy adults, of both genders and of a large range of age, performing the task were considered in the analysis. Results A set of rules and definitions were found characterising the patterns shared by the investigated subjects. Time events of the task turned out to be highly interdependent at least in their average values, showing a high level of repeatability of the timing of the performance of the task. Conclusions The distinctive patterns of the sit-to-stand task found in this study, associated to those that could be found in similar studies focusing on subjects with pathologies, could be used as a reference for the functional evaluation of specific subjects performing the sit-to-stand motor task.

  7. Web database development

    OpenAIRE

    Tsardas, Nikolaos A.

    2001-01-01

    This thesis explores the concept of Web Database Development using Active Server Pages (ASP) and Java Server Pages (JSP). These are among the leading technologies in the web database development. The focus of this thesis was to analyze and compare the ASP and JSP technologies, exposing their capabilities, limitations, and differences between them. Specifically, issues related to back-end connectivity using Open Database Connectivity (ODBC) and Java Database Connectivity (JDBC), application ar...

  8. Client services for geriatric pets.

    Science.gov (United States)

    Hancock, G; Yates, J

    1989-01-01

    Some veterinarians have been reluctant to discuss the prospect of the death of a pet because of a sense of discomfort and a lack of understanding about how to respond to the client's grief reaction. It is essential to take the time for this important communication and help clients deal with fears about the process, any feelings of guilt and helplessness, and judgments about the medical aspects of a case. Clients must be encouraged to express grief over the loss of a pet, particularly a geriatric pet that has lived with them many years and to which they are deeply bonded. Veterinarians need to counsel clients about obtaining additional pets or another pet. The phrase "replacement pet" must be stricken from the veterinarian's vocabulary. One does not "replace" a deceased spouse, mother, father, or child. It is possible to have another child or find another spouse, but it is not possible to replace a person. Neither can a pet be "replaced," because each pet is a unique living being. It is disrespectful to the memory of deceased pets to belittle their uniqueness by suggesting that they can be replaced. Instead, the veterinarian has the capability and responsibility to help pet owners maintain fond and happy memories of an irreplacable pet, while finding room in their hearts for another new pet to create happiness for the future. Once the grief is resolved, clients will be thankful for having had the privilege of sharing their life with an animal and experiencing the joy of the bond between two unique individuals. PMID:2646816

  9. A national, geographic database of CDC-funded HIV prevention services: development challenges and potential applications

    Directory of Open Access Journals (Sweden)

    Fogarty Kieran J

    2005-11-01

    Full Text Available Abstract Background From 2000–2002, the Centers for Disease Control and Prevention (CDC funded a study that was designed to improve the information available to program planners about the geographic distribution of CDC-funded HIV prevention services provided by community-based organizations (CBOs. Program managers at CDC recognized the potential of a geographic information system (GIS to organize and analyze information about HIV prevention services and they made GIS a critical component of the study design. The primary objective of this study was to construct a national, geographically-referenced database of HIV prevention services provided by CDC-funded CBOs. We designed a survey instrument to collect information about the geographic service areas where CBOs provided HIV prevention services, then collected data from CBOs that received CDC funding for these services during fiscal year 2000. We developed a GIS database to link questionnaire responses with GIS map layers in a manner that would incorporate overlapping geographies, risk populations and prevention services. We collected geographic service area data in two formats: 1 geopolitical boundaries and 2 geographic distance. Results The survey response rate was 70.3%, i.e. 1,020 of 1,450 community-based organizations responded. The number of HIV prevention programs administered by each CBO ranged from 1 to 23. The survey provided information about 3,028 prevention programs, including descriptions of intervention types, risk populations, race and ethnicity, CBO location and geographic service area. We incorporated this information into a large GIS database, the HIV Prevention Services Database. The use of geopolitical boundaries provided more accurate results than geographic distance. The use of a reference map with the questionnaire improved completeness, accuracy and precision of service area data. Conclusion The survey instrument design and database development procedures that we used

  10. Exploring earthquake databases for the creation of magnitude-homogeneous catalogues: tools for application on a regional and global scale

    Science.gov (United States)

    Weatherill, G. A.; Pagani, M.; Garcia, J.

    2016-09-01

    The creation of a magnitude-homogenized catalogue is often one of the most fundamental steps in seismic hazard analysis. The process of homogenizing multiple catalogues of earthquakes into a single unified catalogue typically requires careful appraisal of available bulletins, identification of common events within multiple bulletins and the development and application of empirical models to convert from each catalogue's native scale into the required target. The database of the International Seismological Center (ISC) provides the most exhaustive compilation of records from local bulletins, in addition to its reviewed global bulletin. New open-source tools are developed that can utilize this, or any other compiled database, to explore the relations between earthquake solutions provided by different recording networks, and to build and apply empirical models in order to harmonize magnitude scales for the purpose of creating magnitude-homogeneous earthquake catalogues. These tools are described and their application illustrated in two different contexts. The first is a simple application in the Sub-Saharan Africa region where the spatial coverage and magnitude scales for different local recording networks are compared, and their relation to global magnitude scales explored. In the second application the tools are used on a global scale for the purpose of creating an extended magnitude-homogeneous global earthquake catalogue. Several existing high-quality earthquake databases, such as the ISC-GEM and the ISC Reviewed Bulletins, are harmonized into moment magnitude to form a catalogue of more than 562 840 events. This extended catalogue, while not an appropriate substitute for a locally calibrated analysis, can help in studying global patterns in seismicity and hazard, and is therefore released with the accompanying software.

  11. Development of the Database for Environmental Sound Research and Application (DESRA: Design, Functionality, and Retrieval Considerations

    Directory of Open Access Journals (Sweden)

    Brian Gygi

    2010-01-01

    Full Text Available Theoretical and applied environmental sounds research is gaining prominence but progress has been hampered by the lack of a comprehensive, high quality, accessible database of environmental sounds. An ongoing project to develop such a resource is described, which is based upon experimental evidence as to the way we listen to sounds in the world. The database will include a large number of sounds produced by different sound sources, with a thorough background for each sound file, including experimentally obtained perceptual data. In this way DESRA can contain a wide variety of acoustic, contextual, semantic, and behavioral information related to an individual sound. It will be accessible on the Internet and will be useful to researchers, engineers, sound designers, and musicians.

  12. Abductive Equivalential Translation and its application to Natural Language Database Interfacing

    CERN Document Server

    Rayner, M

    1994-01-01

    The thesis describes a logical formalization of natural-language database interfacing. We assume the existence of a ``natural language engine'' capable of mediating between surface linguistic string and their representations as ``literal'' logical forms: the focus of interest will be the question of relating ``literal'' logical forms to representations in terms of primitives meaningful to the underlying database engine. We begin by describing the nature of the problem, and show how a variety of interface functionalities can be considered as instances of a type of formal inference task which we call ``Abductive Equivalential Translation'' (AET); functionalities which can be reduced to this form include answering questions, responding to commands, reasoning about the completeness of answers, answering meta-questions of type ``Do you know...'', and generating assertions and questions. In each case, a ``linguistic domain theory'' (LDT) $\\Gamma$ and an input formula $F$ are given, and the goal is to construct a fo...

  13. The Prototype Database for Management of Revision and Compilation of Nuclear & Radiation Safety Regulations and Standards%核与辐射安全法规标准制修订管理原型数据库系统

    Institute of Scientific and Technical Information of China (English)

    王文海; 张弛; 樊赟; 杨丽丽; 刘黎明; 李小丁

    2011-01-01

    介绍了核与辐射安全法规标准制修订管理系统的基本需求,原型数据库系统的数据结构、终端软件的功能以及原型数据库系统的应用情况。%The article describes basic needs of users of a database for management of revision and compilation of nuclear radiation safety regulations and standards,the data structure of the Prototype Database, the function of the Client software of the Prototype Database and the application of the Prototype Database.

  14. Application of factorial kriging analysis to the FOREGS European topsoil geochemistry database.

    Science.gov (United States)

    Imrie, Claire E; Korre, Anna; Munoz-Melendez, Gabriela; Thornton, Iain; Durucan, Sevket

    2008-04-01

    Concern about increasing levels of trace elements in the environment has led to the development and implementation of a global programme to determine the current baseline levels of these chemicals in the Earth's surface. The FORum of European Geological Surveys (FOREGS) has recently published a geochemical database for Europe, while progress on similar databases is continuing in other major regions of the world. The FOREGS database comprises multimedia samples collected at a resolution of approximately 72x72 km from 26 European countries. This enables the investigation of the factors governing geochemical variation on a continental scale, potentially allowing contributions of natural processes to be appreciated prior to setting environmental quality standards. This paper investigates the variation in European topsoil geochemistry using factorial kriging analysis, which performs principal components analysis at different spatial scales. The results are interpreted with the aid of a GIS database. Four spatial scales were identified: a nugget component representing variation over a range less than the sampling density; a 'short' scale component with a range of 296 km; an 'intermediate' scale component (875 km); and a 'long' scale component (1750 km). The first three principal components (PCs) of the nugget covariance matrix explained 22.2% of the overall variance, representing local variation in geology, land use, weathering and organic matter content. The first two PCs of the short range structure explained 12.6% of the variance, representing variation according to the major structural divisions of Europe, and to carbonate content. The first PC of the intermediate structure explained 7.2% of the variance and was found to relate to glacial history and Quaternary deposition. Finally, the first three PCs of the long range structure explained 29.6% of the variance and represented variation due to mineralisation, soil texture, climate and possibly anthropogenic

  15. Application of factorial kriging analysis to the FOREGS European topsoil geochemistry database.

    Science.gov (United States)

    Imrie, Claire E; Korre, Anna; Munoz-Melendez, Gabriela; Thornton, Iain; Durucan, Sevket

    2008-04-01

    Concern about increasing levels of trace elements in the environment has led to the development and implementation of a global programme to determine the current baseline levels of these chemicals in the Earth's surface. The FORum of European Geological Surveys (FOREGS) has recently published a geochemical database for Europe, while progress on similar databases is continuing in other major regions of the world. The FOREGS database comprises multimedia samples collected at a resolution of approximately 72x72 km from 26 European countries. This enables the investigation of the factors governing geochemical variation on a continental scale, potentially allowing contributions of natural processes to be appreciated prior to setting environmental quality standards. This paper investigates the variation in European topsoil geochemistry using factorial kriging analysis, which performs principal components analysis at different spatial scales. The results are interpreted with the aid of a GIS database. Four spatial scales were identified: a nugget component representing variation over a range less than the sampling density; a 'short' scale component with a range of 296 km; an 'intermediate' scale component (875 km); and a 'long' scale component (1750 km). The first three principal components (PCs) of the nugget covariance matrix explained 22.2% of the overall variance, representing local variation in geology, land use, weathering and organic matter content. The first two PCs of the short range structure explained 12.6% of the variance, representing variation according to the major structural divisions of Europe, and to carbonate content. The first PC of the intermediate structure explained 7.2% of the variance and was found to relate to glacial history and Quaternary deposition. Finally, the first three PCs of the long range structure explained 29.6% of the variance and represented variation due to mineralisation, soil texture, climate and possibly anthropogenic

  16. Client Compliance with Homework Directives during Counseling.

    Science.gov (United States)

    Worthington, Everett L., Jr.

    1986-01-01

    Investigated compliance as a function of counselor, client, and therapy variables. Results indicated that variables associated with the conduct of counseling more strongly influenced compliance with homework than did either counselor or client variables. (Author/BL)

  17. ECG Database Applicable for Development and Testing of Pace Detection Algorithms

    Directory of Open Access Journals (Sweden)

    Irena Jekova

    2014-12-01

    Full Text Available This paper presents an ECG database, named 'PacedECGdb' (available at http://biomed.bas.bg/bioautomation/2014/vol_18.4/files/PacedECGdb.zip, which contains different arrhythmias generated by HKP (Heidelberger Praxisklinik simulator, combined with artificially superimposed pacing pulses that cover the wide ranges of rising edge (from <10 µs to 100 µs and total pulse durations (from 100 µs to 2 ms and correspond to various pacemaker modes. It involves a total number of 1404 recordings - 780 representing 'pure' ECG with pacing pulses and 624 that comprise paced ECGs corrupted by tremor. The signals are recorded with 9.81 µV/LSB amplitude resolution at 128 kHz sampling rate in order to preserve the steep raising and trailing edges of the pace pulses. To the best of our knowledge, 'PacedECGdb' is the first publicly available paced ECG database. It could be used for development and testing of methods for pace detection in the ECG. The existence of ECGs corrupted by tremor (the only physiological noise that could compromise the methods for pacing pulses detection is an advantage, since such signals could be applied to define the signal-to-noise level for correct operation of the algorithm, or for improvement of the noise immunity of a method that is under development. The open access of the database makes it suitable for comparative studies including different algorithms.

  18. A public turbulence database cluster and applications to study Lagrangian evolution of velocity increments in turbulence

    CERN Document Server

    Li, Yi; Wan, Minping; Yang, Yunke; Meneveau, Charles; Burns, Randal; Chen, Shiyi; Szalay, Alexander; Eyink, Gregory

    2008-01-01

    A public database system archiving a direct numerical simulation (DNS) data set of isotropic, forced turbulence is described in this paper. The data set consists of the DNS output on $1024^3$ spatial points and 1024 time-samples spanning about one large-scale turn-over timescale. This complete $1024^4$ space-time history of turbulence is accessible to users remotely through an interface that is based on the Web-services model. Users may write and execute analysis programs on their host computers, while the programs make subroutine-like calls that request desired parts of the data over the network. The users are thus able to perform numerical experiments by accessing the 27 Terabytes of DNS data using regular platforms such as laptops. The architecture of the database is explained, as are some of the locally defined functions, such as differentiation and interpolation. Test calculations are performed to illustrate the usage of the system and to verify the accuracy of the methods. The database is then used to a...

  19. 基于跨平台的移动电子商务支付客户端应用与研究%APPLICATION AND STUDY ON CROSS-PLATFORM BASED ON MOBILE E-BUSINESS PAYMENT CLIENT

    Institute of Scientific and Technical Information of China (English)

    彭革刚; 李新宇; 宋鹰; 向黎生; 沈清; 李仁发

    2013-01-01

    随着全球化的信息技术革命的发展以及智能手机日益普及,移动通信能力进一步加强,为移动电子商务的发展提供了强有力的支持.手机支付是移动电子商务的基石,对推动移动电子商务的全面发展具有重要意义,而手机客户端是手机支付业务的重要渠道,相比网站、短信、WAP等渠道,它更能够充分发挥手机终端便携性的优势、具有更好的用户体验及方便性.针对目前智能手机操作系统的多样性,提出一种基于跨平台的移动电子商务支付客户端解决方案,同时着重介绍该手机客户端系统的体系结构以及系统建设过程中需要解决的关键问题.该客户端的实际运营情况表明该系统具有很好的安全性和实用性,能够成为移动电子商务应用中手机支付的有力工具.%With the development of the globalised information technology revolution and the growing popularisation of intelligent mobile phone, mobile communication ability is further enhanced which provides powerful support to the development of mobile e-business. Mobile phone payment is the cornerstone of mobile e-business, and plays the significant role in driving the mobile e-business toward the all-round development. Mobile phone client is an important channel in mobile phone payment business, compared with other channels such as website, text message and wap, etc. , it can do more in putting the advantage of mobile phone terminals portability into fullplay, and has better user experience and convenience. In view of the diversity of operating system for intelligent mobile phone at present, we present a cross-platform based solution for mobile e-business payment client, and emphatically introduce the architecture of the mobile phone client system as well as the key problems to be overcome in system construction process. Actual operation situation of the client shows that it has good security and practicality, and can become a powerful

  20. Database design for a kindergarten Pastelka

    OpenAIRE

    Grombíř, Tomáš

    2010-01-01

    This bachelor thesis deals with analysis, creation of database for a kindergarten and installation of the designed database into the database system MySQL. Functionality of the proposed database was verified through an application written in PHP.

  1. Client participation in the rehabilitation process

    OpenAIRE

    Wressle, Ewa

    2002-01-01

    This thesis evaluates the rehabilitation process with respect to client participation. The Swedish version of a client-centred structure, the Canadian Occupational Performance Measure (COPM), is evaluated from the perspectives of the clients, the occupational therapists and the members of a rehabilitation team. Data have been collected through diaries, the COPM, assessments of ability to perform activities of daily living, mobility, self-assessments of pain and health, interviews with clients...

  2. The Database Query Support Processor (QSP)

    Science.gov (United States)

    1993-01-01

    The number and diversity of databases available to users continues to increase dramatically. Currently, the trend is towards decentralized, client server architectures that (on the surface) are less expensive to acquire, operate, and maintain than information architectures based on centralized, monolithic mainframes. The database query support processor (QSP) effort evaluates the performance of a network level, heterogeneous database access capability. Air Force Material Command's Rome Laboratory has developed an approach, based on ANSI standard X3.138 - 1988, 'The Information Resource Dictionary System (IRDS)' to seamless access to heterogeneous databases based on extensions to data dictionary technology. To successfully query a decentralized information system, users must know what data are available from which source, or have the knowledge and system privileges necessary to find out this information. Privacy and security considerations prohibit free and open access to every information system in every network. Even in completely open systems, time required to locate relevant data (in systems of any appreciable size) would be better spent analyzing the data, assuming the original question was not forgotten. Extensions to data dictionary technology have the potential to more fully automate the search and retrieval for relevant data in a decentralized environment. Substantial amounts of time and money could be saved by not having to teach users what data resides in which systems and how to access each of those systems. Information describing data and how to get it could be removed from the application and placed in a dedicated repository where it belongs. The result simplified applications that are less brittle and less expensive to build and maintain. Software technology providing the required functionality is off the shelf. The key difficulty is in defining the metadata required to support the process. The database query support processor effort will provide

  3. The GEISA Spectroscopic Database as a Tool for Hyperspectral Earth' Tropospheric Remote Sensing Applications

    Science.gov (United States)

    Jacquinet-Husson, Nicole; Crépeau, Laurent; Capelle, Virginie; Scott, Noëlle; Armante, Raymond; Chédin, Alain

    2010-05-01

    Remote sensing of the terrestrial atmosphere has advanced significantly in recent years, and this has placed greater demands on the compilations in terms of accuracy, additional species, and spectral coverage. The successful performances of the new generation of hyperspectral Earth' atmospheric sounders like AIRS (Atmospheric Infrared Sounder -http://www-airs.jpl.nasa.gov/), in the USA, and IASI (Infrared Atmospheric Sounding Interferometer -http://earth-sciences.cnes.fr/IASI/) in Europe, which have a better vertical resolution and accuracy, compared to the previous satellite infrared vertical sounders, depend ultimately on the accuracy to which the spectroscopic parameters of the optically active gases are known, since they constitute an essential input to the forward radiative transfer models that are used to interpret their observations. In this context, the GEISA (1) (Gestion et Etude des Informations Spectroscopiques Atmosphériques: Management and Study of Atmospheric Spectroscopic Information) computer-accessible database, initiated in 1976, is continuously developed and maintained at LMD (Laboratoire de Météorologie Dynamique, France). The updated 2009 edition of GEISA (GEISA-09)is a system comprising three independent sub-databases devoted respectively to: line transition parameters, infrared and ultraviolet/visible absorption cross-sections, microphysical and optical properties of atmospheric aerosols. In this edition, the contents of which will be summarized, 50 molecules are involved in the line transition parameters sub-database, including 111 isotopes, for a total of 3,807,997 entries, in the spectral range from 10-6 to 35,877.031 cm-1. Currently, GEISA is involved in activities related to the assessment of the capabilities of IASI through the GEISA/IASI database derived from GEISA (2). Since the Metop (http://www.eumetsat.int) launch (October 19th 2006), GEISA/IASI is the reference spectroscopic database for the validation of the level-1 IASI data

  4. Client Server design and implementation issues in the Accelerator Control System environment

    International Nuclear Information System (INIS)

    In distributed system communication software design, the Client Server model has been widely used. This paper addresses the design and implementation issues of such a model, particularly when used in Accelerator Control Systems. in designing the Client Server model one needs to decide how the services will be defined for a server, what types of messages the server will respond to, which data formats will be used for the network transactions and how the server will be located by the client. Special consideration needs to be given to error handling both on the server and client side. Since the server usually is located on a machine other than the client, easy and informative server diagnostic capability is required. The higher level abstraction provided by the Client Server model simplifies the application writing, however fine control over network parameters is essential to improve the performance. Above mentioned design issues and implementation trade-offs are discussed in this paper

  5. The abandoned surface mining sites in the Czech Republic: mapping and creating a database with a GIS web application

    Science.gov (United States)

    Pokorný, Richard; Tereza Peterková, Marie

    2016-05-01

    Based on the vectorization of the 55-volume book series the Quarry Inventories of the Czechoslovak Republic/Czechoslovak Socialist Republic, published in the years 1932-1961, a new comprehensive database was built comprising 9958 surface mining sites of raw materials, which were active in the first half of the 20th century. The mapped area covers 40.9 % of the territory of the Czech Republic. For the purposes of visualization, a map application, the Quarry Inventories Online, was created that enables the data visualization.

  6. DFACS - DATABASE, FORMS AND APPLICATIONS FOR CABLING AND SYSTEMS, VERSION 3.30

    Science.gov (United States)

    Billitti, J. W.

    1994-01-01

    DFACS is an interactive multi-user computer-aided engineering tool for system level electrical integration and cabling engineering. The purpose of the program is to provide the engineering community with a centralized database for entering and accessing system functional definitions, subsystem and instrument-end circuit pinout details, and harnessing data. The primary objective is to provide an instantaneous single point of information interchange, thus avoiding error-prone, time-consuming, and costly multiple-path data shuttling. The DFACS program, which is centered around a single database, has built-in menus that provide easy data input and access for all involved system, subsystem, and cabling personnel. The DFACS program allows parallel design of circuit data sheets and harness drawings. It also recombines raw information to automatically generate various project documents and drawings including the Circuit Data Sheet Index, the Electrical Interface Circuits List, Assembly and Equipment Lists, Electrical Ground Tree, Connector List, Cable Tree, Cabling Electrical Interface and Harness Drawings, Circuit Data Sheets, and ECR List of Affected Interfaces/Assemblies. Real time automatic production of harness drawings and circuit data sheets from the same data reservoir ensures instant system and cabling engineering design harmony. DFACS also contains automatic wire routing procedures and extensive error checking routines designed to minimize the possibility of engineering error. DFACS is designed to run on DEC VAX series computers under VMS using Version 6.3/01 of INGRES QUEL/OSL, a relational database system which is available through Relational Technology, Inc. The program is available in VAX BACKUP format on a 1600 BPI 9-track magnetic tape (standard media) or a TK50 tape cartridge. DFACS was developed in 1987 and last updated in 1990. DFACS is a copyrighted work with all copyright vested in NASA. DEC, VAX and VMS are trademarks of Digital Equipment Corporation

  7. A public turbulence database cluster and applications to study Lagrangian evolution of velocity increments in turbulence

    OpenAIRE

    Li, Yi; Perlman, Eric; Wan, Minping; Yang, Yunke; Meneveau, Charles; Burns, Randal; Chen, Shiyi; Szalay, Alexander; Eyink, Gregory

    2008-01-01

    A public database system archiving a direct numerical simulation (DNS) data set of isotropic, forced turbulence is described in this paper. The data set consists of the DNS output on $1024^3$ spatial points and 1024 time-samples spanning about one large-scale turn-over timescale. This complete $1024^4$ space-time history of turbulence is accessible to users remotely through an interface that is based on the Web-services model. Users may write and execute analysis programs on their host comput...

  8. An on-line scaling method for improving scalability of a database cluster

    Institute of Scientific and Technical Information of China (English)

    JANG Yong-ll; LEE Chung-ho; LEE Jae-dong; BAE Hae-young

    2004-01-01

    The explosive growth of the Internet and database applications has driven database to be more scalable and available, and able to support on-line scaling without interrupting service. To support more client's queries without downtime and degrading the response time, more nodes have to be scaled up while the database is running. This paper presents the overview of scalable and available database that satisfies the above characteristics. And we propose a novel on-line scaling method. Our method improves the existing on-line scaling method for fast response time and higher throughputs. Our proposed method reduces unnecessary network use, i.e. , we decrease the number of data copy by reusing the backup data. Also, our on-line scaling operation can be processed parallel by selecting adequate nodes as new node. Our performance study shows that our method results in significant reduction in data copy time.

  9. Solution-Focused Counseling for Clients with Religious and Spiritual Concerns

    Science.gov (United States)

    Guterman, Jeffrey T.; Leite, Noelia

    2006-01-01

    Solution-focused counseling is presented as a framework for clients with religious and spiritual concerns. The theory of solution-focused counseling is described. Implications for using this model with religious and spiritual clients are considered. A case example is provided to illustrate the application of solution-focused counseling for a…

  10. Improving client-centred care and services : the role of front/back-office configurations

    NARCIS (Netherlands)

    Broekhuis, Manda; de Blok, C.; Meijboom, B.

    2009-01-01

    Improving client-centred care and services: the role of front/back-officeconfigurations. This paper is a report of a study conducted to explore the application of designing front- and back-office work resulting in efficient client-centred care in healthcare organizations that supply home care, welfa

  11. Database design: Community discussion board

    OpenAIRE

    Klepetko, Radim

    2009-01-01

    The goal of this thesis is designing a database for discussion board application, which will be able to provide classic discussion board functionality and web 2.0 features in addition. The emphasis lies on a precise description of the application requirements, which are used afterwards to design an optimal database model independent from technological implementations (chosen database system). In the end of my thesis the database design is tested using MySQL database system.

  12. Automated detection of clustered microcalcifications on mammograms: CAD system application to MIAS database

    Energy Technology Data Exchange (ETDEWEB)

    Ibrahim, Norhayati; Fujita, Hiroshi; Hara, Takeshi [Department of Information Science, Faculty of Engineering, Gifu University, Yanagido, Gifu 501-11 (Japan); Endo, Tokiko [Department of Radiology, Nagoya National Hospital, Naka-ku, Nagoya 460 (Japan)

    1997-12-01

    To investigate the detection performance of our automated detection scheme for clustered microcalcifications on mammograms, we applied our computer-aided diagnosis (CAD) system to the database of the Mammographic Image Analysis Society (MIAS) in the UK. Forty-three mammograms from this database were used in this study. In our scheme, the breast regions were firstly extracted by determining the skinline. Histograms of the original images were used to extract the high-density area within the breast region as the segmentation from the fatty area around the skinline. Then the contrast correction technique was employed. Gradient vectors of the image density were calculated on the contrast corrected images. To extract the specific features of the pattern of the microcalcifications, triple-ring filter analysis was employed. A variable-ring filter was used for more accurate detection after the triple-ring filter. The features of the detected candidate areas were then characterized by feature analysis. The areas which satisfied the characteristics and specific terms were classified and displayed as clusters. As a result, the sensitivity was 95.8% with the false-positive rate at 1.8 clusters per image. This demonstrates that the automated detection of clustered microcalcifications in our CAD system is reliable as an aid to radiologists. (author)

  13. A Spatiotemporal Database to Track Human Scrub Typhus Using the VectorMap Application.

    Directory of Open Access Journals (Sweden)

    Daryl J Kelly

    2015-12-01

    Full Text Available Scrub typhus is a potentially fatal mite-borne febrile illness, primarily of the Asia-Pacific Rim. With an endemic area greater than 13 million km2 and millions of people at risk, scrub typhus remains an underreported, often misdiagnosed febrile illness. A comprehensive, updatable map of the true distribution of cases has been lacking, and therefore the true risk of disease within the very large endemic area remains unknown. The purpose of this study was to establish a database and map to track human scrub typhus. An online search using PubMed and the United States Armed Forces Pest Management Board Literature Retrieval System was performed to identify articles describing human scrub typhus cases both within and outside the traditionally accepted endemic regions. Using World Health Organization guidelines, stringent criteria were used to establish diagnoses for inclusion in the database. The preliminary screening of 181 scrub typhus publications yielded 145 publications that met the case criterion, 267 case records, and 13 serosurvey records that could be georeferenced, describing 13,739 probable or confirmed human cases in 28 countries. A map service has been established within VectorMap (www.vectormap.org to explore the role that relative location of vectors, hosts, and the pathogen play in the transmission of mite-borne scrub typhus. The online display of scrub typhus cases in VectorMap illustrates their presence and provides an up-to-date geographic distribution of proven scrub typhus cases.

  14. A Spatiotemporal Database to Track Human Scrub Typhus Using the VectorMap Application

    Science.gov (United States)

    Kelly, Daryl J.; Foley, Desmond H.; Richards, Allen L.

    2015-01-01

    Scrub typhus is a potentially fatal mite-borne febrile illness, primarily of the Asia-Pacific Rim. With an endemic area greater than 13 million km2 and millions of people at risk, scrub typhus remains an underreported, often misdiagnosed febrile illness. A comprehensive, updatable map of the true distribution of cases has been lacking, and therefore the true risk of disease within the very large endemic area remains unknown. The purpose of this study was to establish a database and map to track human scrub typhus. An online search using PubMed and the United States Armed Forces Pest Management Board Literature Retrieval System was performed to identify articles describing human scrub typhus cases both within and outside the traditionally accepted endemic regions. Using World Health Organization guidelines, stringent criteria were used to establish diagnoses for inclusion in the database. The preliminary screening of 181 scrub typhus publications yielded 145 publications that met the case criterion, 267 case records, and 13 serosurvey records that could be georeferenced, describing 13,739 probable or confirmed human cases in 28 countries. A map service has been established within VectorMap (www.vectormap.org) to explore the role that relative location of vectors, hosts, and the pathogen play in the transmission of mite-borne scrub typhus. The online display of scrub typhus cases in VectorMap illustrates their presence and provides an up-to-date geographic distribution of proven scrub typhus cases. PMID:26678263

  15. Collaborating with Your Clients Using Social Media & Mobile Communications

    Science.gov (United States)

    Typhina, Eli; Bardon, Robert E.; Gharis, Laurie W.

    2015-01-01

    Many Extension educators are still learning how to effectively integrate social media into their programs. By using the right social media platforms and mobile applications to create engaged, online communities, Extension educators can collaborate with clients to produce and to share information expanding and enhancing their social media and…

  16. Systemic Power, Disciplinary Agency, and Developer–Business Client Relations

    DEFF Research Database (Denmark)

    Rowlands, Bruce; Kautz, Karlheinz

    2013-01-01

    This paper presents Hardy’s multi-dimensional model of power and illustrates its application to the field of IS. Findings from a case study of developer—business client power relations within a large financial institution are presented. Our findings indicate that from the developers’ perspective,...

  17. Hardened Client Platforms for Secure Internet Banking

    Science.gov (United States)

    Ronchi, C.; Zakhidov, S.

    We review the security of e-banking platforms with particular attention to the exploitable attack vectors of three main attack categories: Man-in-the-Middle, Man-in-the-PC and Man-in-the-Browser. It will be shown that the most serious threats come from combination attacks capable of hacking any transaction without the need to control the authentication process. Using this approach, the security of any authentication system can be bypassed, including those using SecureID Tokens, OTP Tokens, Biometric Sensors and Smart Cards. We will describe and compare two recently proposed e-banking platforms, the ZTIC and the USPD, both of which are based on the use of dedicated client devices, but with diverging approaches with respect to the need of hardening the Web client application. It will be shown that the use of a Hardened Browser (or H-Browser) component is critical to force attackers to employ complex and expensive techniques and to reduce the strength and variety of social engineering attacks down to physiological fraud levels.

  18. Construction and Application of the underlying database Capital%基于CHS的数据库建设与应用

    Institute of Scientific and Technical Information of China (English)

    李荫荣

    2015-01-01

    This paper mainly presents the construction and application of how to create a harness design of the underlying database in Capital Symbol module and Capital Library modules of the Capital Harness System sofeware module application process,which consists of graphics rendering and data creation two parts;guid to how to create the underlying database of Capital Harness System,able to procide guarantees for the follow-up of high efficiency,high quality automotive wiring harness design work completed.%Capital Symbol和Capital Library是基于CHS软件实现汽车电气系统设计平台化、通用化、信息数据化管理;本文主要介绍基于CHS软件应用过程中如何在Capital Symbol模块和Capital Library模块,创建线束设计的基础数据库,其中包含图形绘制和数据创建两部分内容;用于指导如何创建CHS基础数据库,为后续能高效、高质量的完成汽车线束设计工作提供保证。

  19. Development Method of Wireless Application Based on Wireless Makeup Language and Web Database Technology

    Institute of Scientific and Technical Information of China (English)

    ZHANG Li; SHAO Shi-huang; WANG Jian; YIN Mei-hua

    2002-01-01

    Wireless technology is a new emerging delivery networks and development scheme of wireless internet is given widely attention currently. In order to make international visitors to surge education website at any time, anywhere by mobile handsets. The communication method of web database, such as CGI, ISAPI, JDBC and so on have been aralyzed and a new Active Server Page &Wireless Makeup Language (ASP-WML) based approach is presented. The dynamical refreshment of the homepage of wireless website and the automatic query of main information have been realized. At last, the wireless website of Dong Hua University is taken as an example to testify the possibility of wireless website design which is mentioned above.

  20. jSPyDB, an open source database-independent tool for data management

    International Nuclear Information System (INIS)

    Nowadays, the number of commercial tools available for accessing Databases, built on Java or .Net, is increasing. However, many of these applications have several drawbacks: usually they are not open-source, they provide interfaces only with a specific kind of database, they are platform-dependent and very CPU and memory consuming. jSPyDB is a free web-based tool written using Python and Javascript. It relies on jQuery and python libraries, and is intended to provide a simple handler to different database technologies inside a local web browser. Such a tool, exploiting fast access libraries such as SQLAlchemy, is easy to install, and to configure. The design of this tool envisages three layers. The front-end client side in the local web browser communicates with a backend server. Only the server is able to connect to the different databases for the purposes of performing data definition and manipulation. The server makes the data available to the client, so that the user can display and handle them safely. Moreover, thanks to jQuery libraries, this tool supports export of data in different formats, such as XML and JSON. Finally, by using a set of pre-defined functions, users are allowed to create their customized views for a better data visualization. In this way, we optimize the performance of database servers by avoiding short connections and concurrent sessions. In addition, security is enforced since we do not provide users the possibility to directly execute any SQL statement.

  1. Client/server approach to image capturing

    Science.gov (United States)

    Tuijn, Chris; Stokes, Earle

    1998-01-01

    The diversity of the digital image capturing devices on the market today is quite astonishing and ranges from low-cost CCD scanners to digital cameras (for both action and stand-still scenes), mid-end CCD scanners for desktop publishing and pre- press applications and high-end CCD flatbed scanners and drum- scanners with photo multiplier technology. Each device and market segment has its own specific needs which explains the diversity of the associated scanner applications. What all those applications have in common is the need to communicate with a particular device to import the digital images; after the import, additional image processing might be needed as well as color management operations. Although the specific requirements for all of these applications might differ considerably, a number of image capturing and color management facilities as well as other services are needed which can be shared. In this paper, we propose a client/server architecture for scanning and image editing applications which can be used as a common component for all these applications. One of the principal components of the scan server is the input capturing module. The specification of the input jobs is based on a generic input device model. Through this model we make abstraction of the specific scanner parameters and define the scan job definitions by a number of absolute parameters. As a result, scan job definitions will be less dependent on a particular scanner and have a more universal meaning. In this context, we also elaborate on the interaction of the generic parameters and the color characterization (i.e., the ICC profile). Other topics that are covered are the scheduling and parallel processing capabilities of the server, the image processing facilities, the interaction with the ICC engine, the communication facilities (both in-memory and over the network) and the different client architectures (stand-alone applications, TWAIN servers, plug-ins, OLE or Apple-event driven

  2. Heat pumps: Industrial applications. (Latest citations from the NTIS bibliographic database). Published Search

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-04-01

    The bibliography contains citations concerning design, development, and applications of heat pumps for industrial processes. Included are thermal energy exchanges based on air-to-air, ground-coupled, air-to-water, and water-to-water systems. Specific applications include industrial process heat, drying, district heating, and waste processing plants. Other Published Searches in this series cover heat pump technology and economics, and heat pumps for residential and commercial applications. (Contains 50-250 citations and includes a subject term index and title list.) (Copyright NERAC, Inc. 1995)

  3. Rancang Bangun Keanggotaan Perpustakaan STT Telematika Telkom Menggunakan RFID Berbasis Java 2 Standard Edition Dengan Konsep Client Server

    Directory of Open Access Journals (Sweden)

    Yana Yuniarsyah

    2013-05-01

    Full Text Available RFID technology is a new technology that hasn’t been widely applied. The existence of this technology can reduce the disadvantages of barcode technology. One application of RFID technology is used for a library card. STT Telematika Library is a library that uses a membership card to borrow and return transactions only. The existence of RFID technology in the card member can create a multifunctional card, in addition to borrow and return books transactions, membership cards can be used for visitor attendance too. Distribution of visitor attendance and report library using client-server concept, thus make it easier for librarians in data management. The programming language used in the design of Library Information System is a Java 2 Standard Edition (J2SE using NetBeans 7.0 as IDE. Storage Library using the MySQL database. Software design method using waterfall or linear sequential models. Model design to make information sistem using Unified Modeling Language (UML like usecase diagram, activity diagram, and class diagram. Database design model using Entity Relationship Diagram (ERD for development information library system. Testing library information system have form with testing user requirements, test the program using blacbox testing, and testing the user. RFID used for library information systems have form such as RFID reader which used to read the information carried by the RFID tag and RFID tag used to transmit information to the RFID reader. The success of the client-server concept comes from the success of visitor attendance and show a report from the client, and the success of server to store visitor attendance data.

  4. An interactive end-user software application for a deep-sea photographic database

    Digital Repository Service at National Institute of Oceanography (India)

    Jaisankar, S.; Sharma, R.

    different sources of the system. Date and time is used as the key reference for merging the data from different sources. Techniques that are developed to encode information from photographs and area calculations are discussed. Interactive JAVA application...

  5. An extensive surface model database for population- related information: concept and application

    OpenAIRE

    I Bracken

    1993-01-01

    Information technology has had a substantial impact on methods of geographical study. Although this varies between different application fields, effective use of the new technology requires careful consideration of the underlying concepts of geographic data representation. Here, an improved data model is described by means of raster techniques to represent population-related data in the form of surfaces. This type of model is seen as having broad applicability to the representation of socioec...

  6. Beauty from the beast: Avoiding errors in responding to client questions.

    Science.gov (United States)

    Waehler, Charles A; Grandy, Natalie M

    2016-09-01

    Those rare moments when clients ask direct questions of their therapists likely represent a point when they are particularly open to new considerations, thereby representing an opportunity for substantial therapeutic gains. However, clinical errors abound in this area because clients' questions often engender apprehension in therapists, causing therapists to respond with too little or too much information or shutting down the discussion prematurely. These response types can damage the therapeutic relationship, the psychotherapy process, or both. We explore the nature of these clinical errors in response to client questions by providing examples from our own clinical work, suggesting potential reasons why clinicians may not make optimal use of client questions, and discussing how the mixed psychological literature further complicates the issue. We also present four guidelines designed to help therapists, trainers, and supervisors respond constructively to clinical questions in order to create constructive interactions. (PsycINFO Database Record

  7. Beauty from the beast: Avoiding errors in responding to client questions.

    Science.gov (United States)

    Waehler, Charles A; Grandy, Natalie M

    2016-09-01

    Those rare moments when clients ask direct questions of their therapists likely represent a point when they are particularly open to new considerations, thereby representing an opportunity for substantial therapeutic gains. However, clinical errors abound in this area because clients' questions often engender apprehension in therapists, causing therapists to respond with too little or too much information or shutting down the discussion prematurely. These response types can damage the therapeutic relationship, the psychotherapy process, or both. We explore the nature of these clinical errors in response to client questions by providing examples from our own clinical work, suggesting potential reasons why clinicians may not make optimal use of client questions, and discussing how the mixed psychological literature further complicates the issue. We also present four guidelines designed to help therapists, trainers, and supervisors respond constructively to clinical questions in order to create constructive interactions. (PsycINFO Database Record PMID:27505454

  8. An Analysis of Database Replication Technologies with Regard to Deep Space Network Application Requirements

    Science.gov (United States)

    Connell, Andrea M.

    2011-01-01

    The Deep Space Network (DSN) has three communication facilities which handle telemetry, commands, and other data relating to spacecraft missions. The network requires these three sites to share data with each other and with the Jet Propulsion Laboratory for processing and distribution. Many database management systems have replication capabilities built in, which means that data updates made at one location will be automatically propagated to other locations. This project examines multiple replication solutions, looking for stability, automation, flexibility, performance, and cost. After comparing these features, Oracle Streams is chosen for closer analysis. Two Streams environments are configured - one with a Master/Slave architecture, in which a single server is the source for all data updates, and the second with a Multi-Master architecture, in which updates originating from any of the servers will be propagated to all of the others. These environments are tested for data type support, conflict resolution, performance, changes to the data structure, and behavior during and after network or server outages. Through this experimentation, it is determined which requirements of the DSN can be met by Oracle Streams and which cannot.

  9. Reference database of hypervariable genetic markers of Argentina: application for molecular anthropology and forensic casework.

    Science.gov (United States)

    Sala, A; Penacino, G; Carnese, R; Corach, D

    1999-06-01

    The population of Argentina is mostly composed of people of European ancestry. Aboriginal communities are at present very reduced in number and restricted to small geographically isolated patches. Three aboriginal communities, the Mapuche, Tehuelche and Wichi, were selected for short tandem repeat (STR) investigation. The metropolitan population of the city of Buenos Aires was analyzed, with both micro- and minisatellites. The minisatellite loci D1S7, D2S44, D4S139, D5S110, D8S358, D10S28, and D17S26 were typed on HaeIII-digested DNA obtained from unrelated individuals. D1S80 was typed by polymerase chain reaction (PCR). The autosomal STRs THO1, FABP, D6S366, CSF1PO, TPOX, F13A1, FES/FPS, vWA, MBPA/B, D16S539, D7S820, D13S317, and RENA4 and the sex chromosome STRs HPRTB, DYS385, DYS3891, DYS38911, DYS19, DYS390, DYS391, DYS392, DYS393 and YCAII were also investigated. As a by-product of our investigations, a reference database was created that is routinely used in forensic casework and paternity testing. STR allele frequency distributions are characterized by significant differences within and also between different populations. In contrast, the minisatellite bin distribution of the metropolitan population is not significantly different from other Caucasian populations.

  10. The Developmental Brain Disorders Database (DBDB): a curated neurogenetics knowledge base with clinical and research applications.

    Science.gov (United States)

    Mirzaa, Ghayda M; Millen, Kathleen J; Barkovich, A James; Dobyns, William B; Paciorkowski, Alex R

    2014-06-01

    The number of single genes associated with neurodevelopmental disorders has increased dramatically over the past decade. The identification of causative genes for these disorders is important to clinical outcome as it allows for accurate assessment of prognosis, genetic counseling, delineation of natural history, inclusion in clinical trials, and in some cases determines therapy. Clinicians face the challenge of correctly identifying neurodevelopmental phenotypes, recognizing syndromes, and prioritizing the best candidate genes for testing. However, there is no central repository of definitions for many phenotypes, leading to errors of diagnosis. Additionally, there is no system of levels of evidence linking genes to phenotypes, making it difficult for clinicians to know which genes are most strongly associated with a given condition. We have developed the Developmental Brain Disorders Database (DBDB: https://www.dbdb.urmc.rochester.edu/home), a publicly available, online-curated repository of genes, phenotypes, and syndromes associated with neurodevelopmental disorders. DBDB contains the first referenced ontology of developmental brain phenotypes, and uses a novel system of levels of evidence for gene-phenotype associations. It is intended to assist clinicians in arriving at the correct diagnosis, select the most appropriate genetic test for that phenotype, and improve the care of patients with developmental brain disorders. For researchers interested in the discovery of novel genes for developmental brain disorders, DBDB provides a well-curated source of important genes against which research sequencing results can be compared. Finally, DBDB allows novel observations about the landscape of the neurogenetics knowledge base.

  11. Management system development to establish an alumni database: application to a nuclear institution

    International Nuclear Information System (INIS)

    To pursue the alumni professional evolution has been a long time aspiration. To solve this problem it was developed at IPEN - Nuclear and Energy Research Institute, with support from CNEN - National Nuclear Energy Commission, a system to collect data from graduate alumni. This system was introduced in 2006, during the 30 years celebration of the Nuclear Technology Graduate Program from IPEN, held in association with the University of Sao Paulo - USP. The main purpose is to follow the career development of the alumni, mainly those not employed in any of the institutes linked to CNEN. The developed system allowed the creation of a database comprising information about the academic degree, professional status and the extension of their contribution to the society. It allows also to follow if the knowledge obtained remained restrict to the Universities and Research Institutes or reached the private companies. The system allows several statistics to be done concerning not only the alumni but also the professors. In this work the first results of the data collection are presented, containing more than 750 responses from a total around 1340 alumni. The final purpose is to upgrade this system to collect data from the several institutes linked to CNEN, either graduate or undergraduate alumni. (author)

  12. Constructing Database for Drugs and its Application to Biological Sample by HPTLC and GC/MS

    Energy Technology Data Exchange (ETDEWEB)

    Yoo, Y.C.; Park, S.W.; Lim, M.A.; Baeck, S.K.; Park, S.Y.; Lee, J.S.; Lee, J.S. [National Institute of Scientific investigation, Seoul (Korea); Lho, D.S. [Korea Institute of Science and Technology, Seoul (Korea)

    2000-04-01

    For the identification of unknown drugs in biological samples, we attempted rapid high performance thin layer chromatographic method which is sensitive and selective chromatographic analysis of high performance thin layer chromatography (HPTLC) with automated TLC sampler and ultra-violet (UV) scanner. We constructed HPTLC database (DB) on two hundred five drugs by using the data of Rf values and UV spectra (scan 200-360 nm) as well as gas chromatography/mass spectrometry (GC/MS) DB on ninety six drugs by using the data of relative retention time (RRT) on lidocain and mass spectra. After extracting drugs in geological sample by solid phase extraction (Clean Screen ZSDAU020), we applied them to HPTLC and GC/MS DB. Drugs, especially extracted from biological samples, showed good matching ratio to HPTLC DB and these drugs were confirmed by GC/MS. In conclusion, this DB system is thought to be very useful method for the screening of unknown drugs in biological samples. (author). 9 refs., 2 tabs., 6 figs.

  13. [Application characteristics and situation analysis of volatile oils in database of Chinese patent medicine].

    Science.gov (United States)

    Wang, Sai-Jun; Wu, Zhen-Feng; Yang, Ming; Wang, Ya-Qi; Hu, Peng-Yi; Jie, Xiao-Lu; Han, Fei; Wang, Fang

    2014-09-01

    Aromatic traditional Chinese medicines have a long history in China, with wide varieties. Volatile oils are active ingredients extracted from aromatic herbal medicines, which usually contain tens or hundreds of ingredients, with many biological activities. Therefore, volatile oils are often used in combined prescriptions and made into various efficient preparations for oral administration or external use. Based on the sources from the database of Newly Edited National Chinese Traditional Patent Medicines (the second edition), the author selected 266 Chinese patent medicines containing volatile oils in this paper, and then established an information sheet covering such items as name, dosage, dosage form, specification and usage, and main functions. Subsequently, on the basis of the multidisciplinary knowledge of pharmaceutics, traditional Chinese pharmacology and basic theory of traditional Chinese medicine, efforts were also made in the statistics of the dosage form and usage, variety of volatile oils and main functions, as well as the status analysis on volatile oils in terms of the dosage form development, prescription development, drug instruction and quality control, in order to lay a foundation for the further exploration of the market development situations of volatile oils and the future development orientation. PMID:25522633

  14. Web Technologies And Databases

    OpenAIRE

    Irina-Nicoleta Odoraba

    2011-01-01

    The database means a collection of many types of occurrences of logical records containing relationships between records and data elementary aggregates. Management System database (DBMS) - a set of programs for creating and operation of a database. Theoretically, any relational DBMS can be used to store data needed by a Web server. Basically, it was observed that the simple DBMS such as Fox Pro or Access is not suitable for Web sites that are used intensively. For large-scale Web applications...

  15. Nuclear Science References Database

    OpenAIRE

    PRITYCHENKO B.; Běták, E.; B. Singh; Totans, J.

    2013-01-01

    The Nuclear Science References (NSR) database together with its associated Web interface, is the world's only comprehensive source of easily accessible low- and intermediate-energy nuclear physics bibliographic information for more than 210,000 articles since the beginning of nuclear science. The weekly-updated NSR database provides essential support for nuclear data evaluation, compilation and research activities. The principles of the database and Web application development and maintenance...

  16. SISSY: An example of a multi-threaded, networked, object-oriented databased application

    International Nuclear Information System (INIS)

    The Systems Integration Support SYstem (SISSY) is presented and its capabilities and techniques are discussed. It is fully automated data collection and analysis system supporting the SSCL's systems analysis activities as they relate to the Physics Detector and Simulation Facility (PDSF). SISSY itself is a paradigm of effective computing on the PDSF. It uses home-grown code (C++), network programming (RPC, SNMP), relational (SYBASE) and object-oriented (ObjectStore) DBMSs, UNIX operating system services (IRIX threads, cron, system utilities, shells scripts, etc.), and third party software applications (NetCentral Station, Wingz, DataLink) all of which act together as a single application to monitor and analyze the PDSF

  17. Psychoanalytic psychotherapy with a client with bulimia nervosa.

    Science.gov (United States)

    Lunn, Susanne; Daniel, Sarah I F; Poulsen, Stig

    2016-06-01

    This case study presents the progress of one patient with bulimia nervosa who was originally very compromised in psychological domains that are the focus of analytic treatment, and includes in-session therapeutic process and a range of outcomes, for example, eating disorder symptoms, attachment status, and reflective functioning. Nested in a study showing more rapid behavioral improvement in subjects receiving cognitive behavior therapy than in subjects receiving psychoanalytic psychotherapy, the case highlights the importance of supplementing RCTs with single case studies and the need of adapting the therapeutic approach as well as the current therapeutic dialogue to the individual client. (PsycINFO Database Record PMID:27267505

  18. DataBase on Demand

    Science.gov (United States)

    Gaspar Aparicio, R.; Gomez, D.; Coterillo Coz, I.; Wojcik, D.

    2012-12-01

    At CERN a number of key database applications are running on user-managed MySQL database services. The database on demand project was born out of an idea to provide the CERN user community with an environment to develop and run database services outside of the actual centralised Oracle based database services. The Database on Demand (DBoD) empowers the user to perform certain actions that had been traditionally done by database administrators, DBA's, providing an enterprise platform for database applications. It also allows the CERN user community to run different database engines, e.g. presently open community version of MySQL and single instance Oracle database server. This article describes a technology approach to face this challenge, a service level agreement, the SLA that the project provides, and an evolution of possible scenarios.

  19. DataBase on Demand

    International Nuclear Information System (INIS)

    At CERN a number of key database applications are running on user-managed MySQL database services. The database on demand project was born out of an idea to provide the CERN user community with an environment to develop and run database services outside of the actual centralised Oracle based database services. The Database on Demand (DBoD) empowers the user to perform certain actions that had been traditionally done by database administrators, DBA's, providing an enterprise platform for database applications. It also allows the CERN user community to run different database engines, e.g. presently open community version of MySQL and single instance Oracle database server. This article describes a technology approach to face this challenge, a service level agreement, the SLA that the project provides, and an evolution of possible scenarios.

  20. DataBase on demand

    CERN Document Server

    Aparicio, Ruben Gaspar; Coterillo Coz, I

    2012-01-01

    At CERN a number of key database applications are running on user-managed MySQL database services. The database on demand project was born out of an idea to provide the CERN user community with an environment to develop and run database services outside of the actual centralised Oracle based database services. The Database on Demand (DBoD) empowers the user to perform certain actions that had been traditionally done by database administrators, DBA's, providing an enterprise platform for database applications. It also allows the CERN user community to run different database engines, e.g. presently open community version of MySQL and single instance Oracle database server. This article describes a technology approach to face this challenge, a service level agreement, the SLA that the project provides, and an evolution of possible scenarios.

  1. Microwave heating: Industrial applications. (Latest citations from the EI Compendex*plus database). Published Search

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-04-01

    The bibliography contains citations concerning industrial uses and design of microwave heating equipment. Citations discuss applications in food processing, industrial heating, vulcanization, textile finishing, metallurgical sintering, ceramic manufacturing, paper industries, and curing of polymers. (Contains 50-250 citations and includes a subject term index and title list.) (Copyright NERAC, Inc. 1995)

  2. Design and implementation of a distributed large-scale spatial database system based on J2EE

    Science.gov (United States)

    Gong, Jianya; Chen, Nengcheng; Zhu, Xinyan; Zhang, Xia

    2003-03-01

    With the increasing maturity of distributed object technology, CORBA, .NET and EJB are universally used in traditional IT field. However, theories and practices of distributed spatial database need farther improvement in virtue of contradictions between large scale spatial data and limited network bandwidth or between transitory session and long transaction processing. Differences and trends among of CORBA, .NET and EJB are discussed in details, afterwards the concept, architecture and characteristic of distributed large-scale seamless spatial database system based on J2EE is provided, which contains GIS client application, web server, GIS application server and spatial data server. Moreover the design and implementation of components of GIS client application based on JavaBeans, the GIS engine based on servlet, the GIS Application server based on GIS enterprise JavaBeans(contains session bean and entity bean) are explained.Besides, the experiments of relation of spatial data and response time under different conditions are conducted, which proves that distributed spatial database system based on J2EE can be used to manage, distribute and share large scale spatial data on Internet. Lastly, a distributed large-scale seamless image database based on Internet is presented.

  3. Materials Properties Database for Selection of High-Temperature Alloys and Concepts of Alloy Design for SOFC Applications

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Z Gary; Paxton, Dean M.; Weil, K. Scott; Stevenson, Jeffry W.; Singh, Prabhakar

    2002-11-24

    To serve as an interconnect / gas separator in an SOFC stack, an alloy should demonstrate the ability to provide (i) bulk and surface stability against oxidation and corrosion during prolonged exposure to the fuel cell environment, (ii) thermal expansion compatibility with the other stack components, (iii) chemical compatibility with adjacent stack components, (iv) high electrical conductivity of the surface reaction products, (v) mechanical reliability and durability at cell exposure conditions, (vii) good manufacturability, processability and fabricability, and (viii) cost effectiveness. As the first step of this approach, a composition and property database was compiled for high temperature alloys in order to assist in determining which alloys offer the most promise for SOFC interconnect applications in terms of oxidation and corrosion resistance. The high temperature alloys of interest included Ni-, Fe-, Co-base superal

  4. Briefing: The ICE intelligent client capability framework

    OpenAIRE

    Madter, N; Bower, DA

    2015-01-01

    Recent aspirations to transform the delivery of major capital programmes and projects in the public sector are focusing on the achievement of value for money, whole‐life asset management and sustainable procurement, embodied in the principles of the Intelligent Client. However, there is little support offered to those working in client functions to promote the development of the skills and behaviours that underpin effective client decision-making. In line with the re-launch Infrastructure UK'...

  5. Client Update: A Solution for Service Evolution

    OpenAIRE

    Ouederni, Meriem; Salaün, Gwen; Pimentel, Ernesto

    2011-01-01

    International audience In service-based systems, service evolution might raise critical communication issues since the client cannot be aware of the changes that have occurred on the black-box services side. In this paper, we propose an automated process to adapt the client to the changes that have occurred. Our approach relies on a compatibility measuring method, and changes the client interface to ensure the system compatibility. This solution is fully automated inside a prototype tool w...

  6. Do client fees help or hurt?

    Science.gov (United States)

    Barnett, B

    1998-01-01

    This article discusses the impact of client fees for family planning (FP) services on cost recovery and level of user services in developing countries. The UN Population Fund reports that developing country governments currently pay 75% of the costs of FP programs. Donors contribute 15%, and clients pay 10%. Current pressures are on FP services to broaden and improve their scope, while user demand is increasing. Program managers should consider the program's need for funds and the clients' willingness to pay. Clients are willing to pay about 1% of their income for contraception. A study of sterilization acceptance in Mexico finds that the average monthly case load declined by 10% after the 1st price increase from $43 to $55 and declined by 58% after the 2nd price increase to $60. Fewer low-income clients requested sterilization. A CEMOPLAF study in Ecuador finds that in three price increase situations the number of clients seeking services declined, but the economic mix of clients remained about the same. The decline was 20% in the group with a 20% price increase and 26% in the 40% increase group. In setting fees, the first need is to determine unit costs. The Futures Group International recommends considering political, regulatory, and institutional constraints for charging fees; priorities for revenue use; protection for poor clients; and monitoring of money collection and expenditure. Management Sciences for Health emphasizes consideration of the reasons for collection of fees, client affordability, and client perception of quality issues. Sliding scales can be used to protect poor clients. Charging fees for laboratory services can subsidize poor clients. A Bangladesh program operated a restaurant and catering service in order to subsidize FP services. Colombia's PROFAMILIA sells medical and surgical services and a social marketing program in order to expand clinics. PMID:12293239

  7. Relational databases

    CERN Document Server

    Bell, D A

    1986-01-01

    Relational Databases explores the major advances in relational databases and provides a balanced analysis of the state of the art in relational databases. Topics covered include capture and analysis of data placement requirements; distributed relational database systems; data dependency manipulation in database schemata; and relational database support for computer graphics and computer aided design. This book is divided into three sections and begins with an overview of the theory and practice of distributed systems, using the example of INGRES from Relational Technology as illustration. The

  8. GRAD: On Graph Database Modeling

    OpenAIRE

    Ghrab, Amine; Romero, Oscar; Skhiri, Sabri; Vaisman, Alejandro; Zimányi, Esteban

    2016-01-01

    Graph databases have emerged as the fundamental technology underpinning trendy application domains where traditional databases are not well-equipped to handle complex graph data. However, current graph databases support basic graph structures and integrity constraints with no standard algebra. In this paper, we introduce GRAD, a native and generic graph database model. GRAD goes beyond traditional graph database models, which support simple graph structures and constraints. Instead, GRAD pres...

  9. Storing an OWL 2 Ontology in a Relational Database Structure

    OpenAIRE

    Gorskis, Henrihs; Borisov, Arkady

    2015-01-01

    This paper examines the possibility of storing OWL 2 based ontology information in a classical relational database and reviews some existing methods for ontology databases. In most cases a database is a fitting solution for storing and sharing information among systems, clients or agents. Similarly, in order to make domain ontology information more accessible to systems, in a comparable way, it can be stored and provided in a database form. As of today, there is no consensus on a specific ont...

  10. SISSY: An example of a multi-threaded, networked, object-oriented databased application

    International Nuclear Information System (INIS)

    The Systems Integration Support SYstem (SISSY) is presented and its capabilities and techniques are discussed. It is a fully automated data collection and analysis system supporting the SSCL's systems analysis activities as they relate to the Physics Detector and Simulation Facility (PDSF). SISSY itself is a paradigm of effective computing on the PDSF. It uses home-grown code (C++), network programming (RPC, SNMP), relational (SYBASE) and object-oriented (ObjectStore) DBMSs, UNIX operating system services (IRIX threads, cron, system utilities, shell scripts, etc.), and third party software applications (NetCentral Station, Wingz, DataLink) all of which act together as a single application to monitor and analyze the PDSF

  11. Handling of network and database instabilities in CORAL

    International Nuclear Information System (INIS)

    The CORAL software is widely used by the LHC experiments for storing and accessing data using relational database technologies. CORAL provides a C++ abstraction layer that supports data persistency for several back-ends and deployment models, direct client access to Oracle servers being one of the most important use cases. Since 2010, several problems have been reported by the LHC experiments in their use of Oracle through CORAL, involving application errors, hangs or crashes after the network or the database servers became temporarily unavailable. CORAL already provided some level of handling of these instabilities, which are due to external causes and cannot be avoided, but this proved to be insufficient in some cases and to be itself the cause of other problems, such as the hangs and crashes mentioned before, in other cases. As a consequence, a major redesign of the CORAL plugins has been implemented, with the aim of making the software more robust against these database and network glitches. The new implementation ensures that CORAL automatically reconnects to Oracle databases in a transparent way whenever possible and gently terminates the application when this is not possible. Internally, this is done by resetting all relevant parameters of the underlying back-end technology (OCI, the Oracle Call Interface). This presentation reports on the status of this work at the time of the CHEP2012 conference, covering the design and implementation of these new features and the outlook for future developments in this area.

  12. Handling of network and database instabilities in CORAL

    Science.gov (United States)

    Trentadue, R.; Valassi, A.; Kalkhof, A.

    2012-12-01

    The CORAL software is widely used by the LHC experiments for storing and accessing data using relational database technologies. CORAL provides a C++ abstraction layer that supports data persistency for several back-ends and deployment models, direct client access to Oracle servers being one of the most important use cases. Since 2010, several problems have been reported by the LHC experiments in their use of Oracle through CORAL, involving application errors, hangs or crashes after the network or the database servers became temporarily unavailable. CORAL already provided some level of handling of these instabilities, which are due to external causes and cannot be avoided, but this proved to be insufficient in some cases and to be itself the cause of other problems, such as the hangs and crashes mentioned before, in other cases. As a consequence, a major redesign of the CORAL plugins has been implemented, with the aim of making the software more robust against these database and network glitches. The new implementation ensures that CORAL automatically reconnects to Oracle databases in a transparent way whenever possible and gently terminates the application when this is not possible. Internally, this is done by resetting all relevant parameters of the underlying back-end technology (OCI, the Oracle Call Interface). This presentation reports on the status of this work at the time of the CHEP2012 conference, covering the design and implementation of these new features and the outlook for future developments in this area.

  13. A virtual repository approach to clinical and utilization studies: application in mammography as alternative to a national database.

    OpenAIRE

    Ohno-Machado, L.; Boxwala, A. A.; Ehresman, J.; Smith, D N; Greenes, R. A.

    1997-01-01

    A national mammography database was proposed, based on a centralized architecture for collecting, monitoring, and auditing mammography data. We have developed an alternative architecture relying on Internet-based distributed queries to heterogeneous databases. This architecture creates a "virtual repository", or a federated database which is constructed dynamically, for each query and makes use of data available in legacy systems. It allows the construction of custom-tailored databases at ind...

  14. Irish National Food Ingredient Database: application for assessing patterns of additive usage in foods.

    Science.gov (United States)

    Gilsenan, M B; Lambe, J; Gibney, M J

    2002-12-01

    Patterns of food additive usage in the Irish food supply and changes in patterns of usage between 1995-97 and 1998-99 were assessed by means of an Irish National Food Ingredient Database (INFID). Of the 300 additives permitted for use according to the European Union food additives Directives, some 54% were recorded in foods in INFID. Colours, emulsifiers and acids were the most frequently used additive categories, representing 18, 13 and 12% of the total additives used, respectively. Colours were most commonly recorded in sauces (n = 182 brands, 26% of sauces), emulsifiers were most commonly recorded in biscuits (n = 181 brands, 47% of biscuits) and acids were most commonly recorded in sauces (304 brands, 43% of sauces). Carotenes (E160a), Annatto (E160b), mono- and diglycerides of fatty acids (E471) and citric acid (E330) were the most commonly used colour, emulsifier and acid, respectively. All diet soft drinks (n = 37), low-fat spreads (n = 25) and liver pâtés (n = 10) recorded the use of at least one additive. When expressed in terms of the number of brands that contain additives, sauces (n = 522, 73% of sauces), biscuits (n = 323, 84% of biscuits) and preserves (n = 321, 85% of preserves) were ranked highest. For most categories of additive (n = 24), there appeared to be a minimal change in qualitative additive usage between 1995-97 and 1998-99. However, there was a significant increase in the frequency of use of emulsifiers (p < 0.001), acids (p < 0.01), sweeteners (p < 0.05) and acidity regulators (p < 0.05), and a significant decrease in the frequency of use of antioxidants (p < 0.05) during the period 1998-99 compared with 1995-97. Despite changes in additive usage patterns, it appeared that changes in the types of brands on sale between both periods were more apparent than actual changes in qualitative ingredient formulations across brands, as some 17% of brands that were on sale in 1995-97 were no longer on sale in 1998-99.

  15. Developing a virtual reality application for training Nuclear Power Plant operators: Setting up a database containing dose rates in the refuelling plant

    International Nuclear Information System (INIS)

    Operators in Nuclear Power Plants can receive high doses during refuelling operations. A training programme for simulating refuelling operations will be useful in reducing the doses received by workers as well as minimising operation time. With this goal in mind, a virtual reality application is developed within the framework of the CIPRES project. The application requires doses, both instantaneous and accumulated, to be displayed at all times during operator training. Therefore, it is necessary to set up a database containing dose rates at every point in the refuelling plant. This database is based on radiological protection surveillance data measured in the plant during refuelling operations. Some interpolation routines have been used to estimate doses through the refuelling plant. Different assumptions have been adopted in order to perform the interpolation and obtain consistent data. In this paper, the procedures developed to set up the dose database for the virtual reality application are presented and analysed. (authors)

  16. Developing a virtual reality application for training nuclear power plant operators: setting up a database containing dose rates in the refuelling plant.

    Science.gov (United States)

    Ródenas, J; Zarza, I; Burgos, M C; Felipe, A; Sánchez-Mayoral, M L

    2004-01-01

    Operators in Nuclear Power Plants can receive high doses during refuelling operations. A training programme for simulating refuelling operations will be useful in reducing the doses received by workers as well as minimising operation time. With this goal in mind, a virtual reality application is developed within the framework of the CIPRES project. The application requires doses, both instantaneous and accumulated, to be displayed at all times during operator training. Therefore, it is necessary to set up a database containing dose rates at every point in the refuelling plant. This database is based on radiological protection surveillance data measured in the plant during refuelling operations. Some interpolation routines have been used to estimate doses through the refuelling plant. Different assumptions have been adopted in order to perform the interpolation and obtain consistent data. In this paper, the procedures developed to set up the dose database for the virtual reality application are presented and analysed. PMID:15266073

  17. Fuzzy Modeling of Client Preference in Data-Rich Marketing Environments

    NARCIS (Netherlands)

    M. Setnes; U. Kaymak (Uzay)

    2000-01-01

    textabstractAdvances in computational methods have led, in the world of financial services, to huge databases of client and market information. In the past decade, various computational intelligence (CI) techniques have been applied in mining this data for obtaining knowledge and in-depth informatio

  18. Can You Diagnose Me Now? A Proposal to Modify FDA's Regulation of Smartphone Mobile Health Applications with a Pre-Market Notification and Application Database System.

    Science.gov (United States)

    McInerney, Stephen

    2015-01-01

    Mobile applications provide limitless possibilities for the future of medical care. Yet these changes have also created concerns about patient safety. Under the Federal Food, Drug, and Cosmetic Act (FDCA), the Food and Drug Administration (FDA) has the authority to regulate a much broader spectrum of products beyond traditional medical devices like stethoscopes or pacemakers. The regulatory question is not if FDA has the statutory. authority to regulate health-related software, but rather how it will exercise its regulatory authority. In September 2013, FDA published guidance on Mobile Medical Applications; in it, the Agency limited its oversight to a small subset of medical-related mobile applications, referred to as "mobile medical applications." For the guidance to be effective, FDA must continue to work directly with all actors--including innovators, doctors, and patients--as the market for mobile health applications continues to develop. This Article argues that FDA should adopt a two-step plan--a pre-market notification program and a mobile medical application database--to aid in the successful implementation of its 2013 guidance. By doing so, FDA will ensure that this burgeoning market can reach its fullest potential. PMID:26292476

  19. Client Server Model Based DAQ System for Real-Time Air Pollution Monitoring

    Directory of Open Access Journals (Sweden)

    Vetrivel. P

    2014-01-01

    Full Text Available The proposed system consists of client server model based Data-Acquisition Unit. The Embedded Web Server integrates Pollution Server and DAQ that collects air Pollutants levels (CO, NO2, and SO2. The Pollution Server is designed by considering modern resource constrained embedded systems. In contrast, an application server is designed to the efficient execution of programs and scripts for supporting the construction of various applications. While a pollution server mainly deals with sending HTML for display in a web browser on the client terminal, an application server provides access to server side logic for pollutants levels to be use by client application programs. The Embedded Web Server is an arm mcb2300 board with internet connectivity and acts as air pollution server as this standalone device gathers air pollutants levels and as a Server. Embedded Web server is accessed by various clients.

  20. A qualitative meta-analysis examining clients' experiences of psychotherapy: A new agenda.

    Science.gov (United States)

    Levitt, Heidi M; Pomerville, Andrew; Surace, Francisco I

    2016-08-01

    This article argues that psychotherapy practitioners and researchers should be informed by the substantive body of qualitative evidence that has been gathered to represent clients' own experiences of therapy. The current meta-analysis examined qualitative research studies analyzing clients' experiences within adult individual psychotherapy that appeared in English-language journals. This omnibus review integrates research from across psychotherapy approaches and qualitative methods, focusing on the cross-cutting question of how clients experience therapy. It utilized an innovative method in which 67 studies were subjected to a grounded theory meta-analysis in order to develop a hierarchy of data and then 42 additional studies were added into this hierarchy using a content meta-analytic method-summing to 109 studies in total. Findings highlight the critical psychotherapy experiences for clients, based upon robust findings across these research studies. Process-focused principles for practice are generated that can enrich therapists' understanding of their clients in key clinical decision-making moments. Based upon these findings, an agenda is suggested in which research is directed toward heightening therapists' understanding of clients and recognizing them as agents of change within sessions, supporting the client as self-healer paradigm. This research aims to improve therapists' sensitivity to clients' experiences and thus can expand therapists' attunement and intentionality in shaping interventions in accordance with whichever theoretical orientation is in use. The article advocates for the full integration of the qualitative literature in psychotherapy research in which variables are conceptualized in reference to an understanding of clients' experiences in sessions. (PsycINFO Database Record PMID:27123862

  1. Biofuel Database

    Science.gov (United States)

    Biofuel Database (Web, free access)   This database brings together structural, biological, and thermodynamic data for enzymes that are either in current use or are being considered for use in the production of biofuels.

  2. Community Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This excel spreadsheet is the result of merging at the port level of several of the in-house fisheries databases in combination with other demographic databases...

  3. Combinatorial Design of Some Database Application Technologies Based on B/S

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    Evolving From principal-subordinate structure of C /S to flexible multileveled distributed structure, i.e. B/S architecture so as to form a wide, distributed and orderly Internet/Intranet integrated management inf ormation system, is the trend of development of application software of the whol e world. Advantages and disadvantages of the two modes: C/S and B/S are compared . It is pointed out that at present onefold B/S mode cannot yet fully fullfil th e demands of some complicated data processing, inform...

  4. Database Administrator

    Science.gov (United States)

    Moore, Pam

    2010-01-01

    The Internet and electronic commerce (e-commerce) generate lots of data. Data must be stored, organized, and managed. Database administrators, or DBAs, work with database software to find ways to do this. They identify user needs, set up computer databases, and test systems. They ensure that systems perform as they should and add people to the…

  5. The X-Files: Investigating Alien Performance in a Thin-client World

    OpenAIRE

    Gunther, Neil J.

    2000-01-01

    Many scientific applications use the X11 window environment; an open source windows GUI standard employing a client/server architecture. X11 promotes: distributed computing, thin-client functionality, cheap desktop displays, compatibility with heterogeneous servers, remote services and administration, and greater maturity than newer web technologies. This paper details the author's investigations into close encounters with alien performance in X11-based seismic applications running on a 200-n...

  6. Hydrogen Leak Detection Sensor Database

    Science.gov (United States)

    Baker, Barton D.

    2010-01-01

    This slide presentation reviews the characteristics of the Hydrogen Sensor database. The database is the result of NASA's continuing interest in and improvement of its ability to detect and assess gas leaks in space applications. The database specifics and a snapshot of an entry in the database are reviewed. Attempts were made to determine the applicability of each of the 65 sensors for ground and/or vehicle use.

  7. Conditioning Probabilistic Databases

    CERN Document Server

    Koch, Christoph

    2008-01-01

    Past research on probabilistic databases has studied the problem of answering queries on a static database. Application scenarios of probabilistic databases however often involve the conditioning of a database using additional information in the form of new evidence. The conditioning problem is thus to transform a probabilistic database of priors into a posterior probabilistic database which is materialized for subsequent query processing or further refinement. It turns out that the conditioning problem is closely related to the problem of computing exact tuple confidence values. It is known that exact confidence computation is an NP-hard problem. This has lead researchers to consider approximation techniques for confidence computation. However, neither conditioning nor exact confidence computation can be solved using such techniques. In this paper we present efficient techniques for both problems. We study several problem decomposition methods and heuristics that are based on the most successful search techn...

  8. Using Solid State Drives as a Mid-Tier Cache in Enterprise Database OLTP Applications

    Science.gov (United States)

    Khessib, Badriddine M.; Vaid, Kushagra; Sankar, Sriram; Zhang, Chengliang

    When originally introduced, flash based solid state drives (SSD) exhibited a very high random read throughput with low sub-millisecond latencies. However, in addition to their steep prices, SSDs suffered from slow write rates and reliability concerns related to cell wear. For these reasons, they were relegated to a niche status in the consumer and personal computer market. Since then, several architectural enhancements have been introduced that led to a substantial increase in random write operations as well as a reasonable improvement in reliability. From a purely performance point of view, these high I/O rates and improved reliability make the SSDs an ideal choice for enterprise On-Line Transaction Processing (OLTP) applications. However, from a price/performance point of view, the case for SSDs may not be clear. Enterprise class SSD Price/GB, continues to be at least 10x higher than conventional magnetic hard disk drives (HDD) despite considerable drop in Flash chip prices.

  9. Discover knowledge in databases: Mining of data and applications; Descubrir conocimiento en bases de datos: Mineria de datos y aplicaciones

    Energy Technology Data Exchange (ETDEWEB)

    Rodriguez Martinez, Andres F [Instituto de Investigaciones Electricas, Temixco, Morelos (Mexico); Morales Manzanares, Eduardo [Instituto Tecnologico de Estudios Superiores de Monterrey (ITESM), Campus Cuernavaca, Morelos (Mexico)

    2000-07-01

    In the last years it has existed an enormous growth in the generation capacity and information storage, due to the increasing automation of processes in general and to the advances in the information capacity storage. Unfortunately, the information analysis techniques have not shown an equivalent development, reason why it exists the necessity of a new generation of computing techniques and tools that can assist the one who makes decisions in the automatic and intelligent analysis of large information volumes. To find useful knowledge among great amounts of data is the main objective of the area of discovery of knowledge in databases. The present article has like objective the spread of the process of discovering the knowledge in databases in general and the concept of mining of data in particular; to establish the relation that exists between the process of discovering knowledge in databases and the mining of data; as well as to fix the characteristics and complexities of looking for useful patterns in the data. Also the main methods of mining of data and the areas of application are described, where these algorithms have had greater success. [Spanish] En los ultimos anos ha existido un enorme crecimiento en la capacidad de generacion y almacenamiento de informacion, debido a la creciente automatizacion de procesos en general y a los avances en las capacidades de almacenamiento de informacion. Desafortunadamente, las tecnicas de analisis de informacion no han mostrado un desarrollo equivalente, por lo que existe la necesidad de una nueva generacion de tecnicas y herramientas computacionales que puedan asistir a quien toma decisiones en el analisis automatico e inteligente de grandes volumenes de informacion. Encontrar conocimiento util entre grandes cantidades de datos es el objetivo principal del area de descubrimiento de conocimiento en bases de datos. El presente articulo tiene como objetivo difundir el proceso de descubrir conocimiento en bases de datos en

  10. Open source database of images DEIMOS: extension for large-scale subjective image quality assessment

    Science.gov (United States)

    Vítek, Stanislav

    2014-09-01

    DEIMOS (Database of Images: Open Source) is an open-source database of images and video sequences for testing, verification and comparison of various image and/or video processing techniques such as compression, reconstruction and enhancement. This paper deals with extension of the database allowing performing large-scale web-based subjective image quality assessment. Extension implements both administrative and client interface. The proposed system is aimed mainly at mobile communication devices, taking into account advantages of HTML5 technology; it means that participants don't need to install any application and assessment could be performed using web browser. The assessment campaign administrator can select images from the large database and then apply rules defined by various test procedure recommendations. The standard test procedures may be fully customized and saved as a template. Alternatively the administrator can define a custom test, using images from the pool and other components, such as evaluating forms and ongoing questionnaires. Image sequence is delivered to the online client, e.g. smartphone or tablet, as a fully automated assessment sequence or viewer can decide on timing of the assessment if required. Environmental data and viewing conditions (e.g. illumination, vibrations, GPS coordinates, etc.), may be collected and subsequently analyzed.

  11. A press database on natural risks and its application in the study of floods in Northeastern Spain

    Directory of Open Access Journals (Sweden)

    M. C. Llasat

    2009-12-01

    Full Text Available The aim of this work is to introduce a systematic press database on natural hazards and climate change in Catalonia (NE of Spain and to analyze its potential application to social-impact studies. For this reason, a review of the concepts of risk, hazard, vulnerability and social perception is also included. This database has been built for the period 1982–2007 and contains all the news related with those issues published by the oldest still-active newspaper in Catalonia. Some parameters are registered for each article and for each event, including criteria that enable us to determine the importance accorded to it by the newspaper, and a compilation of information about it. This ACCESS data base allows each article to be classified on the basis of the seven defined topics and key words, as well as summary information about the format and structuring of the new itself, the social impact of the event and data about the magnitude or intensity of the event. The coverage given to this type of news has been assessed because of its influence on construction of the social perception of natural risk and climate change, and as a potential source of information about them. The treatment accorded by the press to different risks is also considered. More than 14 000 press articles have been classified. Results show that the largest number of news items for the period 1982–2007 relates to forest fires and droughts, followed by floods and heavy rainfalls, although floods are the major risk in the region of study. Two flood events recorded in 2002 have been analyzed in order to show an example of the role of the press information as indicator of risk perception.

  12. Client Verbal Response Category System: Preliminary Data.

    Science.gov (United States)

    Meier, Augustine; Boivin, Micheline

    1986-01-01

    The Client Verbal Response Category System classifies client responses into Temporal, Directional and Experiential categories. The categories with their subcategories are defined, interjudge reliability data is presented, and the instrument's utility in psychotherapy process research is demonstrated. Initial results indicate that the instrument is…

  13. Client Motivation and Rehabilitation Counseling Outcome.

    Science.gov (United States)

    Salomone, Paul R.

    This study investigates the relationship between client motivation or lack of motivation for vocational rehabilitation services, and rehabilitation outcome. Clients who had received services at a rehabilitation center during a two year period were rated on their level of motivation for rehabilitation services using the contents of diagnostic…

  14. Organizational and Client Commitment among Contracted Employees

    Science.gov (United States)

    Coyle-Shapiro, Jacqueline A-M.; Morrow, Paula C.

    2006-01-01

    This study examines affective commitment to employing and client organizations among long-term contracted employees, a new and growing employment classification. Drawing on organizational commitment and social exchange literatures, we propose two categories of antecedents of employee commitment to client organizations. We tested our hypotheses…

  15. YASGUI: Not Just Another SPARQL Client

    NARCIS (Netherlands)

    L. Rietveld; R. Hoekstra

    2013-01-01

    This paper introduces YASGUI, a user-friendly SPARQL client. We compare YASGUI with other SPARQL clients, and show the added value and ease of integrating Web APIs, services, and new technologies such as HTML5. Finally, we discuss some of the challenges we encountered in using these technologies for

  16. Indoor Location Fingerprinting with Heterogeneous Clients

    DEFF Research Database (Denmark)

    Kjærgaard, Mikkel Baun

    2011-01-01

    Heterogeneous wireless clients measure signal strength differently. This is a fundamental problem for indoor location fingerprinting, and it has a high impact on the positioning accuracy. Mapping-based solutions have been presented that require manual and error-prone calibration for each new client...

  17. Client Contact versus Paperwork: A Student Perspective.

    Science.gov (United States)

    Strohmer, Douglas C.; And Others

    1979-01-01

    Surveys master's level rehabilitation counseling students and examines percentage of time students spend involved in client contact and paperwork during their internship. Time spent in client contact was nearly double that spent doing paperwork for this group. Data from a number of settings are discussed. (Author)

  18. 面向托管的数据库即服务系统及资源优化技术%Database as a service system for business database application hosting and its resource optimization technique

    Institute of Scientific and Technical Information of China (English)

    王卓昊; 王希诚

    2011-01-01

    Database as a Service(DBaaS) is becoming a research hotspot of cloud computing.As a main application domain, business database application hosting puts forward the requirements of data and performance isolation and reliability guarantee. To satisfy these requirements, this paper proposes a virtual machine based database hosting method and corresponding DBaaS system.Furthermore, aiming at the key problem of resource (such as CPU, memory, etc.) optimal allocation for virtual machines which host database applications of different tenants,the paper formalizes the constraint programming problem,and solves the problem through a greedy algorithm based on a performance model and a utility function.The application example and experiment show that,the algorithm can optimize the resource cost while meeting database performance demand of each tenant.%数据库即服务(DBaaS)是云计算的一个研究热点,而数据应用托管则是当前DBaaS的一个重要应用领域.为满足行业数据应用托管中对DBaaS提出的数据隔离、性能隔离及可靠性保障等方面的要求,提出一种无共享架构下基于虚拟机,支持副本的多租户数据托管方法及相应的数据库即服务系统.针对该系统中面向租户的虚拟机资源(CPU、内存等)动态优化这一核心问题,建立了基于虚拟机的系统资源效用函数和数据库性能计算模型,并在基础上给出了一种根据租户数据请求负载并采用贪心方式的虚拟机资源动态优化算法.结合科技信息服务数据库托管应用示例进行了实验,实验结果表明提出的方法可以根据各个租户的数据库负载动态优化虚拟机的资源分配,能够在满足性能需求同时达到了提高系统资源利用率的目的.

  19. Web数据库技术及其在物探中的应用%THE TECHNOLOGY OF WEB DATABASE AND IT'S APPLICATIONS IN GEOPHYSICAL EXPLORATION

    Institute of Scientific and Technical Information of China (English)

    孙旭; 鲍新毅; 李灿平; 刘飚

    2001-01-01

    作者在文中论述了在物探领域开发web数据库应用的意义。讨论了多种web数据库技术方法的原理、特点以及发展前景。同时给出一个例子说明某地区物探资料的Web数据库应用。%The importances of developing Web database applications in the geophysical exploration is discussed in the paper. The principals and features of a variety of Web database techniques and their prospects are also described and analyzed. Finally, an example of real data is given to illustrate the application of the Web database to the geophysical exploration.

  20. Autism genetic database (AGD: a comprehensive database including autism susceptibility gene-CNVs integrated with known noncoding RNAs and fragile sites

    Directory of Open Access Journals (Sweden)

    Talebizadeh Zohreh

    2009-09-01

    Full Text Available Abstract Background Autism is a highly heritable complex neurodevelopmental disorder, therefore identifying its genetic basis has been challenging. To date, numerous susceptibility genes and chromosomal abnormalities have been reported in association with autism, but most discoveries either fail to be replicated or account for a small effect. Thus, in most cases the underlying causative genetic mechanisms are not fully understood. In the present work, the Autism Genetic Database (AGD was developed as a literature-driven, web-based, and easy to access database designed with the aim of creating a comprehensive repository for all the currently reported genes and genomic copy number variations (CNVs associated with autism in order to further facilitate the assessment of these autism susceptibility genetic factors. Description AGD is a relational database that organizes data resulting from exhaustive literature searches for reported susceptibility genes and CNVs associated with autism. Furthermore, genomic information about human fragile sites and noncoding RNAs was also downloaded and parsed from miRBase, snoRNA-LBME-db, piRNABank, and the MIT/ICBP siRNA database. A web client genome browser enables viewing of the features while a web client query tool provides access to more specific information for the features. When applicable, links to external databases including GenBank, PubMed, miRBase, snoRNA-LBME-db, piRNABank, and the MIT siRNA database are provided. Conclusion AGD comprises a comprehensive list of susceptibility genes and copy number variations reported to-date in association with autism, as well as all known human noncoding RNA genes and fragile sites. Such a unique and inclusive autism genetic database will facilitate the evaluation of autism susceptibility factors in relation to known human noncoding RNAs and fragile sites, impacting on human diseases. As a result, this new autism database offers a valuable tool for the research

  1. Failure database and tools for wind turbine availability and reliability analyses. The application of reliability data for selected wind turbines

    DEFF Research Database (Denmark)

    Kozine, Igor; Christensen, P.; Winther-Jensen, M.

    2000-01-01

    The objective of this project was to develop and establish a database for collecting reliability and reliability-related data, for assessing the reliability of wind turbine components and subsystems and wind turbines as a whole, as well as for assessingwind turbine availability while ranking the ...... similar safety systems. The database was established with Microsoft Access DatabaseManagement System, the software for reliability and availability assessments was created with Visual Basic....

  2. rasdaman Array Database: current status

    Science.gov (United States)

    Merticariu, George; Toader, Alexandru

    2015-04-01

    defines request types for inserting, updating and deleting coverages. A web client, designed for both novice and experienced users, is also available for the service and its extensions. The client offers an intuitive interface that allows users to work with multi-dimensional coverages by abstracting the specifics of the standard definitions of the requests. The Web Coverage Processing Service defines a language for on-the-fly processing and filtering multi-dimensional raster coverages. rasdaman exposes this service through the WCS processing extension. Demonstrations are provided online via the Earthlook website (earthlook.org) which presents use-cases from a wide variety of application domains, using the rasdaman system as processing engine.

  3. Micro-ISIS Training Course: Data Processing Principles and Applications. (Databases for Bibliographic Records and Serials, Training Material and Training Institutions, Addresses, and Registration of Mail).

    Science.gov (United States)

    Janssens, D.; Jesse, A.

    This document contains a compilation of overhead projection sheets from an International Labour Office (ILO) Training Course in Micro-ISIS held in March 1988, augmented by explanatory text and applications to a wide variety of databases. It is designed both to provide a visual, didactic approach to Micro-ISIS, and to constitute a quick-reference…

  4. La Aplicacion de las Bases de Datos al Estudio Historico del Espanol (The Application of Databases to the Historical Study of Spanish).

    Science.gov (United States)

    Nadal, Gloria Claveria; Lancis, Carlos Sanchez

    1997-01-01

    Notes that the employment of databases to the study of the history of a language is a method that allows for substantial improvement in investigative quality. Illustrates this with the example of the application of this method to two studies of the history of Spanish developed in the Language and Information Seminary of the Independent University…

  5. The new ALICE DQM client: a web access to ROOT-based objects

    Science.gov (United States)

    von Haller, B.; Carena, F.; Carena, W.; Chapeland, S.; Chibante Barroso, V.; Costa, F.; Delort, C.; Dénes, E.; Diviá, R.; Fuchs, U.; Niedziela, J.; Simonetti, G.; Soós, C.; Telesca, A.; Vande Vyvre, P.; Wegrzynek, A.

    2015-12-01

    A Large Ion Collider Experiment (ALICE) is the heavy-ion detector designed to study the physics of strongly interacting matter and the quark-gluon plasma at the CERN Large Hadron Collider (LHC). The online Data Quality Monitoring (DQM) plays an essential role in the experiment operation by providing shifters with immediate feedback on the data being recorded in order to quickly identify and overcome problems. An immediate access to the DQM results is needed not only by shifters in the control room but also by detector experts worldwide. As a consequence, a new web application has been developed to dynamically display and manipulate the ROOT-based objects produced by the DQM system in a flexible and user friendly interface. The architecture and design of the tool, its main features and the technologies that were used, both on the server and the client side, are described. In particular, we detail how we took advantage of the most recent ROOT JavaScript I/O and web server library to give interactive access to ROOT objects stored in a database. We describe as well the use of modern web techniques and packages such as AJAX, DHTMLX and jQuery, which has been instrumental in the successful implementation of a reactive and efficient application. We finally present the resulting application and how code quality was ensured. We conclude with a roadmap for future technical and functional developments.

  6. Failing to diagnose and failing to treat an addicted client: Two potentially life-threatening clinical errors.

    Science.gov (United States)

    Liese, Bruce S; Reis, Daniel J

    2016-09-01

    Psychotherapists risk making 2 types of errors with clients who struggle with addictive behaviors: failure to addictive behaviors and failure to effectively addictive behaviors. Given the high prevalence of addictive behaviors in clinical populations, therapists are in a unique position to assist individuals with these problems. It is assumed that therapists possess general diagnostic and treatment skills and yet many do not diagnose or do not treat addictive behaviors. Reasons for making these errors include prohibitive beliefs and limited knowledge about addictive behaviors. We offer specific recommendations to reduce these psychotherapy errors. These include: (a) more deliberate screening and diagnosis of addictive behaviors, (b) increased application of empirically supported addiction treatments, (c) required education and training in addictive behaviors, (d) modification of prohibitive attitudes about addressing addictive behaviors, and (e) increased attention paid to the addictive behaviors by professional psychotherapy organizations. (PsycINFO Database Record PMID:27631864

  7. Failing to diagnose and failing to treat an addicted client: Two potentially life-threatening clinical errors.

    Science.gov (United States)

    Liese, Bruce S; Reis, Daniel J

    2016-09-01

    Psychotherapists risk making 2 types of errors with clients who struggle with addictive behaviors: failure to addictive behaviors and failure to effectively addictive behaviors. Given the high prevalence of addictive behaviors in clinical populations, therapists are in a unique position to assist individuals with these problems. It is assumed that therapists possess general diagnostic and treatment skills and yet many do not diagnose or do not treat addictive behaviors. Reasons for making these errors include prohibitive beliefs and limited knowledge about addictive behaviors. We offer specific recommendations to reduce these psychotherapy errors. These include: (a) more deliberate screening and diagnosis of addictive behaviors, (b) increased application of empirically supported addiction treatments, (c) required education and training in addictive behaviors, (d) modification of prohibitive attitudes about addressing addictive behaviors, and (e) increased attention paid to the addictive behaviors by professional psychotherapy organizations. (PsycINFO Database Record

  8. Team-client Relationships And Extreme Programming

    Directory of Open Access Journals (Sweden)

    John Karn

    2008-01-01

    Full Text Available This paper describes a study that examined the relationship between software engineering teams who adhered to the extreme programming (XP methodology and their project clients. The study involved observing teams working on projects for clients who had commissioned a piece of software to be used in the real world. Interviews were conducted during and at the end of the project to get client opinion on how the project had progressed. Of interest to the researchers were opinions on frequency of feedback, how the team captured requirements, whether or not the iterative approach of XP proved to be helpful, and the level of contextual and software engineering knowledge the client had at the start of the project. In theory, fidelity to XP should result in enhanced communication, reduce expectation gaps, and lead to greater client satisfaction. Our results suggest that this depends heavily on the communication skills of the team and of the client, the expectations of the client, and the nature of the project.

  9. Building a medical multimedia database system to integrate clinical information: an application of high-performance computing and communications technology.

    Science.gov (United States)

    Lowe, H J; Buchanan, B G; Cooper, G F; Vries, J K

    1995-01-01

    The rapid growth of diagnostic-imaging technologies over the past two decades has dramatically increased the amount of nontextual data generated in clinical medicine. The architecture of traditional, text-oriented, clinical information systems has made the integration of digitized clinical images with the patient record problematic. Systems for the classification, retrieval, and integration of clinical images are in their infancy. Recent advances in high-performance computing, imaging, and networking technology now make it technologically and economically feasible to develop an integrated, multimedia, electronic patient record. As part of The National Library of Medicine's Biomedical Applications of High-Performance Computing and Communications program, we plan to develop Image Engine, a prototype microcomputer-based system for the storage, retrieval, integration, and sharing of a wide range of clinically important digital images. Images stored in the Image Engine database will be indexed and organized using the Unified Medical Language System Metathesaurus and will be dynamically linked to data in a text-based, clinical information system. We will evaluate Image Engine by initially implementing it in three clinical domains (oncology, gastroenterology, and clinical pathology) at the University of Pittsburgh Medical Center.

  10. Statistical Analysis of Charpy Transition Temperature Shift in Reactor Pressure Vessel Steels: Application of Nuclear Materials Database(MatDB)

    International Nuclear Information System (INIS)

    The MDPortal contains various technical documents on the degradation and development of nuclear materials. Additionally, the nuclear materials database (MatDB) is also launched in KAERI recently. The MatDB covers the mechanical properties of various nuclear structural materials used as the components: a reactor pressure vessel, steam generator, and primary and secondary piping. In this study, we introduced MatDB briefly, and analyzed the Charpy transition temperature shift in reactor pressure vessel steels of Korean nuclear power plants retrieved from MatDB. It can show an application of the MatDB to the real case of material degradations in NPPs. The MatDB includes the tensile results, Charpy results, fatigue results and J-R curve results at present. In the future other properties such as creep, fracture toughness, and SCC degradations are going to be added consistently. The data from MatDB were successfully applied to estimate the TTS analysis of Korean RPV steels in surveillance tests

  11. Building a medical multimedia database system to integrate clinical information: an application of high-performance computing and communications technology.

    Science.gov (United States)

    Lowe, H J; Buchanan, B G; Cooper, G F; Vries, J K

    1995-01-01

    The rapid growth of diagnostic-imaging technologies over the past two decades has dramatically increased the amount of nontextual data generated in clinical medicine. The architecture of traditional, text-oriented, clinical information systems has made the integration of digitized clinical images with the patient record problematic. Systems for the classification, retrieval, and integration of clinical images are in their infancy. Recent advances in high-performance computing, imaging, and networking technology now make it technologically and economically feasible to develop an integrated, multimedia, electronic patient record. As part of The National Library of Medicine's Biomedical Applications of High-Performance Computing and Communications program, we plan to develop Image Engine, a prototype microcomputer-based system for the storage, retrieval, integration, and sharing of a wide range of clinically important digital images. Images stored in the Image Engine database will be indexed and organized using the Unified Medical Language System Metathesaurus and will be dynamically linked to data in a text-based, clinical information system. We will evaluate Image Engine by initially implementing it in three clinical domains (oncology, gastroenterology, and clinical pathology) at the University of Pittsburgh Medical Center. PMID:7703940

  12. 大型遥感图像处理系统中集成数据库设计及应用%Design and Application of Integrated Database for a Large Remote Sensing Processing System

    Institute of Scientific and Technical Information of China (English)

    李军; 刘高焕; 迟耀斌; 朱重光

    2001-01-01

    大型遥感图像处理应用系统中,往往需要实时获取各种背景或专题数据,该过程即是数据动态集成过程。集成数据库是建立在各种专题数据库基础上的数据集成使用框架体系,该文描述了集成数据库的结构及各类子库的组成,根据项目的特殊需求提出了虚拟数据库概念,并结合实例说明了集成数据库以元数据为链条的使用机制与方法。%It is necessary to provide any essential background data andthematic data timely in image processing and applications. In fact, it is very difficult to integrate different kinds of data into one database that is managed by commercial GIS or image processing software such as ARC/INFO or ERDAS. In this paper, the author describes an integrated database management system which is a framework based on different kinds of database, such as image database, vector spatial database, spatial entity spectrum characteristics database, spatial entity image sample database, control point (tics) database, documents database, models database, and product database. The querying and retrieving system, which are basic functions of integrated database management system, depend on metadata being divided into three parts: database metadata, dataset metadata and attribute field metadata. Finally, the author introduces the concept of virtual database that is a logical database based on other practical databases, and describes its structure and application in product making system for a large remote sensing application in detail.

  13. Threshold detection for the generalized Pareto distribution: Review of representative methods and application to the NOAA NCDC daily rainfall database

    Science.gov (United States)

    Langousis, Andreas; Mamalakis, Antonios; Puliga, Michelangelo; Deidda, Roberto

    2016-04-01

    In extreme excess modeling, one fits a generalized Pareto (GP) distribution to rainfall excesses above a properly selected threshold u. The latter is generally determined using various approaches, such as nonparametric methods that are intended to locate the changing point between extreme and nonextreme regions of the data, graphical methods where one studies the dependence of GP-related metrics on the threshold level u, and Goodness-of-Fit (GoF) metrics that, for a certain level of significance, locate the lowest threshold u that a GP distribution model is applicable. Here we review representative methods for GP threshold detection, discuss fundamental differences in their theoretical bases, and apply them to 1714 overcentennial daily rainfall records from the NOAA-NCDC database. We find that nonparametric methods are generally not reliable, while methods that are based on GP asymptotic properties lead to unrealistically high threshold and shape parameter estimates. The latter is justified by theoretical arguments, and it is especially the case in rainfall applications, where the shape parameter of the GP distribution is low; i.e., on the order of 0.1-0.2. Better performance is demonstrated by graphical methods and GoF metrics that rely on preasymptotic properties of the GP distribution. For daily rainfall, we find that GP threshold estimates range between 2 and 12 mm/d with a mean value of 6.5 mm/d, while the existence of quantization in the empirical records, as well as variations in their size, constitute the two most important factors that may significantly affect the accuracy of the obtained results.

  14. Esourcing capability model for client organizations

    CERN Document Server

    Hefley, Bill

    2010-01-01

    The eSourcing Capability Model for Client Organizations (eSCM-CL) is the best practices model that enables client organizations to appraise and improve their capability to foster the development of more effective relationships and to better manage these relationships. This title helps readers successfully implement a full range of client-organization tasks, ranging from developing the organization's sourcing strategy, planning for sourcing and service provider selection, initiating an agreement with service providers, managing service delivery, and completing the agreement.The eSCM-CL has been

  15. Client-Oriented Approach: Forming the System of Management of the Bank Relations with Clients

    Directory of Open Access Journals (Sweden)

    Zavadska Diana V.

    2015-03-01

    Full Text Available The aim of the article is to develop the theoretical principles of forming the bank relations with clients as part of the client-oriented strategy implementation. As a result of the conducted research there has been presented the definition of client-orientation, mechanism and system of management. The system of management of the bank relations with clients, the purpose and objectives of its formation have been substantiated. The hierarchy of subjects of forming and managing the process of the bank relations with client has been presented. The ways of implementing in practice the functions of the mechanism of managing relations with clients have been revealed. It has been proved that for implementation of the client-oriented approach the banking institution should have a comprehensive view of its clients’ behavior, which detailed understanding will allow for a more accurate segmentation and building individualized partnership relations. Implementing the principle of totality of client relationships level and comprehensive knowledge, development of employee behavior techniques and special techniques for working with the most valuable clients, the use of analytics and forecasting tools will provide targeting of marketing campaigns and lead to minimization of additional costs, satisfaction of every client, loyalty, increase in the market share, growth of sales volume, increase in profits of the banking institution.

  16. Analysis of the Characteristics and Applications of Computer Mobile Database%浅析计算机移动数据库的特点及应用

    Institute of Scientific and Technical Information of China (English)

    乐瑞卿

    2011-01-01

    With the rapid Social and economic development,mobile technology also will be developed,gradually moving database applications in the embedded operating system in mobile database also shows its superiority.%随着社会经济的快速发展,移动计算技术也随之发展,移动数据库逐步走向应用,在嵌入式操作系统中移动数据库更显示出其优越性。

  17. A Methodology and Tool for Investigation of Artifacts Left by the BitTorrent Client

    Directory of Open Access Journals (Sweden)

    Algimantas Venčkauskas

    2016-05-01

    Full Text Available The BitTorrent client application is a popular utility for sharing large files over the Internet. Sometimes, this powerful utility is used to commit cybercrimes, like sharing of illegal material or illegal sharing of legal material. In order to help forensics investigators to fight against these cybercrimes, we carried out an investigation of the artifacts left by the BitTorrent client. We proposed a methodology to locate the artifacts that indicate the BitTorrent client activity performed. Additionally, we designed and implemented a tool that searches for the evidence left by the BitTorrent client application in a local computer running Windows. The tool looks for the four files holding the evidence. The files are as follows: *.torrent, dht.dat, resume.dat, and settings.dat. The tool decodes the files, extracts important information for the forensic investigator and converts it into XML format. The results are combined into a single result file.

  18. GrayStarServer: Server-side spectrum synthesis with a browser-based client-side user interface

    CERN Document Server

    Short, C Ian

    2016-01-01

    I present GrayStarServer (GSS), a stellar atmospheric modeling and spectrum synthesis code of pedagogical accuracy that is accessible in any web browser on commonplace computational devices and that runs on a time-scale of a few seconds. The addition of spectrum synthesis annotated with line identifications extends the functionality and pedagogical applicability of GSS beyond that of its predecessor, GrayStar3 (GS3). The spectrum synthesis is based on a line list acquired from the NIST atomic spectra database, and the GSS post-processing and user interface (UI) client allows the user to inspect the plain text ASCII version of the line list, as well as to apply macroscopic broadening. Unlike GS3, GSS carries out the physical modeling on the server side in Java, and communicates with the JavaScript and HTML client via an asynchronous HTTP request. I also describe other improvements beyond GS3 such as more realistic modeling physics and use of the HTML element for higher quality plotting and rendering of result...

  19. A tandem repeats database for bacterial genomes: application to the genotyping of Yersinia pestis and Bacillus anthracis

    Directory of Open Access Journals (Sweden)

    Denoeud France

    2001-03-01

    Full Text Available Abstract Background Some pathogenic bacteria are genetically very homogeneous, making strain discrimination difficult. In the last few years, tandem repeats have been increasingly recognized as markers of choice for genotyping a number of pathogens. The rapid evolution of these structures appears to contribute to the phenotypic flexibility of pathogens. The availability of whole-genome sequences has opened the way to the systematic evaluation of tandem repeats diversity and application to epidemiological studies. Results This report presents a database (http://minisatellites.u-psud.fr of tandem repeats from publicly available bacterial genomes which facilitates the identification and selection of tandem repeats. We illustrate the use of this database by the characterization of minisatellites from two important human pathogens, Yersinia pestis and Bacillus anthracis. In order to avoid simple sequence contingency loci which may be of limited value as epidemiological markers, and to provide genotyping tools amenable to ordinary agarose gel electrophoresis, only tandem repeats with repeat units at least 9 bp long were evaluated. Yersinia pestis contains 64 such minisatellites in which the unit is repeated at least 7 times. An additional collection of 12 loci with at least 6 units, and a high internal conservation were also evaluated. Forty-nine are polymorphic among five Yersinia strains (twenty-five among three Y. pestis strains. Bacillus anthracis contains 30 comparable structures in which the unit is repeated at least 10 times. Half of these tandem repeats show polymorphism among the strains tested. Conclusions Analysis of the currently available bacterial genome sequences classifies Bacillus anthracis and Yersinia pestis as having an average (approximately 30 per Mb density of tandem repeat arrays longer than 100 bp when compared to the other bacterial genomes analysed to date. In both cases, testing a fraction of these sequences for

  20. Constraint Databases and Geographic Information Systems

    OpenAIRE

    Revesz, Peter

    2007-01-01

    Constraint databases and geographic information systems share many applications. However, constraint databases can go beyond geographic information systems in efficient spatial and spatiotemporal data handling methods and in advanced applications. This survey mainly describes ways that constraint databases go beyond geographic information systems. However, the survey points out that in some areas constraint databases can learn also from geographic information systems.

  1. Acceptance and Commitment Therapy for a Heterogeneous Group of Treatment-Resistant Clients: A Treatment Development Study

    Science.gov (United States)

    Clarke, Sue; Kingston, Jessica; Wilson, Kelly G.; Bolderston, Helen; Remington, Bob

    2012-01-01

    Acceptance and commitment therapy (ACT) has been shown to have broad applicability to different diagnostic groups, and there are theoretical reasons to consider its use with clients with chronic mental health problems. We report an innovative treatment development evaluation of ACT for a heterogeneous group of "treatment-resistant clients" (N =…

  2. Database Manager

    Science.gov (United States)

    Martin, Andrew

    2010-01-01

    It is normal practice today for organizations to store large quantities of records of related information as computer-based files or databases. Purposeful information is retrieved by performing queries on the data sets. The purpose of DATABASE MANAGER is to communicate to students the method by which the computer performs these queries. This…

  3. Conceptual considerations for CBM databases

    International Nuclear Information System (INIS)

    We consider a concept of databases for the Cm experiment. For this purpose, an analysis of the databases for large experiments at the LHC at CERN has been performed. Special features of various DBMS utilized in physical experiments, including relational and object-oriented DBMS as the most applicable ones for the tasks of these experiments, were analyzed. A set of databases for the CBM experiment, DBMS for their developments as well as use cases for the considered databases are suggested.

  4. Asynchronous data change notification between database server and accelerator controls system

    International Nuclear Information System (INIS)

    Database data change notification (DCN) is a commonly used feature. Not all database management systems (DBMS) provide an explicit DCN mechanism. Even for those DBMS's which support DCN (such as Oracle and MS SQL server), some server side and/or client side programming may be required to make the DCN system work. This makes the setup of DCN between database server and interested clients tedious and time consuming. In accelerator control systems, there are many well established software client/server architectures (such as CDEV, EPICS, and ADO) that can be used to implement data reflection servers that transfer data asynchronously to any client using the standard SET/GET API. This paper describes a method for using such a data reflection server to set up asynchronous DCN (ADCN) between a DBMS and clients. This method works well for all DBMS systems which provide database trigger functionality. Asynchronous data change notification (ADCN) between database server and clients can be realized by combining the use of a database trigger mechanism, which is supported by major DBMS systems, with server processes that use client/server software architectures that are familiar in the accelerator controls community (such as EPICS, CDEV or ADO). This approach makes the ADCN system easy to set up and integrate into an accelerator controls system. Several ADCN systems have been set up and used in the RHIC-AGS controls system.

  5. Automated Client-side Sanitizer for Code Injection Attacks

    Directory of Open Access Journals (Sweden)

    Dnyaneshwar K. Patil

    2016-04-01

    Full Text Available Web applications are useful for various online services. These web applications are becoming ubiquitous in our daily lives. They are used for multiple purposes such as e-commerce, financial services, emails, healthcare services and many other captious services. But the presence of vulnerabilities in the web application may become a serious cause for the security of the web application. A web application may contain different types of vulnerabilities. Cross-site scripting is one of the type of code injection attacks. According to OWASP TOP 10 vulnerability report, Cross-site Scripting (XSS is among top 5 vulnerabilities. So this research work aims to implement an effective solution for the prevention of cross- site scripting vulnerabilities. In this paper, we implemented a novel client-side XSS sanitizer that prevents web applications from XSS attacks. Our sanitizer is able to detect cross-site scripting vulnerabilities at the client-side. It strengthens web browser, because modern web browser do not provide any specific notification, alert or indication of security holes or vulnerabilities and their presence in the web application.

  6. Counselor Values and the Pregnant Adolescent Client.

    Science.gov (United States)

    Kennedy, Bebe C.; And Others

    1984-01-01

    Reviews options counselors can suggest to pregnant adolescents, including abortion, adoption, marriage, and single parenthood. Discusses the need for counselors to be aware of their own values and help the client explore her values. (JAC)

  7. Caring for Clients and Families With Anxiety

    Directory of Open Access Journals (Sweden)

    Noriko Yamamoto-Mitani

    2016-08-01

    Full Text Available This study elucidated Japanese home care nurses’ experiences of supporting clients and families with anxiety. We interviewed 10 registered nurses working in home care agencies and analyzed the data using grounded theory to derive categories pertaining to the nurses’ experiences of providing care. We conceptualized nurses’ approaches to caring for anxiety into three categories: First, they attempted to reach out for anxiety even when the client/family did not make it explicit; second, they tried to alter the outlook of the situation; and third, they created comfort in the lives of the client/family. The conceptualizations of nurses’ strategies to alleviate client/family anxiety may reflect Japanese/Eastern cultural characteristics in communication and their view of the person and social care system, but these conceptualizations may also inform the practice of Western nurses by increasing awareness of skills they may also have and use.

  8. Managing Client Values in Construction Design

    DEFF Research Database (Denmark)

    Thyssen, Mikael Hygum; Emmitt, Stephen; Bonke, Sten;

    2008-01-01

    for capturing and managing client values within a lean framework. This paper describes the initial findings of a joint research project between academia and industry practitioners that seeks to develop the workshop method to create a state of the art approach in construction design management. This includes......In construction projects the client will comprise both owner, end-users, and the wider society, representatives of which may have conflicting goals and values; and these may not be fully realized by the stakeholders themselves. Therefore it is a great challenge to capture and manage the values...... of the multiple stakeholders that constitutes the “client”. However, seeing client satisfaction as the end-goal of construction it is imperative to make client values explicit in the early project phase and make sure that these values are reflected in all subsequent phases of design and construction...

  9. A virtual repository approach to clinical and utilization studies: application in mammography as alternative to a national database.

    Science.gov (United States)

    Ohno-Machado, L; Boxwala, A A; Ehresman, J; Smith, D N; Greenes, R A

    1997-01-01

    A national mammography database was proposed, based on a centralized architecture for collecting, monitoring, and auditing mammography data. We have developed an alternative architecture relying on Internet-based distributed queries to heterogeneous databases. This architecture creates a "virtual repository", or a federated database which is constructed dynamically, for each query and makes use of data available in legacy systems. It allows the construction of custom-tailored databases at individual sites that can serve the dual purposes of providing data (a) to researchers through a common mammography repository and (b) to clinicians and administrators at participating institutions. We implemented this architecture in a prototype system at the Brigham and Women's Hospital to show its feasibility. Common queries are translated dynamically into database-specific queries, and the results are aggregated for immediate display or download by the user. Data reside in two different databases and consist of structured mammography reports, coded per BIRADS Standardized Mammography Lexicon, as well as pathology results. We prospectively collected data on 213 patients, and showed that our system can perform distributed queries effectively. We also implemented graphical exploratory analysis tools to allow visualization of results. Our findings indicate that the architecture is not only feasible, but also flexible and scaleable, constituting a good alternative to a national mammography database. PMID:9357650

  10. Measuring Money Mismanagement Among Dually Diagnosed Clients

    OpenAIRE

    Black, Ryan A.; Rounsaville, Bruce J.; Rosenheck, Robert A; Conrad, Kendon J.; Ball, Samuel A.; Rosen, Marc I.

    2008-01-01

    Clients dually diagnosed with psychiatric and substance abuse disorders may be adversely affected if they mismanage their Social Security or public support benefits. Assistance managing funds, including assignment of a representative payee, is available but there are no objective assessments of money mismanagement. In this study, a Structured Clinical Interview for Money Mismanagement was administered twice at one-week intervals to 46 clients receiving disability payments and was compared to ...

  11. Database computing in HEP

    International Nuclear Information System (INIS)

    The major SSC experiments are expected to produce up to 1 Petabyte of data per year each. Once the primary reconstruction is completed by farms of inexpensive processors. I/O becomes a major factor in further analysis of the data. We believe that the application of database techniques can significantly reduce the I/O performed in these analyses. We present examples of such I/O reductions in prototype based on relational and object-oriented databases of CDF data samples

  12. Clients' and therapists' stories about psychotherapy.

    Science.gov (United States)

    Adler, Jonathan M

    2013-12-01

    This article provides an overview of the emerging field of research on clients' stories about their experiences in psychotherapy. The theory of narrative identity suggests that individuals construct stories about their lives in order to provide the self with a sense of purpose and unity. Psychotherapy stories serve both psychological functions. Focusing on the theme of agency as a vehicle for operationalizing purpose and coherence as a way of operationalizing unity, this article will describe the existing scholarship connecting psychotherapy stories to clients' psychological well-being. Results from cross-sectional qualitative and quantitative studies as well as longitudinal research indicate a connection between the stories clients tell about therapy and their psychological well-being, both over the course of treatment and after it is over. In addition, a preliminary analysis of therapists' stories about their clients' treatment is presented. These analyses reveal that the way therapists recount a particular client's therapy does not impact the relationships between clients' narratives and their improvement. The article concludes with a discussion of how this body of scholarship might be fruitfully applied in the realm of clinical practice. PMID:22812587

  13. A Permutation Gigantic Issues in Mobile Real Time Distributed Database : Consistency & Security

    Directory of Open Access Journals (Sweden)

    Gyanendra Kr. Gupta

    2011-02-01

    Full Text Available Several shape of Information System are broadly used in a variety of System Models. With the rapid development of computer network, Information System users concern more about data sharing in networks. In conventional relational database, data consistency was controlled by consistency control mechanism when a data object is locked in a sharing mode, other transactions can only read it, but can not update it. If the traditional consistency control method has been used yet, the system’s concurrency will be inadequately influenced. So there are many new necessities for the consistency control and security in Mobile Real Time Distributed Database (MRTDDB. The problem not limited only to type of data (e.g. mobile or real-time databases. There are many aspects of data consistency problems in MRTDDB, such as inconsistency between characteristic and type of data; the nconsistency of topological relations after objects has been modified. In this paper, many cases of consistency are discussed. As the mobile computing becomes well-liked and the database grows with information sharing security is a big issue for researchers. Mutually both Consistency and Security of data is a big confront for esearchers because whenever the data is not consistent and secure no maneuver on the data (e.g. transaction is productive. It becomes more and more crucial when the transactions are used in on-traditional environment like Mobile, Distributed, Real Time and Multimedia databases. In this paper we raise the different aspects and analyze the available solution for consistency and security of databases. Traditional Database Security has focused primarily on creating user accounts and managing user rights to database objects. But in the mobility and drifting computing uses this database creating a new prospect for research. The wide spread use of databases over the web, heterogeneous client-server architectures,application servers, and networks creates a critical need to

  14. Consistency and Security in Mobile Real Time Distributed Database (MRTDDB): A Combinational Giant Challenge

    Science.gov (United States)

    Gupta, Gyanendra Kr.; Sharma, A. K.; Swaroop, Vishnu

    2010-11-01

    Many type of Information System are widely used in various fields. With the hasty development of computer network, Information System users care more about data sharing in networks. In traditional relational database, data consistency was controlled by consistency control mechanism when a data object is locked in a sharing mode, other transactions can only read it, but can not update it. If the traditional consistency control method has been used yet, the system's concurrency will be inadequately influenced. So there are many new necessities for the consistency control and security in MRTDDB. The problem not limited only to type of data (e.g. mobile or real-time databases). There are many aspects of data consistency problems in MRTDDB, such as inconsistency between attribute and type of data; the inconsistency of topological relations after objects has been modified. In this paper, many cases of consistency are discussed. As the mobile computing becomes well liked and the database grows with information sharing security is a big issue for researchers. Consistency and Security of data is a big challenge for researchers because when ever the data is not consistent and secure no maneuver on the data (e.g. transaction) is productive. It becomes more and more crucial when the transactions are used in non-traditional environment like Mobile, Distributed, Real Time and Multimedia databases. In this paper we raise the different aspects and analyze the available solution for consistency and security of databases. Traditional Database Security has focused primarily on creating user accounts and managing user privileges to database objects. But in the mobility and nomadic computing uses these database creating a new opportunities for research. The wide spread use of databases over the web, heterogeneous client-server architectures, application servers, and networks creates a critical need to amplify this focus. In this paper we also discuss an overview of the new and old

  15. BDDR, a new CEA technological and operating reactor database

    Energy Technology Data Exchange (ETDEWEB)

    Soldevilla, M.; Salmons, S.; Espinosa, B. [CEA-Saclay, CEA/DEN/DANS/DM2S/SERMA, 91191 Gif-sur-Yvette (France); Clanet, M.; Boudin, X. [CEA-Bruyeres-le-Chatel, 91297 Arpajon (France)

    2013-07-01

    The new application BDDR (Reactor database) has been developed at CEA in order to manage nuclear reactors technological and operating data. This application is a knowledge management tool which meets several internal needs: -) to facilitate scenario studies for any set of reactors, e.g. non-proliferation assessments; -) to make core physics studies easier, whatever the reactor design (PWR-Pressurized Water Reactor-, BWR-Boiling Water Reactor-, MAGNOX- Magnesium Oxide reactor-, CANDU - CANada Deuterium Uranium-, FBR - Fast Breeder Reactor -, etc.); -) to preserve the technological data of all reactors (past and present, power generating or experimental, naval propulsion,...) in a unique repository. Within the application database are enclosed location data and operating history data as well as a tree-like structure containing numerous technological data. These data address all kinds of reactors features and components. A few neutronics data are also included (neutrons fluxes). The BDDR application is based on open-source technologies and thin client/server architecture. The software architecture has been made flexible enough to allow for any change. (authors)

  16. On Sensor Network Database Design and Application of Parallel Computing%传感器网络数据库并行计算设计与应用

    Institute of Scientific and Technical Information of China (English)

    王娜娜

    2014-01-01

    根据传感器网络数据库技术自身的特点以及应用情况,结合传感器网络数据库技术实际的设计过程,对传感器网络数据库查询处理功能进行优化处理,对传感器网络数据库并行计算设计与应用进行探讨。通过系统设计之后,使用的成效得到了很大的提升。经过调整可以改进系统性能,但是还需要对其开展更加深入的研究。%Based on the characteristics of the sensor network database technology itself as well as the application, combined with the actual sensor network database design process, the sensor network database query processing functions are optimized. Sensor network database design and application of parallel computing are discussed. After the adoption of the system design, effective use has been greatly improved. After adjustment can improve system performance, but it needs to carry out more in-depth study.

  17. Ontology-guided distortion control for robust-lossless database watermarking: application to inpatient hospital stay records.

    Science.gov (United States)

    Franco-Contreras, J; Coatrieux, G; Cuppens-Boulahia, N; Cuppens, F; Roux, C

    2014-01-01

    In this paper, we propose a new semantic distortion control method for database watermarking. It is based on the identification of the semantic links that exist in-between attribute's values in tuples by means of an ontology. Such a database distortion control provides the capability for any watermarking scheme to avoid incoherent records and consequently ensures: i) the normal interpretation of watermarked data, i.e. introducing a watermark semantically imperceptible; ii) prevent the identification by an attacker of watermarked tuples. The solution we present herein successfully combines this semantic distortion control method with a robust lossless watermarking scheme. Experimental results conducted on a medical database of more than one half million of inpatient hospital stay records also show a non-negligible gain of performance in terms of robustness and database distortion.

  18. Dealer Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The dealer reporting databases contain the primary data reported by federally permitted seafood dealers in the northeast. Electronic reporting was implemented May...

  19. RDD Databases

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This database was established to oversee documents issued in support of fishery research activities including experimental fishing permits (EFP), letters of...

  20. Pursuing Therapeugenic Consequences of Restricting Client Smoking during Counseling.

    Science.gov (United States)

    Schneider, Lawrence J.; Dearing, Nancy

    Theorists and therapists have become increasingly attentive to the role of interpersonal behaviors that facilitate or hinder the ability of the counselor to exert influence over the client during counseling. A study was conducted to examine the impact of a counselor's preference that clients not smoke, client stress levels, client sex, and…

  1. Image Reference Database in Teleradiology: Migrating to WWW

    Science.gov (United States)

    Pasqui, Valdo

    The paper presents a multimedia Image Reference Data Base (IRDB) used in Teleradiology. The application was developed at the University of Florence in the framework of the European Community TELEMED Project. TELEMED overall goals and IRDB requirements are outlined and the resulting architecture is described. IRDB is a multisite database containing radiological images, selected because their scientific interest, and their related information. The architecture consists of a set of IRDB Installations which are accessed from Viewing Stations (VS) located at different medical sites. The interaction between VS and IRDB Installations follows the client-server paradigm and uses an OSI level-7 protocol, named Telemed Communication Language. After reviewing Florence prototype implementation and experimentation, IRDB migration to World Wide Web (WWW) is discussed. A possible scenery to implement IRDB on the basis of WWW model is depicted in order to exploit WWW servers and browsers capabilities. Finally, the advantages of this conversion are outlined.

  2. National database

    DEFF Research Database (Denmark)

    Kristensen, Helen Grundtvig; Stjernø, Henrik

    1995-01-01

    Artikel om national database for sygeplejeforskning oprettet på Dansk Institut for Sundheds- og Sygeplejeforskning. Det er målet med databasen at samle viden om forsknings- og udviklingsaktiviteter inden for sygeplejen.......Artikel om national database for sygeplejeforskning oprettet på Dansk Institut for Sundheds- og Sygeplejeforskning. Det er målet med databasen at samle viden om forsknings- og udviklingsaktiviteter inden for sygeplejen....

  3. Using virtual Lustre clients on the WAN for analysis of data from high energy physics experiments

    Science.gov (United States)

    Bourilkov, D.; Avery, P.; Cheng, M.; Fu, Y.; Kim, B.; Palencia, J.; Budden, R.; Benninger, K.; Shrum, D.; Wilgenbusch, J.

    2012-12-01

    We describe the work on creating system images of Lustre virtual clients in the ExTENCI project (Extending Science Through Enhanced National Cyberlnfrastructure), using several virtual technologies (Xen, VMware, VirtualBox, KVM). These virtual machines can be built at several levels, from a basic Linux installation (we use Scientific Linux 5 as an example), adding a Lustre client with Kerberos authentication, and up to complete clients including local or distributed (based on CernVM-FS) installations of the full CERN and project specific software stack for typical LHC experiments. The level, and size, of the images are determined by the users on demand. Various sites and individual users can just download and use them out of the box on Linux/UNIX, Windows and Mac OS X based hosts. We compare the performance of virtual clients with that of real physical systems for typical high energy physics applications like Monte Carlo simulations or analysis of data stored in ROOT trees.

  4. A NOVEL REDIS SECURITY BEST PRACTICES FOR NOSQL DATABASES

    OpenAIRE

    Jeelani Ahmed

    2016-01-01

    In last decades of years the field of databases has emerged. The organizations are migrating towards Non-Relational databases from Relational Databases due to the current trend of Big Data, Big Users and Cloud Computing. Business data processing is the main market of Relational Databases. It turns out to be harder to managing Big Clients and Big information on a cloud domain. To modeling the data these databases uses a rigid and schema based approach and are designed to run on a single machin...

  5. 网络环境下的非结构化数据库应用研究%Application Research on Unstructured Database in the Network

    Institute of Scientific and Technical Information of China (English)

    王颖; 李建敏

    2015-01-01

    Based on unstructured database technology,this paper analyzes the current situation of data⁃base application in the network environment,discusses the construction of network databases,analysis of un⁃structured data,and an application example is used for explanation.%本文从非结构化数据库技术出发,分析了网络环境下数据库应用现状,探讨了网络数据库的建设、非结构化数据的分析,并以一个具体应用实例进行了说明。

  6. SHORT SURVEY ON GRAPHICAL DATABASE

    Directory of Open Access Journals (Sweden)

    Harsha R Vyavahare

    2015-08-01

    Full Text Available This paper explores the features of graph databases and data models. The popularity towards work with graph models and datasets has been increased in the recent decades .Graph database has a number of advantage over the relational database. This paper take a short review on the graph and hyper graph concepts from mathematics so that graph so that we can understand the existing difficulties in the implantation of graph model. From the Past few decades saw hundreds of research contributions their vast research in the DBS field with graph database. However, the research on the existence of general purpose DBS managements and mining that suits for variety of applications is still very much active. The review is done based on the Application of graph model techniques in the database within the framework of graph based approaches with the aim of implementation of different graphical database and tabular database

  7. On Simplifying Features in OpenStreetMap database

    Science.gov (United States)

    Qian, Xinlin; Tao, Kunwang; Wang, Liang

    2015-04-01

    Currently the visualization of OpenStreetMap data is using a tile server which stores map tiles that have been rendered from vector data in advance. However, tiled map are short of functionalities such as data editing and customized styling. To enable these advanced functionality, Client-side processing and rendering of geospatial data is needed. Considering the voluminous size of the OpenStreetMap data, simply sending region queries results of OSM database to client is prohibitive. To make the OSM data retrieved from database adapted for client receiving and rendering, It must be filtered and simplified at server-side to limit its volume. We propose a database extension for OSM database to make it possible to simplifying geospatial objects such as ways and relations during data queries. Several auxiliary tables and PL/pgSQL functions are presented to make the geospatial features can be simplified by omitting unimportant vertices. There are five components in the database extension: Vertices weight computation by polyline and polygon simplification algorithm, Vertices weight storage in auxiliary tables. filtering and selecting of vertices using specific threshold value during spatial queries, assembling of simplified geospatial objects using filtered vertices, vertices weight updating after geospatial objects editing. The database extension is implemented on an OSM APIDB using PL/pgSQL. The database contains a subset of OSM database. The experimental database contains geographic data of United Kingdom which is about 100 million vertices and roughly occupy 100GB disk. JOSM are used to retrieve the data from the database using a revised data accessing API and render the geospatial objects in real-time. When serving simplified data to client, The database allows user to set the bound of the error of simplification or the bound of responding time in each data query. Experimental results show the effectiveness and efficiency of the proposed methods in building a

  8. Prototype for a Generic Thin—Client Remote Analysis Environment for CMS

    Institute of Scientific and Technical Information of China (English)

    C.D.Steenberg; J.J.Bunn; 等

    2001-01-01

    The multi-tiered architecture of the highly-distributed CMS computing systems necessitates a flexible data distribution and analysis environment.We describe a prototype analysis environment which functions efficiently over wide area networks using a server installed at the Caltech/UCSD Tier 2 prototype to analyze CMS data stored at various locations using a thin client.The analysis environnment is based on existing HEP(Anaphe) and CMOS(CARF,ORCA,IGUANA)software thchnology on the server acessed from a variety of clients.A Java Analysis Studio (JAS,from SLAC)plug-in is being developed as a reference client.The server is operated as "Black box"on the proto-Tier2 system.ORCA Objectivity databases(e.g.an existing large CMS Muon sample)are hosted on the master and slave nodes,and remote clients can request processing of queries across the server nodes ,and get the histogram results returned and rendered in the client.The server is implemented using pure C++,and use XML-RPC as a language-neutral transport.This has several benefits,including much better scalability,better integration with CARF/ORCA,and importanly,Makes the work directly useful to other non-java general-purpose analysis and presentation tools such as Hippodraw,Lizard.or ROOT.

  9. Comparative study on Authenticated Sub Graph Similarity Search in Outsourced Graph Database

    Directory of Open Access Journals (Sweden)

    N. D. Dhamale

    2015-11-01

    Full Text Available Today security is very important in the database system. Advanced database systems face a great challenge raised by the emergence of massive, complex structural data in bioinformatics, chem-informatics, and many other applications. Since exact matching is often too restrictive, similarity search of complex structures becomes a vital operation that must be supported efficiently. The Subgraph similarity search is used in graph databases to retrieve graphs whose subgraphs are similar to a given query graph. It has been proven successful in a wide range of applications including bioinformatics and chem-informatics, etc. Due to the cost of providing efficient similarity search services on everincreasing graph data, database outsourcing is apparently an appealing solution to database owners. In this paper, we are studying on authentication techniques that follow the popular filtering-and-verification framework. An authentication-friendly metric index called GMTree. Specifically, we transform the similarity search into a search in a graph metric space and derive small verification objects (VOs to-be-transmitted to query clients. To further optimize GMTree, we are studying on a sampling-based pivot selection method and an authenticated version of MCS computation.

  10. 基于数据库安全的研究与应用%Research and Application of Database Security

    Institute of Scientific and Technical Information of China (English)

    葛耀武; 张明玉; 方芳

    2014-01-01

    With the rapid development of computer technology and communication technology, big data era has arrived. To follow the era demand, the database technology provides the solution. The database as the aggregates of information, it shoulders the task of storing and managing data, and its security is also becoming increasingly important. This article discusses the definition of database and the security threat faced by database system, and discusses the database-related security technology.%伴随着计算机技术以及通信技术的快速发展,大数据时代已经来临。而数据库技术为顺应这一时代要求提供了方案,数据库作为信息的聚集体,担负着存储和管理数据信息的任务,其安全性也越来越重要。本文论述了数据库安全定义、数据库系统面临的安全威胁,探讨了数据库相关的安全技术。

  11. Analyses of client variables in a series of psychotherapy sessions with two child clients.

    Science.gov (United States)

    Mook, B

    1982-04-01

    Studied the process of child psychotherapy by means of analyses of client verbal behaviors. Audio-video recordings were made of nine intermittent psychotherapy sessions with 2 child clients, aged 8 and 12. A randomized mastertape of 4-minute segments was rated for self-exploration by means of the Carkhuff scale. Transcripts were categorized by means of an extended Snyder system and a preliminary set of grammatical variables. Transcripts then were minutized, and all client variables were intercorrelated and factor-analyzed. According to the research expectations, a high level of interrater reliability for the Carkhuff scale and high levels of interjudge agreement for the extended Snyder system were found. Analyses of the client variables demonstrated the nature of each client's verbal responding as well as their pattern of change across successive therapy sessions. The overall verbal response behavior of each client was summarized best through the factor analyses. Communalities and individual differences between the clients were discussed. Future directions for the study of client variables in child psychotherapy process research were suggested.

  12. Client engagement in home and community care services: The client and care coordinator perspective.

    Science.gov (United States)

    Kirst, Maritt; Elmi, Arij; Ray-Daniels, Mila; Foster, Jennifer

    2016-07-01

    A recent study of two Community Care Access Centres in Ontario was conducted to look at how clients can be involved in their own care while, at the same time, enhance their experience overall. This article describes that study and looks at ways of developing a new client engagement strategy moving forward. PMID:27270114

  13. Databases as an information service

    Science.gov (United States)

    Vincent, D. A.

    1983-01-01

    The relationship of databases to information services, and the range of information services users and their needs for information is explored and discussed. It is argued that for database information to be valuable to a broad range of users, it is essential that access methods be provided that are relatively unstructured and natural to information services users who are interested in the information contained in databases, but who are not willing to learn and use traditional structured query languages. Unless this ease of use of databases is considered in the design and application process, the potential benefits from using database systems may not be realized.

  14. Concierge: Personal database software for managing digital research resources

    Directory of Open Access Journals (Sweden)

    Hiroyuki Sakai

    2007-11-01

    Full Text Available This article introduces a desktop application, named Concierge, for managing personal digital research resources. Using simple operations, it enables storage of various types of files and indexes them based on content descriptions. A key feature of the software is a high level of extensibility. By installing optional plug-ins, users can customize and extend the usability of the software based on their needs. In this paper, we also introduce a few optional plug-ins: literaturemanagement, electronic laboratory notebook, and XooNlps client plug-ins. XooNIps is a content management system developed to share digital research resources among neuroscience communities. It has been adopted as the standard database system in Japanese neuroinformatics projects. Concierge, therefore, offers comprehensive support from management of personal digital research resources to their sharing in open-access neuroinformatics databases such as XooNIps. This interaction between personal and open-access neuroinformatics databases is expected to enhance the dissemination of digital research resources. Concierge is developed as an open source project; Mac OS X and Windows XP versions have been released at the official site (http://concierge.sourceforge.jp.

  15. Handling of network and database instabilities in CORAL

    CERN Document Server

    Trentadue, R; Kalkhof, A

    2012-01-01

    The CORAL software is widely used by the LHC experiments for storing and accessing data using relational database technologies. CORAL provides a C++ abstraction layer that supports data persistency for several back-ends and deployment models, direct client access to Oracle servers being one of the most important use cases. Since 2010, several problems have been reported by the LHC experiments in their use of Oracle through CORAL, involving application errors, hangs or crashes after the network or the database servers became temporarily unavailable. CORAL already provided some level of handling of these instabilities, which are due to external causes and cannot be avoided, but this proved to be insufficient in some cases and to be itself the cause of other problems, such as the hangs and crashes mentioned before, in other cases. As a consequence, a major redesign of the CORAL plugins has been implemented, with the aim of making the software more robust against these database and network glitches. The new imple...

  16. 基于Web的数据库应用设计%Design of database Application Based on Web

    Institute of Scientific and Technical Information of China (English)

    马克

    2001-01-01

    本文通过对Web数据库信息发布方法的分析,论述了利用ASP(Active Server Page)与 ADO(ActiveX Data Objects)的组件对象访问数据库的技术,说明ASP技术具有良好的数据库兼容性。%The article analyses about Web database how to publish information on Internet/Intranet to fascinate more users.It discussed accessing database's technique that uses module objects of ASP(Active Server Page)and ADO(Active Data Objects).It explains the ASP techniques have good database compatibility.

  17. Improvement of the efficiency of artificial insemination services through the use of radioimmunoassay and a computer database application

    International Nuclear Information System (INIS)

    A study was conducted at several locations in four provinces of Indonesia to evaluate and increase the efficiency of artificial insemination (AI) services provided to cattle farmers and to improve the feeding and reproductive management practices. Radioimmunoassay (RIA) for progesterone measurement was used together with the computer program Artificial Insemination Database Application (AIDA) to monitor the success of AI and for the early diagnosis of non-pregnancy and reproductive disorders in dairy and beef cattle. Baseline surveys showed that the average calving to first service interval (CFSI) ranged from 121.3 ± 78.2 days in West Java to 203.5 ± 118.3 in West Sumatra, and the conception rate (CR) to first AI ranged from 27% in South Sulawesi to 44% in West Java. Supplementary feeding with urea-molasses multi-nutrient blocks (UMMB) combined with training of farmers on improved husbandry practices reduced the CFSI from 150.6 ± 66.3 days to 102.3 ± 36.5 days and increased the CR from 27% to 49% in South Sulawesi. Similar interventions in West Java reduced the CFSI from 121.3 ± 78.2 days to 112.1 ± 80.9 days and increased the CR from 34% to 37%. Results from measurement of progesterone in milk or blood samples collected on days 0, 10-12 and 22-24 after AI showed that 25% of the animals were non-cyclic or anovulatory, while 8.7% were pregnant at the time of AI. Investigation of cows with breeding problems using measurement of progesterone in combination with clinical examination revealed a range of problems, including true anoestrus, sub-oestrus or missed oestrus, persistent CL and luteal cysts. The ability to make an accurate diagnosis enabled the provision of appropriate advice or treatment for overcoming the problems. Anti-progesterone serum and 125I-Progesterone tracer for use in RIA were produced locally and were found to have acceptable characteristics. The tracer had good specific activity and stability for up to 12 weeks. The production of standards

  18. 基于GIS系统中实时数据库应用的研究%GIS-based system in real-time database application of research

    Institute of Scientific and Technical Information of China (English)

    蔡宇

    2011-01-01

    Real-time database (RTDB) is the products with the increasing of a amount of information in society, and is the core of the information monitoring system (SIS). Global positioning system monitor the operation of a large number of vehicles. The importance of the article is that examining the feasibility of send real-time data into real-time database by large number of vehicles who carrying wireless sensor, and compare the differences of applications in the GIS system between real-time database and relational database.%实时数据库(RTDB)是当今社会信息量不断与日巨增的产物,是信息监控系统(SIS)的核心。GIS全球定位系统中要监控大量的车辆的运行情况,本文的重点是验证大量车辆携带的无线传感器发送实时数据存入实时数据库的可行性,同时比较了实时数据库和关系数据库在GIS系统中应用的差异。

  19. THE ROLE OF DATABASE MARKETING IN THE OPERATIONALIZATION OF THE SERVICES RELATIONSHIP MARKETING

    OpenAIRE

    DUMITRESCU Luigi; Mircea FUCIU

    2010-01-01

    The relationship marketing aims the construction of a durable relation between the enterprise and the final client, identified at an individual level. The particular part of the relationship marketing has two main concepts: individuality and the relation. This paper presents the concepts of relationship marketing, database marketing and geomarketing. We present the importance of implementing a marketing database in a service providing enterprise and its implications on one hand for the client...

  20. Telematics-based online client-server/client collaborative environment for radiotherapy planning simulations.

    Science.gov (United States)

    Kum, Oyeon

    2007-11-01

    Customized cancer radiation treatment planning for each patient is very useful for both a patient and a doctor because it provides the ability to deliver higher doses to a more accurately defined tumor and at the same time lower doses to organs at risk and normal tissues. This can be realized by building an accurate planning simulation system to provide better treatment strategies based on each patient's tomographic data such as CT, MRI, PET, or SPECT. In this study, we develop a real-time online client-server/client collaborative environment between the client (health care professionals or hospitals) and the server/client under a secure network using telematics (the integrated use of telecommunications and medical informatics). The implementation is based on a point-to-point communication scheme between client and server/client following the WYSIWIS (what you see is what I see) paradigm. After uploading the patient tomographic data, the client is able to collaborate with the server/client for treatment planning. Consequently, the level of health care services can be improved, specifically for small radiotherapy clinics in rural/remote-country areas that do not possess much experience or equipment such as a treatment planning simulator. The telematics service of the system can also be used to provide continued medical education in radiotherapy. Moreover, the system is easy to use. A client can use the system if s/he is familiar with the Windows(TM) operating system because it is designed and built based on a user-friendly concept. This system does not require the client to continue hardware and software maintenance and updates. These are performed automatically by the server.

  1. An Architecture For Shared Multi-User Client Rendering Of Massive Geodatasets

    Science.gov (United States)

    Al-Naser, A.; Brooke, J.; Rasheed, M.; Irving, D. H.

    2012-12-01

    We are developing a novel data-centric visualization architecture to allow interactive exploration of geophysical data. Our method allows multiple users to collaborate in a lightweight, loosely-coupled and highly scalable environment. We choose 3D seismic data for our case study. Existing visualization solutions for data exploratory tasks are mainly application-centric rather than data-centric. They typically store large datasets on users' local machines for fast access. Additionally, data objects that are the focus of study, e.g. seismic surveys and interpreted geological features, are managed as objects that are independent of the primary data. Thus multi-user collaborations where different users visually share their geological interpretations are handled inefficiently since objects from each interpretation are stored as independent discrete objects. These objects may be stored separately from the primary data, e.g. on local disks, ensuring a coherent multi-user view is difficult. Our visual analytic method places a central data structure built on a Massively Parallel Processing (MPP) relational database at the heart of the visualization architecture. This structure allows us to develop the following efficient methods for data retrieval and display: -global hashing for spatial reference on all datasets -interpretation tagging which accumulate user interpretation into the database -multi-user concurrent access allowing parallel multi-threading queries In our data structure, data elements are indexed on their geolocations by a hashing algorithm. The hashing algorithm determines the location of the required row through hashing functions without a construction or any storage complexity. This is unlike other conventional indexing algorithms such as bitmapping or tree-based methods where construction and storage (of the index table) complexity varies between O(n) and O(n log n) where n is the size of the dataset. Also, we replace the geometric objects formed as a

  2. Database for foundry engineers – simulationDB – a modern database storing simulation results

    Directory of Open Access Journals (Sweden)

    P. Malinowski

    2010-11-01

    Full Text Available Purpose: of this paper The main aim of this paper is to build specific database system for collecting, analysing and searching simulation results.Design/methodology/approach: It was prepared using client-server architecture. Then was prepared GUI - Graphical User Interface.Findings: New database system for foundry was discovered.Practical implications: System development is in progress and practical implication will be hold in one of iron foundry in next year.Originality/value: The original value of this paper is innovative database system for storing and analysing simulation results.

  3. Biological Databases

    Directory of Open Access Journals (Sweden)

    Kaviena Baskaran

    2013-12-01

    Full Text Available Biology has entered a new era in distributing information based on database and this collection of database become primary in publishing information. This data publishing is done through Internet Gopher where information resources easy and affordable offered by powerful research tools. The more important thing now is the development of high quality and professionally operated electronic data publishing sites. To enhance the service and appropriate editorial and policies for electronic data publishing has been established and editors of article shoulder the responsibility.

  4. Database Driven Web Systems for Education.

    Science.gov (United States)

    Garrison, Steve; Fenton, Ray

    1999-01-01

    Provides technical information on publishing to the Web. Demonstrates some new applications in database publishing. Discusses the difference between static and database-drive Web pages. Reviews failures and successes of a Web database system. Addresses the question of how to build a database-drive Web site, discussing connectivity software, Web…

  5. DbUnit在数据库测试中的应用%Application of DbUnit in Database Testing

    Institute of Scientific and Technical Information of China (English)

    胡银保

    2012-01-01

    在实现测试驱动开发过程中,DbUnit工具是专门针对数据库测试的单元测试框架,它是对JUnit的一个扩展.DbUnit测试允许程序员在整个测试过程中自由地管理和控制数据库,将其置于一个可知的状态.在明确测试机制、搭建好测试环境的基础之上,围绕数据库备份、植入测试数据集进行单元测试、使数据库恢复到测试前状态等问题进行分析和探讨,以期建立一个良好的测试流程.%In the process of realizing Test-Driven development,DbUnit tool is a kind of unit test framework especially for database testing which is an extension of JUnit.In the process of DbUnit testing,programmer can manage and control database freely,place it in a knowable state.The problems such as database backup,implanting data to test,database recovery are analyzed on the base of clearifying the test mechanism and building the test environment,so as to build a good test process.

  6. Flash Caching on the Storage Client

    OpenAIRE

    Holland, David A.; Angelino, Elaine Lee; Wald, Gideon; Seltzer, Margo I.

    2013-01-01

    Flash memory has recently become popular as a caching medium. Most uses to date are on the storage server side. We investigate a different structure: flash as a cache on the client side of a networked storage environment. We use trace-driven simulation to explore the design space. We consider a wide range of configurations and policies to determine the potential client-side caches might offer and how best to arrange them. Our results show that the flash cache writeback policy does not signifi...

  7. Mobile Database Application in Ad Hoc Network%Ad Hoc网中的移动数据库应用

    Institute of Scientific and Technical Information of China (English)

    范俊; 李晓宇

    2012-01-01

    As a result that the traditional mobile database model can not adapt to the actual situation of Ad Hoc network, this paper improves the traditional mobile database model with adding a local server as the agent to adapt to the situation of Ad Hoc network, forming the mobile database model which comprises mobile computer, local server and master server. Furthermore, two algorithms which are used to solve problems of transactions redoing and data synchronization between local server and master server are proposed, so that mobile computer can access database efficiently and correctly. Experimental results show that mobile database model can gain good stability.%传统的移动数据库模型应用到Ad Hoc网中,会带来通信代价增大等问题.为此,对传统的移动数据库模型进行改进,加入本地服务器作为中介,形成由移动主机、本地服务器和主服务器3类结点构成的移动数据库模型,并提出2个算法用来解决模型中本地服务器与主服务器上的数据同步和事务重做问题,从而使移动主机能够高效正确地访问数据库.实验结果证明,该移动数据库模型具有较好的稳定性.

  8. Database driven scheduling for batch systems

    International Nuclear Information System (INIS)

    Experiments at the Jefferson Laboratory will soon be generating data at the rate of 1 TB/day. In this paper, the authors present a database driven scheme that they are currently implementing in order to ensure the safe archival and subsequent reconstruction of this data. They use a client-server architecture implemented in Java to serve data between the experiments, the mass storage, and the processor farm

  9. Generation of a predicted protein database from EST data and application to iTRAQ analyses in grape (Vitis vinifera cv. Cabernet Sauvignon berries at ripening initiation

    Directory of Open Access Journals (Sweden)

    Smith Derek

    2009-01-01

    Full Text Available Abstract Background iTRAQ is a proteomics technique that uses isobaric tags for relative and absolute quantitation of tryptic peptides. In proteomics experiments, the detection and high confidence annotation of proteins and the significance of corresponding expression differences can depend on the quality and the species specificity of the tryptic peptide map database used for analysis of the data. For species for which finished genome sequence data are not available, identification of proteins relies on similarity to proteins from other species using comprehensive peptide map databases such as the MSDB. Results We were interested in characterizing ripening initiation ('veraison' in grape berries at the protein level in order to better define the molecular control of this important process for grape growers and wine makers. We developed a bioinformatic pipeline for processing EST data in order to produce a predicted tryptic peptide database specifically targeted to the wine grape cultivar, Vitis vinifera cv. Cabernet Sauvignon, and lacking truncated N- and C-terminal fragments. By searching iTRAQ MS/MS data generated from berry exocarp and mesocarp samples at ripening initiation, we determined that implementation of the custom database afforded a large improvement in high confidence peptide annotation in comparison to the MSDB. We used iTRAQ MS/MS in conjunction with custom peptide db searches to quantitatively characterize several important pathway components for berry ripening previously described at the transcriptional level and confirmed expression patterns for these at the protein level. Conclusion We determined that a predicted peptide database for MS/MS applications can be derived from EST data using advanced clustering and trimming approaches and successfully implemented for quantitative proteome profiling. Quantitative shotgun proteome profiling holds great promise for characterizing biological processes such as fruit ripening

  10. The EarthServer project: Exploiting Identity Federations, Science Gateways and Social and Mobile Clients for Big Earth Data Analysis

    Science.gov (United States)

    Barbera, Roberto; Bruno, Riccardo; Calanducci, Antonio; Messina, Antonio; Pappalardo, Marco; Passaro, Gianluca

    2013-04-01

    The EarthServer project (www.earthserver.eu), funded by the European Commission under its Seventh Framework Program, aims at establishing open access and ad-hoc analytics on extreme-size Earth Science data, based on and extending leading-edge Array Database technology. The core idea is to use database query languages as client/server interface to achieve barrier-free "mix & match" access to multi-source, any-size, multi-dimensional space-time data -- in short: "Big Earth Data Analytics" - based on the open standards of the Open Geospatial Consortium Web Coverage Processing Service (OGC WCPS) and the W3C XQuery. EarthServer combines both, thereby achieving a tight data/metadata integration. Further, the rasdaman Array Database System (www.rasdaman.com) is extended with further space-time coverage data types. On server side, highly effective optimizations - such as parallel and distributed query processing - ensure scalability to Exabyte volumes. Six Lighthouse Applications are being established in EarthServer, each of which poses distinct challenges on Earth Data Analytics: Cryospheric Science, Airborne Science, Atmospheric Science, Geology, Oceanography, and Planetary Science. Altogether, they cover all Earth Science domains; the Planetary Science use case has been added to challenge concepts and standards in non-standard environments. In addition, EarthLook (maintained by Jacobs University) showcases use of OGC standards in 1D through 5D use cases. In this contribution we will report on the first applications integrated in the EarthServer Science Gateway and on the clients for mobile appliances developed to access them. We will also show how federated and social identity services can allow Big Earth Data Providers to expose their data in a distributed environment keeping a strict and fine-grained control on user authentication and authorisation. The degree of fulfilment of the EarthServer implementation with the recommendations made in the recent TERENA Study on

  11. Cloud Databases: A Paradigm Shift in Databases

    OpenAIRE

    Indu Arora; Anu Gupta

    2012-01-01

    Relational databases ruled the Information Technology (IT) industry for almost 40 years. But last few years have seen sea changes in the way IT is being used and viewed. Stand alone applications have been replaced with web-based applications, dedicated servers with multiple distributed servers and dedicated storage with network storage. Cloud computing has become a reality due to its lesser cost, scalability and pay-as-you-go model. It is one of the biggest changes in IT after the rise of Wor...

  12. Towards the Interoperability of Web, Database, and Mass Storage Technologies for Petabyte Archives

    Science.gov (United States)

    Moore, Reagan; Marciano, Richard; Wan, Michael; Sherwin, Tom; Frost, Richard

    1996-01-01

    At the San Diego Supercomputer Center, a massive data analysis system (MDAS) is being developed to support data-intensive applications that manipulate terabyte sized data sets. The objective is to support scientific application access to data whether it is located at a Web site, stored as an object in a database, and/or storage in an archival storage system. We are developing a suite of demonstration programs which illustrate how Web, database (DBMS), and archival storage (mass storage) technologies can be integrated. An application presentation interface is being designed that integrates data access to all of these sources. We have developed a data movement interface between the Illustra object-relational database and the NSL UniTree archival storage system running in a production mode at the San Diego Supercomputer Center. With this interface, an Illustra client can transparently access data on UniTree under the control of the Illustr DBMS server. The current implementation is based on the creation of a new DBMS storage manager class, and a set of library functions that allow the manipulation and migration of data stored as Illustra 'large objects'. We have extended this interface to allow a Web client application to control data movement between its local disk, the Web server, the DBMS Illustra server, and the UniTree mass storage environment. This paper describes some of the current approaches successfully integrating these technologies. This framework is measured against a representative sample of environmental data extracted from the San Diego Ba Environmental Data Repository. Practical lessons are drawn and critical research areas are highlighted.

  13. From database to normbase

    NARCIS (Netherlands)

    Stamper, R.; Liu, K.; Kolkman, M.; Klarenberg, P.; Slooten, van F.; Ades, Y.; Slooten, van C.

    1991-01-01

    After the database concept, we are ready for the normbase concept. The object is to decouple organizational and technical knowledge that are now mixed inextricably together in the application programs we write today. The underlying principle is to find a way of specifying a social system as a system

  14. Database Technologies for RDF

    Science.gov (United States)

    Das, Souripriya; Srinivasan, Jagannathan

    Efficient and scalable support for RDF/OWL data storage, loading, inferencing and querying, in conjunction with already available support for enterprise level data and operations reliability requirements, can make databases suitable to act as enterprise-level RDF/OWL repository and hence become a viable platform for building semantic applications for the enterprise environments.

  15. Analysis of isotropic turbulence using a public database and the Web service model, and applications to study subgrid models

    Science.gov (United States)

    Meneveau, Charles; Yang, Yunke; Perlman, Eric; Wan, Minpin; Burns, Randal; Szalay, Alex; Chen, Shiyi; Eyink, Gregory

    2008-11-01

    A public database system archiving a direct numerical simulation (DNS) data set of isotropic, forced turbulence is used for studying basic turbulence dynamics. The data set consists of the DNS output on 1024-cubed spatial points and 1024 time-samples spanning about one large-scale turn-over timescale. This complete space-time history of turbulence is accessible to users remotely through an interface that is based on the Web-services model (see http://turbulence.pha.jhu.edu). Users may write and execute analysis programs on their host computers, while the programs make subroutine-like calls that request desired parts of the data over the network. The architecture of the database is briefly explained, as are some of the new functions such as Lagrangian particle tracking and spatial box-filtering. These tools are used to evaluate and compare subgrid stresses and models.

  16. Application of Large-Scale Database-Based Online Modeling to Plant State Long-Term Estimation

    Science.gov (United States)

    Ogawa, Masatoshi; Ogai, Harutoshi

    Recently, attention has been drawn to the local modeling techniques of a new idea called “Just-In-Time (JIT) modeling”. To apply “JIT modeling” to a large amount of database online, “Large-scale database-based Online Modeling (LOM)” has been proposed. LOM is a technique that makes the retrieval of neighboring data more efficient by using both “stepwise selection” and quantization. In order to predict the long-term state of the plant without using future data of manipulated variables, an Extended Sequential Prediction method of LOM (ESP-LOM) has been proposed. In this paper, the LOM and the ESP-LOM are introduced.

  17. Practical Client Puzzle from Repeated Squaring

    NARCIS (Netherlands)

    Jeckmans, A.

    2009-01-01

    Cryptographic puzzles have been proposed by Merkle [15] to relay secret information between parties over an insecure channel. Client puzzles, a type of cryptographic puzzle, have been proposed by Juels and Brainard [8] to defend a server against denial of service attacks. However there is no general

  18. Finding Happiness for Ourselves and Our Clients.

    Science.gov (United States)

    Miller, Geri

    2001-01-01

    Reviews D. G. Myers' (2000) examination of the contributing factors of happiness: money, relationships, and religion. Discusses the implications of these factors for counseling with specific recommendations made for counselors regarding their own self-care and their work with their clients. (GCP)

  19. Borderline Clients: Practice Implications of Recent Research.

    Science.gov (United States)

    Johnson, Harriette C.

    1991-01-01

    Reviews current research on treatment of borderline clients with medication, individual counseling, and family interventions. Notes that recent studies indicate that borderline personality is heterogeneous condition in which different underlying disorders (affective, schizotypal, and neurological) may be present. Reviews effectiveness of various…

  20. 基于Client Honeypot的网络入侵检测系统%Network Intrusion Detection System Based on Client Honeypot

    Institute of Scientific and Technical Information of China (English)

    忻俊

    2015-01-01

    随着使用者的需求变化与Web应用技术的快速发展,Web应用更为开放并更强调分享及互动,使得Web应用成为当今网络应用的潮流,但也成为黑客新的攻击目标.黑客对网站植入恶意程序代码,造成Web事件威胁不断衍生, Web已变成信息安全攻击重要感染途径之一.该文将介绍一种恶意网页检测方法-Client Honeypot.Client Honeypot系利用Client端主动与Web Server产生互动以进行探测及诱捕,有别于传统的入侵检测系统被动式检测模式.该研究以Open Source工具Honey C为基础进行研究改进,实现对恶意网页检测的应用.%with the user's demand changes with the rapid development of Web application technology, Web application more open and more emphasis on sharing and interactive, making Web applications become the trends of network applications, but also be-come a new target of hackers. Hackers malicious code to your website to make the Web event threat derived from tradition, the Web has become a information security against one of the important infection. This article introduces a kind of malicious web pages detection method-Client Honeypot. Client Honeypot system using Client side active interactions with a Web Server to detect and trapping, passive detection is different from the traditional intrusion detection system model. This study is based on Open Source tools Honey C study improvement, for the application of detecting malicious web pages.

  1. Lead generation using pharmacophore mapping and three-dimensional database searching: application to muscarinic M(3) receptor antagonists.

    Science.gov (United States)

    Marriott, D P; Dougall, I G; Meghani, P; Liu, Y J; Flower, D R

    1999-08-26

    By using a pharmacophore model, a geometrical representation of the features necessary for molecules to show a particular biological activity, it is possible to search databases containing the 3D structures of molecules and identify novel compounds which may possess this activity. We describe our experiences of establishing a working 3D database system and its use in rational drug design. By using muscarinic M(3) receptor antagonists as an example, we show that it is possible to identify potent novel lead compounds using this approach. Pharmacophore generation based on the structures of known M(3) receptor antagonists, 3D database searching, and medium-throughput screening were used to identify candidate compounds. Three compounds were chosen to define the pharmacophore: a lung-selective M(3) antagonist patented by Pfizer and two Astra compounds which show affinity at the M(3) receptor. From these, a pharmacophore model was generated, using the program DISCO, and this was used subsequently to search a UNITY 3D database of proprietary compounds; 172 compounds were found to fit the pharmacophore. These compounds were then screened, and 1-[2-(2-(diethylamino)ethoxy)phenyl]-2-phenylethanone (pA(2) 6.67) was identified as the best hit, with N-[2-(piperidin-1-ylmethyl)cycohexyl]-2-propoxybenz amide (pA(2) 4. 83) and phenylcarbamic acid 2-(morpholin-4-ylmethyl)cyclohexyl ester (pA(2) 5.54) demonstrating lower activity. As well as its potency, 1-[2-(2-(diethylamino)ethoxy)phenyl]-2-phenylethanone is a simple structure with limited similarity to existing M(3) receptor antagonists.

  2. A Visual Database System for Image Analysis on Parallel Computers and its Application to the EOS Amazon Project

    Science.gov (United States)

    Shapiro, Linda G.; Tanimoto, Steven L.; Ahrens, James P.

    1996-01-01

    The goal of this task was to create a design and prototype implementation of a database environment that is particular suited for handling the image, vision and scientific data associated with the NASA's EOC Amazon project. The focus was on a data model and query facilities that are designed to execute efficiently on parallel computers. A key feature of the environment is an interface which allows a scientist to specify high-level directives about how query execution should occur.

  3. Application of SIG and OLAP technologies on IBGE databases as a decision support tool for the county administration

    Directory of Open Access Journals (Sweden)

    REGO, E. A.

    2008-06-01

    Full Text Available This paper shows a Decision Support System development for any brazilian county. The system is free of any costs research. For doing so, one uses the datawarehouse, OLAP and GIS technologies all together with the IBGE's database to give to the user a query building tool, showing the results in maps or/and tables format, on a very simple and efficient way.

  4. The Ideal Characteristics and Content of a Database and Its Useful Application in Improving the Effectiveness of Direct Marketing Campaigns

    Institute of Scientific and Technical Information of China (English)

    QIN Zhi-chao

    2013-01-01

    Direct marketing is now a wel-known discipline and widely used in almost every industry al around the world. The mid to late 2000s saw a huge growth of direct marketing due to the development of technology and the increasing number of wel-educated marketers(Tapp,2008). According to the UK's Institute of Direct Marketing(as cited in Sargeant &West,2001,p.7), Direct marketing is" the planned recording, analysis and tracking of customer's direct response behaviour over time. .in order to develop future marketing strategies for long term customer loyalty and to ensure continued business growth". As Tapp(2008) points out that database is the core of direct marketing. So what is a database in the field of direct marketing? A definition is given by Tapp(2008,p.32)"A marketing database is a list of customers' and prospects' records that enables strategic analysis, and individual selections for communication and customer service support . The data is organized around the customer".

  5. Reengineering multi tiered enterprise business applications for performance enhancement and reciprocal or rectangular hyperbolic relation of variation of data transportation time with row pre-fetch size of relational database drivers

    CERN Document Server

    Sowmiyanarayanan, Sridhar

    2012-01-01

    Reengineering multi tiered enterprise business applications for performance enhancement and reciprocal or rectangular hyperbolic relation of variation of data transportation time with row pre-fetch size of relational database drivers

  6. Proposal for the Award of a Contract, without competitive Tendering, for the Provision of the ORACLE Database Management System Software together with Tools for Application Development and System Exploitation

    CERN Document Server

    1995-01-01

    Proposal for the Award of a Contract, without competitive Tendering, for the Provision of the ORACLE Database Management System Software together with Tools for Application Development and System Exploitation

  7. Prioritizing Project Performance Criteria within Client Perspective

    Directory of Open Access Journals (Sweden)

    Arazi Idrus

    2011-10-01

    Full Text Available Successful performance in a construction project helps to deliver good products to the client. At present, there is no standard approach used by clients to evaluate project performance as project success carries different definitions to different people. Some used the traditional project performance measures of cost, quality and time while others used additional non-traditional measures such as the environment, health and safety, level of technology and contractor planning. The purpose of this study is to identify and rank the actual criteria used by local clients in current practice to measure the performance of a construction project during construction as well as upon completion. The ranking is based on the relative importance of the criteria as perceived by project performance decision makers working for clients’ organizations within the Malaysian construction industry using their accumulated experience and judgment. The objective of this study was investigated through a postal questionnaire which covered a selected sample of the study. Data were analyzed using mean, variance, frequency and severity index analyses. The results of this paper show that Quality of finished project, Construction cost and Construction time were the three most important criteria considered crucial by the respondents for evaluating project performance from current practice in Malaysia. The paper provides supportive practical solution for project performance decision makers working for clients’ organizations within the Malaysian construction industry to enhance and improve their practices in measuring their clients’ project performance so that their clients would enjoy higher satisfaction levels from their projects. More so, the paper would serve as a guide to contractors by helping them to understand that Quality of finished project, Construction cost and Construction time are the criteria given high priority by clients in measuring the performance of a

  8. An architecture for mobile database management system

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    In order to design a new kind of mobile database management system (DBMS) more suitable for mobile computing than the existent DBMS, the essence of database systems in mobile computing is analyzed. An opinion is introduced that the mobile database is a kind of dynamic distributed database, and the concept of virtual servers to translate the clients' mobility to the servers' mobility is proposed. Based on these opinions, a kind of architecture of mobile DBMS, which is of versatility, is presented. The architecture is composed of a virtual server and a local DBMS, the virtual server is the kernel of the architecture and its functions are described. Eventually, the server kernel of a mobile DBMS prototype is illustrated.

  9. 基于注册表的DELPHI数据库应用程序发布%Distributing of Delphi DataBase Application Based on Registry

    Institute of Scientific and Technical Information of China (English)

    陈明; 宋宝卫

    2001-01-01

    In this paper, the authors propose a method which distributes delphi-based database application. Its feature is not using traditional method of installing application and BDE drivers. While application is being installed,this method saves BDE system configuration information through modifying registry and copies BDE configuration files.%提出了一种发布DELPHI数据库应用程序的方法。其特点是完全摆脱传统的安装应用程序和BDE驱动程序,在安装应用程序时通过修改注册表来写入BDE的系统配置信息和复制BDE的配置文件。

  10. The CHIANTI atomic database

    CERN Document Server

    Young, Peter R; Landi, Enrico; Del Zanna, Giulio; Mason, Helen

    2015-01-01

    The CHIANTI atomic database was first released in 1996 and has had a huge impact on the analysis and modeling of emissions from astrophysical plasmas. The database has continued to be updated, with version 8 released in 2015. Atomic data for modeling the emissivities of 246 ions and neutrals are contained in CHIANTI, together with data for deriving the ionization fractions of all elements up to zinc. The different types of atomic data are summarized here and their formats discussed. Statistics on the impact of CHIANTI to the astrophysical community are given and examples of the diverse range of applications are presented.

  11. A Robust Client Verification in Cloud Enabled m-Commerce using Gaining Protocol

    Directory of Open Access Journals (Sweden)

    Chitra Kiran N.

    2011-11-01

    Full Text Available The proposed system highlights a novel approach of exclusive verification process using gain protocol for ensuring security among both the parties (client-service provider in m-commerce application with cloud enabled service. The proposed system is based on the potential to verify the clients with trusted hand held device depending on the set of frequent events and actions to be carried out. The framework of the proposed work is design after collecting a real time data sets from an android enabled hand set, which when subjected to gain protocol, will result in detection of malicious behavior of illegal clients in the network. The real time experiment is performed with applicable datasets gather, which show the best result for identifying threats from last 2 months data collected.

  12. A Robust Client Verification in cloud enabled m-Commerce using Gaining Protocol

    CERN Document Server

    N., Chitra Kiran

    2012-01-01

    The proposed system highlights a novel approach of exclusive verification process using gain protocol for ensuring security among both the parties (client-service provider) in m-commerce application with cloud enabled service. The proposed system is based on the potential to verify the clients with trusted hand held device depending on the set of frequent events and actions to be carried out. The framework of the proposed work is design after collecting a real time data sets from an android enabled hand set, which when subjected to gain protocol, will result in detection of malicious behavior of illegal clients in the network. The real time experiment is performed with applicable datasets gather, which show the best result for identifying threats from last 2 months data collected.

  13. Base-on Cloud Computing A new type of distributed application server system design

    Directory of Open Access Journals (Sweden)

    Ying-ying Chen

    2012-11-01

    Full Text Available At this stage the application server systems, such as e-commerce platform , instant messaging system , enterprise information system and so on, will be led to lose connections , the data latency phenomena because of too much concurrent requests, application server architecture, system architecture, etc. In serious cases, the server is running blocked. The new type of application server system contains four parts: a client program, transfer servers, application servers and databases. Application server is the core of the system. Its performance determines the system’s performance. At the same time the application servers and transfer servers can be designed as the web service open to be used, and they can be achieved as distributed architecture by a number of hardware servers, which can effectively deal with high concurrent client application requests.

  14. A Concurrency Control Method Based on Commitment Ordering in Mobile Databases

    CERN Document Server

    Karami, Ali

    2011-01-01

    Disconnection of mobile clients from server, in an unclear time and for an unknown duration, due to mobility of mobile clients, is the most important challenges for concurrency control in mobile database with client-server model. Applying pessimistic common classic methods of concurrency control (like 2pl) in mobile database leads to long duration blocking and increasing waiting time of transactions. Because of high rate of aborting transactions, optimistic methods aren`t appropriate in mobile database. In this article, OPCOT concurrency control algorithm is introduced based on optimistic concurrency control method. Reducing communications between mobile client and server, decreasing blocking rate and deadlock of transactions, and increasing concurrency degree are the most important motivation of using optimistic method as the basis method of OPCOT algorithm. To reduce abortion rate of transactions, in execution time of transactions` operators a timestamp is assigned to them. In other to checking commitment o...

  15. Primary Numbers Database for ATLAS Detector Description Parameters

    CERN Document Server

    Vaniachine, A; Malon, D; Nevski, P; Wenaus, T

    2003-01-01

    We present the design and the status of the database for detector description parameters in ATLAS experiment. The ATLAS Primary Numbers are the parameters defining the detector geometry and digitization in simulations, as well as certain reconstruction parameters. Since the detailed ATLAS detector description needs more than 10,000 such parameters, a preferred solution is to have a single verified source for all these data. The database stores the data dictionary for each parameter collection object, providing schema evolution support for object-based retrieval of parameters. The same Primary Numbers are served to many different clients accessing the database: the ATLAS software framework Athena, the Geant3 heritage framework Atlsim, the Geant4 developers framework FADS/Goofy, the generator of XML output for detector description, and several end-user clients for interactive data navigation, including web-based browsers and ROOT. The choice of the MySQL database product for the implementation provides addition...

  16. Building strategies for tsunami scenarios databases to be used in a tsunami early warning decision support system: an application to western Iberia

    Science.gov (United States)

    Tinti, S.; Armigliato, A.; Pagnoni, G.; Zaniboni, F.

    2012-04-01

    One of the most challenging goals that the geo-scientific community is facing after the catastrophic tsunami occurred on December 2004 in the Indian Ocean is to develop the so-called "next generation" Tsunami Early Warning Systems (TEWS). Indeed, the meaning of "next generation" does not refer to the aim of a TEWS, which obviously remains to detect whether a tsunami has been generated or not by a given source and, in the first case, to send proper warnings and/or alerts in a suitable time to all the countries and communities that can be affected by the tsunami. Instead, "next generation" identifies with the development of a Decision Support System (DSS) that, in general terms, relies on 1) an integrated set of seismic, geodetic and marine sensors whose objective is to detect and characterise the possible tsunamigenic sources and to monitor instrumentally the time and space evolution of the generated tsunami, 2) databases of pre-computed numerical tsunami scenarios to be suitably combined based on the information coming from the sensor environment and to be used to forecast the degree of exposition of different coastal places both in the near- and in the far-field, 3) a proper overall (software) system architecture. The EU-FP7 TRIDEC Project aims at developing such a DSS and has selected two test areas in the Euro-Mediterranean region, namely the western Iberian margin and the eastern Mediterranean (Turkish coasts). In this study, we discuss the strategies that are being adopted in TRIDEC to build the databases of pre-computed tsunami scenarios and we show some applications to the western Iberian margin. In particular, two different databases are being populated, called "Virtual Scenario Database" (VSDB) and "Matching Scenario Database" (MSDB). The VSDB contains detailed simulations of few selected earthquake-generated tsunamis. The cases provided by the members of the VSDB are computed "real events"; in other words, they represent the unknowns that the TRIDEC

  17. Evaluation of in-memory database TimesTen

    OpenAIRE

    Andras Simon, Endre; Potocky, Miroslav

    2013-01-01

    Project Specification: Oracle TimesTen In-Memory Database is a full-featured, memory-optimized, relational database with persistence and recoverability. For existing application data residing on the Oracle Database, TimesTen can serve as an in-memory cache database. This setup can provide great performance increase and almost instant responsiveness for database intensive applications. Cooperation between application and database support is needed to test integration, benefits and possibili...

  18. A Molecular Biology Database Digest

    OpenAIRE

    Bry, François; Kröger, Peer

    2000-01-01

    Computational Biology or Bioinformatics has been defined as the application of mathematical and Computer Science methods to solving problems in Molecular Biology that require large scale data, computation, and analysis [18]. As expected, Molecular Biology databases play an essential role in Computational Biology research and development. This paper introduces into current Molecular Biology databases, stressing data modeling, data acquisition, data retrieval, and the integration...

  19. Content independence in multimedia databases

    NARCIS (Netherlands)

    Vries, A.P. de

    2001-01-01

    A database management system is a general-purpose software system that facilitates the processes of defining, constructing, and manipulating databases for various applications. This article investigates the role of data management in multimedia digital libraries, and its implications for the design

  20. Counselor Stress in Relation to Disabled and Minority Clients

    Science.gov (United States)

    Vander Kolk, Charles J.

    1977-01-01

    Physiological and self-reported reactions of counselors in training to five disabled clients and a minority client were examined. Implications for counselor practice, education, and in-service education are discussed. (Author)

  1. Voice and Communication Therapy for Transgender/Transsexual Clients

    Science.gov (United States)

    ... Voice and Communication Therapy for Clients Who Are Transgender and/or Transsexual [ en Español ] What does the ... pathologist do when working with clients who are transgender/transsexual? What organizations have more information? Related Articles ...

  2. Client Centeredness and Health Reform: Key Issues for Occupational Therapy

    OpenAIRE

    Mroz, Tracy M.; Pitonyak, Jennifer S.; Fogelberg, Donald; Leland, Natalie E.

    2015-01-01

    Occupational therapy has the philosophical underpinnings to provide expanded and more effective client-centered care that emphasizes the active engagement of the client and recognizes the greater contexts of his or her life.

  3. Asymmetry of Responsiveness in Client-Centered Therapy

    Science.gov (United States)

    Shapiro, David A.

    1977-01-01

    Each utterance of a psychotherapy session conducted by Carl Rogers was transcribed on a separate card. Fifteen undergraduate subjects reconstituted client-therapist sequences more accurately than therapist-client sequences. (Author)

  4. Cryptanalysis of Some Client-to-Client Password-Authenticated Key Exchange Protocols

    Directory of Open Access Journals (Sweden)

    Tianjie Cao

    2009-06-01

    Full Text Available Client-to-Client Password-Authenticated Key Exchange (C2C-PAKE protocols allow two clients establish a common session key based on their passwords. In a secure C2C-PAKE protocol, there is no computationally bounded adversary learns anything about session keys shared between two clients. Especially a participating server should not learn anything about session keys. Server- compromise impersonation resilience is another desirable security property for a C2C-PAKE protocol. It means that compromising the password verifier of any client A should not enable outside adversary to share session key with A. Recently, Kwon and Lee proposed four C2C-PAKE protocols in the three-party setting, and Zhu et al. proposed a C2C-PAKE protocol in the cross-realm setting. All the proposed protocols are claimed to resist server compromise. However, in this paper, we show that Kwon and Lee’s protocols and Zhu et al’s protocol exist server compromise attacks, and a malicious server can mount man-in-themiddle attacks and can eavesdrop the communication between the two clients.

  5. Client-to-client Password-Based Authenticated Key Establishment in a Cross-Realm Setting

    Directory of Open Access Journals (Sweden)

    Shuhua Wu

    2009-09-01

    Full Text Available The area of password-based authenticated key establishment protocols has been the subject of a vast amount of work in the last few years due to its practical aspects. Despite the attention given to it, most passwordauthenticated key establishment (PAKE schemes in the literature consider authentication between a client and a sever. Although some of them are extended to a threeparty PAKE protocol, in which a trusted server exists to mediate between two clients to allow mutual authentication, they are less considered in a cross-realm setting like in kerberos system. In this paper, we propose a provably secure password-authenticated key establishment protocol in a cross-realm setting where two clients in different realms obtain a secret session key as well as mutual authentication, with the help of respective servers. We deal with it using ideas similar to those used in the three-party protocol due to M. Abdalla et al. In our protocol, each client firstly establish secure channel with its server and then the servers securely distribute a fresh common session key to the two clients. One of the attractive features is that our protocol can be easily extended to a more general scenario where a common key should be established among more than two clients. Moreover, analysis shows that the proposed protocol has a per-user computational cost of the underlying two-party encrypted key exchange.

  6. Construction and application of the BIM database based on IFC standard%基于IFC标准BIM数据库的构建与应用

    Institute of Scientific and Technical Information of China (English)

    李犁; 邓雪原

    2013-01-01

    Recently the BIM concept that all professionals carry out collaborative work among domains and departments in construction industry has attracted more and more attention by research scholars and civil engineers around the world.The core of the BIM technology is the information sharing and exchanging among the building life cycle.In view of this key problem,this article introduces the situation of the BIM research and development in domestic and overseas,and indicates that there are several common problems existing in the BIM development,such as,the information of multiple projects cannot be stored in a centralized database;the building information is lost or incorrect when importing and exporting IFC model files with popular BIM software;the application software based on BIM technology is quite few,etc.The paper points out that the realization of BIM technology shall be based on the BIM database which is based on IFC standard.After that,this paper discusses how to build the BIM database based on IFC standard,the application interface of BIM database,the function of budgetary estimates and the transplantation of structural model conversion to BIM database.Then,a few examples are given to demonstrate the feasibility and reliability of the BIM database and corresponding applications developed in this research.As a result,it is concluded that the IFC based BIM database is the foundation of developing building collaborative platform and digital city.%当前BIM理念,即在建筑行业中各部门各专业之间协同工作的概念已越来越为国内外研究学者与工程技术人员所关注.BIM技术的核心是建筑全生命周期过程中的信息共享与转换.针对这一核心问题,本文介绍了现今国内外对BIM技术的研究、开发现状,并分析指出当前BIM技术发展所存在的问题有:多个工程项目信息无法集中存储;大量BIM软件在IFC文件输入输出的过程中发生建筑信息的错误;BIM应用软件的缺失

  7. Database Vs Data Warehouse

    Directory of Open Access Journals (Sweden)

    2007-01-01

    Full Text Available Data warehouse technology includes a set of concepts and methods that offer the users useful information for decision making. The necessity to build a data warehouse arises from the necessity to improve the quality of information in the organization. The date proceeding from different sources, having a variety of forms - both structured and unstructured, are filtered according to business rules and are integrated in a single large data collection. Using informatics solutions, managers have understood that data stored in operational systems - including databases, are an informational gold mine that must be exploited. Data warehouses have been developed to answer the increasing demands for complex analysis, which could not be properly achieved with operational databases. The present paper emphasizes some of the criteria that information application developers can use in order to choose between a database solution or a data warehouse one.

  8. APPLICATION OF THE UNIFIED STATISTICAL MATERIAL DATABASE FOR DESIGN AND LIFE/RISK ASSESSMENT OF HIGH TEMPERATURE COMPONENTS

    Institute of Scientific and Technical Information of China (English)

    K.Fujiyama; T.Fujiwara; Y.Nakatani; K.Saito; A.Sakuma; Y.Akikuni; S.Hayashi; S.Matsumoto

    2004-01-01

    Statistical manipulation of material data was conducted for probabilistic life assessment or risk-based design and maintenance for high temperature components of power plants. To obtain the statistical distribution of material properties, dominant parameters affecting material properties are introduced into normalization of statistical variables. Those parameters are hardness, chemical composition, characteristic microstructural features and so on. Creep and fatigue properties are expressed by normalized parameters and the unified statistical distributions are obtained. These probability distribution functions show good coincidence statistically with the field database of steam turbine components. It was concluded that the unified statistical baseline approach is useful for the risk management of components in power plants.

  9. Cloud database development and management

    CERN Document Server

    Chao, Lee

    2013-01-01

    Nowadays, cloud computing is almost everywhere. However, one can hardly find a textbook that utilizes cloud computing for teaching database and application development. This cloud-based database development book teaches both the theory and practice with step-by-step instructions and examples. This book helps readers to set up a cloud computing environment for teaching and learning database systems. The book will cover adequate conceptual content for students and IT professionals to gain necessary knowledge and hands-on skills to set up cloud based database systems.

  10. Involvement of the Client in Home Care Practice

    DEFF Research Database (Denmark)

    Glasdam, Stinne; Kjær, Lone; Præstegaard, Jeanette

    2011-01-01

    Background: Through the last 35 years, ‘client involvement’ has been a mantra within health policies, education curriculums and health care institutions, yet very little is known about how ‘client involvement’ is practiced in the meetings with clients and health professionals. Aim: To analyse...

  11. Client involvement in home care practice: a relational sociological perspective

    DEFF Research Database (Denmark)

    Glasdam, Stinne; Henriksen, Nina; Kjær, Lone;

    2012-01-01

    Client involvement’ has been a mantra within health policies, education curricula and healthcare institutions over many years, yet very little is known about how ‘client involvement’ is practised in home-care services. The aim of this article is to analyse ‘client involvement’ in practise seen f...

  12. Developing Individualized Behavior Change Goals with Clients: A Procedure.

    Science.gov (United States)

    Weigel, Richard G.; Uhlemann, Max R.

    This document reviews 10 specific and sequential steps which have emerged as being particularly effective in assisting clients in developing individualized behavior change goals in psychotherapy. The therapist and client typically work through these steps together near the beginning of treatment, but only after the client has had the opportunity…

  13. Attitudes of Social Work Students toward Clients with Basic Needs

    Science.gov (United States)

    Krumer-Nevo, Michal; Lev-Wiesel, Rachel

    2005-01-01

    This study examines the attitudes of 91 undergraduate social work students toward clients with basic needs in Israel. The results indicate that only about 1/3 of the students consider the treatment of clients with basic needs to be a part of the profession. In addition, a positive correlation was found between willingness to help clients with…

  14. Impact of Client Suicide on Practitioner Posttraumatic Growth

    Science.gov (United States)

    Munson, Joseph Simon

    2009-01-01

    Our purpose was to examine posttraumatic growth in clinicians after the suicide death of a client. An experience such as a client suicide could be an opportunity for growth or a danger for the practitioner to become traumatized. Thus, the clinician who works with clients who complete suicide may either suffer or experience a positive change from…

  15. An Occupational Performance Process Model: Fostering Client and Therapist Alliances.

    Science.gov (United States)

    Fearing, Virginia G.; And Others

    1997-01-01

    The seven stages of an occupational performance process model focus on client participation and client-centered practice. The model provides a systematic method for developing occupational therapy assessment and intervention that result in a collaborative approach to client-identified occupational performance issues. (SK)

  16. Counselor Beliefs and Perceived Knowledge Regarding Clients with Learning Disabilities

    Science.gov (United States)

    Bell, Tamekia R.

    2012-01-01

    Clients with learning disabilities constitute a cultural group that has not been extensively studied. The professional literature has found that counselors have reported the need for additional training in working with clients with disabilities. This study explored counselors' beliefs and perceived knowledge regarding counseling clients with…

  17. 32 CFR 776.33 - Client under a disability.

    Science.gov (United States)

    2010-07-01

    ... 32 National Defense 5 2010-07-01 2010-07-01 false Client under a disability. 776.33 Section 776.33... of Professional Conduct § 776.33 Client under a disability. (a) Client under a disability: (1) When a... impaired, whether because of minority, mental disability, or for some other reason, the covered...

  18. Incorporating Perceived Importance of Service Elements into Client Satisfaction Measures

    Science.gov (United States)

    Hsieh, Chang-Ming

    2012-01-01

    Objective: The purpose of this study was to assess the need for incorporating perceived importance of service elements into client satisfaction measures. Method: A secondary analysis of client satisfaction data from 112 clients of an elderly case management setting was conducted. Results: This study found that the relationship between global…

  19. Client/Server系统的测试研究

    Institute of Scientific and Technical Information of China (English)

    兰景英

    2006-01-01

    本文针对Client/Server系统的结构和特点,分析了Client/Server系统的测试的难点,并从结构性测试、功能性测试、系统性能测试三个方面提出了Client/Server系统的测试方法.

  20. What Business Students Should Know about Attorney-Client Privilege

    Science.gov (United States)

    Draba, Robert; Marshall, Brent

    2012-01-01

    The case law on attorney-client privilege is extensive and can be somewhat complex. Over seven hundred articles in Westlaw, for example, have the phrase "attorney-client privilege" in the title; in the last three years alone, there have been over 3700 federal cases in which the phrase "attorney-client privilege" appears at least once. However,…

  1. Counselor Interventions Preceding Client Laughter in Brief Therapy.

    Science.gov (United States)

    Falk, Dana R.; Hill, Clara E.

    1992-01-01

    Examined whether 6 categories of counselor humor and 4 categories of risk interventions preceded client laughter in 236 events from 8 cases of brief psychotherapy. Found most client laughter was mild and moderate, with only eight instances of strong laughter. Humorous interventions led to more client laughter than did interventions that encouraged…

  2. Can Knowledge of Client Birth Order Bias Clinical Judgment?

    Science.gov (United States)

    Stewart, Allan E.

    2004-01-01

    Clinicians (N = 308) responded to identical counseling vignettes of a male client that differed only in the client's stated birth order. Clinicians developed different impressions about the client and his family experiences that corresponded with the prototypical descriptions of persons from 1 of 4 birth orders (i.e., first, middle, youngest, and…

  3. Accommodating Extension Clients Who Face Language, Vision, or Hearing Challenges

    Science.gov (United States)

    Angima, Sam; Etuk, Lena; Maddy, Deborah

    2016-01-01

    A survey-based study explored approaches used by one land-grant university to meet the needs of Extension clients who face language, vision, or hearing challenges. In attempts to serve such clients, the greatest gaps existed for clients whose main language was Spanish, followed by those who had vision impairments and then those who had hearing…

  4. The CUTLASS database facilities

    International Nuclear Information System (INIS)

    The enhancement of the CUTLASS database management system to provide improved facilities for data handling is seen as a prerequisite to its effective use for future power station data processing and control applications. This particularly applies to the larger projects such as AGR data processing system refurbishments, and the data processing systems required for the new Coal Fired Reference Design stations. In anticipation of the need for improved data handling facilities in CUTLASS, the CEGB established a User Sub-Group in the early 1980's to define the database facilities required by users. Following the endorsement of the resulting specification and a detailed design study, the database facilities have been implemented as an integral part of the CUTLASS system. This paper provides an introduction to the range of CUTLASS Database facilities, and emphasises the role of Database as the central facility around which future Kit 1 and (particularly) Kit 6 CUTLASS based data processing and control systems will be designed and implemented. (author)

  5. TESS: a geometric hashing algorithm for deriving 3D coordinate templates for searching structural databases. Application to enzyme active sites.

    Science.gov (United States)

    Wallace, A C; Borkakoti, N; Thornton, J M

    1997-11-01

    It is well established that sequence templates such as those in the PROSITE and PRINTS databases are powerful tools for predicting the biological function and tertiary structure for newly derived protein sequences. The number of X-ray and NMR protein structures is increasing rapidly and it is apparent that a 3D equivalent of the sequence templates is needed. Here, we describe an algorithm called TESS that automatically derives 3D templates from structures deposited in the Brookhaven Protein Data Bank. While a new sequence can be searched for sequence patterns, a new structure can be scanned against these 3D templates to identify functional sites. As examples, 3D templates are derived for enzymes with an O-His-O "catalytic triad" and for the ribonucleases and lysozymes. When these 3D templates are applied to a large data set of nonidentical proteins, several interesting hits are located. This suggests that the development of a 3D template database may help to identify the function of new protein structures, if unknown, as well as to design proteins with specific functions.

  6. Analyzing legacy U.S. Geological Survey geochemical databases using GIS: applications for a national mineral resource assessment

    Science.gov (United States)

    Yager, Douglas B.; Hofstra, Albert H.; Granitto, Matthew

    2012-01-01

    This report emphasizes geographic information system analysis and the display of data stored in the legacy U.S. Geological Survey National Geochemical Database for use in mineral resource investigations. Geochemical analyses of soils, stream sediments, and rocks that are archived in the National Geochemical Database provide an extensive data source for investigating geochemical anomalies. A study area in the Egan Range of east-central Nevada was used to develop a geographic information system analysis methodology for two different geochemical datasets involving detailed (Bureau of Land Management Wilderness) and reconnaissance-scale (National Uranium Resource Evaluation) investigations. ArcGIS was used to analyze and thematically map geochemical information at point locations. Watershed-boundary datasets served as a geographic reference to relate potentially anomalous sample sites with hydrologic unit codes at varying scales. The National Hydrography Dataset was analyzed with Hydrography Event Management and ArcGIS Utility Network Analyst tools to delineate potential sediment-sample provenance along a stream network. These tools can be used to track potential upstream-sediment-contributing areas to a sample site. This methodology identifies geochemically anomalous sample sites, watersheds, and streams that could help focus mineral resource investigations in the field.

  7. Computer programme for control and maintenance and object oriented database: application to the realisation of an particle accelerator, the VIVITRON

    International Nuclear Information System (INIS)

    The command and control system for the Vivitron, a new generation electrostatic particle accelerator, has been implemented using workstations and front-end computers using VME standards, the whole within an environment of UNIX/VxWorks. This architecture is distributed over an Ethernet network. Measurements and commands of the different sensors and actuators are concentrated in the front-end computers. The development of a second version of the software giving better performance and more functionality is described. X11 based communication has been utilised to transmit all the necessary informations to display parameters within the front-end computers on to the graphic screens. All other communications between processes use the Remote Procedure Call method (RPC). The conception of the system is based largely on the object oriented database O2 which integrates a full description of equipments and the code necessary to manage it. This code is generated by the database. This innovation permits easy maintenance of the system and bypasses the need of a specialist when adding new equipments. The new version of the command and control system has been progressively installed since August 1995. (author)

  8. A computational framework for the statistical analysis of cardiac diffusion tensors: application to a small database of canine hearts.

    Science.gov (United States)

    Peyrat, Jean-Marc; Sermesant, Maxime; Pennec, Xavier; Delingette, Hervé; Xu, Chenyang; McVeigh, Elliot R; Ayache, Nicholas

    2007-11-01

    We propose a unified computational framework to build a statistical atlas of the cardiac fiber architecture from diffusion tensor magnetic resonance images (DT-MRIs). We apply this framework to a small database of nine ex vivo canine hearts. An average cardiac fiber architecture and a measure of its variability are computed using most recent advances in diffusion tensor statistics. This statistical analysis confirms the already established good stability of the fiber orientations and a higher variability of the laminar sheet orientations within a given species. The statistical comparison between the canine atlas and a standard human cardiac DT-MRI shows a better stability of the fiber orientations than their laminar sheet orientations between the two species. The proposed computational framework can be applied to larger databases of cardiac DT-MRIs from various species to better establish intraspecies and interspecies statistics on the anatomical structure of cardiac fibers. This information will be useful to guide the adjustment of average fiber models onto specific patients from in vivo anatomical imaging modalities.

  9. Securing Oracle Database from Search Engines Attack

    OpenAIRE

    N. M. A. Ayad; H. M. Klash; S. Sorour

    2012-01-01

    Database security has recently become A victim of misused search engines. This can be accomplished simply by searching for a URL containing the name of the vulnerable web page or application. Oracle ships several sample web applications along with its databases. The security holes in these applications allow a web user to exploit SQL Injection to submit arbitrary SQL statements to the database. These applications are enabled by default, listening on port 7777, and known to be vulnerable to...

  10. Geoscientific (GEO) database of the Andra Meuse / Haute-Marne research center

    International Nuclear Information System (INIS)

    Document available in extended abstract form only. The GEO database (geo-scientific database of the Meuse/Haute-Marne Center) is a tool developed by Andra, with a view to group in a secured computer form all data related to the acquisition of in situ and laboratory measurements made on solid and fluid samples. This database has three main functions: - Acquisition and management of data and computer files related to geological, geomechanical, hydrogeological and geochemical measurements on solid and fluid samples and in situ measurements (logging, on sample measurements, geological logs, etc). - Available consultation by the staff on Andra's intranet network for selective viewing of data linked to a borehole and/or a sample and for making computations and graphs on sets of laboratory measurements related to a sample. - Physical management of fluid and solid samples stored in a 'core library' in order to localize a sample, follow-up its movement out of the 'core library' to an organization, and carry out regular inventories. The GEO database is a relational Oracle data base. It is installed on a data server which stores information and manages the users' transactions. The users can consult, download and exploit data from any computer connected to the Andra network or Internet. Management of the access rights is made through a login/ password. Four geo-scientific explanations are linked to the Geo database, they are: - The Geosciences portal: The Geosciences portal is a web Intranet application accessible from the ANDRA network. It does not require a particular installation from the client and is accessible through the Internet navigator. A SQL Server Express database manages the users and access rights to the application. This application is used for the acquisition of hydrogeological and geochemical data collected on the field and on fluid samples, as well as data related to scientific work carried out at surface level or in drifts

  11. 内存数据库在彩铃业务中的应用%Application of Main Memory Database to Color Ring Back Tone Service

    Institute of Scientific and Technical Information of China (English)

    曹猗宣; 王晶

    2011-01-01

    传统的磁盘数据库由于I/O瓶颈的限制,愈来愈不能满足实时高性能应用的需求.内存数据库由于能够提供更快的响应速度和更大的事务吞吐量,在电信领域得到愈来愈多的应用.在对内存数据库技术进行研究的基础上,首次将内存数据库应用到彩铃业务中,提出一种改进的彩铃业务数据库结构.测试结果表明,引入内存数据库之后,彩铃应用的性能得到有效提高,并且CPU占用降低,系统能够承载更多的用户.%Because of the I/O bottleneck, the conventional disk resident database (DRDB) system becomes more and more unsuitable for the needs of real-time high-performance applications. Providing much better response time and transaction throughput as compared to DR DB's, the main memory resident database (MMDB) system has been more and more widely used in telecom field. In this paper, after studying the MMDB technology, we propose a modified database structure in Color Ring Back Tone (CRBT) service, using MMDB in CRBT service for the first time. Test results show that the performance of CRBT application has been efficiently improved by using MMDB and the CPU utilization has been decreased, which means the system's capacity of users has been enlarged.

  12. Client value models provide a framework for rational library planning (or, phrasing the answer in the form of a question).

    Science.gov (United States)

    Van Moorsel, Guillaume

    2005-01-01

    Libraries often do not know how clients value their product/ service offerings. Yet at a time when the mounting costs for library support are increasingly difficult to justify to the parent institution, the library's ability to gauge the value of its offerings to clients has never been more critical. Client Value Models (CVMs) establish a common definition of value elements-or a "value vocabulary"-for libraries and their clients, thereby providing a basis upon which to make rational planning decisions regarding product/service acquisition and development. The CVM concept is borrowed from business and industry, but its application has a natural fit in libraries. This article offers a theoretical consideration and practical illustration of CVM application in libraries.

  13. Behavior Analysis Of Malicious Web Pages Through Client Honeypot For Detection Of Drive-By-Download Malwares

    Directory of Open Access Journals (Sweden)

    Supinder Kaur

    2014-07-01

    Full Text Available Malwares which is also known as malicious software’s is spreading through the exploiting the client side applications such as browsers, plug-ins etc. Attackers implant the malware codes in the user’s computer through web pages; thereby they are also known malicious web pages. Here in the paper, we present the usefulness of controlled environment in the form of client honeypots in detection of malicious web pages through collections of malicious intent in web pages and then perform detailed analysis for validation and confirmation of malicious web pages. First phase is collection of malicious infections through high interaction client honeypot, second phase is validations of the malicious infections embedded into web pages through behavior based analysis. Malwares which infect the client side applications and drop the malwares into user’s computers sometimes overrides the signature based detection techniques; thereby there is a need to study the behavior of the complete malicious web pages.

  14. A NOVEL APPROACH TO TRANSFORM CLASSICAL DATABASE TO USER FRIENDLY WEB DOCUMENT FOR MEDICAL APPLICATION IN PEER-TO-PEER ENVIRONMENT

    Directory of Open Access Journals (Sweden)

    E. Anupriya,

    2010-11-01

    Full Text Available Due to the increasing popularity of the XML (eXtensible Markup Language as a common data standard for information interchange across Web, XML is commonly being used as an underlying data model for many applications to deal with the heterogeneity of data and nodes. This paper presents a novel approach to extract data from classical database systems like relational database systems, convert it into XML documents, and exchange XML documents among peer nodes in the network. Many hospitals have branches in different geographicallocations. The chief doctors need to travel to different locations and give consultation. If the peer hospital nodes are connected in peer-to-peer network, then the consultation can be provided from any peer hospital node even under emergencies. Peer-to-Peer network is implemented via Byzantine-Resilient Secure Multicast Routing in ultihop Wireless Networks (BSMR protocol enhanced with security. Also, we have considered and emulated a system for one such medical application in which the consultant can enter details like insulin dosage to be given or any pre or post sugar measurements need to be taken for any in patients available in the peer hospitals. The concerned duty doctor and nurses can carryout the task based on instruction in any peer hospital node. To make it easily readable,the information is presented in browser as a XML document.

  15. 内存数据库在分布式系统中的应用%Memory Database Applications in Distributed System

    Institute of Scientific and Technical Information of China (English)

    尤良

    2011-01-01

    Based on the memory database architecture,data storage,data organization and analysis of transaction management,etc.,made in memory database applications in a distributed system case.This program is used to solve the mass data,highly concurrent systems of data storage and access problems,especially for complex data models and business rules of large-scale Intemet applications.%本文通过对内存数据库的体系结构、数据存储、数据组织与事务管理等进行分析,提出了内存数据库在分布式系统中的应用案例。此方案用于解决海量数据、高并发系统的数据存储和访问问题,尤其适用于数据模型和业务规则复杂的大型互联网应用。

  16. Perceptions towards IT Security in Online Banking: Croatian Clients vs. Clients of Bosnia and Herzegovina

    Directory of Open Access Journals (Sweden)

    Nedim Makarevic

    2016-01-01

    Full Text Available This study has been completed with a purpose to analyze and compare perceptions of clients of Bosnia and Herzegovina and those of Croatian clients about IT security in online banking, to provide insight into similarities and differences of their view points and to create important set of information for all subjects active in banking industry. Once the survey based on six variables and specific questions assigned to each one of those variables was prepared, results regarding both countries were collected and concluded. Survey was completed in both Bosnia and Herzegovina and Croatia at high response rates. Even 207 respondents replied from Bosnia and Herzegovina, while 203 respondents completed survey from Croatia. Results were analyzed and presented using descriptive statistics. Results indicated that Croatian e-banking users trust to banks when it comes to IT security of online banking much more compared to clients of Bosnia and Herzegovina. It is important to mention that clients of Croatia perceive tangible features as highly significant while Bosnian clients do not perceive tangible features that much important. This proved that Croatian clients are aware of potential security threats and they know their part of responsibility when it comes to handling money online. On the other hand, results from Bosnia and Herzegovina indicated that Bosnian clients have lack of trust in online banking, and lack of awareness about personal tangible aspects that can improve security of personal online banking experience. The main limitation of this study is relatively small sample and too generic approach. Therefore, this study may be perceived as a pilot study for future researchers. The study’s results may be of interest to marketers and managers of banks operating in Bosnia and Herzegovina and Croatia to learn more about their clients’ perceptions towards their e-banking services.

  17. Spiritual Pain in Meals on Wheels’ Clients

    OpenAIRE

    Lisa Boss; Sandy Branson; Stanley Cron; Duck-Hee Kang

    2015-01-01

    Background: Meals on Wheels’ clients are at risk for spiritual pain due to advanced age, social isolation, and failing health. They are also prone to stress, depression, and loneliness, placing them at risk for adverse biological disruptions and health outcomes. The purpose of the study was to examine associations of spiritual pain with psychosocial factors (stress, depression, loneliness, religious coping) and salivary biomarkers of stress and inflammation (cortisol, IL-1β) in Meals on Wheel...

  18. A client/server approach to telemedicine.

    OpenAIRE

    Vaughan, B. J.; Torok, K. E.; Kelly, L. M.; Ewing, D J; Andrews, L. T.

    1995-01-01

    This paper describes the Medical College of Ohio's efforts in developing a client/server telemedicine system. Telemedicine vastly improves the ability of a medical center physician or specialist to interactively consult with a physician at a remote health care facility. The patient receives attention more quickly, he and his family do not need to travel long distances to obtain specialists' services, and the primary care physician can be involved in diagnosis and developing a treatment progra...

  19. Proposal and Implementation of SSH Client System Using Ajax

    Science.gov (United States)

    Kosuda, Yusuke; Sasaki, Ryoichi

    Technology called Ajax gives web applications the functionality and operability of desktop applications. In this study, we propose and implement a Secure Shell (SSH) client system using Ajax, independent of the OS or Java execution environment. In this system, SSH packets are generated on a web browser by using JavaScript and a web server works as a proxy in communication with an SSH server to realize end-to-end SSH communication. We implemented a prototype program and confirmed by experiment that it runs on several web browsers and mobile phones. This system has enabled secure SSH communication from a PC at an Internet cafe or any mobile phone. By measuring the processing performance, we verified satisfactory performance for emergency use, although the speed was unsatisfactory in some cases with mobile phone. The system proposed in this study will be effective in various fields of E-Business.

  20. Prospective validation of a comprehensive in silico hERG model and its applications to commercial compound and drug databases.

    Science.gov (United States)

    Doddareddy, Munikumar R; Klaasse, Elisabeth C; Shagufta; Ijzerman, Adriaan P; Bender, Andreas

    2010-05-01

    Ligand-based in silico hERG models were generated for 2 644 compounds using linear discriminant analysis (LDA) and support vector machines (SVM). As a result, the dataset used for the model generation is the largest publicly available (see Supporting Information). Extended connectivity fingerprints (ECFPs) and functional class fingerprints (FCFPs) were used to describe chemical space. All models showed area under curve (AUC) values ranging from 0.89 to 0.94 in a fivefold cross-validation, indicating high model consistency. Models correctly predicted 80 % of an additional, external test set; Y-scrambling was also performed to rule out chance correlation. Additionally models based on patch clamp data and radioligand binding data were generated separately to analyze their predictive ability when compared to combined models. To experimentally validate the models, 50 of the predicted hERG blockers from the Chembridge database and ten of the predicted non-hERG blockers from an in-house compound library were selected for biological evaluation. Out of those 50 predicted hERG blockers, tested at a concentration of 10 microM, 18 compounds showed more than 50 % displacement of [(3)H]astemizole binding to cell membranes expressing the hERG channel. K(i) values of four of the selected binders were determined to be in the micromolar and high nanomolar range (K(i) (VH01)=2.0 microM, K(i) (VH06)=0.15 microM, K(i) (VH19)=1.1 microM and K(i) (VH47)=18 microM). Of these four compounds, VH01 and VH47 showed also a second, even higher affinity binding site with K(i) values of 7.4 nM and 36 nM, respectively. In the case of non-hERG blockers, all ten compounds tested were found to be inactive, showing less than 50 % displacement of [(3)H]astemizole binding at 10 microM. These experimentally validated models were then used to virtually screen commercial compound databases to evaluate whether they contain hERG blockers. 109 784 (23 %) of Chembridge, 133 175 (38 %) of Chemdiv, 111 737 (31