WorldWideScience

Sample records for analysis browsing server

  1. NeuroTerrain – a client-server system for browsing 3D biomedical image data sets

    Directory of Open Access Journals (Sweden)

    Nissanov Jonathan

    2007-02-01

    Full Text Available Abstract Background Three dimensional biomedical image sets are becoming ubiquitous, along with the canonical atlases providing the necessary spatial context for analysis. To make full use of these 3D image sets, one must be able to present views for 2D display, either surface renderings or 2D cross-sections through the data. Typical display software is limited to presentations along one of the three orthogonal anatomical axes (coronal, horizontal, or sagittal. However, data sets precisely oriented along the major axes are rare. To make fullest use of these datasets, one must reasonably match the atlas' orientation; this involves resampling the atlas in planes matched to the data set. Traditionally, this requires the atlas and browser reside on the user's desktop; unfortunately, in addition to being monolithic programs, these tools often require substantial local resources. In this article, we describe a network-capable, client-server framework to slice and visualize 3D atlases at off-axis angles, along with an open client architecture and development kit to support integration into complex data analysis environments. Results Here we describe the basic architecture of a client-server 3D visualization system, consisting of a thin Java client built on a development kit, and a computationally robust, high-performance server written in ANSI C++. The Java client components (NetOStat support arbitrary-angle viewing and run on readily available desktop computers running Mac OS X, Windows XP, or Linux as a downloadable Java Application. Using the NeuroTerrain Software Development Kit (NT-SDK, sophisticated atlas browsing can be added to any Java-compatible application requiring as little as 50 lines of Java glue code, thus making it eminently re-useable and much more accessible to programmers building more complex, biomedical data analysis tools. The NT-SDK separates the interactive GUI components from the server control and monitoring, so as to support

  2. Exploring the Concept of Browsing from the Literature Analysis

    Directory of Open Access Journals (Sweden)

    Shan-Ju L. Chang

    1997-12-01

    Full Text Available Browsing as a concept and an activity appears to be a fundamental part of human information behavior, which takes place in diverse contexts in our daily life. At the theoretical level, research on browsing can extend and develop theories of human information behavior. Practically, there could be suggestions for better organization and representation of information and material displayed as well as for effective information seeking and retrieval. This thesis attempts to explore the browsing phenomenon as it appears in the library and information science literature and end-user computing literature. Topics included for discussion are the definitions of the browsing concept, potential consequences, topology and influential factors of browsing as being identified from the literature analysis.[Article content in Chinese

  3. MPEG-7 applications for video browsing and analysis

    Science.gov (United States)

    Divakaran, Ajay; Bober, Miroslaw; Asai, Kohtaro

    2001-11-01

    The soon to be released MPEG-7 standard provides a Multimedia Content Description Interface. In other words, it provides a rich set of tools to describe the content with a view to facilitating applications such as content based querying, browsing and searching of multimedia content. In this paper, we describe practical applications of MPEG-7 tools. We use descriptors of features such as color, shape and motion to both index and analyze the content. The aforementioned descriptors stem from our previous work and are currently in the draft international MPEG-7 standard. In our previous work, we have shown the efficacy of each of the descriptors individually. In this paper, we show how we combine color and motion to effectively browse video in our first application. In our second application, we show how we can combine shape and color to recognize objects in real time. We will present a demonstration of our system at the conference. We have already successfully demonstrated it to the Japanese press.

  4. Analysis of Users Web Browsing Behavior Using Markov chain Model

    Directory of Open Access Journals (Sweden)

    Diwakar Shukla

    2011-03-01

    Full Text Available In present days of growing information technology, many browsers available for surfing and web mining. A user has option to use any of them at a time to mine out the desired website. Every browser has pre-defined level of popularity and reputation in the market. This paper considers the setup of only two browsers in a computer system and a user prefers to any one, if fails, switches to the other one .The behavior of user is modeled through Markov chain procedure and transition probabilities are calculated. The quitting to browsing is treated as a parameter of variation over the popularity. Graphical study is performed to explain the inter relationship between user behavior parameters and browser market popularity parameters. If rate of a company is lowest in terms of browser failure and lowest in terms of quitting probability then company enjoys better popularity and larger user proportion

  5. Research on Browsing Behavior in the Libraries: An Empirical Analysis of Consequences, Success and Influences

    Directory of Open Access Journals (Sweden)

    Shan-Ju L. Chang

    2000-12-01

    Full Text Available Browsing as an important part of human information behavior has been observed and investigated in the context of information seeking in the library in general and has assumed greater importance in human-machine interaction in particular. However, the nature and consequences of browsing are not well understood, and little is known of the success rate of such behavior.In this research, exploratory empirical case studies from three types of libraries were conducted, using questionnaires, observation logs, interviews, and computer search logs, to derive the empirical evidence to understand, from the user point of view, what are the consequences of browsing, what constitutes successful browsing, and what factors influence the extent of browsing. Content analysis and statistical analysis were conducted to analyze and synthesize the data. The research results show: (1 There are nine categories of the consequence of browsing, including accidental findings, modification of information need, found the desirable information, learning, feeling relaxation/recreational, information gathering, keeping updated, satisfying curiosity, and not finding what is needed. (2 Four factors that produce successful browsing: intention, the amount or quality of information, the utility of what is found, and help for solving problem or making judgment. (3 Three types of reasons for unsuccessful experience in browsing: not finding what one wanted, inadequate volume or quality of information, and not finding some things useful or interesting. (4 Three types of reasons for partial success: found the intended object but not happy with the quality or amount of information in it, not finding what one wanted but discovering new or potential useful information, not accomplish one purpose but achieve another one given multiple purposes. (5 The influential factors that affect the extent one engages in browsing include browser’s time, scheme of information organization, proximity to

  6. 基于代理服务器的协作浏览%Collaborative Browsing Based on WWW Proxy Server

    Institute of Scientific and Technical Information of China (English)

    王实; 高文; 杜建平; 李锦涛

    2002-01-01

    When a user accesses Internet through WWW Proxy Server,he has some kinds of interest.The Proxy Server will record his basic access information in Log.Through minging the Log,we can get the interest and evaluation of the user to the Web site visited by him.His interest and evaluation to a Web site can be represented through his access time and frequency to the Web site.If a user has some kinds of interest to some Web site can be represented through his access time and frequency to the Web stie.If a user has some kinds of interest to some Web sites,the other Web sites that are accessed by some other users having the same interest can be recommended to him.The content of the Web sites dosn''''t be considered.This paper presents an approach to mine the Proxy Log,provides the evaluation about a user to a Web site,and emploies the neighborhood-based collaborative filtering approach to provide the recommendation.

  7. Enabling Semantic Analysis of User Browsing Patterns in the Web of Data

    CERN Document Server

    Hoxha, Julia; Agarwal, Sudhir

    2012-01-01

    A useful step towards better interpretation and analysis of the usage patterns is to formalize the semantics of the resources that users are accessing in the Web. We focus on this problem and present an approach for the semantic formalization of usage logs, which lays the basis for eective techniques of querying expressive usage patterns. We also present a query answering approach, which is useful to nd in the logs expressive patterns of usage behavior via formulation of semantic and temporal-based constraints. We have processed over 30 thousand user browsing sessions extracted from usage logs of DBPedia and Semantic Web Dog Food. All these events are formalized semantically using respective domain ontologies and RDF representations of the Web resources being accessed. We show the eectiveness of our approach through experimental results, providing in this way an exploratory analysis of the way users browse theWeb of Data.

  8. Advanced Techniques in Web Intelligence-2 Web User Browsing Behaviour and Preference Analysis

    CERN Document Server

    Palade, Vasile; Jain, Lakhmi

    2013-01-01

    This research volume focuses on analyzing the web user browsing behaviour and preferences in traditional web-based environments, social  networks and web 2.0 applications,  by using advanced  techniques in data acquisition, data processing, pattern extraction and  cognitive science for modeling the human actions.  The book is directed to  graduate students, researchers/scientists and engineers  interested in updating their knowledge with the recent trends in web user analysis, for developing the next generation of web-based systems and applications.

  9. Instant Microsoft SQL Server Analysis Services 2012 dimensions and cube

    CERN Document Server

    Acharya, Anurag

    2013-01-01

    Get to grips with a new technology, understand what it is and what it can do for you, and then get to work with the most important features and tasks. Written in a practical, friendly manner this book will take you through the journey from installing SQL Server to developing your first cubes.""Microsoft SQL Server Analysis Service 2012 Dimensions"" and Cube Starter is targeted at anyone who wants to get started with cube development in Microsoft SQL Server Analysis Services. Regardless of whether you are a SQL Server developer who knows nothing about cube development or SSAS or even OLAP, you

  10. Anonymous Web Browsing and Hosting

    Directory of Open Access Journals (Sweden)

    MANOJ KUMAR

    2013-02-01

    Full Text Available In today’s high tech environment every organization, individual computer users use internet for accessing web data. To maintain high confidentiality and security of the data secure web solutions are required. In this paper we described dedicated anonymous web browsing solutions which makes our browsing faster and secure. Web application which play important role for transferring our secret information including like email need more and more security concerns. This paper also describes that how we can choose safe web hosting solutions and what the main functions are which provides more security over server data. With the browser security network security is also important which can be implemented using cryptography solutions, VPN and by implementing firewalls on the network. Hackers always try to steal our identity and data, they track our activities using the network application software’s and do harmful activities. So in this paper we described that how we can monitor them from security purposes.

  11. Spectral images browsing using principal component analysis and set partitioning in hierarchical tree

    Science.gov (United States)

    Ma, Long; Zhao, Deping

    2011-12-01

    Spectral imaging technology have been used mostly in remote sensing, but have recently been extended to new area requiring high fidelity color reproductions like telemedicine, e-commerce, etc. These spectral imaging systems are important because they offer improved color reproduction quality not only for a standard observer under a particular illuminantion, but for any other individual exhibiting normal color vision capability under another illuminantion. A possibility for browsing of the archives is needed. In this paper, the authors present a new spectral image browsing architecture. The architecture for browsing is expressed as follow: (1) The spectral domain of the spectral image is reduced with the PCA transform. As a result of the PCA transform the eigenvectors and the eigenimages are obtained. (2) We quantize the eigenimages with the original bit depth of spectral image (e.g. if spectral image is originally 8bit, then quantize eigenimage to 8bit), and use 32bit floating numbers for the eigenvectors. (3) The first eigenimage is lossless compressed by JPEG-LS, the other eigenimages were lossy compressed by wavelet based SPIHT algorithm. For experimental evalution, the following measures were used. We used PSNR as the measurement for spectral accuracy. And for the evaluation of color reproducibility, ΔE was used.here standard D65 was used as a light source. To test the proposed method, we used FOREST and CORAL spectral image databases contrain 12 and 10 spectral images, respectively. The images were acquired in the range of 403-696nm. The size of the images were 128*128, the number of bands was 40 and the resolution was 8 bits per sample. Our experiments show the proposed compression method is suitable for browsing, i.e., for visual purpose.

  12. Professional Microsoft SQL Server Analysis Services 2008 with MDX

    CERN Document Server

    Harinath, Sivakumar; Meenakshisundaram, Sethu

    2009-01-01

    When used with the MDX query language, SQL Server Analysis Services allows developers to build full-scale database applications to support such business functions as budgeting, forecasting, and market analysis.; Shows readers how to build data warehouses and multi-dimensional databases, query databases, and use Analysis Services and other components of SQL Server to provide end-to-end solutions; Revised, updated, and enhanced, the book discusses new features such as improved integration with Office and Excel 2007; query performance enhancements; improvements to aggregation designer, dimension

  13. Instant SQL Server Analysis Services 2012 Cube Security

    CERN Document Server

    Jayanty, Satya SK

    2013-01-01

    Filled with practical, step-by-step instructions and clear explanations for the most important and useful tasks. Instant Microsoft SQL Server Analysis Services 2012 Cube Security is a practical, hands-on guide that provides a number of clear, step-by-step exercises for getting started with cube security.This book is aimed at Database Administrators, Data Architects, and Systems Administrators who are managing the SQL Server data platform. It is also beneficial for analysis services developers who already have some experience with the technology, but who want to go into more detail on advanced

  14. Availability Analysis of Application Servers Using Software Rejuvenation and Virtualization

    Institute of Scientific and Technical Information of China (English)

    Thandar Thein; Jong Sou Park

    2009-01-01

    Demands on software reliability and availability have increased tremendously due to the nature of present day applications. We focus on the aspect of software for the high availability of application servers since the unavailability of servers more often originates from software faults rather than hardware faults. The software rejuvenation technique has been widely used to avoid the occurrence of unplanned failures, mainly due to the phenomena of software aging or caused by transient failures. In this paper, first we present a new way of using the virtual machine based software rejuvenation named VMSR to offer high availability for application server systems. Second we model a single physical server which is used to host multiple virtual machines (VMs) with the VMSR framework using stochastic modeling and evaluate it through both numerical analysis and SHARPE (Symbolic Hierarchical Automated Reliability and Performance Evaluator) tool simulation.This VMSR model is very general and can capture application server characteristics, failure behavior, and performability measures. Our results demonstrate that VMSR approach is a practical way to ensure uninterrupted availability and to optimize performance for aging applications.

  15. Visual System for Browsing, Analysis, and Retrieval of Data (ViSBARD)

    Science.gov (United States)

    Roberts, Aaron; Boller, Ryan; Cornwell, Carl

    2012-01-01

    ViSBARD software provides a way of visualizing multiple vector and scalar quantities as measured by many spacecraft at once. The data are displayed three-dimensionally along the orbits that may be shown either as connected lines or as points. The data display allows the rapid determination of vector configurations, correlations among many measurements at multiple points, and global relationships. Things such as vector field rotations and dozens of simultaneous variables are very difficult to see in (complementary) panel plot representations. The current and next generations of space physics missions require a means to display from tens to hundreds of time series of data in such a way that the mind can comprehend them for the purposes of browsing data, retrieving them in directly useful form, and analyzing them in a global context. Sets of many spacecraft, each carrying many instruments yielding nearly continuous data at high time resolution, have become one of the most effective ways to make progress in understanding the extended, ionized (plasma) atmosphere of the Earth and the Sun. For large collections of data to be effective, they must be extremely readily accessible, with simple, comprehensible overviews of what is available. ViSBARD provides a means to answer these concerns. The ViSBARD package also acts as a remote repository browser; an interface to a Virtual Observatory. Therefore, data can be pulled directly into the application, as opposed to searching for it and downloading separately.

  16. Berkeley Phylogenomics Group web servers: resources for structural phylogenomic analysis.

    Science.gov (United States)

    Glanville, Jake Gunn; Kirshner, Dan; Krishnamurthy, Nandini; Sjölander, Kimmen

    2007-07-01

    Phylogenomic analysis addresses the limitations of function prediction based on annotation transfer, and has been shown to enable the highest accuracy in prediction of protein molecular function. The Berkeley Phylogenomics Group provides a series of web servers for phylogenomic analysis: classification of sequences to pre-computed families and subfamilies using the PhyloFacts Phylogenomic Encyclopedia, FlowerPower clustering of proteins sharing the same domain architecture, MUSCLE multiple sequence alignment, SATCHMO simultaneous alignment and tree construction and SCI-PHY subfamily identification. The PhyloBuilder web server provides an integrated phylogenomic pipeline starting with a user-supplied protein sequence, proceeding to homolog identification, multiple alignment, phylogenetic tree construction, subfamily identification and structure prediction. The Berkeley Phylogenomics Group resources are available at http://phylogenomics.berkeley.edu.

  17. SQL Server Analysis Services 2012 cube development cookbook

    CERN Document Server

    Dewald, Baya; Hughes, Steve

    2013-01-01

    A practical cookbook packed with recipes to help developers produce data cubes as quickly as possible by following step by step instructions, rather than explaining data mining concepts with SSAS.If you are a BI or ETL developer using SQL Server Analysis services to build OLAP cubes, this book is ideal for you. Prior knowledge of relational databases and experience with Excel as well as SQL development is required.

  18. A user-friendly, dynamic web environment for remote data browsing and analysis of multiparametric geophysical data within the MULTIMO project

    Science.gov (United States)

    Carniel, Roberto; Di Cecca, Mauro; Jaquet, Olivier

    2006-05-01

    In the framework of the EU-funded project "Multi-disciplinary monitoring, modelling and forecasting of volcanic hazard" (MULTIMO), multiparametric data have been recorded at the MULTIMO station in Montserrat. Moreover, several other long time series, recorded at Montserrat and at other volcanoes, have been acquired in order to test stochastic and deterministic methodologies under development. Creating a general framework to handle data efficiently is a considerable task even for homogeneous data. In the case of heterogeneous data, this becomes a major issue. A need for a consistent way of browsing such a heterogeneous dataset in a user-friendly way therefore arose. Additionally, a framework for applying the calculation of the developed dynamical parameters on the data series was also needed in order to easily keep these parameters under control, e.g. for monitoring, research or forecasting purposes. The solution which we present is completely based on Open Source software, including Linux operating system, MySql database management system, Apache web server, Zope application server, Scilab math engine, Plone content management framework, Unified Modelling Language. From the user point of view the main advantage is the possibility of browsing through datasets recorded on different volcanoes, with different instruments, with different sampling frequencies, stored in different formats, all via a consistent, user- friendly interface that transparently runs queries to the database, gets the data from the main storage units, generates the graphs and produces dynamically generated web pages to interact with the user. The involvement of third parties for continuing the development in the Open Source philosophy and/or extending the application fields is now sought.

  19. SQL Server Analysis Services的概念与结构

    Institute of Scientific and Technical Information of China (English)

    雷妍

    2007-01-01

    Microsoft SQL Server Analysis Services(SSAS)为商业智能应用程序提供了联机分析处理(OLAP)和数据挖掘功能。Analysis Services允许设计、创建和管理包含多维结构,使其包含从其他数据源(例如关系数据库)聚合的数据,并通过这种方式来支持OLAP。对于数据挖掘应用程序,Analysis Services允许使用多种行业标准的数据挖掘算法来设计、创建和可视化从其他数据源构造的数据挖掘模型。

  20. Reliability analysis of M/G/1 queues with general retrial times and server breakdowns

    Institute of Scientific and Technical Information of China (English)

    WANG Jinting

    2006-01-01

    This paper concerns the reliability issues as well as queueing analysis of M/G/1 retrial queues with general retrial times and server subject to breakdowns and repairs. We assume that the server is unreliable and customers who find the server busy or down are queued in the retrial orbit in accordance with a first-come-first-served discipline. Only the customer at the head of the orbit queue is allowed for access to the server. The necessary and sufficient condition for the system to be stable is given. Using a supplementary variable method, we obtain the Laplace-Stieltjes transform of the reliability function of the server and a steady state solution for both queueing and reliability measures of interest. Some main reliability indexes, such as the availability, failure frequency, and the reliability function of the server, are obtained.

  1. Analysis of a multi-server queueing model of ABR

    Directory of Open Access Journals (Sweden)

    R. Núñez-Queija

    1998-01-01

    Full Text Available In this paper we present a queueing model for the performance analysis of Available Bit Rate (ABR traffic in Asynchronous Transfer Mode (ATM networks. We consider a multi-channel service station with two types of customers, denoted by high priority and low priority customers. In principle, high priority customers have preemptive priority over low priority customers, except on a fixed number of channels that are reserved for low priority traffic. The arrivals occur according to two independent Poisson processes, and service times are assumed to be exponentially distributed. Each high priority customer requires a single server, whereas low priority customers are served in processor sharing fashion. We derive the joint distribution of the numbers of customers (of both types in the system in steady state. Numerical results illustrate the effect of high priority traffic on the service performance of low priority traffic.

  2. Cooperative Mobile Web Browsing

    Directory of Open Access Journals (Sweden)

    Zhang Q

    2009-01-01

    Full Text Available This paper advocates a novel approach for mobile web browsing based on cooperation among wireless devices within close proximity operating in a cellular environment. In the actual state of the art, mobile phones can access the web using different cellular technologies. However, the supported data rates are not sufficient to cope with the ever increasing traffic requirements resulting from advanced and rich content services. Extending the state of the art, higher data rates can only be achieved by increasing complexity, cost, and energy consumption of mobile phones. In contrast to the linear extension of current technology, we propose a novel architecture where mobile phones are grouped together in clusters, using a short-range communication such as Bluetooth, sharing, and accumulating their cellular capacity. The accumulated data rate resulting from collaborative interactions over short-range links can then be used for cooperative mobile web browsing. By implementing the cooperative web browsing on commercial mobile phones, it will be shown that better performance is achieved in terms of increased data rate and therefore reduced access times, resulting in a significantly enhanced web browsing user experience on mobile phones.

  3. SciServer Compute brings Analysis to Big Data in the Cloud

    Science.gov (United States)

    Raddick, Jordan; Medvedev, Dmitry; Lemson, Gerard; Souter, Barbara

    2016-06-01

    SciServer Compute uses Jupyter Notebooks running within server-side Docker containers attached to big data collections to bring advanced analysis to big data "in the cloud." SciServer Compute is a component in the SciServer Big-Data ecosystem under development at JHU, which will provide a stable, reproducible, sharable virtual research environment.SciServer builds on the popular CasJobs and SkyServer systems that made the Sloan Digital Sky Survey (SDSS) archive one of the most-used astronomical instruments. SciServer extends those systems with server-side computational capabilities and very large scratch storage space, and further extends their functions to a range of other scientific disciplines.Although big datasets like SDSS have revolutionized astronomy research, for further analysis, users are still restricted to downloading the selected data sets locally - but increasing data sizes make this local approach impractical. Instead, researchers need online tools that are co-located with data in a virtual research environment, enabling them to bring their analysis to the data.SciServer supports this using the popular Jupyter notebooks, which allow users to write their own Python and R scripts and execute them on the server with the data (extensions to Matlab and other languages are planned). We have written special-purpose libraries that enable querying the databases and other persistent datasets. Intermediate results can be stored in large scratch space (hundreds of TBs) and analyzed directly from within Python or R with state-of-the-art visualization and machine learning libraries. Users can store science-ready results in their permanent allocation on SciDrive, a Dropbox-like system for sharing and publishing files. Communication between the various components of the SciServer system is managed through SciServer‘s new Single Sign-on Portal.We have created a number of demos to illustrate the capabilities of SciServer Compute, including Python and R scripts

  4. Performance analysis for queueing systems with close down periods and server under maintenance

    Science.gov (United States)

    Krishna Kumar, B.; Anbarasu, S.; Lakshmi, S. R. Anantha

    2015-01-01

    A single server queue subject to maintenance of the server and the close down period is considered. We obtain explicit expressions for the transient probabilities of the system size, the server under maintenance state and the close down period. The time-dependent performance measures of the system and the probability density function of the first-passage-time to reach the maintenance state are discussed. The corresponding steady state analysis and key performance measures of the system are also presented. Finally, the effect of various parameters on system performance measures is demonstrated by a numerical example.

  5. Analysis of free SSL/TLS Certificates and their implementation as Security Mechanism in Application Servers.

    Directory of Open Access Journals (Sweden)

    Mario E. Cueva Hurtado

    2017-02-01

    Full Text Available Security in the application layer (SSL, provides the confidentiality, integrity, and authenticity of the data, between two applications that communicate with each other. This article is the result of having implemented Free SSL / TLS Certificates in application servers, determining the relevant characteristics that must have a SSL/TLS certificate, the Certifying Authority generate it. A vulnerability analysis is developed in application servers and encrypted communications channel is established to protect against attacks such as man in the middle, phishing and maintaining the integrity of information that is transmitted between the client and server.

  6. Mass scans from a proton transfer mass spectrometry analysis of air over Mediterranean shrubland browsed by horses.

    Science.gov (United States)

    Bartolome, Jordi; Penuelas, Josep; Filella, Iolanda; Llusia, Joan; Broncano, M Jose; Plaixats, Josefina

    2007-10-01

    Plants usually emit large amount and varieties of volatiles after being damaged by herbivores. However, analytical methods for measuring herbivore-induced volatiles do not normally monitor the whole range of volatiles and the response to large herbivores such as large mammals is much less studied than the response to other herbivores such as insects. In this paper we present the results of using a highly sensitive proton transfer reaction-mass spectrometry (PTR-MS) technique that allows simultaneous monitoring of leaf volatiles in the pptv range. The resulting mass scans in air over Mediterranean shrubland browsed by horses show 70 to 100% higher concentrations of the masses corresponding to mass fragments 57, 43 and 41 (mostly hexenals, acetone and acetic acid) than scans over control non-browsed shrubland. These compounds are biogeochemically active and they are significant components of the volatile organic carbon found in the atmosphere. They influence the performance of living organisms and, the chemical and physical processes of Earth's atmosphere.

  7. maxdLoad2 and maxdBrowse: standards-compliant tools for microarray experimental annotation, data management and dissemination

    Directory of Open Access Journals (Sweden)

    Nashar Karim

    2005-11-01

    multiple interfaces to the contents of maxd databases. maxdBrowse emulates many of the browse and search features available in the maxdLoad2 application via a web-browser. This allows users who are not familiar with maxdLoad2 to browse and export microarray data from the database for their own analysis. The same browse and search features are also available via command-line and SOAP server interfaces. This both enables scripting of data export for use embedded in data repositories and analysis environments, and allows access to the maxd databases via web-service architectures. Conclusion maxdLoad2 http://www.bioinf.man.ac.uk/microarray/maxd/ and maxdBrowse http://dbk.ch.umist.ac.uk/maxdBrowse are portable and compatible with all common operating systems and major database servers. They provide a powerful, flexible package for annotation of microarray experiments and a convenient dissemination environment. They are available for download and open sourced under the Artistic License.

  8. VOCAL系统Marshal模块分析%Analysis of Marshal Server in VOCAL System

    Institute of Scientific and Technical Information of China (English)

    饶鹏

    2015-01-01

    VOCAL system is an open source VoIP system based on SIP. Marshal server is the transfer station for SIP messages. It helps to understand how the VOCAL system deal with SIP messages by analysis of Marshal server.%VOCAL是基于SIP协议的VoIP开源系统.而Marshal是VOCAL系统SIP消息的中转站,研究Marshal模块有助于了解VOCAL系统对SIP消息的处理.

  9. AVAILABILITY ANALYSIS OF THE QUEUEING SYSTEM GI/PH/1 WITH SERVER BREAKDOWNS

    Institute of Scientific and Technical Information of China (English)

    YUAN Xueming; LI Wei

    2003-01-01

    In the existing literature of Repairable Queueing Systems (RQS), i.e., queueing systems with server breakdowns, it is almost all assumed that interarrival times of successive customers are independent, identically exponentially distributed. In this paper, we deal with more generic system GI/PH/1 with server's exponential uptime and phase-type repair time. With matrix analysis theory, we establish the equilibrium condition and the characteristics of the system, derive the transient and stationary availability behavior of the system.

  10. Cooperative Mobile Web Browsing

    DEFF Research Database (Denmark)

    Perrucci, GP; Fitzek, FHP; Zhang, Qi

    2009-01-01

    This paper advocates a novel approach for mobile web browsing based on cooperation among wireless devices within close proximity operating in a cellular environment. In the actual state of the art, mobile phones can access the web using different cellular technologies. However, the supported data...... extension of current technology, we propose a novel architecture where mobile phones are grouped together in clusters, using a short-range communication such as Bluetooth, sharing, and accumulating their cellular capacity. The accumulated data rate resulting from collaborative interactions over short...... rates are not sufficient to cope with the ever increasing trafic requirements resulting from advanced and rich content services. Extending the state of the art, higher data rates can only be achieved by increasing complexity, cost, and energy consumption of mobile phones. In contrast to the linear...

  11. Microsoft® SQL Server® 2008 Analysis Services Step by Step

    CERN Document Server

    Cameron, Scott

    2009-01-01

    Teach yourself to use SQL Server 2008 Analysis Services for business intelligence-one step at a time. You'll start by building your understanding of the business intelligence platform enabled by SQL Server and the Microsoft Office System, highlighting the role of Analysis Services. Then, you'll create a simple multidimensional OLAP cube and progressively add features to help improve, secure, deploy, and maintain an Analysis Services database. You'll explore core Analysis Services 2008 features and capabilities, including dimension, cube, and aggregation design wizards; a new attribute relatio

  12. PlanetServer: Innovative approaches for the online analysis of hyperspectral satellite data from Mars

    Science.gov (United States)

    Oosthoek, J. H. P.; Flahaut, J.; Rossi, A. P.; Baumann, P.; Misev, D.; Campalani, P.; Unnithan, V.

    2014-06-01

    PlanetServer is a WebGIS system, currently under development, enabling the online analysis of Compact Reconnaissance Imaging Spectrometer (CRISM) hyperspectral data from Mars. It is part of the EarthServer project which builds infrastructure for online access and analysis of huge Earth Science datasets. Core functionality consists of the rasdaman Array Database Management System (DBMS) for storage, and the Open Geospatial Consortium (OGC) Web Coverage Processing Service (WCPS) for data querying. Various WCPS queries have been designed to access spatial and spectral subsets of the CRISM data. The client WebGIS, consisting mainly of the OpenLayers javascript library, uses these queries to enable online spatial and spectral analysis. Currently the PlanetServer demonstration consists of two CRISM Full Resolution Target (FRT) observations, surrounding the NASA Curiosity rover landing site. A detailed analysis of one of these observations is performed in the Case Study section. The current PlanetServer functionality is described step by step, and is tested by focusing on detecting mineralogical evidence described in earlier Gale crater studies. Both the PlanetServer methodology and its possible use for mineralogical studies will be further discussed. Future work includes batch ingestion of CRISM data and further development of the WebGIS and analysis tools.

  13. Software Aging Analysis of Web Server Using Neural Networks

    Directory of Open Access Journals (Sweden)

    G.Sumathi

    2012-05-01

    Full Text Available Software aging is a phenomenon that refers to progressive performance degradation or transient failures or even crashes in long running software systems such as web servers. It mainly occurs due to the deterioration of operating system resource, fragmentation and numerical error accumulation. A primitive method to fight against software aging is software rejuvenation. Software rejuvenation is a proactive fault management technique aimed at cleaning up the system internal state to prevent the occurrence of more severe crash failures in the future. It involves occasionally stopping the running software, cleaning its internal state and restarting it. An optimized schedule for performing the software rejuvenation has to be derived in advance because a long running application could not be put down now and then as it may lead to waste of cost. This paper proposes a method to derive an accurate and optimized schedule for rejuvenation of a web server (Apache by using Radial Basis Function (RBF based Feed Forward Neural Network, a variant of Artificial Neural Networks (ANN. Aging indicators are obtained through experimental setup involving Apache web server and clients, which acts as input to the neural network model. This method is better than existing ones because usage of RBF leads to better accuracy and speed in convergence.

  14. A method for determining the onset year of intense browsing

    Science.gov (United States)

    Keigley, R.B.; Frisina, M.R.; Fager, C.

    2003-01-01

    A survey based on browsing related architectures indicated that browsing level had increased at the Mt. Haggin Wildlife Management Area. We describe a technique for determining the year in which the increase in browsing level occurred. The technique is based on the analysis of stems old enough to have experienced the early period of light browsing; the onset year of intense browsing was determined by using dendrochronology to date the formation of twig clusters produced by intense browsing. Stems from 20 Geyer willow (Salix geyeriana Anderss.) plants were analyzed from each of 6 study sites. Mean onset years at the 6 sites ranged from 1983.1 to 1988.4; the mean onset year for all 6 sites was 1985.4 ?? 0.5 SE (N = 120). The reconstructed history was used to evaluate the relationship between moose (Alces alces) number and browse trend. From 1976 to 2000, the winter trend census of moose increased from 7 to 56. The onset of intense browsing in 1985 occurred when 23 moose were counted.

  15. Distributed analysis with CRAB: The client-server architecture evolution and commissioning

    Energy Technology Data Exchange (ETDEWEB)

    Codispoti, G.; /INFN, Bologna /Bologna U.; Cinquilli, M.; /INFN, Perugia; Fanfani, A.; /Bologna U.; Fanzago, F.; /CERN /INFN, CNAF; Farina, F.; /CERN /INFN, Milan Bicocca; Lacaprara, S.; /INFN, Legnaro; Miccio, V.; /CERN /INFN, CNAF; Spiga, D.; /CERN /INFN, Perugia /Perugia U.; Vaandering, E.; /Fermilab

    2008-01-01

    CRAB (CMS Remote Analysis Builder) is the tool used by CMS to enable running physics analysis in a transparent manner over data distributed across many sites. It abstracts out the interaction with the underlying batch farms, grid infrastructure and CMS workload management tools, such that it is easily usable by non-experts. CRAB can be used as a direct interface to the computing system or can delegate the user task to a server. Major efforts have been dedicated to the client-server system development, allowing the user to deal only with a simple and intuitive interface and to delegate all the work to a server. The server takes care of handling the users jobs during the whole lifetime of the users task. In particular, it takes care of the data and resources discovery, process tracking and output handling. It also provides services such as automatic resubmission in case of failures, notification to the user of the task status, and automatic blacklisting of sites showing evident problems beyond what is provided by existing grid infrastructure. The CRAB Server architecture and its deployment will be presented, as well as the current status and future development. In addition the experience in using the system for initial detector commissioning activities and data analysis will be summarized.

  16. Test Program for the Performance Analysis of DNS64 Servers

    Directory of Open Access Journals (Sweden)

    Gábor Lencse

    2015-09-01

    Full Text Available In our earlier research papers, bash shell scripts using the host Linux command were applied for testing the performance and stability of different DNS64 server imple­mentations. Because of their inefficiency, a small multi-threaded C/C++ program (named dns64perf was written which can directly send DNS AAAA record queries. After the introduction to the essential theoretical background about the structure of DNS messages and TCP/IP socket interface programming, the design decisions and implementation details of our DNS64 performance test program are disclosed. The efficiency of dns64perf is compared to that of the old method using bash shell scripts. The result is convincing: dns64perf can send at least 95 times more DNS AAAA record queries per second. The source code of dns64perf is published under the GNU GPLv3 license to support the work of other researchers in the field of testing the performance of DNS64 servers.

  17. Active browsing using similarity pyramids

    Science.gov (United States)

    Chen, Jau-Yuen; Bouman, Charles A.; Dalton, John C.

    1998-12-01

    In this paper, we describe a new approach to managing large image databases, which we call active browsing. Active browsing integrates relevance feedback into the browsing environment, so that users can modify the database's organization to suit the desired task. Our method is based on a similarity pyramid data structure, which hierarchically organizes the database, so that it can be efficiently browsed. At coarse levels, the similarity pyramid allows users to view the database as large clusters of similar images. Alternatively, users can 'zoom into' finer levels to view individual images. We discuss relevance feedback for the browsing process, and argue that it is fundamentally different from relevance feedback for more traditional search-by-query tasks. We propose two fundamental operations for active browsing: pruning and reorganization. Both of these operations depend on a user-defined relevance set, which represents the image or set of images desired by the user. We present statistical methods for accurately pruning the database, and we propose a new 'worm hole' distance metric for reorganizing the database, so that members of the relevance set are grouped together.

  18. ProteMiner-SSM: a web server for efficient analysis of similar protein tertiary substructures

    Science.gov (United States)

    Chang, Darby Tien-Hau; Chen, Chien-Yu; Chung, Wen-Chin; Oyang, Yen-Jen; Juan, Hsueh-Fen; Huang, Hsuan-Cheng

    2004-01-01

    Analysis of protein–ligand interactions is a fundamental issue in drug design. As the detailed and accurate analysis of protein–ligand interactions involves calculation of binding free energy based on thermodynamics and even quantum mechanics, which is highly expensive in terms of computing time, conformational and structural analysis of proteins and ligands has been widely employed as a screening process in computer-aided drug design. In this paper, a web server called ProteMiner-SSM designed for efficient analysis of similar protein tertiary substructures is presented. In one experiment reported in this paper, the web server has been exploited to obtain some clues about a biochemical hypothesis. The main distinction in the software design of the web server is the filtering process incorporated to expedite the analysis. The filtering process extracts the residues located in the caves of the protein tertiary structure for analysis and operates with O(nlogn) time complexity, where n is the number of residues in the protein. In comparison, the α-hull algorithm, which is a widely used algorithm in computer graphics for identifying those instances that are on the contour of a three-dimensional object, features O(n2) time complexity. Experimental results show that the filtering process presented in this paper is able to speed up the analysis by a factor ranging from 3.15 to 9.37 times. The ProteMiner-SSM web server can be found at http://proteminer.csie.ntu.edu.tw/. There is a mirror site at http://p4.sbl.bc.sinica.edu.tw/proteminer/. PMID:15215355

  19. SAGExplore: a web server for unambiguous tag mapping in serial analysis of gene expression oriented to gene discovery and annotation.

    Science.gov (United States)

    Norambuena, Tomás; Malig, Rodrigo; Melo, Francisco

    2007-07-01

    We describe a web server for the accurate mapping of experimental tags in serial analysis of gene expression (SAGE). The core of the server relies on a database of genomic virtual tags built by a recently described method that attempts to reduce the amount of ambiguous assignments for those tags that are not unique in the genome. The method provides a complete annotation of potential virtual SAGE tags within a genome, along with an estimation of their confidence for experimental observation that ranks tags that present multiple matches in the genome. The output of the server consists of a table in HTML format that contains links to a graphic representation of the results and to some external servers and databases, facilitating the tasks of analysis of gene expression and gene discovery. Also, a table in tab delimited text format is produced, allowing the user to export the results into custom databases and software for further analysis. The current server version provides the most accurate and complete SAGE tag mapping source that is available for the yeast organism. In the near future, this server will also allow the accurate mapping of experimental SAGE-tags from other model organisms such as human, mouse, frog and fly. The server is freely available on the web at: http://dna.bio.puc.cl/SAGExplore.html.

  20. LabKey Server: An open source platform for scientific data integration, analysis and collaboration

    Directory of Open Access Journals (Sweden)

    Lum Karl

    2011-03-01

    countries and 350 organizations. It tracks roughly 27,000 assay runs, 860,000 specimen vials and 1,300,000 vial transfers. Conclusions Sharing data, analysis tools and infrastructure can speed the efforts of large research consortia by enhancing efficiency and enabling new insights. The Atlas installation of LabKey Server demonstrates the utility of the LabKey platform for collaborative research. Stable, supported builds of LabKey Server are freely available for download at http://www.labkey.org. Documentation and source code are available under the Apache License 2.0.

  1. 网络用户浏览行为的分析%Internet Users’ Browsing Behaviors Analysis

    Institute of Scientific and Technical Information of China (English)

    张亮; 赵娜

    2016-01-01

    近年来, Web 使用挖掘成为数据挖掘领域中一个新的研究热点, Web 使用挖掘是从记录了大量网络用户行为信息的 Web 日志中发现用户访问行为特征和潜在规律。本文结合某高校主页的真实运行数据,通过 Web 使用挖掘对于网站的运行日志文件进行全面的挖掘分析,分析用户对信息内容的兴趣度,并通过用户对网页的访问数据推算出各个页面受众的兴趣度高低,借此改良网站的内容和布局。%In recent years, web usage mining has become a new hotspot in the field of data mining. From the web logs which record information of a large number of network user‟s behavior, web usage mining discovers the characteristics and potential user access law. This paper uses many real running dates of college homepage. Aiming at running log files, we carry out a comprehensive analysis by using the web mining. Analyzing the interest measure of user to the information content. By using the user access to the page, the system can calculate the user data level of interest on each page, and thereby improving the content and layout of the site.

  2. WebMGA: a customizable web server for fast metagenomic sequence analysis

    Directory of Open Access Journals (Sweden)

    Niu Beifang

    2011-09-01

    Full Text Available Abstract Background The new field of metagenomics studies microorganism communities by culture-independent sequencing. With the advances in next-generation sequencing techniques, researchers are facing tremendous challenges in metagenomic data analysis due to huge quantity and high complexity of sequence data. Analyzing large datasets is extremely time-consuming; also metagenomic annotation involves a wide range of computational tools, which are difficult to be installed and maintained by common users. The tools provided by the few available web servers are also limited and have various constraints such as login requirement, long waiting time, inability to configure pipelines etc. Results We developed WebMGA, a customizable web server for fast metagenomic analysis. WebMGA includes over 20 commonly used tools such as ORF calling, sequence clustering, quality control of raw reads, removal of sequencing artifacts and contaminations, taxonomic analysis, functional annotation etc. WebMGA provides users with rapid metagenomic data analysis using fast and effective tools, which have been implemented to run in parallel on our local computer cluster. Users can access WebMGA through web browsers or programming scripts to perform individual analysis or to configure and run customized pipelines. WebMGA is freely available at http://weizhongli-lab.org/metagenomic-analysis. Conclusions WebMGA offers to researchers many fast and unique tools and great flexibility for complex metagenomic data analysis.

  3. PepServe: a web server for peptide analysis, clustering and visualization

    Science.gov (United States)

    Alexandridou, Anastasia; Dovrolis, Nikolas; Tsangaris, George Th.; Nikita, Konstantina; Spyrou, George

    2011-01-01

    Peptides, either as protein fragments or as naturally occurring entities are characterized by their sequence and function features. Many times the researchers need to massively manage peptide lists concerning protein identification, biomarker discovery, bioactivity, immune response or other functionalities. We present a web server that manages peptide lists in terms of feature analysis as well as interactive clustering and visualization of the given peptides. PepServe is a useful tool in the understanding of the peptide feature distribution among a group of peptides. The PepServe web application is freely available at http://bioserver-1.bioacademy.gr/Bioserver/PepServe/. PMID:21572105

  4. mtDNA-Server: next-generation sequencing data analysis of human mitochondrial DNA in the cloud.

    Science.gov (United States)

    Weissensteiner, Hansi; Forer, Lukas; Fuchsberger, Christian; Schöpf, Bernd; Kloss-Brandstätter, Anita; Specht, Günther; Kronenberg, Florian; Schönherr, Sebastian

    2016-07-08

    Next generation sequencing (NGS) allows investigating mitochondrial DNA (mtDNA) characteristics such as heteroplasmy (i.e. intra-individual sequence variation) to a higher level of detail. While several pipelines for analyzing heteroplasmies exist, issues in usability, accuracy of results and interpreting final data limit their usage. Here we present mtDNA-Server, a scalable web server for the analysis of mtDNA studies of any size with a special focus on usability as well as reliable identification and quantification of heteroplasmic variants. The mtDNA-Server workflow includes parallel read alignment, heteroplasmy detection, artefact or contamination identification, variant annotation as well as several quality control metrics, often neglected in current mtDNA NGS studies. All computational steps are parallelized with Hadoop MapReduce and executed graphically with Cloudgene. We validated the underlying heteroplasmy and contamination detection model by generating four artificial sample mix-ups on two different NGS devices. Our evaluation data shows that mtDNA-Server detects heteroplasmies and artificial recombinations down to the 1% level with perfect specificity and outperforms existing approaches regarding sensitivity. mtDNA-Server is currently able to analyze the 1000G Phase 3 data (n = 2,504) in less than 5 h and is freely accessible at https://mtdna-server.uibk.ac.at.

  5. mtDNA-Server: next-generation sequencing data analysis of human mitochondrial DNA in the cloud

    Science.gov (United States)

    Weissensteiner, Hansi; Forer, Lukas; Fuchsberger, Christian; Schöpf, Bernd; Kloss-Brandstätter, Anita; Specht, Günther; Kronenberg, Florian; Schönherr, Sebastian

    2016-01-01

    Next generation sequencing (NGS) allows investigating mitochondrial DNA (mtDNA) characteristics such as heteroplasmy (i.e. intra-individual sequence variation) to a higher level of detail. While several pipelines for analyzing heteroplasmies exist, issues in usability, accuracy of results and interpreting final data limit their usage. Here we present mtDNA-Server, a scalable web server for the analysis of mtDNA studies of any size with a special focus on usability as well as reliable identification and quantification of heteroplasmic variants. The mtDNA-Server workflow includes parallel read alignment, heteroplasmy detection, artefact or contamination identification, variant annotation as well as several quality control metrics, often neglected in current mtDNA NGS studies. All computational steps are parallelized with Hadoop MapReduce and executed graphically with Cloudgene. We validated the underlying heteroplasmy and contamination detection model by generating four artificial sample mix-ups on two different NGS devices. Our evaluation data shows that mtDNA-Server detects heteroplasmies and artificial recombinations down to the 1% level with perfect specificity and outperforms existing approaches regarding sensitivity. mtDNA-Server is currently able to analyze the 1000G Phase 3 data (n = 2,504) in less than 5 h and is freely accessible at https://mtdna-server.uibk.ac.at. PMID:27084948

  6. Social Browsing & Information Filtering in Social Media

    CERN Document Server

    Lerman, Kristina

    2007-01-01

    Social networks are a prominent feature of many social media sites, a new generation of Web sites that allow users to create and share content. Sites such as Digg, Flickr, and Del.icio.us allow users to designate others as "friends" or "contacts" and provide a single-click interface to track friends' activity. How are these social networks used? Unlike pure social networking sites (e.g., LinkedIn and Facebook), which allow users to articulate their online professional and personal relationships, social media sites are not, for the most part, aimed at helping users create or foster online relationships. Instead, we claim that social media users create social networks to express their tastes and interests, and use them to filter the vast stream of new submissions to find interesting content. Social networks, in fact, facilitate new ways of interacting with information: what we call social browsing. Through an extensive analysis of data from Digg and Flickr, we show that social browsing is one of the primary usa...

  7. A Comprehensive Availability Modeling and Analysis of a Virtualized Servers System Using Stochastic Reward Nets

    Directory of Open Access Journals (Sweden)

    Tuan Anh Nguyen

    2014-01-01

    Full Text Available It is important to assess availability of virtualized systems in IT business infrastructures. Previous work on availability modeling and analysis of the virtualized systems used a simplified configuration and assumption in which only one virtual machine (VM runs on a virtual machine monitor (VMM hosted on a physical server. In this paper, we show a comprehensive availability model using stochastic reward nets (SRN. The model takes into account (i the detailed failures and recovery behaviors of multiple VMs, (ii various other failure modes and corresponding recovery behaviors (e.g., hardware faults, failure and recovery due to Mandelbugs and aging-related bugs, and (iii dependency between different subcomponents (e.g., between physical host failure and VMM, etc. in a virtualized servers system. We also show numerical analysis on steady state availability, downtime in hours per year, transaction loss, and sensitivity analysis. This model provides a new finding on how to increase system availability by combining both software rejuvenations at VM and VMM in a wise manner.

  8. Data access and analysis with distributed federated data servers in climateprediction.net

    Directory of Open Access Journals (Sweden)

    N. Massey

    2006-01-01

    Full Text Available climateprediction.net is a large public resource distributed scientific computing project. Members of the public download and run a full-scale climate model, donate their computing time to a large perturbed physics ensemble experiment to forecast the climate in the 21st century and submit their results back to the project. The amount of data generated is large, consisting of tens of thousands of individual runs each in the order of tens of megabytes. The overall dataset is, therefore, in the order of terabytes. Access and analysis of the data is further complicated by the reliance on donated, distributed, federated data servers. This paper will discuss the problems encountered when the data required for even a simple analysis is spread across several servers and how webservice technology can be used; how different user interfaces with varying levels of complexity and flexibility can be presented to the application scientists, how using existing web technologies such as HTTP, SOAP, XML, HTML and CGI can engender the reuse of code across interfaces; and how application scientists can be notified of their analysis' progress and results in an asynchronous architecture.

  9. Shrub control by browsing: Targeting adult plants

    Science.gov (United States)

    da Silveira Pontes, Laíse; Magda, Danièle; Gleizes, Benoît; Agreil, Cyril

    2016-01-01

    Reconciling the well known benefits of shrubs for forage with environmental goals, whilst preventing their dominance, is a major challenge in rangeland management. Browsing may be an economical solution for shrubby rangelands as herbivore browsing has been shown to control juvenile shrub growth. Less convincing results have been obtained for adult plants, and long-term experiments are required to investigate the cumulative effects on adult plants. We therefore assessed the impact of different levels of browsing intensity on key demographic parameters for a major dominant shrub species (broom, Cytisus scoparius), focusing on adult plants. We assigned individual broom plants to one of three age classes: 3-5 years (young adults); 5-7 years (adults); and 7-9 years (mature adults). These plants were then left untouched or had 50% or 90% of their total edible stem biomass removed in simulated low-intensity and high-intensity browsing treatments, respectively. Morphological, survival and fecundity data were collected over a period of four years. Browsing affected the morphology of individual plants, promoting changes in subsequent regrowth, and decreasing seed production. The heavily browsed plants were 17% shorter, 32% narrower, and their twigs were 28% shorter. Light browsing seemed to control the growth of young adult plants more effectively than that of older plants. Reproductive output was considerably lower than for control plants after light browsing, and almost 100% lower after heavy browsing. High-intensity browsing had a major effect on survival causing high levels of plant mortality. We conclude that suitable browsing practices could be used to modify adult shrub demography in the management of shrub dominance and forage value.

  10. Analysis of practical backoff protocols for contention resolution with multiple servers

    Energy Technology Data Exchange (ETDEWEB)

    Goldberg, L.A. [Univ. of Warwick, Coventry (United Kingdom); MacKenzie, P.D. [Sandia National Lab., Albuquerque, NM (United States)

    1996-12-31

    Backoff protocols are probably the most widely used protocols for contention resolution in multiple access channels. In this paper, we analyze the stochastic behavior of backoff protocols for contention resolution among a set of clients and servers, each server being a multiple access channel that deals with contention like an Ethernet channel. We use the standard model in which each client generates requests for a given server according to a Bemoulli distribution with a specified mean. The client-server request rate of a system is the maximum over all client-server pairs (i, j) of the sum of all request rates associated with either client i or server j. Our main result is that any superlinear polynomial backoff protocol is stable for any multiple-server system with a sub-unit client-server request rate. We confirm the practical relevance of our result by demonstrating experimentally that the average waiting time of requests is very small when such a system is run with reasonably few clients and reasonably small request rates such as those that occur in actual ethernets. Our result is the first proof of stability for any backoff protocol for contention resolution with multiple servers. Our result is also the first proof that any weakly acknowledgment based protocol is stable for contention resolution with multiple servers and such high request rates. Two special cases of our result are of interest. Hastad, Leighton and Rogoff have shown that for a single-server system with a sub-unit client-server request rate any modified superlinear polynomial backoff protocol is stable. These modified backoff protocols are similar to standard backoff protocols but require more random bits to implement. The special case of our result in which there is only one server extends the result of Hastad, Leighton and Rogoff to standard (practical) backoff protocols. Finally, our result applies to dynamic routing in optical networks.

  11. Secure Environment for Internet Browsing

    Directory of Open Access Journals (Sweden)

    Alexandru Tudor Gavrilescu

    2014-03-01

    Full Text Available The Internet is used by a large proportion of the population, but unfortunately their education regarding the usage of the instruments available is poor, resulting in potential informational frauds, especially in the financial field.In this article I have approached a few simple problems, yet very important and frequently occurred, regarding the secure environment for Internet browsing, proposing solutions for each of them. The security methods are: anti-phishing; the prevention of SQL injection, through the verification of the data given as input in the Address Bar and in the password fields and blocking the access in case of a potential threat; a virtual keyboard for preventing the recording of the keys pressed (key loggers; the backup of the credentials in a local file and the encryption of it to prevent unauthorized access, the decryption of the data is made using a unique encryption key, owned by the user.

  12. Indexing, Browsing, and Searching of Digital Video.

    Science.gov (United States)

    Smeaton, Alan F.

    2004-01-01

    Presents a literature review that covers the following topics related to indexing, browsing, and searching of digital video: video coding and standards; conventional approaches to accessing digital video; automatically structuring and indexing digital video; searching, browsing, and summarization; measurement and evaluation of the effectiveness of…

  13. THttpServer class in ROOT

    Science.gov (United States)

    Adamczewski-Musch, Joern; Linev, Sergey

    2015-12-01

    The new THttpServer class in ROOT implements HTTP server for arbitrary ROOT applications. It is based on Civetweb embeddable HTTP server and provides direct access to all objects registered for the server. Objects data could be provided in different formats: binary, XML, GIF/PNG, and JSON. A generic user interface for THttpServer has been implemented with HTML/JavaScript based on JavaScript ROOT development. With any modern web browser one could list, display, and monitor objects available on the server. THttpServer is used in Go4 framework to provide HTTP interface to the online analysis.

  14. QlikView Server and Publisher

    CERN Document Server

    Redmond, Stephen

    2014-01-01

    This is a comprehensive guide with a step-by-step approach that enables you to host and manage servers using QlikView Server and QlikView Publisher.If you are a server administrator wanting to learn about how to deploy QlikView Server for server management,analysis and testing, and QlikView Publisher for publishing of business content then this is the perfect book for you. No prior experience with QlikView is expected.

  15. Detecting DDoS Attacks Against DNS Servers Using Time Series Analysis

    Directory of Open Access Journals (Sweden)

    Hongyuan Wang

    2013-07-01

    Full Text Available Domain Name System (DNS Service is the basic support of Internet, which security plays a vital role in the entire Internet. Because DNS requests and responses are mostly UDP-based, and the existing large numbers of open recursive DNS servers, it is vulnerable to distributed denial of services (DDoS attacks. Through the analysis of several aspects of these attacks, a novel approach to detect DDoS attack is proposed based on characteristics of attack traffics (CAT time series. Then CAT time series are transformed into a multidimensional vector series and a support vector machine (SVM classifier is applied to identity the attacks. The experiment results show that our approach can identify the state features of the abnormal flow due to the DDoS attacking flows, and detect DDoS attacks accurately

  16. Detecting DDoS Attacks against Web Server Using Time Series Analysis

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    Distributed Denial of Service (DDoS) attack is a major threat to the availability of Web service. The inherent presence of self-similarity in Web traffic motivates the applicability of time series analysis in the study of the burst feature of DDoS attack. This paper presents a method of detecting DDoS attacks against Web server by analyzing the abrupt change of time series data obtained from Web traffic. Time series data are specified in reference sliding window and test sliding window, and the abrupt change is modeled using Auto-Regressive (AR) process. By comparing two adjacent non-overlapping windows of the time series, the attack traffic could be detected at a time point. Combined with alarm correlation and location correlation, not only the presence of DDoS attack, but also its occurring time and location can be determined. The experimental results in a test environment are illustrated to justify our method.

  17. Disclosure-Protected Inference with Linked Microdata Using a Remote Analysis Server

    Directory of Open Access Journals (Sweden)

    Chipperfield James O.

    2014-03-01

    Full Text Available Large amounts of microdata are collected by data custodians in the form of censuses and administrative records. Often, data custodians will collect different information on the same individual. Many important questions can be answered by linking microdata collected by different data custodians. For this reason, there is very strong demand from analysts, within government, business, and universities, for linked microdata. However, many data custodians are legally obliged to ensure the risk of disclosing information about a person or organisation is acceptably low. Different authors have considered the problem of how to facilitate reliable statistical inference from analysis of linked microdata while ensuring that the risk of disclosure is acceptably low. This article considers the problem from the perspective of an Integrating Authority that, by definition, is trusted to link the microdata and to facilitate analysts’ access to the linked microdata via a remote server, which allows analysts to fit models and view the statistical output without being able to observe the underlying linked microdata. One disclosure risk that must be managed by an Integrating Authority is that one data custodian may use the microdata it supplied to the Integrating Authority and statistical output released from the remote server to disclose information about a person or organisation that was supplied by the other data custodian. This article considers analysis of only binary variables. The utility and disclosure risk of the proposed method are investigated both in a simulation and using a real example. This article shows that some popular protections against disclosure (dropping records, rounding regression coefficients or imposing restrictions on model selection can be ineffective in the above setting.

  18. DIANA-microT web server v5.0: service integration into miRNA functional analysis workflows.

    Science.gov (United States)

    Paraskevopoulou, Maria D; Georgakilas, Georgios; Kostoulas, Nikos; Vlachos, Ioannis S; Vergoulis, Thanasis; Reczko, Martin; Filippidis, Christos; Dalamagas, Theodore; Hatzigeorgiou, A G

    2013-07-01

    MicroRNAs (miRNAs) are small endogenous RNA molecules that regulate gene expression through mRNA degradation and/or translation repression, affecting many biological processes. DIANA-microT web server (http://www.microrna.gr/webServer) is dedicated to miRNA target prediction/functional analysis, and it is being widely used from the scientific community, since its initial launch in 2009. DIANA-microT v5.0, the new version of the microT server, has been significantly enhanced with an improved target prediction algorithm, DIANA-microT-CDS. It has been updated to incorporate miRBase version 18 and Ensembl version 69. The in silico-predicted miRNA-gene interactions in Homo sapiens, Mus musculus, Drosophila melanogaster and Caenorhabditis elegans exceed 11 million in total. The web server was completely redesigned, to host a series of sophisticated workflows, which can be used directly from the on-line web interface, enabling users without the necessary bioinformatics infrastructure to perform advanced multi-step functional miRNA analyses. For instance, one available pipeline performs miRNA target prediction using different thresholds and meta-analysis statistics, followed by pathway enrichment analysis. DIANA-microT web server v5.0 also supports a complete integration with the Taverna Workflow Management System (WMS), using the in-house developed DIANA-Taverna Plug-in. This plug-in provides ready-to-use modules for miRNA target prediction and functional analysis, which can be used to form advanced high-throughput analysis pipelines.

  19. Analysis of Windows Server 2012 for the platform to set up the LAN server%试析以Windows Server 2012为平台架设局域网服务器

    Institute of Scientific and Technical Information of China (English)

    丛佩丽

    2016-01-01

    With the development of network technology and information technology, paperless ofifce has become a mainstream form of enterprise ofifce, and it is especially important to build a local area network server. Taking the Windows Server 2012 network operating system for a platform, as the corporate network administrator, this paper elaborated how to build DNS server, WWW server, DHCP server and FTP server to realize the construction of LAN servers.%随着网络技术和信息化技术的发展,无纸化办公已经成为企业办公的一种主流形式,构建局域网服务器尤为重要。文章以Windows Server 2012网络操作系统为平台,以公司网络管理员身份阐述了如何构建局域网DNS服务器,WWW服务器、动态主机配置协议(Dynamic Host Conifguration Protocol,DHCP)服务器和文件传输协议(File Transfer Protocol,FTP)服务器,实现局域网服务器构建。

  20. Browse quality and the Kenai moose population

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — This report covers the browse quality and the Kenai moose population. The quality of moose forage on the north western Kenai Peninsula was evaluated by determining...

  1. A Comprehensive Sensitivity Analysis of a Data Center Network with Server Virtualization for Business Continuity

    Directory of Open Access Journals (Sweden)

    Tuan Anh Nguyen

    2015-01-01

    Full Text Available Sensitivity assessment of availability for data center networks (DCNs is of paramount importance in design and management of cloud computing based businesses. Previous work has presented a performance modeling and analysis of a fat-tree based DCN using queuing theory. In this paper, we present a comprehensive availability modeling and sensitivity analysis of a DCell-based DCN with server virtualization for business continuity using stochastic reward nets (SRN. We use SRN in modeling to capture complex behaviors and dependencies of the system in detail. The models take into account (i two DCell configurations, respectively, composed of two and three physical hosts in a DCell0 unit, (ii failure modes and corresponding recovery behaviors of hosts, switches, and VMs, and VM live migration mechanism within and between DCell0s, and (iii dependencies between subsystems (e.g., between a host and VMs and between switches and VMs in the same DCell0. The constructed SRN models are analyzed in detail with regard to various metrics of interest to investigate system’s characteristics. A comprehensive sensitivity analysis of system availability is carried out in consideration of the major impacting parameters in order to observe the system’s complicated behaviors and find the bottlenecks of system availability. The analysis results show the availability improvement, capability of fault tolerance, and business continuity of the DCNs complying with DCell network topology. This study provides a basis of designing and management of DCNs for business continuity.

  2. Empirical Analysis of Server Consolidation and Desktop Virtualization in Cloud Computing

    Directory of Open Access Journals (Sweden)

    Bao Rong Chang

    2013-01-01

    Full Text Available Physical server transited to virtual server infrastructure (VSI and desktop device to virtual desktop infrastructure (VDI have the crucial problems of server consolidation, virtualization performance, virtual machine density, total cost of ownership (TCO, and return on investments (ROI. Besides, how to appropriately choose hypervisor for the desired server/desktop virtualization is really challenging, because a trade-off between virtualization performance and cost is a hard decision to make in the cloud. This paper introduces five hypervisors to establish the virtual environment and then gives a careful assessment based on C/P ratio that is derived from composite index, consolidation ratio, virtual machine density, TCO, and ROI. As a result, even though ESX server obtains the highest ROI and lowest TCO in server virtualization and Hyper-V R2 gains the best performance of virtual machine management; both of them however cost too much. Instead the best choice is Proxmox Virtual Environment (Proxmox VE because it not only saves the initial investment a lot to own a virtual server/desktop infrastructure, but also obtains the lowest C/P ratio.

  3. DNA sequence chromatogram browsing using JAVA and CORBA.

    Science.gov (United States)

    Parsons, J D; Buehler, E; Hillier, L

    1999-03-01

    DNA sequence chromatograms (traces) are the primary data source for all large-scale genomic and expressed sequence tags (ESTs) sequencing projects. Access to the sequencing trace assists many later analyses, for example contig assembly and polymorphism detection, but obtaining and using traces is problematic. Traces are not collected and published centrally, they are much larger than the base calls derived from them, and viewing them requires the interactivity of a local graphical client with local data. To provide efficient global access to DNA traces, we developed a client/server system based on flexible Java components integrated into other applications including an applet for use in a WWW browser and a stand-alone trace viewer. Client/server interaction is facilitated by CORBA middleware which provides a well-defined interface, a naming service, and location independence. [The software is packaged as a Jar file available from the following URL: http://www.ebi.ac.uk/jparsons. Links to working examples of the trace viewers can be found at http://corba.ebi.ac.uk/EST. All the Washington University mouse EST traces are available for browsing at the same URL.

  4. Many-server queues with customer abandonment: Numerical analysis of their diffusion model

    Directory of Open Access Journals (Sweden)

    Shuangchi He

    2013-01-01

    Full Text Available We use a multidimensional diffusion process to approximate the dynamics of aqueue served by many parallel servers. Waiting customers in this queue may abandonthe system without service. To analyze the diffusion model, we develop a numericalalgorithm for computing its stationary distribution. A crucial part of the algorithm ischoosing an appropriate reference density. Using a conjecture on the tailbehavior of the limit queue length process, we propose a systematic approach toconstructing a reference density. With the proposed reference density, thealgorithm is shown to converge quickly in numerical experiments. Theseexperiments demonstrate that the diffusion model is a satisfactory approximation formany-server queues, sometimes for queues with as few as twenty servers.

  5. Design and Analysis of an Enhanced Patient-Server Mutual Authentication Protocol for Telecare Medical Information System.

    Science.gov (United States)

    Amin, Ruhul; Islam, S K Hafizul; Biswas, G P; Khan, Muhammad Khurram; Obaidat, Mohammad S

    2015-11-01

    In order to access remote medical server, generally the patients utilize smart card to login to the server. It has been observed that most of the user (patient) authentication protocols suffer from smart card stolen attack that means the attacker can mount several common attacks after extracting smart card information. Recently, Lu et al.'s proposes a session key agreement protocol between the patient and remote medical server and claims that the same protocol is secure against relevant security attacks. However, this paper presents several security attacks on Lu et al.'s protocol such as identity trace attack, new smart card issue attack, patient impersonation attack and medical server impersonation attack. In order to fix the mentioned security pitfalls including smart card stolen attack, this paper proposes an efficient remote mutual authentication protocol using smart card. We have then simulated the proposed protocol using widely-accepted AVISPA simulation tool whose results make certain that the same protocol is secure against active and passive attacks including replay and man-in-the-middle attacks. Moreover, the rigorous security analysis proves that the proposed protocol provides strong security protection on the relevant security attacks including smart card stolen attack. We compare the proposed scheme with several related schemes in terms of computation cost and communication cost as well as security functionalities. It has been observed that the proposed scheme is comparatively better than related existing schemes.

  6. Web Server Failure Analysis and Treatment Measures%Web服务器故障分析及处理措施

    Institute of Scientific and Technical Information of China (English)

    陈春晓

    2014-01-01

    Web服务器出现故障不仅会对网站的安全运行造成影响,还会影响到人们的正常使用,所以要及时维护和升级服务器,保证其正常运行。以B/S架构的PACS系统为例,其Web服务器采用的是WindowsIIS,客户端浏览器的使用需要WindowsIIS处于正常情况状态。如果WindowsIIS性能出现故障,就会影响PACS系统运行的可靠性。分析了几种故障,并提出解决方法,使Web服务器稳定运行。%Web server failure will not affect the safe operation of the site, but also affect people’s normal use, so in a timely manner to maintain and upgrade the server to ensure their normal operation. With B/S structure PACS system, for example, the Web server uses a WindowsIIS, the client browser is in use need WindowsIIS normal state. If WindowsIIS performance fails, it will affect the reliability of the PACS system operation. Analysis of several failures, and propose solutions to make the Web server and stable operation.

  7. A smartphone-based colorimetric reader coupled with a remote server for rapid on-site catechols analysis.

    Science.gov (United States)

    Wang, Yun; Li, Yuanyuan; Bao, Xu; Han, Juan; Xia, Jinchen; Tian, Xiaoyu; Ni, Liang

    2016-11-01

    The search of a practical method to analyze cis-diol-containing compounds outside laboratory settings remains a substantial scientific challenge. Herein, a smartphone-based colorimetric reader was coupled with a remote server for rapid on-site analysis of catechols. A smallest-scale 2×2 colorimetric sensor array composed of pH indicators and phenylboronic acid was configured. The array was able to distinguish 13 catechols at 6 serial concentrations, through simultaneous treatment via principal component analysis, hierarchical cluster analysis, and linear discriminant analysis. After both the discriminatory power of the array and the prediction ability of the partial least squares quantitative models were proved to be predominant, the smartphone was coupled to the remote server. All the ΔRGB data were uploaded to the remote server wherein linear discriminant analysis and partial least squares processing modules were established to provide qualitative discrimination and quantitative calculation, respectively, of the analytes in real time. The applicability of this novel method to a real-life scenario was confirmed by the on-site analysis of various catechols from a water sample of the Yangtze River; the feedback result in the smartphone showed the method was able to identify the catechols with 100% accuracy and predict the concentrations to within 0.706-2.240 standard deviation.

  8. Abdominal aortic aneurysms: virtual imaging and analysis through a remote web server

    Energy Technology Data Exchange (ETDEWEB)

    Neri, Emanuele; Bargellini, Irene; Vignali, Claudio; Bartolozzi, Carlo [University of Pisa, Diagnostic and Interventional Radiology, Pisa (Italy); Rieger, Michael; Jaschke, Werner [University of Innsbruck, Diagnostic and Interventional Radiology, Innsbruck (Austria); Giachetti, Andrea; Tuveri, Massimiliano [Center for Research and Study, Sardinia (Italy)

    2005-02-01

    The study describes the application of a web-based software in the planning of the endovascular treatment of abdominal aortic aneurysms (AAA). The software has been developed in the framework of a 2-year research project called Aneurysm QUAntification Through an Internet Collaborative System (AQUATICS); it allows to manage remotely Virtual Reality Modeling Language (VRML) models of the abdominal aorta, derived from multirow computed tomography angiography (CTA) data sets, and to obtain measurements of diameters, angles and centerline lengths. To test the reliability of measurements, two radiologists performed a detailed analysis of multiple 3D models generated from a synthetic phantom, mimicking an AAA. The system was tested on 30 patients with AAA; CTA data sets were mailed and the time required for segmentation and measurement were collected for each case. The Bland-Altman plot analysis showed that the mean intra- and inter-observer differences in measures on phantoms were clinically acceptable. The mean time required for segmentation was 1 h (range 45-120 min). The mean time required for measurements on the web was 7 min (range 4-11 min). The AQUATICS web server may provide a rapid, standardized and accurate tool for the evaluation of AAA prior to the endovascular treatment. (orig.)

  9. FIDEA: a server for the functional interpretation of differential expression analysis.

    KAUST Repository

    D'Andrea, Daniel

    2013-06-10

    The results of differential expression analyses provide scientists with hundreds to thousands of differentially expressed genes that need to be interpreted in light of the biology of the specific system under study. This requires mapping the genes to functional classifications that can be, for example, the KEGG pathways or InterPro families they belong to, their GO Molecular Function, Biological Process or Cellular Component. A statistically significant overrepresentation of one or more category terms in the set of differentially expressed genes is an essential step for the interpretation of the biological significance of the results. Ideally, the analysis should be performed by scientists who are well acquainted with the biological problem, as they have a wealth of knowledge about the system and can, more easily than a bioinformatician, discover less obvious and, therefore, more interesting relationships. To allow experimentalists to explore their data in an easy and at the same time exhaustive fashion within a single tool and to test their hypothesis quickly and effortlessly, we developed FIDEA. The FIDEA server is located at http://www.biocomputing.it/fidea; it is free and open to all users, and there is no login requirement.

  10. DeepBlue epigenomic data server: programmatic data retrieval and analysis of epigenome region sets.

    Science.gov (United States)

    Albrecht, Felipe; List, Markus; Bock, Christoph; Lengauer, Thomas

    2016-07-01

    Large amounts of epigenomic data are generated under the umbrella of the International Human Epigenome Consortium, which aims to establish 1000 reference epigenomes within the next few years. These data have the potential to unravel the complexity of epigenomic regulation. However, their effective use is hindered by the lack of flexible and easy-to-use methods for data retrieval. Extracting region sets of interest is a cumbersome task that involves several manual steps: identifying the relevant experiments, downloading the corresponding data files and filtering the region sets of interest. Here we present the DeepBlue Epigenomic Data Server, which streamlines epigenomic data analysis as well as software development. DeepBlue provides a comprehensive programmatic interface for finding, selecting, filtering, summarizing and downloading region sets. It contains data from four major epigenome projects, namely ENCODE, ROADMAP, BLUEPRINT and DEEP. DeepBlue comes with a user manual, examples and a well-documented application programming interface (API). The latter is accessed via the XML-RPC protocol supported by many programming languages. To demonstrate usage of the API and to enable convenient data retrieval for non-programmers, we offer an optional web interface. DeepBlue can be openly accessed at http://deepblue.mpi-inf.mpg.de.

  11. myPhyloDB: a local web server for the storage and analysis of metagenomic data.

    Science.gov (United States)

    Manter, Daniel K; Korsa, Matthew; Tebbe, Caleb; Delgado, Jorge A

    2016-01-01

    myPhyloDB v.1.1.2 is a user-friendly personal database with a browser-interface designed to facilitate the storage, processing, analysis, and distribution of microbial community populations (e.g. 16S metagenomics data). MyPhyloDB archives raw sequencing files, and allows for easy selection of project(s)/sample(s) of any combination from all available data in the database. The data processing capabilities of myPhyloDB are also flexible enough to allow the upload and storage of pre-processed data, or use the built-in Mothur pipeline to automate the processing of raw sequencing data. myPhyloDB provides several analytical (e.g. analysis of covariance,t-tests, linear regression, differential abundance (DESeq2), and principal coordinates analysis (PCoA)) and normalization (rarefaction, DESeq2, and proportion) tools for the comparative analysis of taxonomic abundance, species richness and species diversity for projects of various types (e.g. human-associated, human gut microbiome, air, soil, and water) for any taxonomic level(s) desired. Finally, since myPhyloDB is a local web-server, users can quickly distribute data between colleagues and end-users by simply granting others access to their personal myPhyloDB database. myPhyloDB is available athttp://www.ars.usda.gov/services/software/download.htm?softwareid=472 and more information along with tutorials can be found on our websitehttp://www.myphylodb.org. Database URL:http://www.myphylodb.org.

  12. Seeds, browse, and tooth wear: a sheep perspective.

    Science.gov (United States)

    Ramdarshan, Anusha; Blondel, Cécile; Brunetière, Noël; Francisco, Arthur; Gautier, Denis; Surault, Jérôme; Merceron, Gildas

    2016-08-01

    While grazing as a selective factor towards hypsodont dentition on mammals has gained a lot of attention, the importance of fruits and seeds as fallback resources for many browsing ungulates has caught much less attention. Controlled-food experiments, by reducing the dietary range, allow for a direct quantification of the effect of each type of items separately on enamel abrasion. We present the results of a dental microwear texture analysis on 40 ewes clustered into four different controlled diets: clover alone, and then three diets composed of clover together with either barley, corn, or chestnuts. Among the seed-eating groups, only the barley one shows higher complexity than the seed-free group. Canonical discriminant analysis is successful at correctly classifying the majority of clover- and seed-fed ewes. Although this study focuses on diets which all fall within a single dietary category (browse), the groups show variations in dental microwear textures in relation with the presence and the type of seeds. More than a matter of seed size and hardness, a high amount of kernels ingested per day is found to be correlated with high complexity on enamel molar facets. This highlights the high variability of the physical properties of the foods falling under the browsing umbrella.

  13. ProTSAV: A protein tertiary structure analysis and validation server.

    Science.gov (United States)

    Singh, Ankita; Kaushik, Rahul; Mishra, Avinash; Shanker, Asheesh; Jayaram, B

    2016-01-01

    Quality assessment of predicted model structures of proteins is as important as the protein tertiary structure prediction. A highly efficient quality assessment of predicted model structures directs further research on function. Here we present a new server ProTSAV, capable of evaluating predicted model structures based on some popular online servers and standalone tools. ProTSAV furnishes the user with a single quality score in case of individual protein structure along with a graphical representation and ranking in case of multiple protein structure assessment. The server is validated on ~64,446 protein structures including experimental structures from RCSB and predicted model structures for CASP targets and from public decoy sets. ProTSAV succeeds in predicting quality of protein structures with a specificity of 100% and a sensitivity of 98% on experimentally solved structures and achieves a specificity of 88%and a sensitivity of 91% on predicted protein structures of CASP11 targets under 2Å.The server overcomes the limitations of any single server/method and is seen to be robust in helping in quality assessment. ProTSAV is freely available at http://www.scfbio-iitd.res.in/software/proteomics/protsav.jsp.

  14. Giraffe browsing in response to plant traits

    Science.gov (United States)

    Mahenya, Obeid; Ndjamba, Johannes Kambinda; Mathisen, Karen Marie; Skarpe, Christina

    2016-08-01

    Intake rates by large herbivores are governed by among other things plant traits. We used Masai giraffe (Giraffa camelopardalis tippelskirchi Matschie) as study animals, testing whether they as very large browsers would follow the Jarman-Bell principle and maximize intake rate while tolerating low forage quality. We worked in Arusha National Park, Tanzania. We investigated how intake rate was determined by bite mass and bite rate, and show that bite mass and bite rate were determined by plant characteristics, governed by inherent plant traits, plant traits acquired from previous years' browsing, and season. We predicted that; (1) bite mass would be larger in trees without spines than with (2) bite mass would be larger in the wet season than in the dry, (3) bite rate would be higher in spinescent trees than in non-spinescent, (4) bite rate and/or bite mass would increase with previous years' browsing, (5) bite mass, bite rate or browsing time per tree would be highest for high trees with large, although still available canopies. Visual observations were used to collect data on tree attributes, number of bites taken and time of browsing. Sample size was 132 observed giraffe. We found that bite mass was larger in spineless than in spinescent trees and was larger in the wet season than in the dry. Bite rate, but not bite mass, increased with increasing browsing in previous years and was highest on two to three meter high trees and in spinescent trees. Intake rate followed bite mass more than bite rate and was higher in spineless than in spinescent trees, higher in the wet season than in the dry, and tended to increase with tree height. Giraffe did not prioritize the highest intake rate, but browsed much on Acacias giving a high quality diet but a low intake rate.

  15. CentiServer: A Comprehensive Resource, Web-Based Application and R Package for Centrality Analysis.

    Directory of Open Access Journals (Sweden)

    Mahdi Jalili

    Full Text Available Various disciplines are trying to solve one of the most noteworthy queries and broadly used concepts in biology, essentiality. Centrality is a primary index and a promising method for identifying essential nodes, particularly in biological networks. The newly created CentiServer is a comprehensive online resource that provides over 110 definitions of different centrality indices, their computational methods, and algorithms in the form of an encyclopedia. In addition, CentiServer allows users to calculate 55 centralities with the help of an interactive web-based application tool and provides a numerical result as a comma separated value (csv file format or a mapped graphical format as a graph modeling language (GML file. The standalone version of this application has been developed in the form of an R package. The web-based application (CentiServer and R package (centiserve are freely available at http://www.centiserver.org/.

  16. Mobile web browsing using the cloud

    CERN Document Server

    Zhao, Bo; Cao, Guohong

    2013-01-01

    This brief surveys existing techniques to address the problem of long delays and high power consumption for web browsing on smartphones, which can be due to the local computational limitation at the smartphone (e.g., running java scripts or flash objects) level. To address this issue, an architecture called Virtual-Machine based Proxy (VMP) is introduced, shifting the computing from smartphones to the VMP which may reside in the cloud. Mobile Web Browsing Using the Cloud illustrates the feasibility of deploying the proposed VMP system in 3G networks through a prototype using Xen virtual machin

  17. Colour appearance descriptors for image browsing and retrieval

    Science.gov (United States)

    Othman, Aniza; Martinez, Kirk

    2008-01-01

    In this paper, we focus on the development of whole-scene colour appearance descriptors for classification to be used in browsing applications. The descriptors can classify a whole-scene image into various categories of semantically-based colour appearance. Colour appearance is an important feature and has been extensively used in image-analysis, retrieval and classification. By using pre-existing global CIELAB colour histograms, firstly, we try to develop metrics for whole-scene colour appearance: "colour strength", "high/low lightness" and "multicoloured". Secondly we propose methods using these metrics either alone or combined to classify whole-scene images into five categories of appearance: strong, pastel, dark, pale and multicoloured. Experiments show positive results and that the global colour histogram is actually useful and can be used for whole-scene colour appearance classification. We have also conducted a small-scale human evaluation test on whole-scene colour appearance. The results show, with suitable threshold settings, the proposed methods can describe the whole-scene colour appearance of images close to human classification. The descriptors were tested on thousands of images from various scenes: paintings, natural scenes, objects, photographs and documents. The colour appearance classifications are being integrated into an image browsing system which allows them to also be used to refine browsing.

  18. Analysis of Server Log by Web Usage Mining for Website Improvement

    Directory of Open Access Journals (Sweden)

    Navin Kumar Tyagi

    2010-07-01

    Full Text Available Web server logs stores click stream data which can be useful for mining purposes. The data is stored as a result of user's access to a website. Web usage mining an application of data mining can be used to discover user access patterns from weblog data. The obtained results are used in different applications like, site modifications, business intelligence, system improvement and personalization. In this study, we have analyzed the log files of smart sync software web server to get information about visitors; top errors which can be utilized by system administrator and web designer to increase the effectiveness of the web site.

  19. The RNAsnp web server

    DEFF Research Database (Denmark)

    Radhakrishnan, Sabarinathan; Tafer, Hakim; Seemann, Ernst Stefan;

    2013-01-01

    , are derived from extensive pre-computed tables of distributions of substitution effects as a function of gene length and GC content. Here, we present a web service that not only provides an interface for RNAsnp but also features a graphical output representation. In addition, the web server is connected...... to a local mirror of the UCSC genome browser database that enables the users to select the genomic sequences for analysis and visualize the results directly in the UCSC genome browser. The RNAsnp web server is freely available at: http://rth.dk/resources/rnasnp/....

  20. MUSTANG-MR structural sieving server: applications in protein structural analysis and crystallography.

    Directory of Open Access Journals (Sweden)

    Arun S Konagurthu

    Full Text Available BACKGROUND: A central tenet of structural biology is that related proteins of common function share structural similarity. This has key practical consequences for the derivation and analysis of protein structures, and is exploited by the process of "molecular sieving" whereby a common core is progressively distilled from a comparison of two or more protein structures. This paper reports a novel web server for "sieving" of protein structures, based on the multiple structural alignment program MUSTANG. METHODOLOGY/PRINCIPAL FINDINGS: "Sieved" models are generated from MUSTANG-generated multiple alignment and superpositions by iteratively filtering out noisy residue-residue correspondences, until the resultant correspondences in the models are optimally "superposable" under a threshold of RMSD. This residue-level sieving is also accompanied by iterative elimination of the poorly fitting structures from the input ensemble. Therefore, by varying the thresholds of RMSD and the cardinality of the ensemble, multiple sieved models are generated for a given multiple alignment and superposition from MUSTANG. To aid the identification of structurally conserved regions of functional importance in an ensemble of protein structures, Lesk-Hubbard graphs are generated, plotting the number of residue correspondences in a superposition as a function of its corresponding RMSD. The conserved "core" (or typically active site shows a linear trend, which becomes exponential as divergent parts of the structure are included into the superposition. CONCLUSIONS: The application addresses two fundamental problems in structural biology: first, the identification of common substructures among structurally related proteins--an important problem in characterization and prediction of function; second, generation of sieved models with demonstrated uses in protein crystallographic structure determination using the technique of Molecular Replacement.

  1. Analysis of Markov-modulated infinite-server queues in the central-limit regime

    NARCIS (Netherlands)

    Blom, J.G.; Turck, K. de; Mandjes, M.R.H.

    2014-01-01

    This paper focuses on an infinite-server queue modulated by an independently evolving finite-state Markovian background process, with transition rate matrix $Q\\equiv(q_{ij})_{i,j=1}^d$. {Both arrival rates and service rates are depending on the state of the background process.} The main contributi

  2. Analysis of Multiserver Queueing System with Opportunistic Occupation and Reservation of Servers

    Directory of Open Access Journals (Sweden)

    Bin Sun

    2014-01-01

    Full Text Available We consider a multiserver queueing system with two input flows. Type-1 customers have preemptive priority and are lost during arrival only if all servers are occupied by type-1 customers. If all servers are occupied, but some provide service to type-2 customers, service of type-2 customer is terminated and type-1 customer occupies the server. If the number of busy servers is less than the threshold M during type-2 customer arrival epoch, this customer is accepted. Otherwise, it is lost or becomes a retrial customer. It will retry to obtain service. Type-2 customer whose service is terminated is lost or moves to the pool of retrial customers. The service time is exponentially distributed with the rate dependent on the customer’s type. Such queueing system is suitable for modeling cognitive radio. Type-1 customers are interpreted as requests generated by primary users. Type-2 customers are generated by secondary or cognitive users. The problem of optimal choice of the threshold M is the subject of this paper. Behavior of the system is described by the multidimensional Markov chain. Its generator, ergodicity condition, and stationary distribution are given. The system performance measures are obtained. The numerical results show the effectiveness of considered admission control.

  3. Medical Applications of Remote Electronic Browsing.

    Science.gov (United States)

    Chadwick, Joseph

    The purposes of this study are to identify and define viable remote browsing techniques and the requirements for an interactive medical information system that would permit the use of such techniques. The main emphasis is in the areas of: (1) remote viewing of page material; and (2) remote interrogation of fact banks with question-answering…

  4. Browsing a Database of Multimedia Learning Material.

    Science.gov (United States)

    Persico, Donatella; And Others

    1992-01-01

    Describes a project that addressed the problem of courseware reusability by developing a database structure suitable for organizing multimedia learning material in a given content domain. A prototype system that allows browsing a DBLM (Data Base of Learning Material) on earth science is described, and future plans are discussed. (five references)…

  5. A polling model with an autonomous server

    OpenAIRE

    2007-01-01

    Polling models are used as an analytical performance tool in several application areas. In these models, the focus often is on controlling the operation of the server as to optimize some performance measure. For several applications, controlling the server is not an issue as the server moves independently in the system. We present the analysis for such a polling model with a so-called autonomous server. In this model, the server remains for an exogenous random time at a queue, which also impl...

  6. APPLICATION OF THE SINGLE SERVER QUEUING SYSTEM ANALYSIS IN WOOD PRODUCTS INDUSTRY

    Directory of Open Access Journals (Sweden)

    Arif GÜRAY

    2001-01-01

    Full Text Available The aim of this study, simulated of the single server queuing system(CNC at the door-joinery facilities. We simulated the system both by hand and computer programming with SIMAN languages. From the obtained results, we aimed to provide some suggestions to the manager. Because, the ending of the study, simulation showed the real system in some hypothesis. As a result of simulated system will have long queues in future time.

  7. PRince: a web server for structural and physicochemical analysis of Protein-RNA interface

    Science.gov (United States)

    Barik, Amita; Mishra, Abhishek; Bahadur, Ranjit Prasad

    2012-01-01

    We have developed a web server, PRince, which analyzes the structural features and physicochemical properties of the protein–RNA interface. Users need to submit a PDB file containing the atomic coordinates of both the protein and the RNA molecules in complex form (in ‘.pdb’ format). They should also mention the chain identifiers of interacting protein and RNA molecules. The size of the protein–RNA interface is estimated by measuring the solvent accessible surface area buried in contact. For a given protein–RNA complex, PRince calculates structural, physicochemical and hydration properties of the interacting surfaces. All these parameters generated by the server are presented in a tabular format. The interacting surfaces can also be visualized with software plug-in like Jmol. In addition, the output files containing the list of the atomic coordinates of the interacting protein, RNA and interface water molecules can be downloaded. The parameters generated by PRince are novel, and users can correlate them with the experimentally determined biophysical and biochemical parameters for better understanding the specificity of the protein–RNA recognition process. This server will be continuously upgraded to include more parameters. PRince is publicly accessible and free for use. Available at http://www.facweb.iitkgp.ernet.in/~rbahadur/prince/home.html. PMID:22689640

  8. The PARIGA server for real time filtering and analysis of reciprocal BLAST results.

    Directory of Open Access Journals (Sweden)

    Massimiliano Orsini

    Full Text Available BLAST-based similarity searches are commonly used in several applications involving both nucleotide and protein sequences. These applications span from simple tasks such as mapping sequences over a database to more complex procedures as clustering or annotation processes. When the amount of analysed data increases, manual inspection of BLAST results become a tedious procedure. Tools for parsing or filtering BLAST results for different purposes are then required. We describe here PARIGA (http://resources.bioinformatica.crs4.it/pariga/, a server that enables users to perform all-against-all BLAST searches on two sets of sequences selected by the user. Moreover, since it stores the two BLAST output in a python-serialized-objects database, results can be filtered according to several parameters in real-time fashion, without re-running the process and avoiding additional programming efforts. Results can be interrogated by the user using logical operations, for example to retrieve cases where two queries match same targets, or when sequences from the two datasets are reciprocal best hits, or when a query matches a target in multiple regions. The Pariga web server is designed to be a helpful tool for managing the results of sequence similarity searches. The design and implementation of the server renders all operations very fast and easy to use.

  9. Experimental Interfaces Involving Visual Grouping During Browsing

    Directory of Open Access Journals (Sweden)

    Stan Ruecker

    2006-11-01

    Full Text Available This paper provides a brief overview of a number of experimental interface design projects being carried out collaboratively by teams of researchers at the University of Alberta and elsewhere. One goal of this interface research is to explore the principles of rich-prospect browsing interfaces, which I have defined (Author 2003 as those where some meaningful representation of every item in a collection is combined with tools for manipulating the display. Often this manipulation is for the purposes of carrying out some portion of a research task: the interfaces lend themselves to exploratory and synthetic activities, such as knowledge discovery and hypothesis formulation. The projects summarized here begin with a browsing prototype originally designed for the task of pill identification (Given et al. 2005 but subsequently extended into a prototype for browsing conference delegates and other groups of people (Author et al. 2006. Another is a nuanced system based on the mandala (Cheypesh et al. 2006 intended for examining any collection that has been encoded with an XML schema, using combinations of attractors selected by the user from the available tags. Next is the set of specialized interfaces for the Orlando Project (Orlando Team 2006, intended to provide a set of discrete entry points into the deeply-encoded electronic history of women’s writing in the British Isles. Our project on tabular interfaces provides a variety of spaces designed to assist the user in using thesauri for multilingual query enhancement (Anvik et al. 2006. The final project described below is NORA (Unsworth 2004, which relies on the power of the D2K data-mining tools at the National Centre for Supercomputing Applications at the University of Illinois at Urbana-Champaign to give humanities scholars a workspace for exploring the system-identified features of common documents and further documents that have been recommended by the system. Each of these projects are

  10. Mining Fuzzy Weighted Browsing Patterns from Time Duration and with Linguistic Thresholds

    Directory of Open Access Journals (Sweden)

    Tzung P. Hong

    2008-01-01

    Full Text Available World-wide-web applications have grown very rapidly and have made a significant impact on computer systems. Among them, web browsing for useful information may be most commonly seen. Due to its tremendous amounts of use, efficient and effective web retrieval has become a very important research topic in this field. Techniques of web mining have thus been requested and developed to achieve this purpose. In this research, a new fuzzy weighted web-mining algorithm is proposed, which can process web-server logs to discover useful users’ browsing behaviors from the time durations of the paged browsed. Since the time durations are numeric, fuzzy concepts are used here to process them and to form linguistic terms. Besides, different web pages may have different importance. The importance of web pages are evaluated by managers as linguistic terms, which are then transformed and averaged as fuzzy sets of weights. Each linguistic term is then weighted by the importance for its page. Only the linguistic term with the maximum cardinality for a page is chosen in later mining processes, thus reducing the time complexity. The minimum support is set linguistic, which is more natural and understandable for human beings. An example is given to clearly illustrate the proposed approach.

  11. Browsing the Internet: good-bye anonymity!

    CERN Multimedia

    Computer Security Team

    2012-01-01

    Do you consider browsing the Internet to be your private business? When visiting random web-sites, how far do you assume you are anonymous? Would it matter to you that Google or Facebook can profile your browsing behaviour in order to better target you with advertisements? Did you notice that you already get targeted ads when you are logged on to Google or Facebook even if you are visiting completely different websites? If matters to you, note that browsing anonymously on the Internet is far from easy.   When you are connected to the Internet, you give away a variety of information: your PC’s IP address, some browser settings like language or screen size, and, probably, your login information. So how private is private? You might argue that your current IP address has been picked from a pool of addresses and therefore regularly changes, so it does not necessarily always pinpoint you. On the other hand, with the dawn of IPv6 there is no need any more for shared IP addresses as the...

  12. MATLAB WEB SERVER AND ITS APPLICATION IN REMOTE COLLABORATIVE DESIGN OF MAGNETIC BEARING SYSTEMS

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    Acclimatizing itself to the development of network,Math Works Inc constructed a MATLAB Web Server environment by dint of which one can browse the calculation and plots of MATLAB through Internet directly.The installation and use of the environment is introduced.A code established on the platform of MATLAB,which deals with the modal analysis of magnetic bearing system(MBS) supporting rotors of five degrees of freedom and considering the coupling of thrust bearing with radical bearings is modified to work in the environment.The purpose is to realize a remote call of the code by users through Internet for the performance analysis of the system.Such an application is very important to the concurrent design of MBS and for the utilization of distributive knowledge acquisition resources in collaborative design.The work on modification and realization is described and the results are discussed.

  13. An Authentication system of Web Services Based on Web Server Log Analysis

    Directory of Open Access Journals (Sweden)

    R. Joseph Manoj

    2014-01-01

    Full Text Available Authentication is a method which validates users' identity prior to permitting them to access the web services. To enhance the security of web services, providers follow varieties of authentication methods to restrict malicious users from accessing the services. This paper proposes a new authentication method which claims user’s identity by analyzing web server log files which includes the details of requesting user’s IP address, username, password, date and time of request, status code, URL etc., and checks IP address spoofing using ingress packet filtering method. This paper also analyses the resultant data and performance of the proposed work.

  14. Getting to the Source: a Survey of Quantitative Data Sources Available to the Everyday Librarian: Part 1: Web Server Log Analysis

    Directory of Open Access Journals (Sweden)

    Lisa Goddard

    2007-03-01

    Full Text Available This is the first part of a two‐part article that provides a survey of data sources which are likely to be immediately available to the typical practitioner who wishes to engage instatistical analysis of collections and services within his or her own library. Part I outlines the data elements which can be extracted from web server logs, and discusses web log analysis tools. Part II looks at logs, reports, and data sources from proxy servers, resource vendors, link resolvers, federated search engines, institutional repositories, electronic reference services, and the integrated library system.

  15. deepTools2: a next generation web server for deep-sequencing data analysis.

    Science.gov (United States)

    Ramírez, Fidel; Ryan, Devon P; Grüning, Björn; Bhardwaj, Vivek; Kilpert, Fabian; Richter, Andreas S; Heyne, Steffen; Dündar, Friederike; Manke, Thomas

    2016-07-08

    We present an update to our Galaxy-based web server for processing and visualizing deeply sequenced data. Its core tool set, deepTools, allows users to perform complete bioinformatic workflows ranging from quality controls and normalizations of aligned reads to integrative analyses, including clustering and visualization approaches. Since we first described our deepTools Galaxy server in 2014, we have implemented new solutions for many requests from the community and our users. Here, we introduce significant enhancements and new tools to further improve data visualization and interpretation. deepTools continue to be open to all users and freely available as a web service at deeptools.ie-freiburg.mpg.de The new deepTools2 suite can be easily deployed within any Galaxy framework via the toolshed repository, and we also provide source code for command line usage under Linux and Mac OS X. A public and documented API for access to deepTools functionality is also available.

  16. Caching Servers for ATLAS

    CERN Document Server

    Gardner, Robert; The ATLAS collaboration

    2017-01-01

    As many LHC Tier-3 and some Tier-2 centers look toward streamlining operations, they are considering autonomously managed storage elements as part of the solution. These storage elements are essentially file caching servers. They can operate as whole file or data block level caches. Several implementations exist. In this paper we explore using XRootD caching servers that can operate in either mode. They can also operate autonomously (i.e. demand driven), be centrally managed (i.e. a Rucio managed cache), or operate in both modes. We explore the pros and cons of various configurations as well as practical requirements for caching to be effective. While we focus on XRootD caches, the analysis should apply to other kinds of caches as well.

  17. Caching Servers for ATLAS

    CERN Document Server

    Gardner, Robert; The ATLAS collaboration

    2016-01-01

    As many Tier 3 and some Tier 2 centers look toward streamlining operations, they are considering autonomously managed storage elements as part of the solution. These storage elements are essentially file caching servers. They can operate as whole file or data block level caches. Several implementations exist. In this paper we explore using XRootD caching servers that can operate in either mode. They can also operate autonomously (i.e. demand driven), be centrally managed (i.e. a Rucio managed cache), or operate in both modes. We explore the pros and cons of various configurations as well as practical requirements for caching to be effective. While we focus on XRootD caches, the analysis should apply to other kinds of caches as well.

  18. Experimental Interfaces Involving Visual Grouping During Browsing

    Directory of Open Access Journals (Sweden)

    Dr Stan Ruecker

    2006-03-01

    Full Text Available This paper provides a brief overview of a number of experimental interface design projects being carried out collaboratively by teams of researchers at the University of Alberta and elsewhere. One goal of this interface research is to explore the principles of rich-prospect browsing interfaces, which I have defined (Ruecker 1 as those where some meaningful representation of every item in a collection is combined with tools for manipulating the display. Often this manipulation is for the purpose of carrying out some portion of a research task: the interfaces lend themselves to exploratory and synthetic activities, such as knowledge discovery and hypothesis formulation. The projects summarized here begin with a browsing prototype originally designed for the task of pill identification (Given et al.. This prototype was subsequently extended into a prototype for browsing conference delegates and other groups of people (Ruecker et al.. Another direction is represented by a nuanced system based on the mandala (Cheypesh et al. intended for examining any collection that has been encoded with an XML schema. The Mandala Browser uses combinations of “magnetic axes” selected by the user from the available tags. Next is the set of specialized interfaces for the Orlando Project (Orlando Team, intended to provide a set of discrete entry points into the deeply-encoded electronic history of women’s writing in the British Isles. Our project on tabular interfaces provides a variety of spaces designed to assist the user in using thesauri for multilingual query enhancement (Anvik et al.. The final project described below is NORA (Unsworth, which relies on the power of the D2K data-mining tools at the National Centre for Supercomputing Applications at the University of Illinois at Urbana-Champaign. The goal of NORA is to give humanities scholars a workspace for exploring the system-identified features of common documents and further documents that havebeen

  19. Efficient One-click Browsing of Large Trajectory Sets

    DEFF Research Database (Denmark)

    Krogh, Benjamin Bjerre; Andersen, Ove; Lewis-Kelham, Edwin

    2014-01-01

    Traffic researchers, planners, and analysts want a simple way to query the large quantities of GPS trajectories collected from vehicles. In addition, users expect the results to be presented immediately even when querying very large transportation networks with huge trajectory data sets. This paper...... presents a novel query type called sheaf, where users can browse trajectory data sets using a single mouse click. Sheaves are very versatile and can be used for location-based advertising, travel-time analysis, intersection analysis, and reachability analysis (isochrones). A novel in-memory trajectory...... index compresses the data by a factor of 12.4 and enables execution of sheaf queries in 40 ms. This is up to 2 orders of magnitude faster than existing work. We demonstrate the simplicity, versatility, and efficiency of sheaf queries using a real-world trajectory set consisting of 2.7 million...

  20. Analysis and Design of Web Server Based on Softbase%基于SOftbase的Web Server分析与设计

    Institute of Scientific and Technical Information of China (English)

    黄松英; 涂征

    2001-01-01

    This paper firstly analyses common structure and function ,then designs a new structure of Web server based on seenre Softbase.%分析了Web Sewer的一般结构及功能特点,设计了一种基于Sofbase安全数据库系统的Web Server.

  1. Interconnecting smartphone, image analysis server, and case report forms in clinical trials for automatic skin lesion tracking in clinical trials

    Science.gov (United States)

    Haak, Daniel; Doma, Aliaa; Gombert, Alexander; Deserno, Thomas M.

    2016-03-01

    Today, subject's medical data in controlled clinical trials is captured digitally in electronic case report forms (eCRFs). However, eCRFs only insufficiently support integration of subject's image data, although medical imaging is looming large in studies today. For bed-side image integration, we present a mobile application (App) that utilizes the smartphone-integrated camera. To ensure high image quality with this inexpensive consumer hardware, color reference cards are placed in the camera's field of view next to the lesion. The cards are used for automatic calibration of geometry, color, and contrast. In addition, a personalized code is read from the cards that allows subject identification. For data integration, the App is connected to an communication and image analysis server that also holds the code-study-subject relation. In a second system interconnection, web services are used to connect the smartphone with OpenClinica, an open-source, Food and Drug Administration (FDA)-approved electronic data capture (EDC) system in clinical trials. Once the photographs have been securely stored on the server, they are released automatically from the mobile device. The workflow of the system is demonstrated by an ongoing clinical trial, in which photographic documentation is frequently performed to measure the effect of wound incision management systems. All 205 images, which have been collected in the study so far, have been correctly identified and successfully integrated into the corresponding subject's eCRF. Using this system, manual steps for the study personnel are reduced, and, therefore, errors, latency and costs decreased. Our approach also increases data security and privacy.

  2. Language Intent Models for Inferring User Browsing Behavior

    NARCIS (Netherlands)

    Tsagkias, M.; Blanco, R.

    2012-01-01

    Modeling user browsing behavior is an active research area with tangible real-world applications, e.g., organizations can adapt their online presence to their visitors browsing behavior with positive effects in user engagement, and revenue. We concentrate on online news agents, and present a semi-su

  3. Browsed twig environmental DNA: diagnostic PCR to identify ungulate species.

    Science.gov (United States)

    Nichols, Ruth V; Königsson, Helena; Danell, Kjell; Spong, Göran

    2012-11-01

    Ungulate browsing can have a strong effect on ecological processes by affecting plant community structure and composition, with cascading effects on nutrient cycling and animal communities. However, in the absence of direct observations of foraging, species-specific foraging behaviours are difficult to quantify. We therefore know relatively little about foraging competition and species-specific browsing patterns in systems with several browsers. However, during browsing, a small amount of saliva containing buccal cells is deposited at the bite site, providing a source of environmental DNA (eDNA) that can be used for species identification. Here, we describe extraction and PCR protocols for a browser species diagnostic kit. Species-specific primers for mitochondrial DNA were optimized and validated using twigs browsed by captive animals. A time series showed that about 50% of the samples will amplify up to 12 weeks after the browsing event and that some samples amplify up to 24 weeks after browsing (12.5%). Applied to samples of natural browsing from an area where moose (Alces alces), roe deer (Capreolus capreolus), fallow deer (Cervus dama) and red deer (Cervus elaphus) are sympatric, amplification success reached 75%. This method promises to greatly improve our understanding of multispecies browsing systems without the need for direct observations.

  4. Using Apollo to browse and edit genome annotations.

    Science.gov (United States)

    Misra, Sima; Harris, Nomi

    2006-01-01

    An annotation is any feature that can be tied to genomic sequence, such as an exon, transcript, promoter, or transposable element. As biological knowledge increases, annotations of different types need to be added and modified, and links to other sources of information need to be incorporated, to allow biologists to easily access all of the available sequence analysis data and design appropriate experiments. The Apollo genome browser and editor offers biologists these capabilities. Apollo can display many different types of computational evidence, such as alignments and similarities based on BLAST searches (UNITS 3.3 & 3.4), and enables biologists to utilize computational evidence to create and edit gene models and other genomic features, e.g., using experimental evidence to refine exon-intron structures predicted by gene prediction algorithms. This protocol describes simple ways to browse genome annotation data, as well as techniques for editing annotations and loading data from different sources.

  5. SHADE3 server

    DEFF Research Database (Denmark)

    Madsen, Anders Østergaard; Hoser, Anna Agnieszka

    2014-01-01

    A major update of the SHADE server (http://shade.ki.ku.dk) is presented. In addition to all of the previous options for estimating H-atom anisotropic displacement parameters (ADPs) that were offered by SHADE2, the newest version offers two new methods. The first method combines the original...... translation-libration-screw analysis with input from periodic ab initio calculations. The second method allows the user to input experimental information from spectroscopic measurements or from neutron diffraction experiments on related structures and utilize this information to evaluate ADPs of H atoms...

  6. SQL Server数据库安全系统分析%Safety System Analysis of SQL Server Data Bank

    Institute of Scientific and Technical Information of China (English)

    孟盛

    2011-01-01

    This paper describes SQL Server's security mechanisms and how to improve SQL Server security system measures for security mechanism. Database monitoring information retrieval strategies are researched and the safety of the SQL Server database system is discussed.%本文阐述了SQL Server的安全机制,对安全机制提出了如何提高SQL Server安全系统的措施.并研究了数据库监控信息获取策略,探讨了SQL Server数据库安全系统的实现.

  7. GeoServer cookbook

    CERN Document Server

    Iacovella, Stefano

    2014-01-01

    This book is ideal for GIS experts, developers, and system administrators who have had a first glance at GeoServer and who are eager to explore all its features in order to configure professional map servers. Basic knowledge of GIS and GeoServer is required.

  8. Mastering Lync Server 2010

    CERN Document Server

    Winters, Nathan

    2012-01-01

    An in-depth guide on the leading Unified Communications platform Microsoft Lync Server 2010 maximizes communication capabilities in the workplace like no other Unified Communications (UC) solution. Written by experts who know Lync Server inside and out, this comprehensive guide shows you step by step how to administer the newest and most robust version of Lync Server. Along with clear and detailed instructions, learning is aided by exercise problems and real-world examples of established Lync Server environments. You'll gain the skills you need to effectively deploy Lync Server 2010 and be on

  9. Analysis of Compute Vs Retrieve Intensive Web Applications and Its Impact On The Performance Of A Web Server

    Directory of Open Access Journals (Sweden)

    Syed Mutahar Aaqib

    2012-01-01

    Full Text Available The World Wide Web (WWW has undergone remarkable change over the past few years, placing substantially heavy load on Web servers. Today’s web servers host web applications that demand high computational resources. Also some applications require heavy database retrieval processing, making server load even more critical. In this paper, performance of Apache web server running compute and retrieve-intensive web workloads is analyzed. Workload files implemented in three dynamic web programming technologies: PERL, PHP and Java Servlets are used with MySQL acting as a data source. Measurements are performed with the intent to analyze the impact of application workloads on the overall performance of the web server and determine which web technology yields better performance on Windows and Linux platforms. Experimental results depict that for both compute and retrieve intensive applications, PHP exhibits better performance than PERL and Java Servlets. A multiple linear regression model was also developed to predict the web server performance and to validate the experimental results. This regression model showed that for compute and retrieve intensive web applications, PHP exhibits better performance than Perl and Java Servlets.

  10. I/Browse: the Bellcore video library tool kit

    Science.gov (United States)

    England, Paul; Allen, Robert B.; Sullivan, Mark; Heybey, Andrew; Bianchi, Mike; Dailianas, Apostolos

    1996-03-01

    I/Browse: The Bellcore Video Library Toolkit is a set of tools for constructing and browsing libraries of digital video. The toolkit is designed to work with video libraries on local or network disks, CD-ROMs or a multimedia sever. There are three main components, a preprocessor, a tagger, and a browser. Particular attention is focused on the tools and techniques we have developed to rapidly tag videos. The tagging system allows text fields, type information, and other resources to be associated with frames in the video. The tags are further organized into a hierarchical 'table of contents', which is suitable for browsing and searching.

  11. Jekyll or Hyde? Better browse securely

    CERN Multimedia

    Computer Security Team

    2013-01-01

    Surfing the web is like walking through London in 1886. Usually you meet nice Dr Jekyll, interact with him and everything is fine. But at other times, at night, you might encounter the malicious Mr Hyde. He just wants your money and your secrets, and wants to take advantage of you.   As in the novel by Stevenson, good and bad web pages can be very close together. Most web pages exist to provide information or a service. But one click away, one Google page down, there are malicious web pages that aim to steal your password, infect your computer, or lull you into disclosing personal information.    So remember: “STOP - THINK - CLICK!” should be the standard when browsing the Internet. If you are presented with a link that looks strange or contains gibberish (like http://211.268.156.277/.PayPal/cgi-bin/wbscrcmd_login.php), just ignore it! It is always better to type simple, comprehensible web addresses like “www.paypal.com” than...

  12. FARO server: Meta-analysis of gene expression by matching gene expression signatures to a compendium of public gene expression data

    DEFF Research Database (Denmark)

    Manijak, Mieszko P.; Nielsen, Henrik Bjørn

    2011-01-01

    BACKGROUND: Although, systematic analysis of gene annotation is a powerful tool for interpreting gene expression data, it sometimes is blurred by incomplete gene annotation, missing expression response of key genes and secondary gene expression responses. These shortcomings may be partially...... circumvented by instead matching gene expression signatures to signatures of other experiments. FINDINGS: To facilitate this we present the Functional Association Response by Overlap (FARO) server, that match input signatures to a compendium of 242 gene expression signatures, extracted from more than 1700...

  13. MISTIC: mutual information server to infer coevolution

    DEFF Research Database (Denmark)

    Simonetti, Franco L.; Teppa, Elin; Chernomoretz, Ariel

    2013-01-01

    MISTIC (mutual information server to infer coevolution) is a web server for graphical representation of the information contained within a MSA (multiple sequence alignment) and a complete analysis tool for Mutual Information networks in protein families. The server outputs a graphical visualization...... of several information-related quantities using a circos representation. This provides an integrated view of the MSA in terms of (i) the mutual information (MI) between residue pairs, (ii) sequence conservation and (iii) the residue cumulative and proximity MI scores. Further, an interactive interface...... containing all results can be downloaded. The server is available at http://mistic.leloir.org.ar. In summary, MISTIC allows for a comprehensive, compact, visually rich view of the information contained within an MSA in a manner unique to any other publicly available web server. In particular, the use...

  14. Network characteristics for server selection in online games

    Science.gov (United States)

    Claypool, Mark

    2008-01-01

    Online gameplay is impacted by the network characteristics of players connected to the same server. Unfortunately, the network characteristics of online game servers are not well-understood, particularly for groups that wish to play together on the same server. As a step towards a remedy, this paper presents analysis of an extensive set of measurements of game servers on the Internet. Over the course of many months, actual Internet game servers were queried simultaneously by twenty-five emulated game clients, with both servers and clients spread out on the Internet. The data provides statistics on the uptime and populations of game servers over a month long period an an in-depth look at the suitability for game servers for multi-player server selection, concentrating on characteristics critical to playability--latency and fairness. Analysis finds most game servers have latencies suitable for third-person and omnipresent games, such as real-time strategy, sports and role-playing games, providing numerous server choices for game players. However, far fewer game servers have the low latencies required for first-person games, such as shooters or race games. In all cases, groups that wish to play together have a greatly reduced set of servers from which to choose because of inherent unfairness in server latencies and server selection is particularly limited as the group size increases. These results hold across different game types and even across different generations of games. The data should be useful for game developers and network researchers that seek to improve game server selection, whether for single or multiple players.

  15. Techniques of Turnovers’ Evolution and Structure Analysis Using SQL Server 2005

    Directory of Open Access Journals (Sweden)

    Alexandru Manole

    2007-07-01

    Full Text Available The turnovers’ evolution and structure analysis can provide many useful information for the construction of a viable set of policies for products, prices and retail network. When the analysis deals with large quantities of raw data, one of the solutions that guarantees the rigorous treatment of the data is the use of a software system based on a data warehouse.

  16. Effect of Roadside Vegetation Cutting on Moose Browsing.

    Directory of Open Access Journals (Sweden)

    Amy L Tanner

    Full Text Available Moose (Alces americanus vehicle collisions (MVCs are an issue throughout the distribution of moose. Many mitigation strategies have been tested and implemented to reduce the number of MVCs, but there have been few empirical analyses of the effectiveness of roadside vegetation cutting. The goal of this study was to determine if roadside vegetation cutting attracted moose into roadside areas to browse on the vegetation regrowth. We hypothesized that moose would be attracted to roadside areas with cut vegetation. Consequently, we predicted that there would be higher levels of browsing in cut areas compared to uncut areas. To determine if moose were browsing more in cut or uncut areas, we measured the number of plants browsed by moose in paired treatment (cut on or after 2008 and control (not cut since at least 2008 sites, along with a suite of potential environmental covariates. Using a model selection approach, we fit generalized linear mixed-effects models to determine the most parsimonious set of environmental variables to explain variation in the proportion of moose browse among sites. In contrast to our hypothesis, our results show that the proportion of moose browse in the uncut control areas was significantly higher than in the cut treatment areas. The results of this study suggest that recently cut roadside areas (7 years or less based on our work may create a less attractive foraging habitat for moose. The majority of the variance in the proportion of moose browse among sites was explained by treatment type and nested plot number within site identification (34.16%, with additional variance explained by traffic region (5.00% and moose density (4.35%. Based on our study, we recommend that vegetation cutting be continued in roadside areas in Newfoundland as recently cut areas may be less attractive browsing sites for moose.

  17. Effect of Roadside Vegetation Cutting on Moose Browsing.

    Science.gov (United States)

    Tanner, Amy L; Leroux, Shawn J

    2015-01-01

    Moose (Alces americanus ) vehicle collisions (MVCs) are an issue throughout the distribution of moose. Many mitigation strategies have been tested and implemented to reduce the number of MVCs, but there have been few empirical analyses of the effectiveness of roadside vegetation cutting. The goal of this study was to determine if roadside vegetation cutting attracted moose into roadside areas to browse on the vegetation regrowth. We hypothesized that moose would be attracted to roadside areas with cut vegetation. Consequently, we predicted that there would be higher levels of browsing in cut areas compared to uncut areas. To determine if moose were browsing more in cut or uncut areas, we measured the number of plants browsed by moose in paired treatment (cut on or after 2008) and control (not cut since at least 2008) sites, along with a suite of potential environmental covariates. Using a model selection approach, we fit generalized linear mixed-effects models to determine the most parsimonious set of environmental variables to explain variation in the proportion of moose browse among sites. In contrast to our hypothesis, our results show that the proportion of moose browse in the uncut control areas was significantly higher than in the cut treatment areas. The results of this study suggest that recently cut roadside areas (7 years or less based on our work) may create a less attractive foraging habitat for moose. The majority of the variance in the proportion of moose browse among sites was explained by treatment type and nested plot number within site identification (34.16%), with additional variance explained by traffic region (5.00%) and moose density (4.35%). Based on our study, we recommend that vegetation cutting be continued in roadside areas in Newfoundland as recently cut areas may be less attractive browsing sites for moose.

  18. Browsing and Visualization of Linked Environmental Data

    Science.gov (United States)

    Nikolaou, Charalampos; Kyzirakos, Kostis; Bereta, Konstantina; Dogani, Kallirroi; Koubarakis, Manolis

    2014-05-01

    Linked environmental data has started to appear on the Web as environmental researchers make use of technologies such as ontologies, RDF, and SPARQL. Many of these datasets have an important geospatial and temporal dimension. The same is true also for the Web of data that is being rapidly populated not only with geospatial information, but also with temporal information. As the real-world entities represented in linked geospatial datasets evolve over time, the datasets themselves get updated and both the spatial and the temporal dimension of data become significant for users. For example, in the Earth Observation and Environment domains, data is constantly produced by satellite sensors and is associated with metadata containing, among others, temporal attributes, such as the time that an image was acquired. In addition, the acquisitions are considered to be valid for specific periods of time, for example until they get updated by new acquisitions. Satellite acquisitions might be utilized in applications such as the CORINE Land Cover programme operated by the European Environment Agency that makes available as a cartographic product the land cover of European areas. Periodically CORINE publishes the changes in the land cover of these areas in the form of changesets. Tools for exploiting the abundance of geospatial information have also started to emerge. However, these tools are designed for browsing a single data source, while in addition they cannot represent the temporal dimension. This is for two reasons: a) the lack of an implementation of a data model and a query language with temporal features covering the various semantics associated with the representation of time (e.g., valid and user-defined), and b) the lack of a standard temporal extension of RDF that would allow practitioners to utilize when publishing RDF data. Recently, we presented the temporal features of the data model stRDF, the query language stSPARQL, and their implementation in the geospatial

  19. Expert cube development with SQL server analysis services 2012 multidimensional models

    CERN Document Server

    Ferrari, Alberto; Russo, Marco

    2014-01-01

    An easy-to-follow guide full of hands on examples of real-world Analysis Services cube development tasks. Each topic is explained and placed in context, and for the more inquisitive reader, there also more in-depth details of the concepts used.If you are an Analysis Services cube designer wishing to learn more advanced topic and best practices for cube design, this book is for you.You are expected to have some prior experience with Analysis Services cube development.

  20. Silvicultural Attempts to Induce Browse Resistance in Conifer Seedlings

    Directory of Open Access Journals (Sweden)

    Bruce A. Kimball

    2011-01-01

    Full Text Available A multiyear study was conducted to determine if soil amendment combined with topical application of elemental sulfur could be employed to reduce deer browse damage to four conifer species. Fertilizer and sulfur were applied to conifer seedlings at seven sites near Corvallis, OR. Growth and browse damage data were collected for all seedlings over a period of 17 months. Additionally, foliar concentrations of monoterpenes and simple carbohydrates were assessed in western redcedar (Thuja plicata seedlings over a period of three years. Fertilization and sulfur treatments had a moderate impact on growth and no influence on browse damage or the chemical responses. Over the course of the study, browse damage diminished while foliar monoterpene concentrations increased in redcedar. It appears that silvicultural manipulation via sulfur application and/or soil amendment cannot accelerate or alter the ontogenetical changes that may naturally defend seedlings against mammalian herbivores. In a brief trial with captive deer, redcedar browse resistance was influenced by seedling maturation, but not monoterpene content. Other maturation effects may yield significant browse protection to young seedlings.

  1. PIQMIe: A web server for semi-quantitative proteomics data management and analysis

    NARCIS (Netherlands)

    A. Kuzniar (Arnold); R. Kanaar (Roland)

    2014-01-01

    textabstractWe present the Proteomics Identifications and Quantitations Data Management and Integration Service or PIQMIe that aids in reliable and scalable data management, analysis and visualization of semi-quantitative mass spectrometry based proteomics experiments. PIQMIe readily integrates pept

  2. ArachnoServer: a database of protein toxins from spiders

    Directory of Open Access Journals (Sweden)

    Kaas Quentin

    2009-08-01

    Full Text Available Abstract Background Venomous animals incapacitate their prey using complex venoms that can contain hundreds of unique protein toxins. The realisation that many of these toxins may have pharmaceutical and insecticidal potential due to their remarkable potency and selectivity against target receptors has led to an explosion in the number of new toxins being discovered and characterised. From an evolutionary perspective, spiders are the most successful venomous animals and they maintain by far the largest pool of toxic peptides. However, at present, there are no databases dedicated to spider toxins and hence it is difficult to realise their full potential as drugs, insecticides, and pharmacological probes. Description We have developed ArachnoServer, a manually curated database that provides detailed information about proteinaceous toxins from spiders. Key features of ArachnoServer include a new molecular target ontology designed especially for venom toxins, the most up-to-date taxonomic information available, and a powerful advanced search interface. Toxin information can be browsed through dynamic trees, and each toxin has a dedicated page summarising all available information about its sequence, structure, and biological activity. ArachnoServer currently manages 567 protein sequences, 334 nucleic acid sequences, and 51 protein structures. Conclusion ArachnoServer provides a single source of high-quality information about proteinaceous spider toxins that will be an invaluable resource for pharmacologists, neuroscientists, toxinologists, medicinal chemists, ion channel scientists, clinicians, and structural biologists. ArachnoServer is available online at http://www.arachnoserver.org.

  3. myPhyloDB: a local web server for the storage and analysis of metagenomics data

    Science.gov (United States)

    myPhyloDB is a user-friendly personal database with a browser-interface designed to facilitate the storage, processing, analysis, and distribution of metagenomics data. MyPhyloDB archives raw sequencing files, and allows for easy selection of project(s)/sample(s) of any combination from all availab...

  4. Analysis on SQL Server Database Using and Its Performance Optimization%SQL Server数据库运用及其性能优化分析

    Institute of Scientific and Technical Information of China (English)

    冯艳

    2012-01-01

    SQL Server是一种关系型数据库管理系统,是由微软公司出品的软件。SQL Server数据库在运行几年后,由于需要处理的数据变多,其性能和执行效率会逐渐降低。通过探讨数据库运用技巧及其性能优化分析,使得SQLServer数据库能够更好更有效地运行。%the SQL Server is a relational database management system,is the product of our company by Microsoft software.SQL Server database in operation after a few years,because of the need to deal with data become more,its performance and the efficiency will be gradually reduced.This paper discusses the database with skill and its performance optimization analysis,this paper makes the SQL Server database can better and more effective operation.

  5. Crysalis: an integrated server for computational analysis and design of protein crystallization.

    Science.gov (United States)

    Wang, Huilin; Feng, Liubin; Zhang, Ziding; Webb, Geoffrey I; Lin, Donghai; Song, Jiangning

    2016-02-24

    The failure of multi-step experimental procedures to yield diffraction-quality crystals is a major bottleneck in protein structure determination. Accordingly, several bioinformatics methods have been successfully developed and employed to select crystallizable proteins. Unfortunately, the majority of existing in silico methods only allow the prediction of crystallization propensity, seldom enabling computational design of protein mutants that can be targeted for enhancing protein crystallizability. Here, we present Crysalis, an integrated crystallization analysis tool that builds on support-vector regression (SVR) models to facilitate computational protein crystallization prediction, analysis, and design. More specifically, the functionality of this new tool includes: (1) rapid selection of target crystallizable proteins at the proteome level, (2) identification of site non-optimality for protein crystallization and systematic analysis of all potential single-point mutations that might enhance protein crystallization propensity, and (3) annotation of target protein based on predicted structural properties. We applied the design mode of Crysalis to identify site non-optimality for protein crystallization on a proteome-scale, focusing on proteins currently classified as non-crystallizable. Our results revealed that site non-optimality is based on biases related to residues, predicted structures, physicochemical properties, and sequence loci, which provides in-depth understanding of the features influencing protein crystallization. Crysalis is freely available at http://nmrcen.xmu.edu.cn/crysalis/.

  6. Analisis Performansi Layanan Kluster Server Menggunakan Penyeimbang Beban dan Virtual Box

    Directory of Open Access Journals (Sweden)

    Fajar Zuhroni

    2015-10-01

    Full Text Available Today, the growth of technology is very fast causes people more familiar with the Internet. Web servers are required to more quickly in order to serve a number of user request. Technology uses a single server is underprivileged to handle the load on the web server traffic as the number of users increases, causing the server to fail in serving the request. Technology on cluster server with load balancer is used in order to divide the load evenly so that optimize performance on a web server. Server cluster systems that are designed using six virtual machines using VirtualBox application. Six virtual machine consists of two load balancer servers, two application servers and two database servers. Definition of the system created by outlining system requirements and network topology. System design describes requirements specification of the hardware and software. Analysis and testing conducted to determine the performance of the system designed. Analysis and testing conducted to determine the performance of the system designed. Results of this research is the design of virtual servers that can serve a number of user request. Test result showed maximum ability to serve when all servers are up reach 240 connection, one of the aplication server down is 180 connection and one of the database down is 220.The optimal result when all servers up is 180 connections, one of the aplication server down is 150 connections and when database server down is 160 connections.

  7. Analysis of free geo-server software usability from the viewpoint of INSPIRE requirementsAnalysis of free geo-server software usability from the viewpoint of INSPIRE requirements

    Directory of Open Access Journals (Sweden)

    Tomasz  Grasza

    2014-06-01

    Full Text Available The paper presents selected server platforms based on free and open source license, coherent with the standards of the Open Geospatial Consortium. The presented programs are evaluated in the context of the INSPIRE Directive. The first part describes the requirements of the Directive, and afterwards presented are the pros and cons of each platform, to meet these demands. This article provides an answer to the question whether the use of free software can provide interoperable network services in accordance with the requirements of the INSPIRE Directive, on the occasion of presenting the application examples and practical tips on the use of particular programs.[b]Keywords[/b]: GIS, INSPIRE, free software, OGC, geoportal, network services, GeoServer, deegree, GeoNetwork

  8. A Skyline Plugin for Pathway-Centric Data Browsing

    Energy Technology Data Exchange (ETDEWEB)

    Degan, Michael G.; Ryadinskiy, Lillian; Fujimoto, Grant M.; Wilkins, Christopher S.; Lichti, Cheryl F.; Payne, Samuel H.

    2016-08-16

    For targeted proteomics to be broadly adopted in biological laboratories as a routine experimental protocol, wet-bench biologists must be able to approach SRM assay design in the same way they approach biological experimental design. Most often, biological hypotheses are envisioned in a set of protein interactions, networks and pathways. We present a plugin for the popular Skyline tool that presents public mass spectrometry data in a pathway-centric view to assist users in browsing available data and determining how to design quantitative experiments. Selected proteins and their underlying mass spectra are imported to Skyline for further assay design (transition selection). The same plugin can be used for hypothesis-drive DIA data analysis, again utilizing the pathway view to help narrow down the set of proteins which will be investigated. The plugin is backed by the PNNL Biodiversity Library, a corpus of 3 million peptides from >100 organisms, and the draft human proteome. Users can upload personal data to the plugin to use the pathway navigation prior to importing their own data into Skyline.

  9. A Skyline Plugin for Pathway-Centric Data Browsing

    Science.gov (United States)

    Degan, Michael G.; Ryadinskiy, Lillian; Fujimoto, Grant M.; Wilkins, Christopher S.; Lichti, Cheryl F.; Payne, Samuel H.

    2016-08-01

    For targeted proteomics to be broadly adopted in biological laboratories as a routine experimental protocol, wet-bench biologists must be able to approach selected reaction monitoring (SRM) and parallel reaction monitoring (PRM) assay design in the same way they approach biological experimental design. Most often, biological hypotheses are envisioned in a set of protein interactions, networks, and pathways. We present a plugin for the popular Skyline tool that presents public mass spectrometry data in a pathway-centric view to assist users in browsing available data and determining how to design quantitative experiments. Selected proteins and their underlying mass spectra are imported to Skyline for further assay design (transition selection). The same plugin can be used for hypothesis-driven data-independent acquisition (DIA) data analysis, again utilizing the pathway view to help narrow down the set of proteins that will be investigated. The plugin is backed by the Pacific Northwest National Laboratory (PNNL) Biodiversity Library, a corpus of 3 million peptides from >100 organisms, and the draft human proteome. Users can upload personal data to the plugin to use the pathway navigation prior to importing their own data into Skyline.

  10. A Skyline Plugin for Pathway-Centric Data Browsing

    Science.gov (United States)

    Degan, Michael G.; Ryadinskiy, Lillian; Fujimoto, Grant M.; Wilkins, Christopher S.; Lichti, Cheryl F.; Payne, Samuel H.

    2016-11-01

    For targeted proteomics to be broadly adopted in biological laboratories as a routine experimental protocol, wet-bench biologists must be able to approach selected reaction monitoring (SRM) and parallel reaction monitoring (PRM) assay design in the same way they approach biological experimental design. Most often, biological hypotheses are envisioned in a set of protein interactions, networks, and pathways. We present a plugin for the popular Skyline tool that presents public mass spectrometry data in a pathway-centric view to assist users in browsing available data and determining how to design quantitative experiments. Selected proteins and their underlying mass spectra are imported to Skyline for further assay design (transition selection). The same plugin can be used for hypothesis-driven data-independent acquisition (DIA) data analysis, again utilizing the pathway view to help narrow down the set of proteins that will be investigated. The plugin is backed by the Pacific Northwest National Laboratory (PNNL) Biodiversity Library, a corpus of 3 million peptides from >100 organisms, and the draft human proteome. Users can upload personal data to the plugin to use the pathway navigation prior to importing their own data into Skyline.

  11. Hipax Cluster PACS Server

    Directory of Open Access Journals (Sweden)

    Ramin Payrovi

    2007-08-01

    Full Text Available Best Performace: With our Hipax Cluster PACS Server solution we are introducing the parallel computing concept as an extremely fast software system to the PACS world. In contrast to the common PACS servers, the Hipax Cluster PACS software is not only restricted to one or two computers, but can be used on a couple of servers controlling each other."nThus, the same services can be run simultaneously on different computers. The scalable system can also be expanded subsequently without lost of per-formance by adding further processors or Hipax server units, for example, if new clients or modalities are to be connected."nMaximum Failure Security: The Cluster Server concept offers high failure security. If one of the server PCs breaks down, the services can be assumed by another Hipax server unit, temporarily. If the overload of one of the server PCs is imminent, the services will be carried out by another Hipax unit (load balancing. To increase the security, e.g. against fire, the single Hipax servers can also be located separately. This concept offers maximum security, flexibility, performance, redundancy and scalability."nThe Hipax Cluster PACS Server is easy to be administrated using a web interface. In the case of a system failure (e.g. overloading, breakdown of a server PC the system administrator receives a mes-sage via Email and is so enabled to solve the problem."nFeatures"n• Based on SQL database"n• Different services running on separate PCs"n• The Hipax Server unis are coordinated and able to control each other"n• Exponentiates the power of a cluster server to the whole PACS (more processors"n• Scalable to the demands"n• Maximum performance"n• Load balancing for optimum efficiency"n• Maximum failure security because of expo-nentiated redundancy"n• Warning Email automatically sent to the system administrator in the case of failure"n• Web interface for system configuration"n• Maintenance without shut down the system

  12. Assessing browse trend at the landscape level Part 2: Monitoring

    Science.gov (United States)

    Keigley, R.B.; Frisina, M.R.; Fager, C.W.

    2002-01-01

    In Part 1, we assessed browse trend across a wide geographic area of Mt. Haggin Wildlife Management Area by conducting surveys of browsing-related architectures. Those data were qualitative. Below we describe the periodic collection of quantitative data from permanently marked locations; we refer to this phase of the trend assessment program as "monitoring." Trend was monitored by three methods: 1 Repeat photography. 2 Comparison of the height of live stems with the height of stems killed by browsing (LD Index). 3 Net annual stem growth rate (NAGRL3). The photography provides an assessment of trend from the comparison of photographs taken at intervals of a few years. The LD Index and NAGRL3 measurements provide an immediate assessment of trend.

  13. Log files analysis to assess the use and workload of a dynamic web server dedicated to end-stage renal disease.

    Science.gov (United States)

    Ben Said, Mohamed; Le Mignot, Loic; Richard, Jean Baptiste; Le Bihan, Christine; Toubiana, Laurent; Jais, Jean-Philippe; Landais, Paul

    2006-01-01

    A Multi-Source Information System (MSIS), has been designed for the Renal Epidemiology and Information Network (REIN) dedicated to End-Stage Renal Disease (ESRD). MSIS aims at providing reliable follow-up data for ESRD patients. It is based on an n-tier architecture, made out of a universal client, a dynamic Web server connected to a production database and to a data warehouse. MSIS is operational since 2002 and progressively deployed in 9 regions in France. It includes 16,677 patients. We show that the analysis of MSIS web log files allows evaluating the use of the system and the workload in a public-health perspective.

  14. Linux Server Security

    CERN Document Server

    Bauer, Michael D

    2005-01-01

    Linux consistently appears high up in the list of popular Internet servers, whether it's for the Web, anonymous FTP, or general services such as DNS and delivering mail. But security is the foremost concern of anyone providing such a service. Any server experiences casual probe attempts dozens of time a day, and serious break-in attempts with some frequency as well. This highly regarded book, originally titled Building Secure Servers with Linux, combines practical advice with a firm knowledge of the technical tools needed to ensure security. The book focuses on the most common use of Linux--

  15. Learning Zimbra Server essentials

    CERN Document Server

    Kouka, Abdelmonam

    2013-01-01

    A standard tutorial approach which will guide the readers on all of the intricacies of the Zimbra Server.If you are any kind of Zimbra user, this book will be useful for you, from newbies to experts who would like to learn how to setup a Zimbra server. If you are an IT administrator or consultant who is exploring the idea of adopting, or have already adopted Zimbra as your mail server, then this book is for you. No prior knowledge of Zimbra is required.

  16. Quality control, analysis and secure sharing of Luminex® immunoassay data using the open source LabKey Server platform

    Science.gov (United States)

    2013-01-01

    Background Immunoassays that employ multiplexed bead arrays produce high information content per sample. Such assays are now frequently used to evaluate humoral responses in clinical trials. Integrated software is needed for the analysis, quality control, and secure sharing of the high volume of data produced by such multiplexed assays. Software that facilitates data exchange and provides flexibility to perform customized analyses (including multiple curve fits and visualizations of assay performance over time) could increase scientists’ capacity to use these immunoassays to evaluate human clinical trials. Results The HIV Vaccine Trials Network and the Statistical Center for HIV/AIDS Research and Prevention collaborated with LabKey Software to enhance the open source LabKey Server platform to facilitate workflows for multiplexed bead assays. This system now supports the management, analysis, quality control, and secure sharing of data from multiplexed immunoassays that leverage Luminex xMAP® technology. These assays may be custom or kit-based. Newly added features enable labs to: (i) import run data from spreadsheets output by Bio-Plex Manager™ software; (ii) customize data processing, curve fits, and algorithms through scripts written in common languages, such as R; (iii) select script-defined calculation options through a graphical user interface; (iv) collect custom metadata for each titration, analyte, run and batch of runs; (v) calculate dose–response curves for titrations; (vi) interpolate unknown concentrations from curves for titrated standards; (vii) flag run data for exclusion from analysis; (viii) track quality control metrics across runs using Levey-Jennings plots; and (ix) automatically flag outliers based on expected values. Existing system features allow researchers to analyze, integrate, visualize, export and securely share their data, as well as to construct custom user interfaces and workflows. Conclusions Unlike other tools tailored for

  17. Server-side Statistics Scripting in PHP

    Directory of Open Access Journals (Sweden)

    Jan de Leeuw

    1997-06-01

    Full Text Available On the UCLA Statistics WWW server there are a large number of demos and calculators that can be used in statistics teaching and research. Some of these demos require substantial amounts of computation, others mainly use graphics. These calculators and demos are implemented in various different ways, reflecting developments in WWW based computing. As usual, one of the main choices is between doing the work on the client-side (i.e. in the browser or on the server-side (i.e. on our WWW server. Obviously, client-side computation puts fewer demands on the server. On the other hand, it requires that the client downloads Java applets, or installs plugins and/or helpers. If JavaScript is used, client-side computations will generally be slow. We also have to assume that the client is installed properly, and has the required capabilities. Requiring too much on the client-side has caused browsing machines such as Netscape Communicator to grow beyond all reasonable bounds, both in size and RAM requirements. Moreover requiring Java and JavaScript rules out such excellent browsers as Lynx or Emacs W3. For server-side computing, we can configure the server and its resources ourselves, and we need not worry about browser capabilities and configuration. Nothing needs to be downloaded, except the usual HTML pages and graphics. In the same way as on the client side, there is a scripting solution, where code is interpreted, or a ob ject-code solution using compiled code. For the server-side scripting, we use embedded languages, such as PHP/FI. The scripts in the HTML pages are interpreted by a CGI program, and the output of the CGI program is send to the clients. Of course the CGI program is compiled, but the statistics procedures will usually be interpreted, because PHP/FI does not have the appropriate functions in its scripting language. This will tend to be slow, because embedded languages do not deal efficiently with loops and similar constructs. Thus a first

  18. Browse evaluation of tall shrubs based on direct measurement of a management objective

    Science.gov (United States)

    Keigley, R.B.; Frisina, M.R.; Kitchen, Stanley G.; Pendleton, Rosemary L.; Monaco, Thomas A.; Vernon, Jason

    2008-01-01

    The monitoring of Geyer willow was based on the following management objective: Browsing will prevent fewer than 50 percent of Geyer willow shrubs from growing taller than 3 m . Three questions were addressed: (1) Is browsing a potential factor? (2) If so, can young plants grow taller than 3 meters? (3) If not, is browsing the dominant factor? All shrubs were intensely browsed. With a post-browsing growth rate of 5.0 cm per yr, no shrub could grow 3 m tall. Analyses of stem growth rate excluded dominant roles for climate and plant vigor. Browsing and stem age were the dominant factors that limited growth to 3 m tall.

  19. Server Side Applications And Plugins Architecture For The Analysis Of Geospatial Information And The Management Of Water Resources

    Science.gov (United States)

    Pierleoni, Arnaldo; Casagrande, Luca; Bellezza, Michele; Casadei, Stefano

    2010-05-01

    The need for increasingly complex geospatial algorithms dedicated to the management of water resources, the fact that many of them require specific knowledge and the need for dedicated computing machines has led to the necessity of centralizing and sharing all the server applications and the plugins developed. For this purpose, a Web Processing Service (WPS) that can make available to users a range of geospatial analysis algorithms, geostatistics, remote sensing procedures and that can be used simply by providing data and input parameters and download the results has been developed. The core of the system infrastructure is a GRASS GIS, which acts as a computational engine, providing more than 350 forms of analysis and the opportunity to create new and ad hoc procedures. The implementation of the WPS was performed using the software PyWPS written in Python that is easily manageable and configurable. All these instruments are managed by a daemon named "Arcibald" specifically created for the purpose of listing the order of the requests that come from the users. In fact, it may happen that there are already ongoing processes so the system will queue the new ones registering the request and running it only when the previous calculations have been completed. However, individual Geoprocessing have an indicator to assess the resources necessary to implement it, enabling you to run geoprocesses that do not require excessive computing time in parallel. This assessment is also made in relation to the size of the input file provided. The WPS standard defines methods for accessing and running Geoprocessing regardless of the client used, however, the project has been developed specifically for a graphical client to access the resources. The client was built as a plugin for the GIS QGis Software which provides the most common tools for the view and the consultation of geographically referenced data. The tool was tested using the data taken during the bathymetric campaign at the

  20. Browsing through rapid-fire imaging: requirements and industry initiatives

    Science.gov (United States)

    Wittenburg, Kent; Chiyoda, Carlos; Heinrichs, Michael; Lanning, Tom

    1999-12-01

    It is well established that humans possess cognitive abilities to process images extremely rapidly. At GTE Laboratories we have been experimenting with Web-based browsing interfaces that take advantage of this human facility. We have prototyped a number of browsing applications in different domains that offer the advantages of high interactivity and visual engagement. Our hypothesis, confirmed by user evaluations and a pilot experiment, is that many users will be drawn to interfaces that provide rapid presentation of images for browsing tasks in many contexts, among them online shopping, multimedia title selection, and people directories. In this paper we present our application prototypes using a system called PolyNav and discuss the imaging requirements for applications like these. We also raise the suggestion that if the Web industry at large standardized on an XML for meta-content that included images, then the possibility exist that rapid-fire image browsing could become a standard part of the Web experience for content selection in a variety of domains.

  1. Graph-Based Methods for Discovery Browsing with Semantic Predications

    DEFF Research Database (Denmark)

    Wilkowski, Bartlomiej; Fiszman, Marcelo; Miller, Christopher M;

    2011-01-01

    We present an extension to literature-based discovery that goes beyond making discoveries to a principled way of navigating through selected aspects of some biomedical domain. The method is a type of "discovery browsing" that guides the user through the research literature on a specified phenomen...

  2. A Storyboard-Based Interface for Mobile Video Browsing

    NARCIS (Netherlands)

    Hürst, Wolfgang; Hoet, Miklas; van de Werken, Rob

    2015-01-01

    We present an interface design for video browsing on mobile devices such as tablets that is based on storyboards and optimized with respect to content visualization and interaction design. In particular, we consider scientific results from our previous studies on mobile visualization (e.g., about op

  3. The Costs of Web Advertisements while Mobile Browsing

    NARCIS (Netherlands)

    Brande, van den Jeffrey; Pras, Aiko

    2012-01-01

    Tablet PCs, iPads and mobile phones all include facilities to browse the mobile Internet. The costs of mobile Internet access may become extraordinary, however, when the data limit is exceeded or when the user is roaming abroad without a roaming data plan. Since users may see advertisements as unwan

  4. Beyond Information Searching and Browsing: Acquiring Knowledge from Digital Library

    NARCIS (Netherlands)

    Feng, L.; Jeusfeld, M.A.; Hoppenbrouwers, J.

    2005-01-01

    Digital libraries (DLs) are a resource for answering complex questions. Up to now, such systems mainly support keyword-based searching and browsing. The mapping from a research question to keywords and the assessment whether an article is relevant for a research question is completely with the user.

  5. Sliders Versus Storyboards - Investigating Interaction Design for Mobile Video Browsing

    NARCIS (Netherlands)

    Hürst, Wolfgang; Hoet, Miklas

    2015-01-01

    We present a comparative study of two different interfaces for mobile video browsing on tablet devices following two basic concepts - storyboard designs representing a video’s content in a grid-like arrangement of static images extracted from the file, and slider interfaces enabling users to interac

  6. A Texture Thesaurus for Browsing Large Aerial Photographs.

    Science.gov (United States)

    Ma, Wei-Ying; Manjunath, B. S.

    1998-01-01

    Presents a texture-based image-retrieval system for browsing large-scale aerial photographs. System components include texture-feature extraction, image segmentation and grouping, learning-similarity measure, and a texture-thesaurus model for fast search and indexing. Testing has demonstrated the system's effectiveness in searching and selecting…

  7. Storing, Browsing, Querying, and Sharing Data: the THREDDS Data Repository (TDR)

    Science.gov (United States)

    Wilson, A.; Lindholm, D.; Baltzer, T.

    2005-12-01

    The Unidata Internet Data Distribution (IDD) network delivers gigabytes of data per day in near real time to sites across the U.S. and beyond. The THREDDS Data Server (TDS) supports public browsing of metadata and data access via OPeNDAP enabled URLs for datasets such as these. With such large quantities of data, sites generally employ a simple data management policy, keeping the data for a relatively short term on the order of hours to perhaps a week or two. In order to save interesting data in longer term storage and make it available for sharing, a user must move the data herself. In this case the user is responsible for determining where space is available, executing the data movement, generating any desired metadata, and setting access control to enable sharing. This task sequence is generally based on execution of a sequence of low level operating system specific commands with significant user involvement. The LEAD (Linked Environments for Atmospheric Discovery) project is building a cyberinfrastructure to support research and education in mesoscale meteorology. LEAD orchestrations require large, robust, and reliable storage with speedy access to stage data and store both intermediate and final results. These requirements suggest storage solutions that involve distributed storage, replication, and interfacing to archival storage systems such as mass storage systems and tape or removable disks. LEAD requirements also include metadata generation and access in order to support querying. In support of both THREDDS and LEAD requirements, Unidata is designing and prototyping the THREDDS Data Repository (TDR), a framework for a modular data repository to support distributed data storage and retrieval using a variety of back end storage media and interchangeable software components. The TDR interface will provide high level abstractions for long term storage, controlled, fast and reliable access, and data movement capabilities via a variety of technologies such as

  8. Research on Web Server Attacks Logs Analysis%Web服务器攻击日志分析研究

    Institute of Scientific and Technical Information of China (English)

    邓诗琪; 刘晓明; 武旭东; 雷敏

    2016-01-01

    The rapid development of Internet Technology has changed people’s lifestyle. And the e-commerce becomes one of the most popular web applications. Nowadays, malicious attacks towards web server of most e-commerce websites appear to be more and more common. However, related attack records can be found through analyzing access logs on web server of those e-commerce websites. The OWASP (Open Web Application Security Project) publishes ten attack technology the web server experienced every year, such as SQL injection, XSS attack and DDoS attack, etc. These attacks have caused great harm to the web server, on the one hand, the e-commerce websites can’t provide normal service for users, on the other hand, most data or privacy of users is leaked. This paper puts forward a solution to analyzing access logs on web server by the classiifcation of web access logs and the matching of attack pattern and characteristics. The system can ifnd out attack sources and types, and then displays the results in a graphical from in a web page, which helps security administrators of e-commerce websites to detect the attacks and improve the ability of resisting various attacks on web server.%互联网技术的飞速发展改变了人们的生活方式,其中电子商务是近年来应用最为广泛的互联网应用之一。越来越多的Web服务器部署在互联网上向外提供服务,因此针对电子商务Web服务器的攻击不断增加。OWASP组织每年都公布Web应用程序遭受到的最多的10种攻击技术,其中攻击危害性较大的有SQL注入、XSS攻击和DDoS攻击等。这些攻击一方面使得电子商务服务器无法向外提供服务,另一方面还可能造成电子商务服务器中数据和用户个人隐私的泄露,因此电子商务服务器的安全防护是Web服务器安全运维最为重要的一个环节。通过对Web服务器日志的分析研究可以对网站的攻击事件进行检测,进而掌握Web服务器被攻击的

  9. Assessing browse trend at the landscape level Part 1: Preliminary steps and field survey

    Science.gov (United States)

    Keigley, R.B.; Frisina, M.R.; Fager, C.W.

    2002-01-01

    Woody plants are an important component of rangeland habitat, providing food and shelter for animals that range in size from moose to warblers to insects. Because of this importance, land managers are paying increased attention to browse trends. In this two-part article, we describe how browse trend is assessed at the Mt. Haggin Wildlife Management Area in southwestern Montana. Willows are currently heavily browsed, but there is evidence that browsing pressure was lower in the past. Heavily-browsed 14-inch-tall plants grow in close proximity to 16-foot-tall plants, the tallest stems of which are unbrowsed. The 16-foot-tall stems are older than the 14-inch-tall stems, and apparently grew through the browse zone when browsing pressure was lower than its current level. An increase in browsing pressure would be consistent with the increase in the moose population that occurred over the past 3 decades.

  10. Professional SQL Server 2005 administration

    CERN Document Server

    Knight, Brian; Snyder, Wayne; Armand, Jean-Claude; LoForte, Ross; Ji, Haidong

    2007-01-01

    SQL Server 2005 is the largest leap forward for SQL Server since its inception. With this update comes new features that will challenge even the most experienced SQL Server DBAs. Written by a team of some of the best SQL Server experts in the industry, this comprehensive tutorial shows you how to navigate the vastly changed landscape of the SQL Server administration. Drawing on their own first-hand experiences to offer you best practices, unique tips and tricks, and useful workarounds, the authors help you handle even the most difficult SQL Server 2005 administration issues, including blockin

  11. Performance Analysis of MTD64, our Tiny Multi-Threaded DNS64 Server Implementation: Proof of Concept

    Directory of Open Access Journals (Sweden)

    Gábor Lencse

    2016-07-01

    In this paper, the performance of MTD64 is measured and compared to that of the industry standard BIND in order to check the correctness of the design concepts of MTD64, especially of the one that we use a new thread for each request. For the performance measurements, our earlier proposed dns64perf program is enhanced as dns64perf2, which one is also documented in this paper. We found that MTD64 seriously outperformed BIND and hence our design principles may be useful for the design of a high performance production class DNS64 server. As an additional test, we have also examined the effect of dynamic CPU frequency scaling to the performance of the implementations.

  12. An integrated medical image database and retrieval system using a web application server.

    Science.gov (United States)

    Cao, Pengyu; Hashiba, Masao; Akazawa, Kouhei; Yamakawa, Tomoko; Matsuto, Takayuki

    2003-08-01

    We developed an Integrated Medical Image Database and Retrieval System (INIS) for easy access by medical staff. The INIS mainly consisted of four parts: specific servers to save medical images from multi-vendor modalities of CT, MRI, CR, ECG and endoscopy; an integrated image database (DB) server to save various kinds of images in a DICOM format; a Web application server to connect clients to the integrated image DB and the Web browser terminals connected to an HIS system. The INIS provided a common screen design to retrieve CT, MRI, CR, endoscopic and ECG images, and radiological reports, which would allow doctors to retrieve radiological images and corresponding reports, or ECG images of a patient simultaneously on a screen. Doctors working in internal medicine on average accessed information 492 times a month. Doctors working in cardiological and gastroenterological accessed information 308 times a month. Using the INIS, medical staff could browse all or parts of a patient's medical images and reports.

  13. Voice-controlled Internet Browsing for Motor-handicapped Users

    DEFF Research Database (Denmark)

    Brøndsted, Tom; Aaskoven, Erik

    2006-01-01

    The public-funded project "Indtal" ("Speak-it") has succeeded in developing a Danish voice-controlled utility for internet browsing targeting motor-handicapped users having difficulties using a standard keyboard and/or a standard mouse. The system has been designed and implemented in collaboration...... with an advisory board of motor-handicapped (potential) end-users and underlies a number of a priori defined design criteria: learnability and memorability rather than naturalness, minimal need for maintenance after release, support for "all" web standards (not just HTML conforming to certain "recommendations......"), independency of the language on the websites being browsed, etc. These criteria have lead to a primarily message-driven system interacting with an existing browser on the end users' systems...

  14. A Video Browsing Tool for Content Management in Postproduction

    Directory of Open Access Journals (Sweden)

    Werner Bailer

    2010-01-01

    Full Text Available We propose an interactive video browsing tool for supporting content management and selection in postproduction. The approach is based on a process model for multimedia content abstraction. A software framework based on this process model and desktop and Web-based client applications are presented. For evaluation, we apply two TRECVID style fact finding approaches (retrieval and question answering tasks and a user survey to the evaluation of the video browsing tool. We analyze the correlation between the results of the different methods, whether different aspects can be evaluated independently with the survey, and if a learning effect can be measured with the different methods, and we also compare the full-featured desktop and the limited Web-based user interface. The results show that the retrieval task correlates better with the user experience according to the survey. The survey rather measures the general user experience while different aspects of the usability cannot be analyzed independently.

  15. Mobile Web Browsing Based On Content Preserving With Reduced Cost

    Directory of Open Access Journals (Sweden)

    Dr.N.Saravanaselvam

    2015-01-01

    Full Text Available Internet has played a drastic change in today’s life. Especially, web browsing has become more exclusive in compact devices. This tempts the people to migrate their innovations & skills into an unimaginable world. With these things in mind, it is necessary for us to concentrate more on the techniques that how the web data’s are accessed and accounted. Developed countries use a widely popular technique called Flat- rate pricing, which is solely independent on data usage. But whereas, developing countries are still behind the concept of “pay as you use”, which leads to high usage bills.With an effort to resolve the problem of high usage bills, we propose a cost effective technique, which reduces the data consumption in web mobile browsing. It reduces the usage bills in the mechanism of usage-based pricing. The key idea of our approach is to leverage the data plan of the user to compute a cost quota for each web request and a network middle-box to automatically adapt any web page to the cost quota. Here we use a simple but effective content adaption technique that highly decides which image or data best fits the mobile display with low cost and high quality resolution. It also emphasis on the trendy technique,” The Data Mining “which mines the requested & required data. The mined data’s are filtered based on the content adaption technique and fit into the display effectively. Interesting and noticeable feature in this concept is that only important web contents requested by the user are exhibited. A feedback process involves in this concept to retrieve the required data alone and also to improve the best fit resolution. With this proposed system web mobile browsing becomes cheaper & contributes an enormous logic for the future project in the field of Mobile browsing.

  16. Compact Web browsing profiles for click-through rate prediction

    DEFF Research Database (Denmark)

    Fruergaard, Bjarne Ørum; Hansen, Lars Kai

    2014-01-01

    with varying degrees of sparsity in the representations. The decompositions that we consider are SVD, NMF, and IRM. To quantify the utility, we measure the performances of these representations when used as features in a sparse logistic regression model for click-through rate prediction. We recommend the IRM...... bipartite clustering features as they provide the most compact representation of browsing patterns and yield the best performance....

  17. Effects of simulated moose Alces alces browsing on the morphology of rowan Sorbus aucuparia

    Science.gov (United States)

    Jager, N.R.D.; Pastor, J.

    2010-01-01

    In much of northern Sweden moose Alces alces browse rowan Sorbus aucuparia heavily and commonly revisit previously browsed plants. Repeated browsing of rowan by moose has created some concern for its long-term survival in heavily browsed areas. We therefore measured how four years of simulated moose browsing at four population densities (0, 10, 30 and 50 moose/1,000 ha) changed plant height, crown width, available bite mass, the number of bites per plant and per plant forage biomass of rowan saplings. Increased biomass removal led to a significant decline in plant height (P moose relative to unbrowsed controls. Moose therefore stand to benefit from revisiting previously browsed plants, which may result in feeding loops between moose and previously browsed rowan saplings. ?? 2010 Wildlife Biology, NKV.

  18. DNS BIND Server Configuration

    Directory of Open Access Journals (Sweden)

    Radu MARSANU

    2011-01-01

    Full Text Available After a brief presentation of the DNS and BIND standard for Unix platforms, the paper presents an application which has a principal objective, the configuring of the DNS BIND 9 server. The general objectives of the application are presented, follow by the description of the details of designing the program.

  19. WWDC SERVER SOFTWARE INVENTORY MANAGEMENT AND AUTOMATION

    Directory of Open Access Journals (Sweden)

    M . THANJAIVADIVEL

    2012-08-01

    Full Text Available Many organizations maintain a large Data Centre for its business operation. In that different administration team works on large set of servers and performs several tasks on need basis. It’s too complicate to handle this large number of server manually in terms of maintaining its configuration, scheduled operation, administrative tasks etc. Here, we going to propose new automated technology for Server Software Configuration management. Then the entire server Configuration is maintained by this SSCMDB. Before that understand and perform the system administration task and also to identify repetitive task as well as automate them in some extent to bemore efficient, and then avoid mistakes in that. Then go through the different module of host list [A Host List is a list of hosts to be used for matching to examine whether a certain host is included in the list or not] . So that we can save time and better utilization of resources by eliminating time consuming manual process. It also usedto help the root cause analysis related issues like unplanned interruption it may leads to reduction quality. Most of the servers running on HPUX, LINUX and SOLARIS. It’s executed using the tools Perl, Shell, CGI, Java Script and MySql.

  20. Installing and Testing a Server Operating System

    Directory of Open Access Journals (Sweden)

    Lorentz JÄNTSCHI

    2003-08-01

    Full Text Available The paper is based on the experience of the author with the FreeBSD server operating system administration on three servers in use under academicdirect.ro domain.The paper describes a set of installation, preparation, and administration aspects of a FreeBSD server.First issue of the paper is the installation procedure of FreeBSD operating system on i386 computer architecture. Discussed problems are boot disks preparation and using, hard disk partitioning and operating system installation using a existent network topology and a internet connection.Second issue is the optimization procedure of operating system, server services installation, and configuration. Discussed problems are kernel and services configuration, system and services optimization.The third issue is about client-server applications. Using operating system utilities calls we present an original application, which allows displaying the system information in a friendly web interface. An original program designed for molecular structure analysis was adapted for systems performance comparisons and it serves for a discussion of Pentium, Pentium II and Pentium III processors computation speed.The last issue of the paper discusses the installation and configuration aspects of dial-in service on a UNIX-based operating system. The discussion includes serial ports, ppp and pppd services configuration, ppp and tun devices using.

  1. Concept indexing and expansion for social multimedia websites based on semantic processing and graph analysis

    Science.gov (United States)

    Lin, Po-Chuan; Chen, Bo-Wei; Chang, Hangbae

    2016-07-01

    This study presents a human-centric technique for social video expansion based on semantic processing and graph analysis. The objective is to increase metadata of an online video and to explore related information, thereby facilitating user browsing activities. To analyze the semantic meaning of a video, shots and scenes are firstly extracted from the video on the server side. Subsequently, this study uses annotations along with ConceptNet to establish the underlying framework. Detailed metadata, including visual objects and audio events among the predefined categories, are indexed by using the proposed method. Furthermore, relevant online media associated with each category are also analyzed to enrich the existing content. With the above-mentioned information, users can easily browse and search the content according to the link analysis and its complementary knowledge. Experiments on a video dataset are conducted for evaluation. The results show that our system can achieve satisfactory performance, thereby demonstrating the feasibility of the proposed idea.

  2. Coupling News Sentiment with Web Browsing Data Improves Prediction of Intra-Day Price Dynamics.

    Directory of Open Access Journals (Sweden)

    Gabriele Ranco

    Full Text Available The new digital revolution of big data is deeply changing our capability of understanding society and forecasting the outcome of many social and economic systems. Unfortunately, information can be very heterogeneous in the importance, relevance, and surprise it conveys, affecting severely the predictive power of semantic and statistical methods. Here we show that the aggregation of web users' behavior can be elicited to overcome this problem in a hard to predict complex system, namely the financial market. Specifically, our in-sample analysis shows that the combined use of sentiment analysis of news and browsing activity of users of Yahoo! Finance greatly helps forecasting intra-day and daily price changes of a set of 100 highly capitalized US stocks traded in the period 2012-2013. Sentiment analysis or browsing activity when taken alone have very small or no predictive power. Conversely, when considering a news signal where in a given time interval we compute the average sentiment of the clicked news, weighted by the number of clicks, we show that for nearly 50% of the companies such signal Granger-causes hourly price returns. Our result indicates a "wisdom-of-the-crowd" effect that allows to exploit users' activity to identify and weigh properly the relevant and surprising news, enhancing considerably the forecasting power of the news sentiment.

  3. Coupling News Sentiment with Web Browsing Data Improves Prediction of Intra-Day Price Dynamics.

    Science.gov (United States)

    Ranco, Gabriele; Bordino, Ilaria; Bormetti, Giacomo; Caldarelli, Guido; Lillo, Fabrizio; Treccani, Michele

    2016-01-01

    The new digital revolution of big data is deeply changing our capability of understanding society and forecasting the outcome of many social and economic systems. Unfortunately, information can be very heterogeneous in the importance, relevance, and surprise it conveys, affecting severely the predictive power of semantic and statistical methods. Here we show that the aggregation of web users' behavior can be elicited to overcome this problem in a hard to predict complex system, namely the financial market. Specifically, our in-sample analysis shows that the combined use of sentiment analysis of news and browsing activity of users of Yahoo! Finance greatly helps forecasting intra-day and daily price changes of a set of 100 highly capitalized US stocks traded in the period 2012-2013. Sentiment analysis or browsing activity when taken alone have very small or no predictive power. Conversely, when considering a news signal where in a given time interval we compute the average sentiment of the clicked news, weighted by the number of clicks, we show that for nearly 50% of the companies such signal Granger-causes hourly price returns. Our result indicates a "wisdom-of-the-crowd" effect that allows to exploit users' activity to identify and weigh properly the relevant and surprising news, enhancing considerably the forecasting power of the news sentiment.

  4. Beginning SQL Server 2008 Administration

    CERN Document Server

    Walters, R

    2009-01-01

    Beginning SQL Server 2008 Administration is essential for anyone wishing to learn about implementing and managing SQL Server 2008 database. From college students, to experienced database administrators from other platforms, to those already familiar with SQL Server and wanting to fill in some gaps of knowledge, this book will bring all readers up to speed on the enterprise platform Microsoft SQL Server 2008. * Clearly describes relational database concepts* Explains the SQL Server database engine and supporting tools* Shows various database maintenance scenarios What you'll learn* Understand c

  5. Windows Home Server users guide

    CERN Document Server

    Edney, Andrew

    2008-01-01

    Windows Home Server brings the idea of centralized storage, backup and computer management out of the enterprise and into the home. Windows Home Server is built for people with multiple computers at home and helps to synchronize them, keep them updated, stream media between them, and back them up centrally. Built on a similar foundation as the Microsoft server operating products, it's essentially Small Business Server for the home.This book details how to install, configure, and use Windows Home Server and explains how to connect to and manage different clients such as Windows XP, Windows Vist

  6. Microsoft SQL Server 2012 bible

    CERN Document Server

    Jorgensen, Adam; LeBlanc, Patrick; Cherry, Denny; Nelson, Aaron

    2012-01-01

    Harness the powerful new SQL Server 2012 Microsoft SQL Server 2012 is the most significant update to this product since 2005, and it may change how database administrators and developers perform many aspects of their jobs. If you're a database administrator or developer, Microsoft SQL Server 2012 Bible teaches you everything you need to take full advantage of this major release. This detailed guide not only covers all the new features of SQL Server 2012, it also shows you step by step how to develop top-notch SQL Server databases and new data connections and keep your databases performing at p

  7. Lexical Server of Polish Language

    Directory of Open Access Journals (Sweden)

    Marek Gajecki

    2001-01-01

    Full Text Available This paper presents Lexical Server of Polish Language, the tool that aids natural language processing (NLP. Database of the server consists of dictionary units enriched by lexical information. The lexical server should be able to perform identification of word forms and generations of all inflected forms of the word. The server is dedicated to the people who are looking for NLP algorithms or implement them. The algorithms can be implemented in different kinds of programming languages and different operating systems. There are some examples of problems when lexical server can be useful: automatic text correction, tcxt indexing, keywords extraction, text profile building.

  8. Mastering Microsoft Exchange Server 2010

    CERN Document Server

    McBee, Jim

    2010-01-01

    A top-selling guide to Exchange Server-now fully updated for Exchange Server 2010. Keep your Microsoft messaging system up to date and protected with the very newest version, Exchange Server 2010, and this comprehensive guide. Whether you're upgrading from Exchange Server 2007 SP1 or earlier, installing for the first time, or migrating from another system, this step-by-step guide provides the hands-on instruction, practical application, and real-world advice you need.: Explains Microsoft Exchange Server 2010, the latest release of Microsoft's messaging system that protects against spam and vir

  9. CpGAVAS, an integrated web server for the annotation, visualization, analysis, and GenBank submission of completely sequenced chloroplast genome sequences

    Directory of Open Access Journals (Sweden)

    Liu Chang

    2012-12-01

    Full Text Available Abstract Background The complete sequences of chloroplast genomes provide wealthy information regarding the evolutionary history of species. With the advance of next-generation sequencing technology, the number of completely sequenced chloroplast genomes is expected to increase exponentially, powerful computational tools annotating the genome sequences are in urgent need. Results We have developed a web server CPGAVAS. The server accepts a complete chloroplast genome sequence as input. First, it predicts protein-coding and rRNA genes based on the identification and mapping of the most similar, full-length protein, cDNA and rRNA sequences by integrating results from Blastx, Blastn, protein2genome and est2genome programs. Second, tRNA genes and inverted repeats (IR are identified using tRNAscan, ARAGORN and vmatch respectively. Third, it calculates the summary statistics for the annotated genome. Fourth, it generates a circular map ready for publication. Fifth, it can create a Sequin file for GenBank submission. Last, it allows the extractions of protein and mRNA sequences for given list of genes and species. The annotation results in GFF3 format can be edited using any compatible annotation editing tools. The edited annotations can then be uploaded to CPGAVAS for update and re-analyses repeatedly. Using known chloroplast genome sequences as test set, we show that CPGAVAS performs comparably to another application DOGMA, while having several superior functionalities. Conclusions CPGAVAS allows the semi-automatic and complete annotation of a chloroplast genome sequence, and the visualization, editing and analysis of the annotation results. It will become an indispensible tool for researchers studying chloroplast genomes. The software is freely accessible from http://www.herbalgenomics.org/cpgavas.

  10. FocusStack and StimServer: a new open source MATLAB toolchain for visual stimulation and analysis of two-photon calcium neuronal imaging data.

    Science.gov (United States)

    Muir, Dylan R; Kampa, Björn M

    2014-01-01

    Two-photon calcium imaging of neuronal responses is an increasingly accessible technology for probing population responses in cortex at single cell resolution, and with reasonable and improving temporal resolution. However, analysis of two-photon data is usually performed using ad-hoc solutions. To date, no publicly available software exists for straightforward analysis of stimulus-triggered two-photon imaging experiments. In addition, the increasing data rates of two-photon acquisition systems imply increasing cost of computing hardware required for in-memory analysis. Here we present a Matlab toolbox, FocusStack, for simple and efficient analysis of two-photon calcium imaging stacks on consumer-level hardware, with minimal memory footprint. We also present a Matlab toolbox, StimServer, for generation and sequencing of visual stimuli, designed to be triggered over a network link from a two-photon acquisition system. FocusStack is compatible out of the box with several existing two-photon acquisition systems, and is simple to adapt to arbitrary binary file formats. Analysis tools such as stack alignment for movement correction, automated cell detection and peri-stimulus time histograms are already provided, and further tools can be easily incorporated. Both packages are available as publicly-accessible source-code repositories.

  11. FocusStack and StimServer: A new open source MATLAB toolchain for visual stimulation and analysis of two-photon calcium neuronal imaging data

    Directory of Open Access Journals (Sweden)

    Dylan Richard Muir

    2015-01-01

    Full Text Available Two-photon calcium imaging of neuronal responses is an increasingly accessible technology for probing population responses in cortex at single cell resolution, and with reasonable and improving temporal resolution. However, analysis of two-photon data is usually performed using ad-hoc solutions. To date, no publicly available software exists for straightforward analysis of stimulus-triggered two-photon imaging experiments. In addition, the increasing data rates of two-photon acquisition systems imply increasing cost of computing hardware required for in-memory analysis. Here we present a Matlab toolbox, FocusStack, for simple and efficient analysis of two-photon calcium imaging stacks on consumer-level hardware, with minimal memory footprint. We also present a Matlab toolbox, StimServer, for generation and sequencing of visual stimuli, designed to be triggered over a network link from a two-photon acquisition system. FocusStack is compatible out of the box with several existing two-photon acquisition systems, and is simple to adapt to arbitrary binary file formats. Analysis tools such as stack alignment for movement correction, automated cell detection and peri-stimulus time histograms are already provided, and further tools can be easily incorporated. Both packages are available as publicly-accessible source-code repositories.

  12. Liberate Mediacast Server

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    @@ The Mediacast server schedules the retrieval of HTML content from multiple sources, then organizes and broadcasts the content to all clients on the appropriate channels at the specified times. The content is displayed upon subscriber request, when triggered by an event, or automatically. These broadcasts are generally performed in-band, but Mediacast can also broadcast over the out-of-band network. This allows subscribers to display and interact with a wide variety of specialized content without significantly increasing network traffic.

  13. Getting started with SQL Server 2012 cube development

    CERN Document Server

    Lidberg, Simon

    2013-01-01

    As a practical tutorial for Analysis Services, get started with developing cubes. ""Getting Started with SQL Server 2012 Cube Development"" walks you through the basics, working with SSAS to build cubes and get them up and running.Written for SQL Server developers who have not previously worked with Analysis Services. It is assumed that you have experience with relational databases, but no prior knowledge of cube development is required. You need SQL Server 2012 in order to follow along with the exercises in this book.

  14. Huygens file server and storage architecture

    NARCIS (Netherlands)

    Bosch, Peter; Mullender, Sape; Stabell-Kulo, Tage

    1993-01-01

    The Huygens file server is a high-performance file server which is able to deliver multi-media data in a timely manner while also providing clients with ordinary “Unix” like file I/O. The file server integrates client machines, file servers and tertiary storage servers in the same storage architectu

  15. Using eDNA to experimentally test ungulate browsing preferences.

    Science.gov (United States)

    Nichols, Ruth V; Cromsigt, Joris P G M; Spong, Göran

    2015-01-01

    Large herbivores may affect ecosystem processes and states, but such effects can be difficult to quantify, especially within multispecies assemblages. To better understand such processes and improve our predictive ability of systems undergoing change, herbivore diets can be studied using controlled feeding trials (or cafeteria tests). With some wildlife, such as large herbivores, it is impractical to empirically verify these findings, because it requires visually observing animals in forested environments, which can disturb them from their natural behaviors. Yet, in field-based cafeteria trials it is nearly impossible to differentiate selection between herbivore species that forage on similar plants and make very similar bite marks. However, during browsing ungulates leave saliva residue which includes some buccal cells and DNA that can be extracted for species identification. Here we used a newly developed eDNA-based method (biteDNA) to test the browsing preferences of four sympatric ungulate species in the wild. Overall, food preferences varied between species, but all species strongly preferred deciduous over coniferous species. Our method allows the study of plant-animal interactions in multispecies assemblages at very fine detail.

  16. Interactively Browsing NASA's EOS Imagery in Full Resolution

    Science.gov (United States)

    Boller, R. A.; Joshi, T.; Schmaltz, J. E.; Ilavajhala, S.; Davies, D.; Murphy, K. J.

    2012-12-01

    Worldview is a new tool designed to interactively browse full-resolution imagery from NASA's fleet of Earth Observing System (EOS) satellites. It is web-based and developed using open standards (JavaScript, CSS, HTML) for cross-platform compatibility. It addresses growing user demands for access to full-resolution imagery by providing a responsive, interactive interface with global coverage, no artificial boundaries, and views in geographic and polar projections. Currently tailored to the near real-time community, Worldview enables the rapid evaluation and comparison of imagery related to such application areas as fires, floods, and air quality. It is supported by the Global Imagery Browse Services (GIBS), a system that continuously ingests, mosaics, and serves approximately 21GB of imagery daily. This imagery spans over 50 data products that are available within three hours of observation from instruments aboard Terra, Aqua, and Aura. The GIBS image archive began in May 2012 and will have published approximately 4.4TB of imagery as of December 2012. Worldview facilitates rapid access to this archive and is supplemented by socioeconomic data layers from the Socioeconomic Data and Applications Center (SEDAC), including products such as population density and economic risk from cyclones. Future plans include the accessibility of additional products that cover the entire Terra/MODIS and Aqua/MODIS missions (>150TB) and the ability to download the underlying science data of the onscreen imagery.

  17. Advanced 3-D analysis, client-server systems, and cloud computing—Integration of cardiovascular imaging data into clinical workflows of transcatheter aortic valve replacement

    Science.gov (United States)

    Zimmermann, Mathis; Falkner, Juergen

    2013-01-01

    Degenerative aortic stenosis is highly prevalent in the aging populations of industrialized countries and is associated with poor prognosis. Surgical valve replacement has been the only established treatment with documented improvement of long-term outcome. However, many of the older patients with aortic stenosis (AS) are high-risk or ineligible for surgery. For these patients, transcatheter aortic valve replacement (TAVR) has emerged as a treatment alternative. The TAVR procedure is characterized by a lack of visualization of the operative field. Therefore, pre- and intra-procedural imaging is critical for patient selection, pre-procedural planning, and intra-operative decision-making. Incremental to conventional angiography and 2-D echocardiography, multidetector computed tomography (CT) has assumed an important role before TAVR. The analysis of 3-D CT data requires extensive post-processing during direct interaction with the dataset, using advance analysis software. Organization and storage of the data according to complex clinical workflows and sharing of image information have become a critical part of these novel treatment approaches. Optimally, the data are integrated into a comprehensive image data file accessible to multiple groups of practitioners across the hospital. This creates new challenges for data management requiring a complex IT infrastructure, spanning across multiple locations, but is increasingly achieved with client-server solutions and private cloud technology. This article describes the challenges and opportunities created by the increased amount of patient-specific imaging data in the context of TAVR. PMID:24282750

  18. Advanced 3-D analysis, client-server systems, and cloud computing-Integration of cardiovascular imaging data into clinical workflows of transcatheter aortic valve replacement.

    Science.gov (United States)

    Schoenhagen, Paul; Zimmermann, Mathis; Falkner, Juergen

    2013-06-01

    Degenerative aortic stenosis is highly prevalent in the aging populations of industrialized countries and is associated with poor prognosis. Surgical valve replacement has been the only established treatment with documented improvement of long-term outcome. However, many of the older patients with aortic stenosis (AS) are high-risk or ineligible for surgery. For these patients, transcatheter aortic valve replacement (TAVR) has emerged as a treatment alternative. The TAVR procedure is characterized by a lack of visualization of the operative field. Therefore, pre- and intra-procedural imaging is critical for patient selection, pre-procedural planning, and intra-operative decision-making. Incremental to conventional angiography and 2-D echocardiography, multidetector computed tomography (CT) has assumed an important role before TAVR. The analysis of 3-D CT data requires extensive post-processing during direct interaction with the dataset, using advance analysis software. Organization and storage of the data according to complex clinical workflows and sharing of image information have become a critical part of these novel treatment approaches. Optimally, the data are integrated into a comprehensive image data file accessible to multiple groups of practitioners across the hospital. This creates new challenges for data management requiring a complex IT infrastructure, spanning across multiple locations, but is increasingly achieved with client-server solutions and private cloud technology. This article describes the challenges and opportunities created by the increased amount of patient-specific imaging data in the context of TAVR.

  19. The UMLS Knowledge Source server.

    Science.gov (United States)

    McCray, A T; Razi, A

    1995-01-01

    The UMLS Knowledge Source server is an evolving tool for accessing information stored in the UMLS Knowledge Sources. The system architecture is based on the client-server paradigm wherein remote site users send their requests to a centrally managed server at the U.S. National Library of Medicine. The client programs can run on platforms supporting the TCP/IP communication protocol. Access to the system is provided through a command-line interface and through an Application Programming Interface.

  20. Mastering Microsoft Exchange Server 2013

    CERN Document Server

    Elfassy, David

    2013-01-01

    The bestselling guide to Exchange Server, fully updated for the newest version Microsoft Exchange Server 2013 is touted as a solution for lowering the total cost of ownership, whether deployed on-premises or in the cloud. Like the earlier editions, this comprehensive guide covers every aspect of installing, configuring, and managing this multifaceted collaboration system. It offers Windows systems administrators and consultants a complete tutorial and reference, ideal for anyone installing Exchange Server for the first time or those migrating from an earlier Exchange Server version.Microsoft

  1. Overview of Ontology Servers Research

    Directory of Open Access Journals (Sweden)

    Robert M. Colomb

    2007-06-01

    Full Text Available An ontology is increasingly becoming an essential tool for solving problems in many research areas. The ontology is a complex information object. It can contain millions of concepts in complex relationships. When we want to manage complex information objects, we generally turn to information systems technology. An information system intended to manage ontology is called an ontology server. The ontology server technology is at the time of writing quite immature. Therefore, this paper reviews and compares the main ontology servers that have been reported in the literatures. As a result, we point out several research questions related to server technology.

  2. Microsoft Windows Server Administration Essentials

    CERN Document Server

    Carpenter, Tom

    2011-01-01

    The core concepts and technologies you need to administer a Windows Server OS Administering a Windows operating system (OS) can be a difficult topic to grasp, particularly if you are new to the field of IT. This full-color resource serves as an approachable introduction to understanding how to install a server, the various roles of a server, and how server performance and maintenance impacts a network. With a special focus placed on the new Microsoft Technology Associate (MTA) certificate, the straightforward, easy-to-understand tone is ideal for anyone new to computer administration looking t

  3. SPEER-SERVER: a web server for prediction of protein specificity determining sites

    Science.gov (United States)

    Chakraborty, Abhijit; Mandloi, Sapan; Lanczycki, Christopher J.; Panchenko, Anna R.; Chakrabarti, Saikat

    2012-01-01

    Sites that show specific conservation patterns within subsets of proteins in a protein family are likely to be involved in the development of functional specificity. These sites, generally termed specificity determining sites (SDS), might play a crucial role in binding to a specific substrate or proteins. Identification of SDS through experimental techniques is a slow, difficult and tedious job. Hence, it is very important to develop efficient computational methods that can more expediently identify SDS. Herein, we present Specificity prediction using amino acids’ Properties, Entropy and Evolution Rate (SPEER)-SERVER, a web server that predicts SDS by analyzing quantitative measures of the conservation patterns of protein sites based on their physico-chemical properties and the heterogeneity of evolutionary changes between and within the protein subfamilies. This web server provides an improved representation of results, adds useful input and output options and integrates a wide range of analysis and data visualization tools when compared with the original standalone version of the SPEER algorithm. Extensive benchmarking finds that SPEER-SERVER exhibits sensitivity and precision performance that, on average, meets or exceeds that of other currently available methods. SPEER-SERVER is available at http://www.hpppi.iicb.res.in/ss/. PMID:22689646

  4. How happy is your web browsing? A model to quantify satisfaction of an Internet user searching for desired information

    Science.gov (United States)

    Banerji, Anirban; Magarkar, Aniket

    2012-09-01

    We feel happy when web browsing operations provide us with necessary information; otherwise, we feel bitter. How to measure this happiness (or bitterness)? How does the profile of happiness grow and decay during the course of web browsing? We propose a probabilistic framework that models the evolution of user satisfaction, on top of his/her continuous frustration at not finding the required information. It is found that the cumulative satisfaction profile of a web-searching individual can be modeled effectively as the sum of a random number of random terms, where each term is a mutually independent random variable, originating from ‘memoryless’ Poisson flow. Evolution of satisfaction over the entire time interval of a user’s browsing was modeled using auto-correlation analysis. A utilitarian marker, a magnitude of greater than unity of which describes happy web-searching operations, and an empirical limit that connects user’s satisfaction with his frustration level-are proposed too. The presence of pertinent information in the very first page of a website and magnitude of the decay parameter of user satisfaction (frustration, irritation etc.) are found to be two key aspects that dominate the web user’s psychology. The proposed model employed different combinations of decay parameter, searching time and number of helpful websites. The obtained results are found to match the results from three real-life case studies.

  5. SQL Server Integration Services

    CERN Document Server

    Hamilton, Bill

    2007-01-01

    SQL Server 2005 Integration Services (SSIS) lets you build high-performance data integration solutions. SSIS solutions wrap sophisticated workflows around tasks that extract, transform, and load (ETL) data from and to a wide variety of data sources. This Short Cut begins with an overview of key SSIS concepts, capabilities, standard workflow and ETL elements, the development environment, execution, deployment, and migration from Data Transformation Services (DTS). Next, you'll see how to apply the concepts you've learned through hands-on examples of common integration scenarios. Once you've

  6. Sending servers to Morocco

    CERN Multimedia

    Joannah Caborn Wengler

    2012-01-01

    Did you know that computer centres are like people? They breathe air in and out like a person, they have to be kept at the right temperature, and they can even be organ donors. As part of a regular cycle of equipment renewal, the CERN Computer Centre has just donated 161 retired servers to universities in Morocco.   Prof. Abdeslam Hoummada and CERN DG Rolf Heuer seeing off the servers on the beginning of their journey to Morocco. “Many people don’t realise, but the Computer Centre is like a living thing. You don’t just install equipment and it runs forever. We’re continually replacing machines, broken parts and improving things like the cooling.” Wayne Salter, Leader of the IT Computing Facilities Group, watches over the Computer Centre a bit like a nurse monitoring a patient’s temperature, especially since new international recommendations for computer centre environmental conditions were released. “A new international s...

  7. Open access to sequence: Browsing the Pichia pastoris genome

    Directory of Open Access Journals (Sweden)

    Graf Alexandra

    2009-10-01

    Full Text Available Abstract The first genome sequences of the important yeast protein production host Pichia pastoris have been released into the public domain this spring. In order to provide the scientific community easy and versatile access to the sequence, two web-sites have been installed as a resource for genomic sequence, gene and protein information for P. pastoris: A GBrowse based genome browser was set up at http://www.pichiagenome.org and a genome portal with gene annotation and browsing functionality at http://bioinformatics.psb.ugent.be/webtools/bogas. Both websites are offering information on gene annotation and function, regulation and structure. In addition, a WiKi based platform allows all users to create additional information on genes, proteins, physiology and other items of P. pastoris research, so that the Pichia community can benefit from exchange of knowledge, data and materials.

  8. Based on the Case of LVS + Keepalived Architecture Server Access Failure Analysis%一例基于LVS + Keepalived 架构的服务器访问故障分析

    Institute of Scientific and Technical Information of China (English)

    沈平; 潘志安; 袁瑛

    2013-01-01

    This article mainly introduced in the LVS + Keepalived framework, solve a server server part a user access to a fault, through the contrast analysis, tracking, check regularly Keepalived configuration and so on, has solved the partial user suddenly ap?peared access failure problem, the author provides a solution to train of thought, for everybody reference.%该文主要介绍了在 LVS + Keepalived 架构下,解决一例服务器部分用户访问故障,通过对比分析、追踪排查,检查Keepalived配置等,解决了部分用户突然出现访问故障的问题,笔者提供了一个解决的思路,供大家参考.

  9. Eland browsing of Grewia occidentalis in semi-arid shrubland: the influence of bush clumps

    Directory of Open Access Journals (Sweden)

    L.H. Watson

    1999-01-01

    Full Text Available Grewia occidentalis plants in the study area generally occurred in bush clumps with other shrub species. Grewia occidentalis commonly occurred with Diospyros austro-africana, Rhus longispina and Rhus pollens (nurse shrubs, but seldom with Acacia kar-roo and Lycium cinereum (non-nurse shrubs. Eland browsed G. occidentalis plants at higher levels than other shrub species, but browsing was not evenly spread across all plants. Grewia occidentalis plants associated with nurse shrubs had lower levels of browsing than those growing alone and those growing with non-nurse shrubs, while G. occidentalis plants in the centre of nurse shrubs experienced the lowest levels of browsing. The latter group of plants also produced the most fruit. Eland browsing is consid-ered an important factor determining the distribution of G. occidentalis plants in the study area, while the presence of nurse shrubs is considered essential for the establishment and maintenance of the G. occidentalis population in the study area.

  10. Topic Browsing for Research Papers with Hierarchical Latent Tree Analysis

    OpenAIRE

    Poon, Leonard K. M.; Zhang, Nevin L.

    2016-01-01

    Academic researchers often need to face with a large collection of research papers in the literature. This problem may be even worse for postgraduate students who are new to a field and may not know where to start. To address this problem, we have developed an online catalog of research papers where the papers have been automatically categorized by a topic model. The catalog contains 7719 papers from the proceedings of two artificial intelligence conferences from 2000 to 2015. Rather than the...

  11. Design and Implementation of a Computation Server for Optimization with Application to the Analysis of Critical Infrastructure

    Science.gov (United States)

    2013-06-01

    Visual Basic for Ap- plications ( VBA ) are often used for building a user-interface and automating the model runs. The choice of this application is...performance of decision support tools for Operations Analysis, there remains a persistent challenge in the deployment and use of sophisticated models by...performance of decision support tools for Opera- tions Analysis, there remains a persistent challenge in the deployment and use of sophisticated models

  12. Servers in SCADA applications

    Energy Technology Data Exchange (ETDEWEB)

    Marcuse, J.; Menz, B.; Payne, J.

    1995-12-31

    The rise of computerized data acquisition, storage and reporting systems has been driven by industries demand for advanced troubleshooting aids and continual, measurable process and product quality improvements. As US companies entered into global competition they discovered ever stiffer customer requirements. These requirements were especially stiff in the auto industry where the Japanese set a very high standard using SPC and other world class manufacturing methods. The architecture of these SCADA (supervisory control and data acquisition) systems has gone through several evolutionary stages over the last few years. This paper examines this evolution from the mainframe computer architecture used in the 1970`s, to the multi-tiered scheme of the 1980`s to the client-server architecture emerging today.

  13. Host Integration Server 2004

    Institute of Scientific and Technical Information of China (English)

    PaulThurrott; 杨岩

    2005-01-01

    微软发布的Host Integration Server(HIS)2004,是IBM大型主机集成服务器的一个重要的更新,添加了一些重要的新特点和改进。与大多数微软公司协同工作的产品不同,HIS 2004的设计目的是为了移植,而不是纯粹的集成,事实上它将会帮助客户从现有的传统平台中得到更多的价值——在这种情况下,所指的产品就是IBM大型主机和iSeries(也就是以前的AS/400)系列机型。

  14. Universal Fingerprinting Chip Server

    Science.gov (United States)

    Casique-Almazán, Janet; Larios-Serrato, Violeta; Olguín-Ruíz, Gabriela Edith; Sánchez-Vallejo, Carlos Javier; Maldonado-Rodríguez, Rogelio; Méndez-Tenorio, Alfonso

    2012-01-01

    The Virtual Hybridization approach predicts the most probable hybridization sites across a target nucleic acid of known sequence, including both perfect and mismatched pairings. Potential hybridization sites, having a user-defined minimum number of bases that are paired with the oligonucleotide probe, are first identified. Then free energy values are evaluated for each potential hybridization site, and if it has a calculated free energy of equal or higher negative value than a user-defined free energy cut-off value, it is considered as a site of high probability of hybridization. The Universal Fingerprinting Chip Applications Server contains the software for visualizing predicted hybridization patterns, which yields a simulated hybridization fingerprint that can be compared with experimentally derived fingerprints or with a virtual fingerprint arising from a different sample. Availability http://bioinformatica.homelinux.org/UFCVH/ PMID:22829736

  15. WMS Server 2.0

    Science.gov (United States)

    Plesea, Lucian; Wood, James F.

    2012-01-01

    This software is a simple, yet flexible server of raster map products, compliant with the Open Geospatial Consortium (OGC) Web Map Service (WMS) 1.1.1 protocol. The server is a full implementation of the OGC WMS 1.1.1 as a fastCGI client and using Geospatial Data Abstraction Library (GDAL) for data access. The server can operate in a proxy mode, where all or part of the WMS requests are done on a back server. The server has explicit support for a colocated tiled WMS, including rapid response of black (no-data) requests. It generates JPEG and PNG images, including 16-bit PNG. The GDAL back-end support allows great flexibility on the data access. The server is a port to a Linux/GDAL platform from the original IRIX/IL platform. It is simpler to configure and use, and depending on the storage format used, it has better performance than other available implementations. The WMS server 2.0 is a high-performance WMS implementation due to the fastCGI architecture. The use of GDAL data back end allows for great flexibility. The configuration is relatively simple, based on a single XML file. It provides scaling and cropping, as well as blending of multiple layers based on layer transparency.

  16. Understanding Academic Information Seeking Habits through Analysis of Web Server Log Files: The Case of the Teachers College Library Website

    Science.gov (United States)

    Asunka, Stephen; Chae, Hui Soo; Hughes, Brian; Natriello, Gary

    2009-01-01

    Transaction logs of user activity on an academic library website were analyzed to determine general usage patterns on the website. This paper reports on insights gained from the analysis, and identifies and discusses issues relating to content access, interface design and general functionality of the website. (Contains 13 figures and 8 tables.)

  17. Chemical composition and digestibility of some browse plant species collected from Algerian arid rangelands

    Energy Technology Data Exchange (ETDEWEB)

    Boufennara, S.; Lopez, S.; Boussebouna, H.; Bodas, R.; Bouazza, L.

    2012-11-01

    Many wild browse and bush species are undervalued mainly because of insufficient knowledge about their potential feeding value. The objective was to evaluate some nutritional attributes of various Algerian browse and shub species (Atriplex halimus, Artemisia campestris, Artemisia herba-alba, Astragalus gombiformis, Calobota saharae, Retama raetam, Stipagrostis pungens, Lygeum spartum and Stipa tenacissima). Chemical composition, phenols and tannins concentration, in vitro digestibility, in vitro gas production kinetics and in vitro bio-assay for assessment of tannins using buffered rumen fluid, and in situ disappearence of the edible parts of the plants (leaves, thin twigs and flowers) were determined. In general, protein content in dicotyledon species was always greater than in monocotyledon grasses, these showing higher neutral and acid detergent fibre and lower lignin contents than dicots. The tannin concentrations varied considerably between species, but in general the plants investigated in this study had low tannin contents (except for Artemisia spp. and S. tenacissima). Monocots showed lower in vitro and in situ digestibilities, fermentation rate, cumulative gas production and extent of degradation than dicot species. The plants were clustered by principal components analysis in two groups: poor-quality grasses and the most digestible dicot species. Chemical composition (neutral detergent fibre and protein) and digestibility were the main influential variables determining the ranking. In conclusion, A. halimus, A. campestris, A. herba-alba and A. gombiformis can be considered of greater nutritional value than the highly fibrous and low digestible grasses (S. pungens, L. spartum and S. tenacissima) that should be considered emergency roughages. (Author) 46 refs.

  18. SQL Server 2014 development essentials

    CERN Document Server

    Masood-Al-Farooq, Basit A

    2014-01-01

    This book is an easy-to-follow, comprehensive guide that is full of hands-on examples, which you can follow to successfully design, build, and deploy mission-critical database applications with SQL Server 2014. If you are a database developer, architect, or administrator who wants to learn how to design, implement, and deliver a successful database solution with SQL Server 2014, then this book is for you. Existing users of Microsoft SQL Server will also benefit from this book as they will learn what's new in the latest version.

  19. Control of a heterogeneous two-server exponential queueing system

    Science.gov (United States)

    Larsen, R. L.; Agrawala, A. K.

    1983-01-01

    A dynamic control policy known as 'threshold queueing' is defined for scheduling customers from a Poisson source on a set of two exponential servers with dissimilar service rates. The slower server is invoked in response to instantaneous system loading as measured by the length of the queue of waiting customers. In a threshold queueing policy, a specific queue length is identified as a 'threshold,' beyond which the slower server is invoked. The slower server remains busy until it completes service on a customer and the queue length is less than its invocation threshold. Markov chain analysis is employed to analyze the performance of the threshold queueing policy and to develop optimality criteria. It is shown that probabilistic control is suboptimal to minimize the mean number of customers in the system. An approximation to the optimum policy is analyzed which is computationally simple and suffices for most operational applications.

  20. Research on the Remote Data Collection Based SQL Server

    Institute of Scientific and Technical Information of China (English)

    QI Xiangyang; LIN Shuzhong; CUI Hui; WANG Jiangfeng; SUN Huilai

    2006-01-01

    The remote data collection system based on SQL Server database technology was developed by Visual C++ and SQL Server database technology together. The Client/Server mode was adopted. The system adopted the database search technological-ADO to work out the communication procedure of the server. And the old data of corresponding memory units were upgraded by the new data which gathered from PLC through serial port real time in the database. The Client utilizes the network technology and database technology through queries procedure to access the data information in the database. Thus a large number of relevant data that the production line operated were obtained. The goal of understanding operation conditions of product line was achieved through analysis of these data. This system has been debugged by the experiment successfully.

  1. A Capacity Supply Model for Virtualized Servers

    Directory of Open Access Journals (Sweden)

    Alexander PINNOW

    2009-01-01

    Full Text Available This paper deals with determining the capacity supply for virtualized servers. First, a server is modeled as a queue based on a Markov chain. Then, the effect of server virtualization on the capacity supply will be analyzed with the distribution function of the server load.

  2. Windows Server 2012 R2 administrator cookbook

    CERN Document Server

    Krause, Jordan

    2015-01-01

    This book is intended for system administrators and IT professionals with experience in Windows Server 2008 or Windows Server 2012 environments who are looking to acquire the skills and knowledge necessary to manage and maintain the core infrastructure required for a Windows Server 2012 and Windows Server 2012 R2 environment.

  3. Intelligent Browsing System for E-Government Based on Domain Ontology Base%基于领域本体库的电子政务智能浏览系统

    Institute of Scientific and Technical Information of China (English)

    葛新; 董朝阳

    2012-01-01

    为了使用户能够快速、智能地利用电子政务信息资源,文中对不同来源、不同结构的数据进行整合和处理,提供面向业务领域主题查询、检索的智能浏览系统,给出了电子政务智能浏览系统体系结构,对系统的主要功能如数据发掘、数据存储、数据挖掘分析、智能浏览服务器、智能浏览客户端等进行了分析;对系统实现的关键技术即基于领域本体库的电子政务智能浏览进行了深入研究,讨论了领域本体实例库的建立方法,分析了智能浏览的工作流程,并对领域本体的构建、推理机制、智能浏览等关键活动进行了研究.%In order to use the information of e-government rapidly and smartly for users, the data of various sources and structure are integrated and processed. The intelligent browsing system is provided, which can be used for querying and retrieving of domain topics. The architecture of intelligent browsing for e-government is proposed. The main parts of the system are analyzed, such as data mining, data storing, data analyzing, intelligent browsing server, as well as intelligent browsing client. The key technology of the system, namely the intelligent browsing based on ontology is discussed. The constructing method for ontology instance base is proposed. Finally, the work flow of intelligent browsing is analyzed, and the key activities are studied, such as construction of domain ontology, deducing mechanism, as well as intelligent browsing.

  4. Mac OS X Lion Server For Dummies

    CERN Document Server

    Rizzo, John

    2011-01-01

    The perfect guide to help administrators set up Apple's Mac OS X Lion Server With the overwhelming popularity of the iPhone and iPad, more Macs are appearing in corporate settings. The newest version of Mac Server is the ideal way to administer a Mac network. This friendly guide explains to both Windows and Mac administrators how to set up and configure the server, including services such as iCal Server, Podcast Producer, Wiki Server, Spotlight Server, iChat Server, File Sharing, Mail Services, and support for iPhone and iPad. It explains how to secure, administer, and troubleshoot the networ

  5. PSSweb: protein structural statistics web server.

    Science.gov (United States)

    Gaillard, Thomas; Stote, Roland H; Dejaegere, Annick

    2016-07-01

    With the increasing number of protein structures available, there is a need for tools capable of automating the comparison of ensembles of structures, a common requirement in structural biology and bioinformatics. PSSweb is a web server for protein structural statistics. It takes as input an ensemble of PDB files of protein structures, performs a multiple sequence alignment and computes structural statistics for each position of the alignment. Different optional functionalities are proposed: structure superposition, Cartesian coordinate statistics, dihedral angle calculation and statistics, and a cluster analysis based on dihedral angles. An interactive report is generated, containing a summary of the results, tables, figures and 3D visualization of superposed structures. The server is available at http://pssweb.org.

  6. The metagenomics RAST server – a public resource for the automatic phylogenetic and functional analysis of metagenomes

    Directory of Open Access Journals (Sweden)

    Stevens R

    2008-09-01

    Full Text Available Abstract Background Random community genomes (metagenomes are now commonly used to study microbes in different environments. Over the past few years, the major challenge associated with metagenomics shifted from generating to analyzing sequences. High-throughput, low-cost next-generation sequencing has provided access to metagenomics to a wide range of researchers. Results A high-throughput pipeline has been constructed to provide high-performance computing to all researchers interested in using metagenomics. The pipeline produces automated functional assignments of sequences in the metagenome by comparing both protein and nucleotide databases. Phylogenetic and functional summaries of the metagenomes are generated, and tools for comparative metagenomics are incorporated into the standard views. User access is controlled to ensure data privacy, but the collaborative environment underpinning the service provides a framework for sharing datasets between multiple users. In the metagenomics RAST, all users retain full control of their data, and everything is available for download in a variety of formats. Conclusion The open-source metagenomics RAST service provides a new paradigm for the annotation and analysis of metagenomes. With built-in support for multiple data sources and a back end that houses abstract data types, the metagenomics RAST is stable, extensible, and freely available to all researchers. This service has removed one of the primary bottlenecks in metagenome sequence analysis – the availability of high-performance computing for annotating the data. http://metagenomics.nmpdr.org

  7. Mastering Citrix XenServer

    CERN Document Server

    Reed, Martez

    2014-01-01

    If you are an administrator who is looking to gain a greater understanding of how to design and implement a virtualization solution based on Citrix® XenServer®, then this book is for you. The book will serve as an excellent resource for those who are already familiar with other virtualization platforms, such as Microsoft Hyper-V or VMware vSphere.The book assumes that you have a good working knowledge of servers, networking, and storage technologies.

  8. Building server capabilities in China

    DEFF Research Database (Denmark)

    Adeyemi, Oluseyi; Slepniov, Dmitrij; Wæhrens, Brian Vejrum;

    2012-01-01

    The purpose of this paper is to further our understanding of multinational companies building server capabilities in China. The paper is based on the cases of two western companies with operations in China. The findings highlight a number of common patterns in the 1) managerial challenges related...... to the development of server capabilities at offshore sites, and 2) means of how these challenges can be handled....

  9. RAPID INDUCTION OF MULTIPLE TAXONOMIES FOR ENHANCED FACETED TEXT BROWSING

    Directory of Open Access Journals (Sweden)

    Lawrence Muchemi 1

    2016-07-01

    Full Text Available In this paper we present and compare two methodologies for rapidly inducing multiple subject-specific taxonomies from crawled data. The first method involves a sentence-level words co-occurrence frequency method for building the taxonomy, while the second involves the bootstrapping of a Word2Vec based algorithm with a directed crawler. We exploit the multilingual open-content directory of the World Wide Web, DMOZ1 to seed the crawl, and the domain name to direct the crawl. This domain corpus is then input to our algorithm that can automatically induce taxonomies. The induced taxonomies provide hierarchical semantic dimensions for the purposes of faceted browsing. As part of an ongoing personal semantics project, we applied the resulting taxonomies to personal social media data Twitter, Gmail, Facebook, Instagram, Flickr with an objective of enhancing an individual’s exploration of their personal information through faceted searching. We also perform a comprehensive corpus based evaluation of the algorithms based on many datasets drawn from the fields of medicine (diseases and leisure (hobbies and show that the induced taxonomies are of high quality.

  10. TBI server: a web server for predicting ion effects in RNA folding.

    Directory of Open Access Journals (Sweden)

    Yuhong Zhu

    Full Text Available Metal ions play a critical role in the stabilization of RNA structures. Therefore, accurate prediction of the ion effects in RNA folding can have a far-reaching impact on our understanding of RNA structure and function. Multivalent ions, especially Mg²⁺, are essential for RNA tertiary structure formation. These ions can possibly become strongly correlated in the close vicinity of RNA surface. Most of the currently available software packages, which have widespread success in predicting ion effects in biomolecular systems, however, do not explicitly account for the ion correlation effect. Therefore, it is important to develop a software package/web server for the prediction of ion electrostatics in RNA folding by including ion correlation effects.The TBI web server http://rna.physics.missouri.edu/tbi_index.html provides predictions for the total electrostatic free energy, the different free energy components, and the mean number and the most probable distributions of the bound ions. A novel feature of the TBI server is its ability to account for ion correlation and ion distribution fluctuation effects.By accounting for the ion correlation and fluctuation effects, the TBI server is a unique online tool for computing ion-mediated electrostatic properties for given RNA structures. The results can provide important data for in-depth analysis for ion effects in RNA folding including the ion-dependence of folding stability, ion uptake in the folding process, and the interplay between the different energetic components.

  11. Optimizing Parallel Access to the BaBar Database System Using CORBA Servers

    Institute of Scientific and Technical Information of China (English)

    JacekBecla; IgorGaponenko

    2001-01-01

    The BaBar Experiment collected around 20 TB of data during its first 6 months of running.Now,after 18 months,data size exceeds 300 TB,and according to prognosis,it is a small fraction of the size of data coming in the next few months,In order to keep up with the data significant effort was put into tuning the database system,It led to great performance improvements,as well as to inevitable system expansion-450 simultaneous processing nodes alone used for data reconstruction.It is believed,that further growth beyond 600 nodes will happen soon.In such an environment,many complex operations are executed simultaneously on hundreds of machines,putting a huge load on data servers and increasing network traffic Introducing two CORBA servers halved startup time,and dramatically offloaded database servers:data servers as well as lock servers The paper describes details of design and implementation of two servers recently in troduced in the Babar system:conditions OID server and Clustering Server,The first experience of using these servers is discussed.A discussion on a Collection Server for data analysis,currently being designed is included.

  12. Location-aware gang graffiti acquisition and browsing on a mobile device

    Science.gov (United States)

    Parra, Albert; Boutin, Mireille; Delp, Edward J.

    2012-02-01

    In this paper we describe a mobile-based system that allows first responders to identify and track gang graffiti by combining the use of image analysis and location-based-services. The gang graffiti image and metadata (geoposition, date and time) obtained automatically are transferred to a server and uploaded to a database of graffiti images. The database can then be queried with the matched results sent back to the mobile device where the user can then review the results and provide extra inputs to refine the information.

  13. Determination of Browse Intake and Nutrient Digestibility of Grazing West African Dwarf (WAD Goats Fed Varying Levels of Gmelina arborea Leaves as Supplements in Delta State Nigeria

    Directory of Open Access Journals (Sweden)

    O. Okpara

    2014-04-01

    Full Text Available The Research was carried out to assess the browse intake and nutrient digestibility of grazing West African Dwarf (WAD goats fed varying levels of Gmelina arborea leaves as supplement. Which produces appreciable amount of forage even at the peak of the dry season in the tropics, thereby ensuring all year round supply of follage and fodder. Thirty growing West Africa Dwarf (WAD goats were used to dertermine the level of browse intake and nutrient diggestibility by goats fed verying levels Gmelina arborea leaves as supplement. The goats were randomly divided into five groups of six animal per group. Goats in Treatment A were fed commercial growers mash, at 0.50 kg, Treatment B were fed 0.25 kg of Gmelina arborea Treatment C were fed 0.50 kg of Gmelina arborea leaves, Treatment D 0.75 kg and Treatment E 1.00 kg of Gmelina arborea leaves. Data were collected for thirteen weeks on browse intake and nutrient digestibility. Chemical analysis showed significantly (pPennisetum purpurem and growers mash. The values observed for growers mash for Nitrogen free extract and ether extract were higher than those observed in Gmelina arborea leaves and elephant grass. Significant differences (p<0.05existed for values recorded by the goats fed the different experimental diets. Goats fed Diet E had the highest (p<0.05 browse intake than Goat fed Diet D and C. Goat fed Diet B recorded the least significant (p<0.05 quantity of browse intake. Dry matter digestibility values as well as the crude protein difestibility were appreciably high for all animal fed the experimental diets. The goats fed diet E had the highest crude fibre digestibility followed by goats fed diest A and that of D, B and C respectively. Therefore, the study suggested 0.5 kg inclusion of Gmelina arborea leaves in the diet of grazing WAD goats as the optimum level for better performance.

  14. Server-side Filtering and Aggregation within a Distributed Environment

    Science.gov (United States)

    Currey, J. C.; Bartle, A.

    2015-12-01

    Intercalibration, validation, and data mining use cases require more efficient access to the massive volumes of observation data distributed across multiple agency data centers. The traditional paradigm of downloading large volumes of data to a centralized server or desktop computer for analysis is no longer viable. More analysis should be performed within the host data centers using server-side functions. Many comparative analysis tasks require far less than 1% of the available observation data. The Multi-Instrument Intercalibration (MIIC) Framework provides web services to find, match, filter, and aggregate multi-instrument observation data. Matching measurements from separate spacecraft in time, location, wavelength, and viewing geometry is a difficult task especially when data are distributed across multiple agency data centers. Event prediction services identify near coincident measurements with matched viewing geometries near orbit crossings using complex orbit propagation and spherical geometry calculations. The number and duration of event opportunities depend on orbit inclinations, altitude differences, and requested viewing conditions (e.g., day/night). Event observation information is passed to remote server-side functions to retrieve matched data. Data may be gridded, spatially convolved onto instantaneous field-of-views, or spectrally resampled or convolved. Narrowband instruments are routinely compared to hyperspectal instruments such as AIRS and CRIS using relative spectral response (RSR) functions. Spectral convolution within server-side functions significantly reduces the amount of hyperspectral data needed by the client. This combination of intelligent selection and server-side processing significantly reduces network traffic and data to process on local servers. OPeNDAP is a mature networking middleware already deployed at many of the Earth science data centers. Custom OPeNDAP server-side functions that provide filtering, histogram analysis (1D

  15. Windows Server 2012 vulnerabilities and security

    Directory of Open Access Journals (Sweden)

    Gabriel R. López

    2015-09-01

    Full Text Available This investigation analyses the history of the vulnerabilities of the base system Windows Server 2012 highlighting the most critic vulnerabilities given every 4 months since its creation until the current date of the research. It was organized by the type of vulnerabilities based on the classification of the NIST. Next, given the official vulnerabilities of the system, the authors show how a critical vulnerability is treated by Microsoft in order to countermeasure the security flaw. Then, the authors present the recommended security approaches for Windows Server 2012, which focus on the baseline software given by Microsoft, update, patch and change management, hardening practices and the application of Active Directory Rights Management Services (AD RMS. AD RMS is considered as an important feature since it is able to protect the system even though it is compromised using access lists at a document level. Finally, the investigation of the state of the art related to the security of Windows Server 2012 shows an analysis of solutions given by third parties vendors, which offer security products to secure the base system objective of this study. The recommended solution given by the authors present the security vendor Symantec with its successful features and also characteristics that the authors considered that may have to be improved in future versions of the security solution.

  16. The SDSS data archive server

    Energy Technology Data Exchange (ETDEWEB)

    Neilsen, Eric H., Jr.; /Fermilab

    2007-10-01

    data reduction pipeline is similar. Each pipeline deposits the results in a collection of files on disk. The Catalog Archive Server (CAS) provides an interface to a database of objects detected through the SDSS along with their properties and observational metadata. This serves the needs of most users, but some users require access to files produced by the pipelines. Some data, including the corrected frames (the pixel data itself corrected for instrumental signatures), the models for the point spread function, and an assortment of quality assurance plots, are not included in the database at all. Sometimes it is simply more convenient for a user to read data from existing files than to retrieve it using database queries. This is often the case, for example, when a user wants to download data a significant fraction of objects in the database. Users might need to perform analysis that requires more computing power than the CAS database servers can reasonably provide, and so need to download the data so that it can be analyzed with local resources. Users can derive observational parameters not measured by the standard SDSS pipeline from the corrected frames, metadata, and other data products, or simply use the output of tools with which they're familiar. The challenge in distributing these data is lies not in the distribution method itself, but in providing tools and support that allow users to find the data they need and interpret it properly. After introducing the data itself, this article describes how the DAS uses ubiquitous and well understood technologies to manage and distribute the data. It then discusses how it addresses the more difficult problem of helping the public find and use the data it contains, despite its complexity of its content and organization.

  17. 2009 report: A multi-refuge program to evaluate the effect of ungulate browsing on habitat

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — This report describes the evaluation of ungulate browsing at six wildlife refuges in FWS Region 6 in 2009. The aim of evaluation was to determine if ungulate...

  18. Modeling Integration and Reuse of Heterogeneous Terminologies in Faceted Browsing Systems

    Science.gov (United States)

    Harris, Daniel R.

    2017-01-01

    We integrate heterogeneous terminologies into our category-theoretic model of faceted browsing and show that existing terminologies and vocabularies can be reused as facets in a cohesive, interactive system. Commonly found in online search engines and digital libraries, faceted browsing systems depend upon one or more taxonomies which outline the structure and content of the facets available for user interaction. Controlled vocabularies or terminologies are often externally curated and are available as a reusable resource across systems. We demonstrated previously that category theory can abstractly model faceted browsing in a way that supports the development of interfaces capable of reusing and integrating multiple models of faceted browsing. We extend this model by illustrating that terminologies can be reused and integrated as facets across systems with examples from the biomedical domain.

  19. A Qualitative and Quantitative Analysis of Multi-core CPU Power and Performance Impact on Server Virtualization for Enterprise Cloud Data Centers

    Directory of Open Access Journals (Sweden)

    S. Suresh

    2015-02-01

    Full Text Available Cloud is an on demand service provisioning techniques uses virtualization as the underlying technology for managing and improving the utilization of data and computing center resources by server consolidation. Even though virtualization is a software technology, it has the effect of making hardware more important for high consolidation ratio. Performance and energy efficiency is one of the most important issues for large scale server systems in current and future cloud data centers. As improved performance is pushing the migration to multi core processors, this study does the analytic and simulation study of, multi core impact on server virtualization for new levels of performance and energy efficiency in cloud data centers. In this regard, the study develops the above described system model of virtualized server cluster and validate it for CPU core impact for performance and power consumption in terms of mean response time (mean delay vs. offered cloud load. Analytic and simulation results show that multi core virtualized model yields the best results (smallest mean delays, over the single fat CPU processor (faster clock speed for the diverse cloud workloads. For the given application, multi cores, by sharing the processing load improves overall system performance for all varying workload conditions; whereas, the fat single CPU model is only best suited for lighter loads. In addition, multi core processors don’t consume more power or generate more heat vs. a single-core processor, which gives users more processing power without the drawbacks typically associated with such increases. Therefore, cloud data centers today rely almost exclusively on multi core systems.

  20. Spatiotemporal variation in deer browse and tolerance in a woodland herb.

    Science.gov (United States)

    Prendeville, Holly R; Steven, Janet C; Galloway, Laura F

    2015-02-01

    Herbivory can shape the dynamics of plant populations, including effects on survival and reproduction, and is in turn affected by environmental factors that vary in space and time. White-tailed deer are significant herbivores in North America that have been broadly documented to affect plant reproductive success. If variation in the frequency and impact of herbivory by deer correlates with a broad-scale latitudinal gradient, climactic effects may be important for shaping plant-herbivore interactions, Alternatively, a lack of broad-scale gradients would suggest local factors such as plant community composition and deer densities are affecting herbivory. To investigate broad-scale patterns of deer herbivory, we examined the frequency and reproductive consequences of deer browse over three years in 17 populations of Campanulastrum americanum spanning the latitudinal extent of its range. Even though deer are overabundant throughout the range of C. americanum, we found spatiotemporal variation in deer browse frequency (0-0.96, mean 0.46) and its effects on plant reproductive success. The four southernmost populations experienced high levels of herbivory, and were responsible for generating a negative relationship between latitude and herbivory. In general, patterns of variation in the frequency and impact of herbivory across the entire latitudinal gradient pointed to the importance of local rather than broad-scale factors. Within a population, deer consumed larger plants. Across many populations and years, average fitnesses of browsed and uneaten plants were similar, suggesting that plants are tolerant to browse. However, since large plants have greater reproductive success and are more likely to be browsed, tolerance may be influenced by plant size. When plant size was accounted for, most populations did not fully compensate for browsing. There was no relationship between browsing intensity and tolerance, suggesting that browsing may be too variable to consistently

  1. Anthelmintic activity of some Mediterranean browse plants against parasitic nematodes.

    Science.gov (United States)

    Manolaraki, F; Sotiraki, S; Stefanakis, A; Skampardonis, V; Volanis, M; Hoste, H

    2010-04-01

    The anthelmintic properties of tannin-rich plants are being explored as an alternative to chemical drugs. Most data have been acquired on legume forages, but only few on browse plants. The present study aimed to (i) screen the in vitro effects of extracts from 7 Mediterranean plants on Haemonchus contortus, (ii) verify the role of tannins using an inhibitor, polyvinyl polypyrrolidone (PVPP) and (iii) verify the in vivo effects of extracts from 4 plants. Significant inhibition was shown in vitro using a larval migration inhibition (LMI) assay for all extracts except that from Olea europaea var. koroneiki. After adding PVPP, the LMI values were restored to control levels for all plants except Pistacia lentiscus and Ceratonia siliqua, confirming a role for tannins in the activity. In the in vivo experiment, 48 lambs composed 6 groups, depending on diet. On Day 0, groups G1-G5 received H. contortus and Trichostrongylus colubriformis larvae and G6 remained uninfected. The various diets were distributed from Days 14 to 45: P. lentiscus (G1), Quercus coccifera (G2), C. siliqua (G3), Onobrychis viciifolia (G4), or Medicago sativa for the 2 control groups (G5, G6). Egg excretion, packed cell volumes (PCVs) and inorganic phosphate were measured weekly throughout the entire experimental period. At slaughter, the worms were enumerated and their fecundity assessed. Consumption of the 4 browser plants did not provoke differences in pathophysiological measurements but there were significant decreases in egg excretion, mainly explained by significant decreases in worm fecundity for both species, without any statistical difference in worm numbers.

  2. Foliar Nutritional Quality Explains Patchy Browsing Damage Caused by an Invasive Mammal.

    Directory of Open Access Journals (Sweden)

    Hannah R Windley

    Full Text Available Introduced herbivores frequently inflict significant, yet patchy damage on native ecosystems through selective browsing. However, there are few instances where the underlying cause of this patchy damage has been revealed. We aimed to determine if the nutritional quality of foliage could predict the browsing preferences of an invasive mammalian herbivore, the common brushtail possum (Trichosurus vulpecula, in a temperate forest in New Zealand. We quantified the spatial and temporal variation in four key aspects of the foliar chemistry (total nitrogen, available nitrogen, in vitro dry matter digestibility and tannin effect of 275 trees representing five native tree species. Simultaneously, we assessed the severity of browsing damage caused by possums on those trees in order to relate selective browsing to foliar nutritional quality. We found significant spatial and temporal variation in nutritional quality among individuals of each tree species examined, as well as among tree species. There was a positive relationship between the available nitrogen concentration of foliage (a measure of in vitro digestible protein and the severity of damage caused by browsing by possums. This study highlights the importance of nutritional quality, specifically, the foliar available nitrogen concentration of individual trees, in predicting the impact of an invasive mammal. Revealing the underlying cause of patchy browsing by an invasive mammal provides new insights for conservation of native forests and targeted control of invasive herbivores in forest ecosystems.

  3. Learning SQL Server Reporting Services 2012

    CERN Document Server

    Krishnaswamy, Jayaram

    2013-01-01

    The book is packed with clear instructions and plenty of screenshots, providing all the support and guidance you will need as you begin to generate reports with SQL Server 2012 Reporting Services.This book is for those who are new to SQL Server Reporting Services 2012 and aspiring to create and deploy cutting edge reports. This book is for report developers, report authors, ad-hoc report authors and model developers, and Report Server and SharePoint Server Integrated Report Server administrators. Minimal knowledge of SQL Server is assumed and SharePoint experience would be helpful.

  4. Open client/server computing and middleware

    CERN Document Server

    Simon, Alan R

    2014-01-01

    Open Client/Server Computing and Middleware provides a tutorial-oriented overview of open client/server development environments and how client/server computing is being done.This book analyzes an in-depth set of case studies about two different open client/server development environments-Microsoft Windows and UNIX, describing the architectures, various product components, and how these environments interrelate. Topics include the open systems and client/server computing, next-generation client/server architectures, principles of middleware, and overview of ProtoGen+. The ViewPaint environment

  5. Implementing Citrix XenServer Quickstarter

    CERN Document Server

    Ahmed, Gohar

    2013-01-01

    Implementing Citrix XenServer Quick Starter is a practical, hands-on guide that will help you get started with the Citrix XenServer Virtualization technology with easy-to-follow instructions.Implementing Citrix XenServer Quick Starter is for system administrators who have little to no information on virtualization and specifically Citrix XenServer Virtualization. If you're managing a lot of physical servers and are tired of installing, deploying, updating, and managing physical machines on a daily basis over and over again, then you should probably explore your option of XenServer Virtualizati

  6. Beginning Microsoft SQL Server 2012 Programming

    CERN Document Server

    Atkinson, Paul

    2012-01-01

    Get up to speed on the extensive changes to the newest release of Microsoft SQL Server The 2012 release of Microsoft SQL Server changes how you develop applications for SQL Server. With this comprehensive resource, SQL Server authority Robert Vieira presents the fundamentals of database design and SQL concepts, and then shows you how to apply these concepts using the updated SQL Server. Publishing time and date with the 2012 release, Beginning Microsoft SQL Server 2012 Programming begins with a quick overview of database design basics and the SQL query language and then quickly proceeds to sho

  7. Analysis of Commodity Parameters Browsing Preference in Consumer' s Online Shopping Decision- making Taking Digital Camera for Example%消费者在线购物决策中的商品参数浏览偏好分析——以数码相机为例

    Institute of Scientific and Technical Information of China (English)

    许应楠

    2012-01-01

    In order to provide better online shopping services, this article firstly analyzes consumer' s online decision - making process. Secondly it analyzes consumer' s behavior of commodity parameters browsing preference and influence fac- tors. Then taking digital camera for example, the author designs and organizes the experiment of consumer' s browsing pref- erence of digital camera parameters. Experimental results show that consumers of different genders and knowledge back- ground have different preferences of commodity parameters. Finally, based on experimental results, it provides suggestions for content design of future knowledge recommendation services.%为更好地提供在线购物服务,首先分析消费者在线商品选择决策过程,对消费者的商品参数浏览偏好行为及影响因素进行深入分析,然后以数码相机为例,设计并组织消费者数码相机参数浏览偏好实验。通过实验,发现不同性别、知识背景的消费者对商品参数有着不同的偏好。最后基于实验结果,对知识推荐服务内容设计提出建议。

  8. Exchange Server2010只能安装在Windows Server2008R2上,这是真的吗?Windows Server2008R2支持Exchange Server2007P~?

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    Windows Server 2008 SP2 64位版本和Windows Server 2008 R2都支持Exchange Server2010。Windows Server 2008 R2不支持Exchange Server2007和Exchange Server 2007 SP1。Windows Server 2008 R2预计也不会增加对Exchange Server 2007 SP2的支持。

  9. The CGView Server: a comparative genomics tool for circular genomes.

    Science.gov (United States)

    Grant, Jason R; Stothard, Paul

    2008-07-01

    The CGView Server generates graphical maps of circular genomes that show sequence features, base composition plots, analysis results and sequence similarity plots. Sequences can be supplied in raw, FASTA, GenBank or EMBL format. Additional feature or analysis information can be submitted in the form of GFF (General Feature Format) files. The server uses BLAST to compare the primary sequence to up to three comparison genomes or sequence sets. The BLAST results and feature information are converted to a graphical map showing the entire sequence, or an expanded and more detailed view of a region of interest. Several options are included to control which types of features are displayed and how the features are drawn. The CGView Server can be used to visualize features associated with any bacterial, plasmid, chloroplast or mitochondrial genome, and can aid in the identification of conserved genome segments, instances of horizontal gene transfer, and differences in gene copy number. Because a collection of sequences can be used in place of a comparison genome, maps can also be used to visualize regions of a known genome covered by newly obtained sequence reads. The CGView Server can be accessed at http://stothard.afns.ualberta.ca/cgview_server/

  10. Professional Team Foundation Server 2010

    CERN Document Server

    Blankenship, Ed; Holliday, Grant; Keller, Brian

    2011-01-01

    Authoritative guide to TFS 2010 from a dream team of Microsoft insiders and MVPs!Microsoft Visual Studio Team Foundation Server (TFS) has evolved until it is now an essential tool for Microsoft?s Application Lifestyle Management suite of productivity tools, enabling collaboration within and among software development teams. By 2011, TFS will replace Microsoft?s leading source control system, VisualSourceSafe, resulting in an even greater demand for information about it. Professional Team Foundation Server 2010, written by an accomplished team of Microsoft insiders and Microsoft MVPs, provides

  11. GeoServer beginner's guide

    CERN Document Server

    Youngblood, Brian

    2013-01-01

    Step-by-step instructions are included and the needs of a beginner are totally satisfied by the book. The book consists of plenty of examples with accompanying screenshots and code for an easy learning curve. You are a web developer with knowledge of server side scripting, and have experience with installing applications on the server. You have a desire to want more than Google maps, by offering dynamically built maps on your site with your latest geospatial data stored in MySQL, PostGIS, MsSQL or Oracle. If this is the case, this book is meant for you.

  12. Professional Team Foundation Server 2012

    CERN Document Server

    Blankenship, Ed; Holliday, Grant; Keller, Brian

    2012-01-01

    A comprehensive guide to using Microsoft Team Foundation Server 2012 Team Foundation Server has become the leading Microsoft productivity tool for software management, and this book covers what developers need to know to use it effectively. Fully revised for the new features of TFS 2012, it provides developers and software project managers with step-by-step instructions and even assists those who are studying for the TFS 2012 certification exam. You'll find a broad overview of TFS, thorough coverage of core functions, a look at extensibility options, and more, written by Microsoft ins

  13. Instant MDX queries for SQL Server 2012

    CERN Document Server

    Emond, Nicholas

    2013-01-01

    Get to grips with a new technology, understand what it is and what it can do for you, and then get to work with the most important features and tasks. This short, focused guide is a great way to get stated with writing MDX queries. New developers can use this book as a reference for how to use functions and the syntax of a query as well as how to use Calculated Members and Named Sets.This book is great for new developers who want to learn the MDX query language from scratch and install SQL Server 2012 with Analysis Services

  14. Parallel Computing Using Web Servers and "Servlets".

    Science.gov (United States)

    Lo, Alfred; Bloor, Chris; Choi, Y. K.

    2000-01-01

    Describes parallel computing and presents inexpensive ways to implement a virtual parallel computer with multiple Web servers. Highlights include performance measurement of parallel systems; models for using Java and intranet technology including single server, multiple clients and multiple servers, single client; and a comparison of CGI (common…

  15. A polling model with an autonomous server

    NARCIS (Netherlands)

    Haan, de Roland; Boucherie, Richard J.; Ommeren, van Jan-Kees C.W.

    2007-01-01

    Polling models are used as an analytical performance tool in several application areas. In these models, the focus often is on controlling the operation of the server as to optimize some performance measure. For several applications, controlling the server is not an issue as the server moves indepen

  16. Client-server password recovery

    NARCIS (Netherlands)

    Chmielewski, Ł.; Hoepman, J.H.; Rossum, P. van

    2009-01-01

    Human memory is not perfect - people constantly memorize new facts and forget old ones. One example is forgetting a password, a common problem raised at IT help desks. We present several protocols that allow a user to automatically recover a password from a server using partial knowledge of the pass

  17. Team Foundation Server 2013 customization

    CERN Document Server

    Beeming, Gordon

    2014-01-01

    This book utilizes a tutorial based approach, focused on the practical customization of key features of the Team Foundation Server for collaborative enterprise software projects.This practical guide is intended for those who want to extend TFS. This book is for intermediate users who have an understanding of TFS, and basic coding skills will be required for the more complex customizations.

  18. Client-Server Password Recovery

    NARCIS (Netherlands)

    Chmielewski, L.; Hoepman, J.H.; Rossum, P. van

    2009-01-01

    Human memory is not perfect – people constantly memorize new facts and forget old ones. One example is forgetting a password, a common problem raised at IT help desks. We present several protocols that allow a user to automatically recover a password from a server using partial knowledge of the pass

  19. Tackling action-based video abstraction of animated movies for video browsing

    Science.gov (United States)

    Ionescu, Bogdan; Ott, Laurent; Lambert, Patrick; Coquin, Didier; Pacureanu, Alexandra; Buzuloiu, Vasile

    2010-07-01

    We address the issue of producing automatic video abstracts in the context of the video indexing of animated movies. For a quick browse of a movie's visual content, we propose a storyboard-like summary, which follows the movie's events by retaining one key frame for each specific scene. To capture the shot's visual activity, we use histograms of cumulative interframe distances, and the key frames are selected according to the distribution of the histogram's modes. For a preview of the movie's exciting action parts, we propose a trailer-like video highlight, whose aim is to show only the most interesting parts of the movie. Our method is based on a relatively standard approach, i.e., highlighting action through the analysis of the movie's rhythm and visual activity information. To suit every type of movie content, including predominantly static movies or movies without exciting parts, the concept of action depends on the movie's average rhythm. The efficiency of our approach is confirmed through several end-user studies.

  20. GPCR-GIA: a web-server for identifying G-protein coupled receptors and their families with grey incidence analysis.

    Science.gov (United States)

    Lin, Wei-Zhong; Xiao, Xuan; Chou, Kuo-Chen

    2009-11-01

    G-protein-coupled receptors (GPCRs) play fundamental roles in regulating various physiological processes as well as the activity of virtually all cells. Different GPCR families are responsible for different functions. With the avalanche of protein sequences generated in the postgenomic age, it is highly desired to develop an automated method to address the two problems: given the sequence of a query protein, can we identify whether it is a GPCR? If it is, what family class does it belong to? Here, a two-layer ensemble classifier called GPCR-GIA was proposed by introducing a novel scale called 'grey incident degree'. The overall success rate by GPCR-GIA in identifying GPCR and non-GPCR was about 95%, and that in identifying the GPCRs among their nine family classes was about 80%. These rates were obtained by the jackknife cross-validation tests on the stringent benchmark data sets where none of the proteins has > or = 50% pairwise sequence identity to any other in a same class. Moreover, a user-friendly web-server was established at http://218.65.61.89:8080/bioinfo/GPCR-GIA. For user's convenience, a step-by-step guide on how to use the GPCR-GIA web server is provided. Generally speaking, one can get the desired two-level results in around 10 s for a query protein sequence of 300-400 amino acids; the longer the sequence is, the more time that is needed.

  1. Professional Microsoft SQL Server 2012 Administration

    CERN Document Server

    Jorgensen, Adam; LoForte, Ross; Knight, Brian

    2012-01-01

    An essential how-to guide for experienced DBAs on the most significant product release since 2005! Microsoft SQL Server 2012 will have major changes throughout the SQL Server and will impact how DBAs administer the database. With this book, a team of well-known SQL Server experts introduces the many new features of the most recent version of SQL Server and deciphers how these changes will affect the methods that administrators have been using for years. Loaded with unique tips, tricks, and workarounds for handling the most difficult SQL Server admin issues, this how-to guide deciphers topics s

  2. LassoProt: server to analyze biopolymers with lassos.

    Science.gov (United States)

    Dabrowski-Tumanski, Pawel; Niemyska, Wanda; Pasznik, Pawel; Sulkowska, Joanna I

    2016-07-01

    The LassoProt server, http://lassoprot.cent.uw.edu.pl/, enables analysis of biopolymers with entangled configurations called lassos. The server offers various ways of visualizing lasso configurations, as well as their time trajectories, with all the results and plots downloadable. Broad spectrum of applications makes LassoProt a useful tool for biologists, biophysicists, chemists, polymer physicists and mathematicians. The server and our methods have been validated on the whole PDB, and the results constitute the database of proteins with complex lassos, supported with basic biological data. This database can serve as a source of information about protein geometry and entanglement-function correlations, as a reference set in protein modeling, and for many other purposes.

  3. Design and Implementation of the Personalized Search Engine Based on the Improved Behavior of User Browsing

    Directory of Open Access Journals (Sweden)

    Wei-Chao Li

    2013-02-01

    Full Text Available An improved user profile based on the user browsing behavior is proposed in this study. In the user profile, the user browsing web pages behaviors, the level of interest to keywords, the user's short-term interest and long-term interest are overall taken into account. The improved user profile based on the user browsing behavior is embedded in the personalized search engine system. The basic framework and the basic functional modules of the system are described detailed in this study. A demonstration system of IUBPSES is developed in the .NET platform. The results of the simulation experiments indicate that the retrieval effects which use the IUBPSES based on the improved user profile for information search surpass the current mainstream search engines. The direction of improvement and further research is proposed in the finally.

  4. Integrating Live Access Server into Google Earth

    Science.gov (United States)

    Li, J.; Schweitzer, R.; Hankin, S.; O'Brien, K.

    2006-12-01

    The Live Access Server (LAS) is a highly configurable Web server designed to provide flexible access to visualization and analysis products generated from geo-referenced scientific data sets. Now at version 7.0, LAS has been in operation since 1994. The current ~{!0~}Armstrong?release of LAS V7 consists of a set of modular components in a three tiered architecture -- user interface, workflow orchestration and services to access data and generate scientific products. The LAS user interface (UI) helps the user make requests, preventing requests that are impossible or unreasonable. The UI communicates with the LAS Product Server (LPS the workflow orchestration component) via an XML string with an HTTP GET. When a request is received by the LPS, business logic converts this request into a series of Web Service requests invoked via SOAP. The SOAP services perform data access and generate products (visualizations, data subsets, analyses, etc.). LPS packages these outputs into final products via Jakarta Velocity templates for delivery to the end user. Back-end services are most often a legacy application wrapped in a Java class. The Java wrapper classes are deployed as Web Services accessible via SOAP using the AxisServlet and a custom Web Services Deployment Descriptor file. Ferret is the default visualization application used by LAS, though other applications (e.g. Matlab, CDAT, and GrADS) can also be used. This application demonstrates how Keyhole Markup Language (KML) can be used to provide simple integration of LAS and Google Earth. KML makes access to "Virtual Globe" capabilities so simple that it can be added as an option to existing systems. This application is one such example. The ability to package an image in KML was added to the LAS as a new SOAP service. On the LAS user interface, users can select a Google Earth product in the same manner that any other LAS product is requested. The server will dynamically generate a KML file, which contains the 2D plot

  5. The Philosophy behind a (Danish) Voice-controlled Interface to Internet Browsing for motor-handicapped

    DEFF Research Database (Denmark)

    Brøndsted, Tom

    2005-01-01

    The public-funded project "Indtal" ("Speak-it") has succeeded in developing a Danish voice-controlled utility for internet browsing targeting motor-handicapped users having difficulties using a standard keyboard and/or a standard mouse. The system underlies a number of a priori defined design...... criteria: learnability and memorability rather than naturalness, minimal need for maintenance after release, support for "all" web standards (not just HTML conforming to certain "recommendations"), independency of the language on the websites being browsed, allowance for multimodal control along...

  6. Non-stationary analysis of queueing delay behavior in the GI/M/1/N-type queue with server working vacations

    Science.gov (United States)

    Kempa, Wojciech M.

    2015-11-01

    A finite-buffer GI/M/1/N-type queueing model with single working vacations is considered. Every time when the system becomes empty the server initializes an exponentially distributed single working vacation period, during which the processing of jobs is carried out with another (slower) rate. After finishing the vacation period the service process is being continued with normal (higher) speed. The next working vacation period is started at the next moment at which the queue empties and so on. The systems of integral equations for time-dependent queueing delay distributions, conditioned by the initial level of buffer saturation and related to each other, are built for systems beginning the operation in normal and working vacation modes, separately. The solutions for corresponding systems written for Laplace transforms are given explicitly using the linear algebraic approach.

  7. CERN servers go to Mexico

    CERN Multimedia

    Stefania Pandolfi

    2015-01-01

    On Wednesday, 26 August, 384 servers from the CERN Computing Centre were donated to the Faculty of Science in Physics and Mathematics (FCFM) and the Mesoamerican Centre for Theoretical Physics (MCTP) at the University of Chiapas, Mexico.   CERN’s Director-General, Rolf Heuer, met the Mexican representatives in an official ceremony in Building 133, where the servers were prepared for shipment. From left to right: Frédéric Hemmer, CERN IT Department Head; Raúl Heredia Acosta, Deputy Permanent Representative of Mexico to the United Nations and International Organizations in Geneva; Jorge Castro-Valle Kuehne, Ambassador of Mexico to the Swiss Confederation and the Principality of Liechtenstein; Rolf Heuer, CERN Director-General; Luis Roberto Flores Castillo, President of the Swiss Chapter of the Global Network of Qualified Mexicans Abroad; Virginia Romero Tellez, Coordinator of Institutional Relations of the Swiss Chapter of the Global Network of Qualified Me...

  8. Minimizing Thermal Stress for Data Center Servers through Thermal-Aware Relocation

    Science.gov (United States)

    Ling, T. C.; Hussain, S. A.

    2014-01-01

    A rise in inlet air temperature may lower the rate of heat dissipation from air cooled computing servers. This introduces a thermal stress to these servers. As a result, the poorly cooled active servers will start conducting heat to the neighboring servers and giving rise to hotspot regions of thermal stress, inside the data center. As a result, the physical hardware of these servers may fail, thus causing performance loss, monetary loss, and higher energy consumption for cooling mechanism. In order to minimize these situations, this paper performs the profiling of inlet temperature sensitivity (ITS) and defines the optimum location for each server to minimize the chances of creating a thermal hotspot and thermal stress. Based upon novel ITS analysis, a thermal state monitoring and server relocation algorithm for data centers is being proposed. The contribution of this paper is bringing the peak outlet temperatures of the relocated servers closer to average outlet temperature by over 5 times, lowering the average peak outlet temperature by 3.5% and minimizing the thermal stress. PMID:24987743

  9. Minimizing Thermal Stress for Data Center Servers through Thermal-Aware Relocation

    Directory of Open Access Journals (Sweden)

    Muhammad Tayyab Chaudhry

    2014-01-01

    Full Text Available A rise in inlet air temperature may lower the rate of heat dissipation from air cooled computing servers. This introduces a thermal stress to these servers. As a result, the poorly cooled active servers will start conducting heat to the neighboring servers and giving rise to hotspot regions of thermal stress, inside the data center. As a result, the physical hardware of these servers may fail, thus causing performance loss, monetary loss, and higher energy consumption for cooling mechanism. In order to minimize these situations, this paper performs the profiling of inlet temperature sensitivity (ITS and defines the optimum location for each server to minimize the chances of creating a thermal hotspot and thermal stress. Based upon novel ITS analysis, a thermal state monitoring and server relocation algorithm for data centers is being proposed. The contribution of this paper is bringing the peak outlet temperatures of the relocated servers closer to average outlet temperature by over 5 times, lowering the average peak outlet temperature by 3.5% and minimizing the thermal stress.

  10. PostgreSQL server programming

    CERN Document Server

    Krosing, Hannu

    2013-01-01

    This practical guide leads you through numerous aspects of working with PostgreSQL. Step by step examples allow you to easily set up and extend PostgreSQL. ""PostgreSQL Server Programming"" is for moderate to advanced PostgreSQL database professionals. To get the best understanding of this book, you should have general experience in writing SQL, a basic idea of query tuning, and some coding experience in a language of your choice.

  11. Running Servers around Zero Degrees

    OpenAIRE

    PervilÀ, Mikko; Kangasharju, Jussi

    2010-01-01

    Data centers are a major consumer of electricity and a significant fraction of their energy use is devoted to cooling the data center. Recent prototype deployments have investigated the possibility of using outside air for cooling and have shown large potential savings in energy consumption. In this paper, we push this idea to the extreme, by running servers outside in Finnish winter. Our results show that commercial, off-the-shelf computer equipment can tolerate extreme conditions such as ou...

  12. Preference of goats (Capra hircus L.) for tanniniferous browse species available in semi-arid areas in Ethiopia

    NARCIS (Netherlands)

    Mengistu, G.; Bezabih, M.; Hendriks, W.H.; Pellikaan, W.F.

    2016-01-01

    The objectives were to determine browse species preference of goats using dry matter intake (DMI) as a proxy, to compare preference when offered in combination with polyethylene glycol (PEG) and to establish relationships between browse species intake and chemical compositional data. Air-dried le

  13. Contextualization of topics - browsing through terms, authors, journals and cluster allocations

    NARCIS (Netherlands)

    Koopman, Rob; Wang, Shenghui; Scharnhorst, Andrea

    2015-01-01

    This paper builds on an innovative Information Retrieval tool, Ariadne. The tool has been developed as an interactive network visualization and browsing tool for large-scale bibliographic databases. It basically allows to gain insights into a topic by contextualizing a search query (Koopman et al.,

  14. Browsing and Querying in Online Documentation:A Study of User Interfaces and the Interaction Process

    DEFF Research Database (Denmark)

    Hertzum, Morten; Frøkjær, Erik

    1996-01-01

    A user interface study concerning the usage effectiveness of selected retrieval modes was conducted using an experimental text retrieval system, TeSS, giving access to online documentation of certain programming tools. Four modes of TeSS were compared: (1) browsing, (2) conventional boolean...

  15. Manipulating sheep browsing levels on coyote willow (Salix exigua) with supplements

    Science.gov (United States)

    Macronutrients and additives have been used to suppress or promote intake of upland tannin-containing browse species by livestock, but to our knowledge this technique has not been applied to sheep that feed on tannin-containing species in riparian areas. The objective of this study was to determine ...

  16. Browse species from Ethiopia: role in methane reduction and nematode control in goats

    NARCIS (Netherlands)

    Mengistu, Genet F.

    2017-01-01

    The aim of the research reported in this thesis was to evaluate browse species collected from Ethiopia for preference by goats, and for their in vitro anthelmintic and methane (CH4) reduction properties. During the conduct of the studies observations were made warranting a further aim, to compare in

  17. Establishment, survival, and growth of selected browse species in a ponderosa pine forest

    Science.gov (United States)

    Dietz, D.R.; Uresk, D.W.; Messner, H.E.; McEwen, L.C.

    1980-01-01

    Information is presented on establishment, survival, and growth of seven selected browse species in a ponderosa pine forest over a 10-year period. Methods of establishment included hand seeding and planting bare-root and containerized stock. Success of different methods differed with shrub species.

  18. Product features and task effects on experienced richness, control and engagement in voicemail browsing

    NARCIS (Netherlands)

    Rozendaal, M.C.; Keyson, D.V.; De Ridder, H.

    2008-01-01

    A recent focus is on creating engaging user experiences with digital products and services such as voicemail. This study aims to design towards increased levels of engagement in voicemail browsing by using the ‘Richness, Control and Engagement’ (RC & E) framework. This framework explains the levels

  19. Microsoft SQL Server 2012 with Hadoop

    CERN Document Server

    Sarkar, Debarchan

    2013-01-01

    This book will be a step-by-step tutorial, which practically teaches working with big data on SQL Server through sample examples in increasing complexity.Microsoft SQL Server 2012 with Hadoop is specifically targeted at readers who want to cross-pollinate their Hadoop skills with SQL Server 2012 business intelligence and data analytics. A basic understanding of traditional RDBMS technologies and query processing techniques is essential.

  20. System architecture for integrating semantic and iconic content for intelligent browsing of medical images

    Science.gov (United States)

    Tang, Lilian H.; Hanka, Rudolf; Ip, Horace H. S.

    1998-07-01

    Two sources of information play key roles in a collection of medical images such as computer tomographs, X-rays and histological slides, they are (1) textual descriptions relating to the image content and (2) visual features that can be seen on the image itself. The former are traditionally made by human specialists (e.g. histopathologists, radiographers, etc.) who interpret the image, and the latter are the inherent characteristics of images. This research program aims to study the architectural issues of a system which combines and interprets the information inherent in these two media to achieve automatic intelligent browsing of medical images. To give the research some practical significance, we applied the architecture to the design of the I-BROWSE system which is being developed jointly by the City University of Hong Kong and the Clinical School of the University of Cambridge. I- BROWSE is aimed to support intelligent retrieval and browsing of histological images obtained along the gastrointestinal tract (GI tract). Within such an architecture, given a query image or a populated image, a set of low level image feature measurements are obtained from a Visual Feature Detector, and with the help of knowledge bases and reasoning engines, the Semantic Analyzer derives, using an semantic feature generation and verification paradigm, the high level attributes for the image and furthermore automatically generates textual annotations for it. If the input image is accompanied with annotations made by a human specialist, the system will also analyze, combine and verify these two level of information, i.e., iconic and semantic contents. In the paper, we present the architectural issues and the strategies needed to support such information fusion process as well as the potentials of intelligent browsing using this dual- content-based approach.

  1. Mastering Windows Server 2008 Networking Foundations

    CERN Document Server

    Minasi, Mark; Mueller, John Paul

    2011-01-01

    Find in-depth coverage of general networking concepts and basic instruction on Windows Server 2008 installation and management including active directory, DNS, Windows storage, and TCP/IP and IPv4 networking basics in Mastering Windows Server 2008 Networking Foundations. One of three new books by best-selling author Mark Minasi, this guide explains what servers do, how basic networking works (IP basics and DNS/WINS basics), and the fundamentals of the under-the-hood technologies that support staff must understand. Learn how to install Windows Server 2008 and build a simple network, security co

  2. UniTree Name Server internals

    Energy Technology Data Exchange (ETDEWEB)

    Mecozzi, D.; Minton, J.

    1996-01-01

    The UniTree Name Server (UNS) is one of several servers which make up the UniTree storage system. The Name Server is responsible for mapping names to capabilities Names are generally human readable ASCII strings of any length. Capabilities are unique 256-bit identifiers that point to files, directories, or symbolic links. The Name Server implements a UNIX style hierarchical directory structure to facilitate name-to-capability mapping. The principal task of the Name Server is to manage the directories which make up the UniTree directory structure. The principle clients of the Name Server are the FTP Daemon, NFS and a few UniTree utility routines. However, the Name Server is a generalized server and will accept messages from any client. The purpose of this paper is to describe the internal workings of the UniTree Name Server. In cases where it seems appropriate, the motivation for a particular choice of algorithm as description of the algorithm itself will be given.

  3. Windows Server 2012 : Uudet ominaisuudet ja muutokset

    OpenAIRE

    2013-01-01

    Tämän opintyön tarkoituksena on valottaa Windows Server 2012 -käyttöjärjestelmän muutoksia verrattuna vanhaan Windows Server 2008 R2 -versioon. Työ aloitettiin ennen Windows Server 2012 -julkaisua Release Candidate -version testauksella ja myöhemmin julkaisun jälkeen Windows Serverin kokeiluversiolla. Työssä on silti ajankohtaista tietoa Windows Server 2012:sta. Aluksi käsitellään Windows Servereiden kehityskaarta lyhyesti ja käsitellään uusinta Windows Serveriä tuotteena se...

  4. Mastering Microsoft Windows Small Business Server 2008

    CERN Document Server

    Johnson, Steven

    2010-01-01

    A complete, winning approach to the number one small business solution. Do you have 75 or fewer users or devices on your small-business network? Find out how to integrate everything you need for your mini-enterprise with Microsoft's new Windows Server 2008 Small Business Server, a custom collection of server and management technologies designed to help small operations run smoothly without a giant IT department. This comprehensive guide shows you how to master all SBS components as well as handle integration with other Microsoft technologies.: Focuses on Windows Server 2008 Small Business Serv

  5. Microsoft Windows Server 2012 administration instant reference

    CERN Document Server

    Hester, Matthew

    2013-01-01

    Fast, accurate answers for common Windows Server questions Serving as a perfect companion to all Windows Server books, this reference provides you with quick and easily searchable solutions to day-to-day challenges of Microsoft's newest version of Windows Server. Using helpful design features such as thumb tabs, tables of contents, and special heading treatments, this resource boasts a smooth and seamless approach to finding information. Plus, quick-reference tables and lists provide additional on-the-spot answers. Covers such key topics as server roles and functionality, u

  6. Essential Mac OS X panther server administration integrating Mac OS X server into heterogeneous networks

    CERN Document Server

    Bartosh, Michael

    2004-01-01

    If you've ever wondered how to safely manipulate Mac OS X Panther Server's many underlying configuration files or needed to explain AFP permission mapping--this book's for you. From the command line to Apple's graphical tools, the book provides insight into this powerful server software. Topics covered include installation, deployment, server management, web application services, data gathering, and more

  7. 媒体资源云化关键技术研究%System Design for Media Server Based on Cloud Computing Technology and the Application Analysis

    Institute of Scientific and Technical Information of China (English)

    李青; 唐哲红; 宋阿芳; 王发光

    2013-01-01

    A media resource pool of multi-tenant cloud-based resource scheduling,load balancing implementation techniques and processes were proposed.Cloud media in the IMS service platform application solutions,cloud technology,media resources test situation were introduced,and cloud media business model was presented.In order to promote cloud media server hardware and software for the local manufacturers and related industry chain development provides a new idea.%提出一种媒体资源池云化中的多租户资源调度、负载均衡实现技术及流程,介绍了云媒体在IMS业务平台中的应用方案、云媒体资源技术的实验情况,并对云媒体商业模式等进行了阐述,为推动本土软硬件云媒体服务器生产商及相关产业链发展提供了新思路.

  8. TJ-II data retrieving by means of a client/server model

    Science.gov (United States)

    Vega, J.; Sánchez, E.; Crémy, C.; Portas, A.; Dulya, C. M.; Nilsson, J.

    1999-01-01

    The database of the TJ-II flexible heliac is centralized in a Unix server. This computer also commands the on-line processes related to data acquisition during TJ-II discharges: programming of measurement systems, connectivity with control systems, data visualization, and computations. The server has to provide access to the data so that signal analysis can be performed by local users or even from remote hosts. Data retrieving is accomplished by means of a client/server architecture in which two data servers are permanently running in the background of the Unix computer. One of them serves data requests from local clients and the other one sends data to remote clients. The communication protocol in both cases has been developed by using TCP/IP and Berkeley sockets. The client part consists of a set of routines (FORTRAN and C callable), which, in a transparent way, provide connectivity with the servers. This structure allows access to TJ-II data exactly in the same way from any computer, hiding not only specific aspects of the database, but hardware architecture of the server computer as well. In addition, the remote access makes it possible to distribute computations and to reduce the load on the Unix server from analysis and visualization tasks. At present, this software is running in four different environments: the Unix server itself, various types of Unix workstations, a CRAY J90 and a CRAY T3E. Finally, due to the fact that visualization is essential for TJ-II data analysis, a powerful and a very flexible visualization tool has been developed. It is a point and click application based on X Window/Motif. Data access is carried out through the client/server processes mentioned above and the software runs in the client computer.

  9. Distributed control system for demand response by servers

    Science.gov (United States)

    Hall, Joseph Edward

    Within the broad topical designation of smart grid, research in demand response, or demand-side management, focuses on investigating possibilities for electrically powered devices to adapt their power consumption patterns to better match generation and more efficiently integrate intermittent renewable energy sources, especially wind. Devices such as battery chargers, heating and cooling systems, and computers can be controlled to change the time, duration, and magnitude of their power consumption while still meeting workload constraints such as deadlines and rate of throughput. This thesis presents a system by which a computer server, or multiple servers in a data center, can estimate the power imbalance on the electrical grid and use that information to dynamically change the power consumption as a service to the grid. Implementation on a testbed demonstrates the system with a hypothetical but realistic usage case scenario of an online video streaming service in which there are workloads with deadlines (high-priority) and workloads without deadlines (low-priority). The testbed is implemented with real servers, estimates the power imbalance from the grid frequency with real-time measurements of the live outlet, and uses a distributed, real-time algorithm to dynamically adjust the power consumption of the servers based on the frequency estimate and the throughput of video transcoder workloads. Analysis of the system explains and justifies multiple design choices, compares the significance of the system in relation to similar publications in the literature, and explores the potential impact of the system.

  10. Calibration and validation of the {sup 14}C-labelled polyethylene glycol-binding assay for tannins in tropical browse

    Energy Technology Data Exchange (ETDEWEB)

    Mlambo, V. [Animal Production Unit, FAO/IAEA Agriculture and Biotechnology Laboratory, Seibersdorf (Austria)]. E-mail: vmlambo@agric.uniswa.sz; Makkar, H.P.S. [Animal Production and Health Section, Joint FAO/IAEA Division of Nuclear Techniques in Agriculture and Food, International Atomic Energy Agency, Vienna (Austria)

    2005-08-19

    This study evaluates the radiolabelled polyethylene glycol (PEG)-binding procedure [Silanikove, N., Shinder, D., Gilboa, N., Eyal, M., Nitsan, Z., 1996. Polyethylene glycol-binding to plant samples as an assay for the biological effects of tannins: predicting the negative effects of tannins in Mediterranean browse on rumen degradation. J. Agric. Food Chem. 44, 3230-3234] for tannin analysis, using 27 tropical browse plants. In this method, the amount of PEG bound to a plant sample is assumed to be a reflection of its tannin content. The method was modified to exclude the use of non-tanniniferous substrate for estimating non-specific binding (NSB) in tannin-containing substrates. Non-specific binding values varied widely (0.4-2.8 mg PEG/100 mg DM tannin-free substrate) when the tannin-free substrate was changed from wheat straw to either rye grass or maize shoots. We therefore propose a modified radiolabelled PEG-binding method to estimate the level of PEG-binding (PEGb) to tannin-containing foliage without using tannin-free substrate to correct for non-specific binding. In this approach, incremental levels of each tanniniferous substrate were used to generate PEGb values. The resultant linear response was analysed and tannin activity was expressed as the slope of the response curve (PEGbSlope) observed for each substrate. The slope takes into account the non-specific binding in each substrate, thus PEGbSlope does not require correction for NSB using tannin-free samples. This approach improved the correlation between PEGb and the {sup 125}I-labelled bovine serum albumin precipitation assay. Relationships between the modified PEG-binding assay and radiolabelled bovine serum albumin assay, in vitro tannin bioassay and colorimetric assays are presented. (author)

  11. Measuring SIP proxy server performance

    CERN Document Server

    Subramanian, Sureshkumar V

    2013-01-01

    Internet Protocol (IP) telephony is an alternative to the traditional Public Switched Telephone Networks (PSTN), and the Session Initiation Protocol (SIP) is quickly becoming a popular signaling protocol for VoIP-based applications. SIP is a peer-to-peer multimedia signaling protocol standardized by the Internet Engineering Task Force (IETF), and it plays a vital role in providing IP telephony services through its use of the SIP Proxy Server (SPS), a software application that provides call routing services by parsing and forwarding all the incoming SIP packets in an IP telephony network.SIP Pr

  12. Observing Strategies Used by Children When Selecting Books to Browse, Read or Borrow

    Directory of Open Access Journals (Sweden)

    Syahranah A. Raqi

    2008-06-01

    Full Text Available This paper described 1. the investigation undertaken to trace the strategies used by children in selecting books to borrow, use or browse in two children’s public libraries, and 2. map the information seeking patterns adopted by the selected children. The sample comprised 43 children who used the Bayan Budiman Children’s Library, Petaling Jaya and the Kuala Lumpur Children’s Library. The children were randomly chosen, aged between 7 and 12 and comprised those who entered the library with the observed behaviour of selecting books to browse, use or borrow. Two stages were used to collect data; 1. observing the children’s behavior as they enter the library to the point when they pick up a book to browse, read or borrow for fifteen to twenty minutes and 2. interviewing those selected with a semi-structure questionnaire. Belkin, et al’s (1993 information search strategy (ISS dimensions were used to transcribe children’s browsing and selecting behavior. Based on the observations and interviews respondent’s behaviour was mapped to illustrate the children’s choosing process. The findings indicated that 1. browsing was the most popular method used when choosing a book combined with various strategies such as looking for a book by an author or series, finding a book by subjects, visually or physically scanning and recognizing the physical composition of the book; 2. children based their selection on the storyline, illustrations, cover designs and typography of the books; and 3. the searching behaviour is likely to be non-linear in nature. The majority of the children faced no problems in choosing or locating a book as most are regular visitors. A few indicated being overwhelmed by the library’s large collection or, face initial confusion before they started to browse and interact with resources. Children used visual cues rather from textual information when searching for books, inferring that children libraries need to be supported with

  13. Tandem queue with server slow-down

    NARCIS (Netherlands)

    D.I. Miretskiy; W.R.W. Scheinhardt; M.R.H. Mandjes

    2007-01-01

    We study how rare events happen in the standard two-node tandem Jackson queue and in a generalization, the socalled slow-down network, see [2]. In the latter model the service rate of the first server depends on the number of jobs in the second queue: the first server slows down if the amount of job

  14. ENHANCING THE IMPREGNABILITY OF LINUX SERVERS

    Directory of Open Access Journals (Sweden)

    Rama Koteswara Rao G

    2014-03-01

    Full Text Available Worldwide IT industry is experiencing a rapid shift towards Service Oriented Architecture (SOA. As a response to the current trend, all the IT firms are adopting business models such as cloud based services which rely on reliable and highly available server platforms. Linux servers are known to be highly secure. Network security thus becomes a major concern to all IT organizations offering cloud based services. The fundamental form of attack on network security is Denial of Service. This paper focuses on fortifying the Linux server defence mechanisms resulting in an increase in reliability and availability of services offered by the Linux server platforms. To meet this emerging scenario, most of the organizations are adopting business models such as cloud computing that are dependant on reliable server platforms. Linux servers are well ahead of other server platforms in terms of security. This brings network security to the forefront of major concerns to an organization. The most common form of attacks is a Denial of Service attack. This paper focuses on mechanisms to detect and immunize Linux servers from DoS .

  15. What's New in Apache Web Server 22?

    CERN Document Server

    Bowen, Rich

    2007-01-01

    What's New in Apache Web Server 2.2? shows you all the new features you'll know to set up and administer the Apache 2.2 web server. Learn how to take advantage of its improved caching, proxying, authentication, and other improvements in your Web 2.0 applications.

  16. Building Mail Server on Distributed Computing SYstem

    Institute of Scientific and Technical Information of China (English)

    AkihiroShibata; OsamuHamada; 等

    2001-01-01

    The electronic mail has become the indispensable function in daily job,and the server stability and performance are required.Using DCE and DFS we have built the distributed electronic mail sever,that is,servers such as SMPT,IMAP are distributed symmetrically,and provids the seamless access.

  17. The NASA Technical Report Server

    Science.gov (United States)

    Nelson, Michael L.; Gottlich, Gretchen L.; Bianco, David J.; Paulson, Sharon S.; Binkley, Robert L.; Kellogg, Yvonne D.; Beaumont, Chris J.; Schmunk, Robert B.; Kurtz, Michael J.; Accomazzi, Alberto

    1995-01-01

    The National Aeronautics and Space Act of 1958 established NASA and charged it to "provide for the widest practicable and appropriate dissemination of information concerning its activities and the results thereof." The search for innovative methods to distribute NASA's information lead a grass-roots team to create the NASA Technical Report Server (NTRS), which uses the World Wide Web and other popular Internet-based information systems as search engines. The NTRS is an inter-center effort which provides uniform access to various distributed publication servers residing on the Internet. Users have immediate desktop access to technical publications from NASA centers and institutes. The NTRS is comprised of several units, some constructed especially for inclusion in NTRS, and others that are existing NASA publication services that NTRS reuses. This paper presents the NTRS architecture, usage metrics, and the lessons learned while implementing and maintaining the service. The NTRS is largely constructed with freely available software running on existing hardware. NTRS builds upon existing hardware and software, and the resulting additional exposure for the body of literature contained ensures that NASA's institutional knowledge base will continue to receive the widest practicable and appropriate dissemination.

  18. CERN servers donated to Ghana

    CERN Multimedia

    CERN Bulletin

    2012-01-01

    Cutting-edge research requires a constantly high performance of the computing equipment. At the CERN Computing Centre, computers typically need to be replaced after about four years of use. However, while servers may be withdrawn from cutting-edge use, they are still good for other uses elsewhere. This week, 220 servers and 30 routers were donated to the Kwame Nkrumah University of Science and Technology (KNUST) in Ghana.   “KNUST will provide a good home for these computers. The university has also developed a plan for using them to develop scientific collaboration with CERN,” said John Ellis, a professor at King’s College London and a visiting professor in CERN’s Theory Group.  John Ellis was heavily involved in building the relationship with Ghana, which started in 2006 when a Ghanaian participated in the CERN openlab student programme. Since 2007 CERN has hosted Ghanaians especially from KNUST in the framework of the CERN Summer Student Progr...

  19. Improving consensus contact prediction via server correlation reduction

    Directory of Open Access Journals (Sweden)

    Xu Jinbo

    2009-05-01

    Full Text Available Abstract Background Protein inter-residue contacts play a crucial role in the determination and prediction of protein structures. Previous studies on contact prediction indicate that although template-based consensus methods outperform sequence-based methods on targets with typical templates, such consensus methods perform poorly on new fold targets. However, we find out that even for new fold targets, the models generated by threading programs can contain many true contacts. The challenge is how to identify them. Results In this paper, we develop an integer linear programming model for consensus contact prediction. In contrast to the simple majority voting method assuming that all the individual servers are equally important and independent, the newly developed method evaluates their correlation by using maximum likelihood estimation and extracts independent latent servers from them by using principal component analysis. An integer linear programming method is then applied to assign a weight to each latent server to maximize the difference between true contacts and false ones. The proposed method is tested on the CASP7 data set. If the top L/5 predicted contacts are evaluated where L is the protein size, the average accuracy is 73%, which is much higher than that of any previously reported study. Moreover, if only the 15 new fold CASP7 targets are considered, our method achieves an average accuracy of 37%, which is much better than that of the majority voting method, SVM-LOMETS, SVM-SEQ, and SAM-T06. These methods demonstrate an average accuracy of 13.0%, 10.8%, 25.8% and 21.2%, respectively. Conclusion Reducing server correlation and optimally combining independent latent servers show a significant improvement over the traditional consensus methods. This approach can hopefully provide a powerful tool for protein structure refinement and prediction use.

  20. Ontobee: A linked ontology data server to support ontology term dereferencing, linkage, query and integration.

    Science.gov (United States)

    Ong, Edison; Xiang, Zuoshuang; Zhao, Bin; Liu, Yue; Lin, Yu; Zheng, Jie; Mungall, Chris; Courtot, Mélanie; Ruttenberg, Alan; He, Yongqun

    2017-01-04

    Linked Data (LD) aims to achieve interconnected data by representing entities using Unified Resource Identifiers (URIs), and sharing information using Resource Description Frameworks (RDFs) and HTTP. Ontologies, which logically represent entities and relations in specific domains, are the basis of LD. Ontobee (http://www.ontobee.org/) is a linked ontology data server that stores ontology information using RDF triple store technology and supports query, visualization and linkage of ontology terms. Ontobee is also the default linked data server for publishing and browsing biomedical ontologies in the Open Biological Ontology (OBO) Foundry (http://obofoundry.org) library. Ontobee currently hosts more than 180 ontologies (including 131 OBO Foundry Library ontologies) with over four million terms. Ontobee provides a user-friendly web interface for querying and visualizing the details and hierarchy of a specific ontology term. Using the eXtensible Stylesheet Language Transformation (XSLT) technology, Ontobee is able to dereference a single ontology term URI, and then output RDF/eXtensible Markup Language (XML) for computer processing or display the HTML information on a web browser for human users. Statistics and detailed information are generated and displayed for each ontology listed in Ontobee. In addition, a SPARQL web interface is provided for custom advanced SPARQL queries of one or multiple ontologies.

  1. Make Communication Between Browser and Server by Using Cookie Technology%巧妙运用Cookie技术进行Browser/Server通信

    Institute of Scientific and Technical Information of China (English)

    刘利军; 侯关士; 杨宗煦

    2001-01-01

    以一个在Browse/Seerver通信方式下对员工信息的处理为例,介绍了一种笔者在实际商业网站开发中经常用到的、以Cookie作为ASP和JavaScript技术的中介从而巧妙地在测览器和服务器间进行通信的应用技巧.%This paper takes a disposal of the workers information under the Browser/Server communication way as an example, and introduces a method that the writers often use in the development of business Web site .In this method, the Cookie technology is used skillfully as the agency between the ASP technology and the JavaScript technology, and makes the browser communicating with the Web server dynamically.

  2. Remote sensing information sciences research group: Browse in the EOS era

    Science.gov (United States)

    Estes, John E.; Star, Jeffrey L.

    1989-01-01

    The problem of science data browse was examined. Given the tremendous data volumes that are planned for future space missions, particularly the Earth Observing System in the late 1990's, the need for access to large spatial databases must be understood. Work was continued to refine the concept of data browse. Further, software was developed to provide a testbed of the concepts, both to locate possibly interesting data, as well as view a small portion of the data. Build II was placed on a minicomputer and a PC in the laboratory, and provided accounts for use in the testbed. Consideration of the testbed software as an element of in-house data management plans was begun.

  3. A Quaternary-Stage User Interest Model Based on User Browsing Behavior and Web Page Classification

    Institute of Scientific and Technical Information of China (English)

    Zongli Jiang; Hang Su

    2012-01-01

    The key to personalized search engine lies in user model. Traditional personalized model results in that the search results of secondary search are partial to the long-term interests, besides, forgetting to the long-term interests disenables effective recollection of user interests. This paper presents a quaternary-stage user interest model based on user browsing behavior and web page classification, which consults the principles of cache and recycle bin in operating system, by setting up an illuminating text-stage and a recycle bin interest-stage in front and rear of the traditional interest model respectively to constitute the quaternary-stage user interest model. The model can better reflect the user interests, by using an adaptive natural weight and its calculation method, and by efficiently integrating user browsing behavior and web document content.

  4. Change in Mesoherbivore Browsing Is Mediated by Elephant and Hillslope Position.

    Science.gov (United States)

    Lagendijk, D D Georgette; Thaker, Maria; de Boer, Willem F; Page, Bruce R; Prins, Herbert H T; Slotow, Rob

    2015-01-01

    Elephant are considered major drivers of ecosystems, but their effects within small-scale landscape features and on other herbivores still remain unclear. Elephant impact on vegetation has been widely studied in areas where elephant have been present for many years. We therefore examined the combined effect of short-term elephant presence (elephant presence did not affect woody species assemblages, but did affect height distribution, with greater sapling densities in elephant access areas. Overall tree and stem densities were also not affected by elephant. By contrast, slope position affected woody species assemblages, but not height distributions and densities. Variation in species assemblages was statistically best explained by levels of total cations, Zinc, sand and clay. Although elephant and mesoherbivore browsing intensities were unaffected by slope position, we found lower mesoherbivore browsing intensity on crests with high elephant browsing intensity. Thus, elephant appear to indirectly facilitate the survival of saplings, via the displacement of mesoherbivores, providing a window of opportunity for saplings to grow into taller trees. In the short-term, effects of elephant can be minor and in the opposite direction of expectation. In addition, such behavioural displacement promotes recruitment of saplings into larger height classes. The interaction between slope position and elephant effect found here is in contrast with other studies, and illustrates the importance of examining ecosystem complexity as a function of variation in species presence and topography. The absence of a direct effect of elephant on vegetation, but the presence of an effect on mesoherbivore browsing, is relevant for conservation areas especially where both herbivore groups are actively managed.

  5. Search the Audio, Browse the Video—A Generic Paradigm for Video Collections

    OpenAIRE

    Efrat Alon; Amir Arnon; Srinivasan Savitha

    2003-01-01

    The amount of digital video being shot, captured, and stored is growing at a rate faster than ever before. The large amount of stored video is not penetrable without efficient video indexing, retrieval, and browsing technology. Most prior work in the field can be roughly categorized into two classes. One class is based on image processing techniques, often called content-based image and video retrieval, in which video frames are indexed and searched for visual content. The other class is bas...

  6. Using Context Dependent Semantic Similarity to Browse Information Resources: an Application for the Industrial Design

    CERN Document Server

    Albertoni, Riccardo

    2010-01-01

    This paper deals with the semantic interpretation of information resources (e.g., images, videos, 3D models). We present a case study of an approach based on semantic and context dependent similarity applied to the industrial design. Different application contexts are considered and modelled to browse a repository of 3D digital objects according to different perspectives. The paper briefly summarises the basic concepts behind the semantic similarity approach and illustrates its application and results.

  7. Decreasing deer browsing pressure influenced understory vegetation dynamics over 30 years

    OpenAIRE

    Boulanger, Vincent; Baltzinger, Christophe; Saïd, Sonia; Ballon, Philippe; Picard, Jean-Francois; Dupouey, Jean-Luc

    2015-01-01

    Key message Thanks to the concomitant recordings of vegetation and deer browsing sampled first in 1976, then resurveyed in 2006, we show that forest plant communities shifted in response to deer population dynamics, stand management and eutrophication. Context and aims High deer populations alter forest under-story dynamics worldwide. However, no study ever attempted to rank the importance of deer herbivory relatively to other environmental drivers. In the Arc-en-Barrois National Forest (...

  8. Effect of browsing on willow in the Steel Creek grazing allotment

    Science.gov (United States)

    Keigley, R.B.; Gale, Gil

    2000-01-01

    The Steel Creek drainage serves as both wildlife range (primarily moose and elk) and as a livestock grazing allotment. For some years there has been concern about the effect of browsing on willows. Dense clusters of twigs have formed at the end of branches; entire stems of some plants have died. As of 1996, the relative impacts attributable to each of the ungulate species had not been documented.

  9. Using the NCBI Map Viewer to browse genomic sequence data.

    Science.gov (United States)

    Wolfsberg, Tyra G

    2011-04-01

    This unit includes a basic protocol with an introduction to the Map Viewer, describing how to perform a simple text-based search of genome annotations to view the genomic context of a gene, navigate along a chromosome, zoom in and out, and change the displayed maps to hide and show information. It also describes some of NCBI's sequence-analysis tools, which are provided as links from the Map Viewer. The alternate protocols describe different ways to query the genome sequence, and also illustrate additional features of the Map Viewer. Alternate Protocol 1 shows how to perform and interpret the results of a BLAST search against the human genome. Alternate Protocol 2 demonstrates how to retrieve a list of all genes between two STS markers. Finally, Alternate Protocol 3 shows how to find all annotated members of a gene family.

  10. Finding, Browsing and Getting Data Easily Using SPDF Web Services

    Science.gov (United States)

    Candey, R.; Chimiak, R.; Harris, B.; Johnson, R.; Kovalick, T.; Lal, N.; Leckner, H.; Liu, M.; McGuire, R.; Papitashvili, N.; Roberts, A.

    2010-01-01

    The NASA GSFC Space Physics Data Facility (5PDF) provides heliophysics science-enabling information services for enhancing scientific research and enabling integration of these services into the Heliophysics Data Environment paradigm, via standards-based approach (SOAP) and Representational State Transfer (REST) web services in addition to web browser, FTP, and OPeNDAP interfaces. We describe these interfaces and the philosophies behind these web services, and show how to call them from various languages, such as IDL and Perl. We are working towards a "one simple line to call" philosophy extolled in the recent VxO discussions. Combining data from many instruments and missions enables broad research analysis and correlation and coordination with other experiments and missions.

  11. Search the Audio, Browse the Video—A Generic Paradigm for Video Collections

    Science.gov (United States)

    Amir, Arnon; Srinivasan, Savitha; Efrat, Alon

    2003-12-01

    The amount of digital video being shot, captured, and stored is growing at a rate faster than ever before. The large amount of stored video is not penetrable without efficient video indexing, retrieval, and browsing technology. Most prior work in the field can be roughly categorized into two classes. One class is based on image processing techniques, often called content-based image and video retrieval, in which video frames are indexed and searched for visual content. The other class is based on spoken document retrieval, which relies on automatic speech recognition and text queries. Both approaches have major limitations. In the first approach, semantic queries pose a great challenge, while the second, speech-based approach, does not support efficient video browsing. This paper describes a system where speech is used for efficient searching and visual data for efficient browsing, a combination that takes advantage of both approaches. A fully automatic indexing and retrieval system has been developed and tested. Automated speech recognition and phonetic speech indexing support text-to-speech queries. New browsable views are generated from the original video. A special synchronized browser allows instantaneous, context-preserving switching from one view to another. The system was successfully used to produce searchable-browsable video proceedings for three local conferences.

  12. Search the Audio, Browse the Video—A Generic Paradigm for Video Collections

    Directory of Open Access Journals (Sweden)

    Efrat Alon

    2003-01-01

    Full Text Available The amount of digital video being shot, captured, and stored is growing at a rate faster than ever before. The large amount of stored video is not penetrable without efficient video indexing, retrieval, and browsing technology. Most prior work in the field can be roughly categorized into two classes. One class is based on image processing techniques, often called content-based image and video retrieval, in which video frames are indexed and searched for visual content. The other class is based on spoken document retrieval, which relies on automatic speech recognition and text queries. Both approaches have major limitations. In the first approach, semantic queries pose a great challenge, while the second, speech-based approach, does not support efficient video browsing. This paper describes a system where speech is used for efficient searching and visual data for efficient browsing, a combination that takes advantage of both approaches. A fully automatic indexing and retrieval system has been developed and tested. Automated speech recognition and phonetic speech indexing support text-to-speech queries. New browsable views are generated from the original video. A special synchronized browser allows instantaneous, context-preserving switching from one view to another. The system was successfully used to produce searchable-browsable video proceedings for three local conferences.

  13. Software for browsing sectioned images of a dog body and generating a 3D model.

    Science.gov (United States)

    Park, Jin Seo; Jung, Yong Wook

    2016-01-01

    The goals of this study were (1) to provide accessible and instructive browsing software for sectioned images and a portable document format (PDF) file that includes three-dimensional (3D) models of an entire dog body and (2) to develop techniques for segmentation and 3D modeling that would enable an investigator to perform these tasks without the aid of a computer engineer. To achieve these goals, relatively important or large structures in the sectioned images were outlined to generate segmented images. The sectioned and segmented images were then packaged into browsing software. In this software, structures in the sectioned images are shown in detail and in real color. After 3D models were made from the segmented images, the 3D models were exported into a PDF file. In this format, the 3D models could be manipulated freely. The browsing software and PDF file are available for study by students, for lecture for teachers, and for training for clinicians. These files will be helpful for anatomical study by and clinical training of veterinary students and clinicians. Furthermore, these techniques will be useful for researchers who study two-dimensional images and 3D models.

  14. Randomized assignment of jobs to servers in heterogeneous clusters of shared servers for low delay

    Directory of Open Access Journals (Sweden)

    Arpan Mukhopadhyay

    2016-11-01

    Full Text Available We consider the problem of assignning jobs to servers in a multi-server system consisting of N parallel processor sharing servers, categorized into M (≪N different types according to their processing capacities or speeds. Jobs of random sizes arrive at the system according to a Poisson process with rate Nλ. Upon each arrival, some servers of each type are sampled uniformly at random. The job is then assigned to one of the sampled servers based on their states. We propose two schemes, which differ in the metric for choosing the destination server for each arriving job. Our aim is to reduce the mean sojourn time of the jobs in the system. It is shown that the proposed schemes achieve the maximal stability region, without requiring the knowledge of the system parameters. The performance of the system operating under the proposed schemes is analyzed in the limit as N→∞. This gives rise to a mean field limit. The mean field is shown to have a unique, globally asymptotically stable equilibrium point which approximates the stationary distribution of load at each server. Asymptotic independence among the servers is established using a notion of intra-type exchangeability which generalizes the usual notion of exchangeability. It is further shown that the tail distribution of server occupancies decays doubly exponentially for each server type. Numerical evidence shows that at high load the proposed schemes perform at least as well as other schemes that require more knowledge of the system parameters.

  15. Indian accent text-to-speech system for web browsing

    Indian Academy of Sciences (India)

    Aniruddha Sen; K Samudravijaya

    2002-02-01

    Incorporation of speech and Indian scripts can greatly enhance the accessibility of web information among common people. This paper describes a ‘web reader’ which ‘reads out’ the textual contents of a selected web page in Hindi or in English with Indian accent. The content of the page is downloaded and parsed into suitable textual form. It is then passed on to an indigenously developed text-to-speech system for Hindi/Indian English, to generate spoken output. The text-to-speech conversion is performed in three stages: text analysis, to establish pronunciation, phoneme to acoustic-phonetic parameter conversion and, lastly, parameter-to-speech conversion through a production model. Different types of voices are used to read special messages. The web reader detects the hypertext links in the web pages and gives the user the option to follow the link or continue perusing the current web page. The user can exercise the option either through a keyboard or via spoken commands. Future plans include refining the web parser, improvement of naturalness of synthetic speech and improving the robustness of the speech recognition system.

  16. Lively data: discover, browse and access ocean altimetry data on internet

    Directory of Open Access Journals (Sweden)

    V. Rosmorduc

    2006-01-01

    Full Text Available The Products and Services (P&S department in the Space Oceanography Division at CLS (Collecte, Localisation, Satellites is in charge of distributing and promoting altimetry and operational oceanography data. The department is thus involved in the Aviso satellite altimetry project (the French service which distributes altimetry products since 1992, in the Mercator ocean operational forecasting system, and in the European Godae/Mersea ocean portal. Aiming to a standardisation and a common vision and management of all these ocean data, all these projects, led to the implementation of several Opendap/LAS Internet servers (Baudel et al., 2004. Some of the possibilities of the tools, as well as how-to information will be highlighted, as they are in the "Lively data'' section of Aviso website (see http://www.aviso.oceanobs.com/html/donnees/las/. Moreover, with a two-year experience we now have some feedback and analysis of how people – users, would-be users and students alike – are using this tool, some ideas for possible enhancements, etc.

  17. Effect of tropical browse leaves supplementation on rumen enzymes of sheep and goats fed Dichanthium annulatum grass-based diets.

    Science.gov (United States)

    Singh, Sultan; Kundu, S S

    2010-08-01

    In a switch-over experiment, eight male animals, four each of sheep and goats of local breeds with mean body weight of 26. 8 +/- 2.0 and 30.0 +/- 2.1 kg, were fed Dichanthium annulatum (DA) grass and four browse species viz. Helictris isora, Securengia virosa, Leucaena leucocephala (LL) and Hardwickia binnata (HB) in four feeding trials to assess their supplementary effect on activity of rumen enzymes. The sheep and goats were offered DA grass with individual browse in 75:25 and 50:50 proportions, respectively, for more than 3 months during each feeding trial, and rumen liquor samples were collected twice at 0 and 4 h post feeding after 60 and 90 days of feeding. Glutamate oxaloacetate transaminase (GOT), glutamate pyruvate transaminase (GPT) and glutamate dehydrogenase (GDH) enzymes were determined in the bacteria and protozoa fractions of rumen liquor, while cellulase enzyme activity was measured in mixed rumen liquor. LL and HB had the highest and lowest contents of CP, while fibre contents were lower in early than later browse leaves. Supplementation of browse leaves significantly (P goats on all DA grass-browse-supplemented diets except DA-HB (42.8 units/mg protein), where activity was significantly (P grass. Browse leaves significantly (P diet (144.8 microg sugar/mg protein). Goat exhibited higher activities of GOT and GPT than sheep in both bacteria and protozoa fraction of rumen liquor, while cellulase activity was similar between the animal species on the grass-browse leaves diets. Results indicate that browse leaves supplementation affect the enzyme activities of sheep and goats rumen, while the goats rumen liquor had higher activities of GOT, GPT and GDH enzyme than sheep.

  18. The Medicago truncatula gene expression atlas web server

    Directory of Open Access Journals (Sweden)

    Tang Yuhong

    2009-12-01

    Full Text Available Abstract Background Legumes (Leguminosae or Fabaceae play a major role in agriculture. Transcriptomics studies in the model legume species, Medicago truncatula, are instrumental in helping to formulate hypotheses about the role of legume genes. With the rapid growth of publically available Affymetrix GeneChip Medicago Genome Array GeneChip data from a great range of tissues, cell types, growth conditions, and stress treatments, the legume research community desires an effective bioinformatics system to aid efforts to interpret the Medicago genome through functional genomics. We developed the Medicago truncatula Gene Expression Atlas (MtGEA web server for this purpose. Description The Medicago truncatula Gene Expression Atlas (MtGEA web server is a centralized platform for analyzing the Medicago transcriptome. Currently, the web server hosts gene expression data from 156 Affymetrix GeneChip® Medicago genome arrays in 64 different experiments, covering a broad range of developmental and environmental conditions. The server enables flexible, multifaceted analyses of transcript data and provides a range of additional information about genes, including different types of annotation and links to the genome sequence, which help users formulate hypotheses about gene function. Transcript data can be accessed using Affymetrix probe identification number, DNA sequence, gene name, functional description in natural language, GO and KEGG annotation terms, and InterPro domain number. Transcripts can also be discovered through co-expression or differential expression analysis. Flexible tools to select a subset of experiments and to visualize and compare expression profiles of multiple genes have been implemented. Data can be downloaded, in part or full, in a tabular form compatible with common analytical and visualization software. The web server will be updated on a regular basis to incorporate new gene expression data and genome annotation, and is accessible

  19. Web server with ATMEGA 2560 microcontroller

    Science.gov (United States)

    Răduca, E.; Ungureanu-Anghel, D.; Nistor, L.; Haţiegan, C.; Drăghici, S.; Chioncel, C.; Spunei, E.; Lolea, R.

    2016-02-01

    This paper presents the design and building of a Web Server to command, control and monitor at a distance lots of industrial or personal equipments and/or sensors. The server works based on a personal software. The software can be written by users and can work with many types of operating system. The authors were realized the Web server based on two platforms, an UC board and a network board. The source code was written in "open source" language Arduino 1.0.5.

  20. Windows Server 2012 ja Active Directory

    OpenAIRE

    2015-01-01

    Opinnäytetyön aiheena oli tutustua Windows Server 2012–ohjelmiston sisältämiin palveluihin sekä perehtyä tarkemmin Active Directoryn peruskäyttöön. Tavoitteena oli antaa lukijalle ymmärrys Windows Server 2012–ohjelmiston tarjoamista käyttömahdollisuuksista ja Active Directoryn käytöstä. Opinnäytetyön tietoperusta koostui virtuaaliympäristön käytöstä ja erilaisista Windows Server 2012–ohjelman palveluista. Tietoperusta kattoi esimerkiksi seuraavat käsitteet: Virtuaalisointi, Emulointi, Ohj...

  1. Getting started with SQL Server 2014 administration

    CERN Document Server

    Ellis, Gethyn

    2014-01-01

    This is an easytofollow handson tutorial that includes real world examples of SQL Server 2014's new features. Each chapter is explained in a stepbystep manner which guides you to implement the new technology.If you want to create an highly efficient database server then this book is for you. This book is for database professionals and system administrators who want to use the added features of SQL Server 2014 to create a hybrid environment, which is both highly available and allows you to get the best performance from your databases.

  2. Environment server. Digital field information archival technology

    Energy Technology Data Exchange (ETDEWEB)

    Kita, Nobuyuki; Kita, Yasuyo; Yang, Hai-quan [National Institute of Advanced Industrial Science and Technology, Intelligent Systems Research Institute, Tsukuba, Ibaraki (Japan)

    2002-01-01

    For the safety operation of nuclear power plants, it is important to store various information about plants for a long period and visualize those stored information as desired. The system called Environment Server is developed for realizing it. In this paper, the general concepts of Environment Server is explained and its partial implementation for archiving the image information gathered by inspection mobile robots into virtual world and visualizing them is described. An extension of Environment Server for supporting attention sharing is also briefly introduced. (author)

  3. Application Server Aging Model and Multi-Level Rejuvenation Strategy Using Semi-Markov Process

    Institute of Scientific and Technical Information of China (English)

    ZHAO Tianhai; QI Yong; SHEN Junyi; HOU Di; ZHENG Xiaomei; LIU Liang

    2006-01-01

    Aiming at the characteristic of the dependency between the application components and the application server platform, a rejuvenation strategy with two different levels of rejuvenation granularities is put forward in this paper including the application component rejuvenation and the application server system rejuvenation. The availability and maintenance cost functions are obtained by means of establishing the application server aging model and the boundary condition of the optimal rejuvenation time is analyzed. Theory analysis indicates that the two-level rejuvenation strategy is superior to the traditional single level one. Finally, evaluation experiments are carried out and numerical result shows that compared with the traditional rejuvenation policy, the rejuvenation strategy proposed in this paper can further increase availability of the application server and reduce maintenance cost.

  4. Exam 70-411 administering Windows Server 2012

    CERN Document Server

    Course, Microsoft Official Academic

    2014-01-01

    Microsoft Windows Server is a multi-purpose server designed to increase reliability and flexibility of  a network infrastructure. Windows Server is the paramount tool used by enterprises in their datacenter and desktop strategy. The most recent versions of Windows Server also provide both server and client virtualization. Its ubiquity in the enterprise results in the need for networking professionals who know how to plan, design, implement, operate, and troubleshoot networks relying on Windows Server. Microsoft Learning is preparing the next round of its Windows Server Certification program

  5. FalconStor iSCSI Storage Server for Window Storage Server 2003

    Institute of Scientific and Technical Information of China (English)

    2004-01-01

    FalconStor iSCSI Storage Server for Windows Storage Server 2003是市场上第一个以Windows Storage Server 2003为平台的iSCSI Target及存储管理产品。它是通过既有的企业网络,提供全面的、高可用的及合乎成本效益的IP SAN存储系统。

  6. DSP: a protein shape string and its profile prediction server.

    Science.gov (United States)

    Sun, Jiangming; Tang, Shengnan; Xiong, Wenwei; Cong, Peisheng; Li, Tonghua

    2012-07-01

    Many studies have demonstrated that shape string is an extremely important structure representation, since it is more complete than the classical secondary structure. The shape string provides detailed information also in the regions denoted random coil. But few services are provided for systematic analysis of protein shape string. To fill this gap, we have developed an accurate shape string predictor based on two innovative technologies: a knowledge-driven sequence alignment and a sequence shape string profile method. The performance on blind test data demonstrates that the proposed method can be used for accurate prediction of protein shape string. The DSP server provides both predicted shape string and sequence shape string profile for each query sequence. Using this information, the users can compare protein structure or display protein evolution in shape string space. The DSP server is available at both http://cheminfo.tongji.edu.cn/dsp/ and its main mirror http://chemcenter.tongji.edu.cn/dsp/.

  7. Striping and Scheduling for Large Scale Multimedia Servers

    Institute of Scientific and Technical Information of China (English)

    Kyung-Oh Lee; Jun-Ho Park; Yoon-Young Park

    2004-01-01

    When designing a multimedia server, several things must be decided: which scheduling scheme to adopt, how to allocate multimedia objects on storage devices, and the round length with which the streams will be serviced. Several problems in the designing of large-scale multimedia servers are addressed, with the following contributions: (1) a striping scheme is proposed that minimizes the number of seeks and hence maximizes the performance; (2) a simple and efficient mechanism is presented to find the optimal striping unit size as well as the optimal round length, which exploits both the characteristics of VBR streams and the situation of resources in the system; and (3) the characteristics and resource requirements of several scheduling schemes are investigated in order to obtain a clear indication as to which scheme shows the best performance in realtime multimedia servicing. Based on our analysis and experimental results, the CSCAN scheme outperforms the other schemes.

  8. Mining the SDSS SkyServer SQL queries log

    Science.gov (United States)

    Hirota, Vitor M.; Santos, Rafael; Raddick, Jordan; Thakar, Ani

    2016-05-01

    SkyServer, the Internet portal for the Sloan Digital Sky Survey (SDSS) astronomic catalog, provides a set of tools that allows data access for astronomers and scientific education. One of SkyServer data access interfaces allows users to enter ad-hoc SQL statements to query the catalog. SkyServer also presents some template queries that can be used as basis for more complex queries. This interface has logged over 330 million queries submitted since 2001. It is expected that analysis of this data can be used to investigate usage patterns, identify potential new classes of queries, find similar queries, etc. and to shed some light on how users interact with the Sloan Digital Sky Survey data and how scientists have adopted the new paradigm of e-Science, which could in turn lead to enhancements on the user interfaces and experience in general. In this paper we review some approaches to SQL query mining, apply the traditional techniques used in the literature and present lessons learned, namely, that the general text mining approach for feature extraction and clustering does not seem to be adequate for this type of data, and, most importantly, we find that this type of analysis can result in very different queries being clustered together.

  9. Mastering Windows Server 2012 R2

    CERN Document Server

    Minasi, Mark; Booth, Christian; Butler, Robert; McCabe, John; Panek, Robert; Rice, Michael; Roth, Stefan

    2013-01-01

    Check out the new Hyper-V, find new and easier ways to remotely connect back into the office, or learn all about Storage Spaces-these are just a few of the features in Windows Server 2012 R2 that are explained in this updated edition from Windows authority Mark Minasi and a team of Windows Server experts led by Kevin Greene. This book gets you up to speed on all of the new features and functions of Windows Server, and includes real-world scenarios to put them in perspective. If you're a system administrator upgrading to, migrating to, or managing Windows Server 2012 R2, find what you need to

  10. Geologic Hazards Science Center GIS Server

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — The USGS Geologic Hazards Science Center (GHSC) in Golden, CO maintains a GIS server with services pertaining to various geologic hazard disciplines involving...

  11. Membangun Server Multicast Berbasis Streaming Menggunakan Centos

    Directory of Open Access Journals (Sweden)

    Irwan Susanto

    2009-11-01

    Full Text Available The development of IP-based technology contribute to the development of telecomunication and information technology.  One of  IP-based technology application is streaming multicast, as part of broadcasting. The streaming  process is made by accessing  Telkom-2 broadcast  through AKATEL LAN network, then  server forward it to clients using multicast IP system. Multicast IP is D-class IP, which is able to send data package in realtime. In multicast system, server only send one data package to  some clients with same speed transmition. The Telkom-2 broadcast is already accessed before   sent as data package. Server will access Telkom-2 broadcast using parabola antenna and Hughes modem, then forward it to clients through AKATEL LAN network. Clients must conect to server via AKATEL LAN network and already  instaled VLC player, in order to be able to access the Telkom-2 broadcast

  12. Miniaturized Airborne Imaging Central Server System Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovation is a miniaturized airborne imaging central server system (MAICSS). MAICSS is designed as a high-performance-computer-based electronic backend that...

  13. Miniaturized Airborne Imaging Central Server System Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovation is a miniaturized airborne imaging central server system (MAICSS). MAICSS is designed as a high-performance computer-based electronic backend that...

  14. Preference of goats (Capra hircus L.) for tanniniferous browse species available in semi-arid areas in Ethiopia.

    Science.gov (United States)

    Mengistu, G; Bezabih, M; Hendriks, W H; Pellikaan, W F

    2016-11-29

    The objectives were to determine browse species preference of goats using dry matter intake (DMI) as a proxy, to compare preference when offered in combination with polyethylene glycol (PEG) and to establish relationships between browse species intake and chemical compositional data. Air-dried leaves of Acacia etbaica, Cadaba farinosa, Capparis tomentosa, Dichrostachys cinerea, Dodonaea angustifolia, Euclea racemosa, Maerua angolensis, Maytenus senegalensis, Rhus natalensis and Senna singueana were used. Two cafeteria trials, each lasting 10 days, were conducted using four local mature male goats of 2-2.5 years receiving a daily ration of grass hay (4% of body weight) and 200 g wheat bran. In trial 1, goats were offered 25 g of each browse species for a total of 30 min with intake, time spent on consumption and the number of visits to specific browse species recorded at 10-min intervals. In trial 2, the same procedure was followed except that 25 g of PEG 4000 was added to the daily wheat bran ration. Crude protein and neutral detergent fibre in browse species ranged from 69.0-245.5 to 159.8-560.6 g/kg dry matter (DM) respectively. Total phenols and total tannins contents ranged between 3.7-70.6 and 2.5-68.1 mg tannic acid equivalent/g DM, respectively, and condensed tannins 1.7-18.4 Abs550 nm /g DM. Preference indicators measured in the first 10 min of browse species intake differed significantly among browse species and with PEG (p < 0.0001). Principal components explained 69.9% of the total variation in browse species DMI. Despite the high tannin levels, D. cinerea, R. natalensis and A. etbaica were the most preferred species regardless of PEG presence. Tannin levels at the observed browse species DMI did not determine preference, instead, preference appeared to be based on hemicellulose. Determining browse species preference is essential to exploit them to improve nutrient utilization and control parasites in goats.

  15. Conversation Threads Hidden within Email Server Logs

    Science.gov (United States)

    Palus, Sebastian; Kazienko, Przemysław

    Email server logs contain records of all email Exchange through this server. Often we would like to analyze those emails not separately but in conversation thread, especially when we need to analyze social network extracted from those email logs. Unfortunately each mail is in different record and those record are not tided to each other in any obvious way. In this paper method for discussion threads extraction was proposed together with experiments on two different data sets - Enron and WrUT..

  16. 桌面型Windows Server 2008

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    Windows Server2008是服务器系统,作为桌面使用没有winxp和win7等用起来得心应手,此文通过对该系统一些配置稍作修改,在不影响其服务器系统的性能情况下打造一款更得心应手的桌面型的windows Server 2008。

  17. ON THE SINGLE SERVER RETRIAL QUEUE WITH PRIORITY SUBSCRIBERS AND SERVER BREAKDOWNS

    Institute of Scientific and Technical Information of China (English)

    Jinting WANG

    2008-01-01

    The author concerned the reliability evaluation as well as queueing analysis of M1, M2/G1, G2/1 retrial queues with two different types of primary customers arriving according to independent poisson flows. In the case of blocking, the first type of customers can be queued whereas the second type of customers must leave the service area but return after some random period of time to try their luck again. The author assumes that the server is unreliable and it has a service-type dependent, exponentially distributed life time as well as a service-type dependent, generally distributed repair time. The necessary and sufficient condition for the system to be stable is investigated. Using a supplementary variable method, the author obtains a steady-state solution for queueing measures, and the transient as well as the steady-state solutions for reliability measures of interest.

  18. An efficient biometric and password-based remote user authentication using smart card for Telecare Medical Information Systems in multi-server environment.

    Science.gov (United States)

    Maitra, Tanmoy; Giri, Debasis

    2014-12-01

    The medical organizations have introduced Telecare Medical Information System (TMIS) to provide a reliable facility by which a patient who is unable to go to a doctor in critical or urgent period, can communicate to a doctor through a medical server via internet from home. An authentication mechanism is needed in TMIS to hide the secret information of both parties, namely a server and a patient. Recent research includes patient's biometric information as well as password to design a remote user authentication scheme that enhances the security level. In a single server environment, one server is responsible for providing services to all the authorized remote patients. However, the problem arises if a patient wishes to access several branch servers, he/she needs to register to the branch servers individually. In 2014, Chuang and Chen proposed an remote user authentication scheme for multi-server environment. In this paper, we have shown that in their scheme, an non-register adversary can successfully logged-in into the system as a valid patient. To resist the weaknesses, we have proposed an authentication scheme for TMIS in multi-server environment where the patients can register to a root telecare server called registration center (RC) in one time to get services from all the telecare branch servers through their registered smart card. Security analysis and comparison shows that our proposed scheme provides better security with low computational and communication cost.

  19. The tissue micro-array data exchange specification: a web based experience browsing imported data

    Directory of Open Access Journals (Sweden)

    Ayers Leona W

    2005-08-01

    Full Text Available Abstract Background The AIDS and Cancer Specimen Resource (ACSR is an HIV/AIDS tissue bank consortium sponsored by the National Cancer Institute (NCI Division of Cancer Treatment and Diagnosis (DCTD. The ACSR offers to approved researchers HIV infected biologic samples and uninfected control tissues including tissue cores in micro-arrays (TMA accompanied by de-identified clinical data. Researchers interested in the type and quality of TMA tissue cores and the associated clinical data need an efficient method for viewing available TMA materials. Because each of the tissue samples within a TMA has separate data including a core tissue digital image and clinical data, an organized, standard approach to producing, navigating and publishing such data is necessary. The Association for Pathology Informatics (API extensible mark-up language (XML TMA data exchange specification (TMA DES proposed in April 2003 provides a common format for TMA data. Exporting TMA data into the proposed format offers an opportunity to implement the API TMA DES. Using our public BrowseTMA tool, we created a web site that organizes and cross references TMA lists, digital "virtual slide" images, TMA DES export data, linked legends and clinical details for researchers. Microsoft Excel® and Microsoft Word® are used to convert tabular clinical data and produce an XML file in the TMA DES format. The BrowseTMA tool contains Extensible Stylesheet Language Transformation (XSLT scripts that convert XML data into Hyper-Text Mark-up Language (HTML web pages with hyperlinks automatically added to allow rapid navigation. Results Block lists, virtual slide images, legends, clinical details and exports have been placed on the ACSR web site for 14 blocks with 1623 cores of 2.0, 1.0 and 0.6 mm sizes. Our virtual microscope can be used to view and annotate these TMA images. Researchers can readily navigate from TMA block lists to TMA legends and to clinical details for a selected tissue core

  20. Analyzing Remote Server Locations for Personal Data Transfers in Mobile Apps

    Directory of Open Access Journals (Sweden)

    Eskandari Mojtaba

    2017-01-01

    Full Text Available The prevalence of mobile devices and their capability to access high speed internet has transformed them into a portable pocket cloud interface. Being home to a wide range of users’ personal data, mobile devices often use cloud servers for storage and processing. The sensitivity of a user’s personal data demands adequate level of protection at the back-end servers. In this regard, the European Union Data Protection regulations (e.g., article 25.1 impose restriction on the locations of European users’ personal data transfer. The matter of concern, however, is the enforcement of such regulations. The first step in this regard is to analyze mobile apps and identify the location of servers to which personal data is transferred. To this end, we design and implement an app analysis tool, PDTLoc (Personal Data Transfer Location Analyzer, to detect violation of the mentioned regulations. We analyze 1, 498 most popular apps in the EEA using PDTLoc to investigate the data recipient server locations. We found that 16.5% (242 of these apps transfer users’ personal data to servers located at places outside Europe without being under the control of a data protection framework. Moreover, we inspect the privacy policies of the apps revealing that 51% of these apps do not provide any privacy policy while almost all of them contact the servers hosted outside Europe.

  1. PDS: A Performance Database Server

    Directory of Open Access Journals (Sweden)

    Michael W. Berry

    1994-01-01

    Full Text Available The process of gathering, archiving, and distributing computer benchmark data is a cumbersome task usually performed by computer users and vendors with little coordination. Most important, there is no publicly available central depository of performance data for all ranges of machines from personal computers to supercomputers. We present an Internet-accessible performance database server (PDS that can be used to extract current benchmark data and literature. As an extension to the X-Windows-based user interface (Xnetlib to the Netlib archival system, PDS provides an on-line catalog of public domain computer benchmarks such as the LINPACK benchmark, Perfect benchmarks, and the NAS parallel benchmarks. PDS does not reformat or present the benchmark data in any way that conflicts with the original methodology of any particular benchmark; it is thereby devoid of any subjective interpretations of machine performance. We believe that all branches (research laboratories, academia, and industry of the general computing community can use this facility to archive performance metrics and make them readily available to the public. PDS can provide a more manageable approach to the development and support of a large dynamic database of published performance metrics.

  2. A New Technology of Upload Files On Browse/Server%一种基于B/S模式的实现文件上传的技术

    Institute of Scientific and Technical Information of China (English)

    张建中; 张亚平; 苏智星; 袁小一

    2002-01-01

    @@ 1.PHP简介 PHP(Professional Hypertext Preprocessor)是一种服务器端HTML嵌入式脚本描述语言,目前正式发布的最高版本为4.04.服务器端脚本技术又分为嵌入式与非嵌入式两种,PHP是嵌入式的,类似的如ASP.它是一种功能非常强大的面向Internet/Intranet的编程语言,可以开发动态交互的Web应用程序,可在多种系统平台和多种Web服务器中使用,是真正的跨平台、跨服务器的开发语言.

  3. LINCS Canvas Browser: interactive web app to query, browse and interrogate LINCS L1000 gene expression signatures.

    Science.gov (United States)

    Duan, Qiaonan; Flynn, Corey; Niepel, Mario; Hafner, Marc; Muhlich, Jeremy L; Fernandez, Nicolas F; Rouillard, Andrew D; Tan, Christopher M; Chen, Edward Y; Golub, Todd R; Sorger, Peter K; Subramanian, Aravind; Ma'ayan, Avi

    2014-07-01

    For the Library of Integrated Network-based Cellular Signatures (LINCS) project many gene expression signatures using the L1000 technology have been produced. The L1000 technology is a cost-effective method to profile gene expression in large scale. LINCS Canvas Browser (LCB) is an interactive HTML5 web-based software application that facilitates querying, browsing and interrogating many of the currently available LINCS L1000 data. LCB implements two compacted layered canvases, one to visualize clustered L1000 expression data, and the other to display enrichment analysis results using 30 different gene set libraries. Clicking on an experimental condition highlights gene-sets enriched for the differentially expressed genes from the selected experiment. A search interface allows users to input gene lists and query them against over 100 000 conditions to find the top matching experiments. The tool integrates many resources for an unprecedented potential for new discoveries in systems biology and systems pharmacology. The LCB application is available at http://www.maayanlab.net/LINCS/LCB. Customized versions will be made part of the http://lincscloud.org and http://lincs.hms.harvard.edu websites.

  4. Analysis of Web Proxy Logs

    Science.gov (United States)

    Fei, Bennie; Eloff, Jan; Olivier, Martin; Venter, Hein

    Network forensics involves capturing, recording and analysing network audit trails. A crucial part of network forensics is to gather evidence at the server level, proxy level and from other sources. A web proxy relays URL requests from clients to a server. Analysing web proxy logs can give unobtrusive insights to the browsing behavior of computer users and provide an overview of the Internet usage in an organisation. More importantly, in terms of network forensics, it can aid in detecting anomalous browsing behavior. This paper demonstrates the use of a self-organising map (SOM), a powerful data mining technique, in network forensics. In particular, it focuses on how a SOM can be used to analyse data gathered at the web proxy level.

  5. JAtlasView: a Java atlas-viewer for browsing biomedical 3D images and atlases

    Directory of Open Access Journals (Sweden)

    Scott Mark

    2005-03-01

    Full Text Available Abstract Background Many three-dimensional (3D images are routinely collected in biomedical research and a number of digital atlases with associated anatomical and other information have been published. A number of tools are available for viewing this data ranging from commercial visualization packages to freely available, typically system architecture dependent, solutions. Here we discuss an atlas viewer implemented to run on any workstation using the architecture neutral Java programming language. Results We report the development of a freely available Java based viewer for 3D image data, descibe the structure and functionality of the viewer and how automated tools can be developed to manage the Java Native Interface code. The viewer allows arbitrary re-sectioning of the data and interactive browsing through the volume. With appropriately formatted data, for example as provided for the Electronic Atlas of the Developing Human Brain, a 3D surface view and anatomical browsing is available. The interface is developed in Java with Java3D providing the 3D rendering. For efficiency the image data is manipulated using the Woolz image-processing library provided as a dynamically linked module for each machine architecture. Conclusion We conclude that Java provides an appropriate environment for efficient development of these tools and techniques exist to allow computationally efficient image-processing libraries to be integrated relatively easily.

  6. SciServer: An Online Collaborative Environment for Big Data in Research and Education

    Science.gov (United States)

    Raddick, Jordan; Souter, Barbara; Lemson, Gerard; Taghizadeh-Popp, Manuchehr

    2017-01-01

    For the past year, SciServer Compute (http://compute.sciserver.org) has offered access to big data resources running within server-side Docker containers. Compute has allowed thousands of researchers to bring advanced analysis to big datasets like the Sloan Digital Sky Survey and others, while keeping the analysis close to the data for better performance and easier read/write access. SciServer Compute is just one part of the SciServer system being developed at Johns Hopkins University, which provides an easy-to-use collaborative research environment for astronomy and many other sciences.SciServer enables these collaborative research strategies using Jupyter notebooks, in which users can write their own Python and R scripts and execute them on the same server as the data. We have written special-purpose libraries for querying, reading, and writing data. Intermediate results can be stored in large scratch space (hundreds of TBs) and analyzed directly from within Python or R with state-of-the-art visualization and machine learning libraries. Users can store science-ready results in their permanent allocation on SciDrive, a Dropbox-like system for sharing and publishing files.SciServer Compute’s virtual research environment has grown with the addition of task management and access control functions, allowing collaborators to share both data and analysis scripts securely across the world. These features also open up new possibilities for education, allowing instructors to share datasets with students and students to write analysis scripts to share with their instructors. We are leveraging these features into a new system called “SciServer Courseware,” which will allow instructors to share assignments with their students, allowing students to engage with big data in new ways.SciServer has also expanded to include more datasets beyond the Sloan Digital Sky Survey. A part of that growth has been the addition of the SkyQuery component, which allows for simple, fast

  7. Scaling the effects of moose browsing on forage distribution, from the geometry of plant canopies to landscapes

    Science.gov (United States)

    De Jager, N. R.; Pastor, J.; Hodgson, A.L.

    2009-01-01

    Landscape heterogeneity influences large herbivores by altering their feeding rates, but as herbivores attempt to maximize feeding rates they also create spatial heterogeneity by altering plant growth. Herbivore feeding rates thus provide a quantitative link between the causes and consequences of spatial heterogeneity in herbivore-dominated ecosystems. The fractal geometry of plant canopies determines both the density and mass of twigs available to foraging herbivores. These properties determine a threshold distance between plants (d*) that distinguishes the mechanisms regulating herbivore intake rates. When d* is greater than the actual distance between plants (d), intake is regulated by the rate of food processing in the mouth. But when d* moose browsing from 2001 to 2005 at Isle Royale National Park, Michigan, USA. For aspen saplings, fractal dimension of bite density, bite mass, and forage biomass responded quadratically to increasing moose browsing and were greatest at -3-4 g-g.m-2.yr"1 consumption. For balsam fir, in contrast, these same measures declined steadily with increasing moose browsing. The different responses of plant canopies to increased browsing altered d* around plants. In summer, d* > d for aspen saplings at all prior consumption levels. Food processing therefore regulated summer moose feeding rates across our landscapes. In winter, changes in bite mass due to past browsing were sufficient to cause d* < d for aspen and balsam fir. Therefore, travel velocity and food processing jointly regulated intake rate during winter. Browsing-induced changes in the small-scale geometry of plant canopies can determine intake rate at larger spatial scales by changing d* relative to d and, hence, which mechanisms determine intake rate, essentially altering how herbivores sense the distribution of their food resources. ?? 2009 by the Ecological Society of America.

  8. Managing Data Persistence in Network Enabled Servers

    Directory of Open Access Journals (Sweden)

    Eddy Caron

    2005-01-01

    Full Text Available The GridRPC model [17] is an emerging standard promoted by the Global Grid Forum (GGF that defines how to perform remote client-server computations on a distributed architecture. In this model data are sent back to the client at the end of every computation. This implies unnecessary communications when computed data are needed by an other server in further computations. Since, communication time is sometimes the dominant cost of remote computations, this cost has to be lowered. Several tools instantiate the GridRPC model such as NetSolve developed at the University of Tennessee, Knoxville, USA, and DIET developed at LIP laboratory, ENS Lyon, France. They are usually called Network Enabled Servers (NES. In this paper, we present a discussion of the data management solutions chosen for these two NES (NetSolve and DIET as well as experimental results.

  9. Energy-efficient server management; Energieeffizientes Servermanagement

    Energy Technology Data Exchange (ETDEWEB)

    Sauter, B.

    2003-07-01

    This final report for the Swiss Federal Office of Energy (SFOE) presents the results of a project that aimed to develop an automatic shut-down system for the servers used in typical electronic data processing installations to be found in small and medium-sized enterprises. The purpose of shutting down these computers - the saving of energy - is discussed. The development of a shutdown unit on the basis of a web-server that automatically shuts down the servers connected to it and then interrupts their power supply is described. The functions of the unit, including pre-set times for switching on and off, remote operation via the Internet and its interaction with clients connected to it are discussed. Examples of the system's user interface are presented.

  10. A Single-server Discrete-time Retrial G-queue with Server Breakdowns and Repairs

    Institute of Scientific and Technical Information of China (English)

    Jin-ting Wang; Peng Zhang

    2009-01-01

    This paper concerns a discrete-time Geo/Geo/1 retrial queue with both positive and negative customers where the server is subject to breakdowns and repairs due to negative arrivals.The arrival of a negative customer causes one positive customer to be killed if any is present,and simultaneously breaks the server down.The server is sent to repair immediately and after repair it is as good as new.The negative customer also causes the server breakdown if the server is found idle,but has no effect on the system if the server is under repair.We analyze the Markov chain underlying the queueing system and obtain its ergodicity condition.The generating function of the number of customers in the orbit and in the system are also obtained,along with the marginal distributions of the orbit size when the server is idle,busy or down.Finally,we present some numerical examples to illustrate the influence of the parameters on several performance characteristics of the system.

  11. 全球油气地质信息共享系统%The Global Oil and Gas Geology Database Information Sharing System Based on ArcGIS Server

    Institute of Scientific and Technical Information of China (English)

    苏国辉; 申延平; 孙记红; 何书锋; 魏合龙

    2012-01-01

    为了充分展示全球油气地质综合研究的数据资料,提供数据库对外信息服务的窗口,基于ArcGIS Server技术、Visual studio.NET开发平台和ORACLE数据库管理系统,构建了B/S结构的全球油气地质信息共享系统.本系统是WebGIS应用平台,提供了信息发布、地图浏览、图形查询、空间分析、专题制图等功能,以图形、文字、图表等方式展示研究区的基础地理、基础地质、油气地质、资源潜力、投资环境等方面的最新成果,为公众和研究人员提供了高效的信息访问途径,为国家油气资源管理和能源决策提供了可靠的信息支撑,实现了重要信息社会化共享的目标.文章简单介绍了ArcGIS Server,重点描述系统框架、数据分类和功能模块,旨在探讨ArcGIS Server的网络信息共享系统的设计与实现.%It is an important research subject to build hydrocarbon resources information database and provide decision support service by information technology. Strategic Research Center of Oil and Gas Resources organizes a global hydrocarbon geology research project, collects the global latest hydrocarbon resources information, and establishes global hydrocarbon geology database, which has the following characters: wide scope, perfect content and strong timeliness. In order to provide the database information service, the Global Hydrocarbon Geology Database Information Sharing System is established based on ArcGIS Server, Visual studio. NET development platform and the Oracle database management system. It fills in the domestic gap of hydrocarbon geology information management and sharing system, provides a way to reach sharing and interoperability. The information sharing system is a WebGIS application platform, providing information release, map browsing, graphics search, spatial analysis, thematic mapping and other functions. By the graphics, text and charts, it presents the latest research results of basic geography

  12. Windows Server® 2008 Inside Out

    CERN Document Server

    Stanek, William R

    2009-01-01

    Learn how to conquer Windows Server 2008-from the inside out! Designed for system administrators, this definitive resource features hundreds of timesaving solutions, expert insights, troubleshooting tips, and workarounds for administering Windows Server 2008-all in concise, fast-answer format. You will learn how to perform upgrades and migrations, automate deployments, implement security features, manage software updates and patches, administer users and accounts, manage Active Directory® directory services, and more. With INSIDE OUT, you'll discover the best and fastest ways to perform core a

  13. Weather station with a web server

    OpenAIRE

    Repinc, Matej

    2013-01-01

    In this diploma thesis we present the process of making a cheap weather station using Arduino prototyping platform and its functionality. The weather station monitors current temperature, humidity of air and air pressure. The station has its own simple HTTP server that is used to relay current data in two different formats: JSON encoded data and simple HTML website. The weather station can also send data to a pre-defined server used for data collection. We implemented a web site where data an...

  14. Instant Hyper-v Server Virtualization starter

    CERN Document Server

    Eguibar, Vicente Rodriguez

    2013-01-01

    Get to grips with a new technology, understand what it is and what it can do for you, and then get to work with the most important features and tasks.The approach would be in a tutorial manner that will guide the users in an orderly manner toward virtualization.This book is conceived for system administrator and advanced PC enthusiasts who want to venture into the virtualization world. Although this book goes from scratch up, knowledge on server Operative Systems, LAN and networking has to be in place. Having a good background on server administration is desirable, including networking service

  15. The Giga View Multiprocessor Multidisk Image Server

    Directory of Open Access Journals (Sweden)

    B. A. Gennart

    1996-01-01

    Full Text Available Professionals in various fields such as medical imaging, biology, and civil engineering require rapid access to huge amounts of pixmap image data. Multimedia interfaces further increase the need for large image databases. To fulfill these requirements, the GigaView parallel image server architecture relies on arrays of intelligent disk nodes, each disk node being composed of one processor and one disk. This contribution reviews the design of the GigaView hardware and file system, compares it to other storage servers available on the market, and evaluates fields of applications for the architecture.

  16. Professional Microsoft SQL Server 2012 Integration Services

    CERN Document Server

    Knight, Brian; Moss, Jessica M; Davis, Mike; Rock, Chris

    2012-01-01

    An in-depth look at the radical changes to the newest release of SISS Microsoft SQL Server 2012 Integration Services (SISS) builds on the revolutionary database product suite first introduced in 2005. With this crucial resource, you will explore how this newest release serves as a powerful tool for performing extraction, transformation, and load operations (ETL). A team of SQL Server experts deciphers this complex topic and provides detailed coverage of the new features of the 2012 product release. In addition to technical updates and additions, the authors present you with a new set of SISS b

  17. WebLogic Server 9.0

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    8月9日,BEA WebLogic Server 9.0全面上市。BEA公司产品执行副总裁黄卫文(WaiWong)表示:“WebLogic Server9.0将使用户在转向SOA的同时,能够继续在复杂的异构环境中实现提高效率、降低IT成本和零宕机这些核心目标。”

  18. Analysis of User Behavior and Push Server Based on Data Mining Technology%基于数据挖掘下的用户分析和推送

    Institute of Scientific and Technical Information of China (English)

    吴玮怡

    2015-01-01

    随着信息时代的快速发展,各个行业领域每天都会产生大量的数据,其中包括消费者的各种行为信息,如果能够将这些看似杂乱的数据进行有效的分析,预测出用户背后的需求及潜在的诉求,可以有效地帮助运营商更好的进行市场决断,将已有的用户变为忠实用户,同时吸引更多的用户。本文以某一企业的用户数据进行具体的分析,使用数据挖掘方法分析企业内部的用户信息,了解用户的行为,预测用户的内在需求。%With the quick development of the information age, many industries generate massive data every day, which includes various kinds of information from user behavior. In fact, if we can make an effective analysis on such huge data and predict the customers' hidden demands, it can efficiently help enterprises to make a sensible decision. Therefore, it can help to convert existing users into loyal users and even attract more users. This paper will use the data mining technology to analyze the users' information within enterprises and get well aware of the users' behavior to further predict the inner demand of users. We will take users' data of an enterprise as an example.

  19. Instant Team Foundation Server 2012 and Project Server 2010 integration how-to

    CERN Document Server

    Gauvin, Gary P

    2013-01-01

    Filled with practical, step-by-step instructions and clear explanations for the most important and useful tasks. Get the job done and learn as you go. A how-To book with practical recipes accompanied with rich screenshots for easy comprehension.The How-to style is a very practical book which will take the reader through the process of garnering a basic understanding of TFS and Project Server with practical tutorials and recipes.This book is for users who want to integrate TFS 2012 and Project Server 2010. Readers are expected to know some basic Windows Server commands and account management, a

  20. Concurrent web server designing based on prethreading technology under Linux system%基于Linux的预线程化并发Web服务器设计

    Institute of Scientific and Technical Information of China (English)

    兰红; 柳显涛; 李文琼

    2012-01-01

    Browsing web pages is the most common service for Internet. Web sever is a kind of software which provides users to get or post web pages. Web server adopts B/S model, follows the standard HTTP protocol. In Linux operating system some web servers designed based on single processing or thread mechanism, have some problems, such as big costs of CPU and lower efficiency etc. This paper introduces a new web sever, using prethreading technology, integrating semaphore mechanism to dispatch shared resource, and producer-consumer model to build web server model. This new web server not only provides basic functions same as other web server, but also has concurrency, better service qualities and system resource utilization.%Web浏览是Internet最为常用的服务,Web服务器为客户提供Web浏览服务,它采用B/S工作模式,遵循HTTP协议.Linux系统中基于进程或单纯基于线程机制设计的Web服务器存在资源消耗大、利用率不高等问题.文中设计的基于预线程化的并发Web服务器利用信号量机制和生产者-消费者模型较好的实现了资源共享及其调度,利用线程的逻辑缓冲流有效节省地址空间.该Web服务器具备了常用Web服务器的基本功能,具有并发处理能力,可以降低系统开销,提高了Web服务器的服务效率和系统资源利用率.

  1. Roe deer browsing effects on growth development of Turkey oak and chestnut coppices.

    Directory of Open Access Journals (Sweden)

    Andrea Cutini

    2010-12-01

    Full Text Available Normal 0 14 false false false MicrosoftInternetExplorer4 Over the last three decades wild ungulates populations in Italy increased to values ranging from 300% to 600%. As a consequence, in Italy as well as in other European countries, situations with high ungulate density and, then, negative effects on the stability and dynamics of ecosystems, are increasing frequently. Starting from these evidences we investigated the effects of roe deer population on the vegetative regeneration of two different broadleaved tree species: Turkey oak (Quercus cerris L. and chestnut (Castanea sativa Mill. coppice stands. In Alpe di Catenaia (Apennines – Central Italy, after coppicing in 2002, we chose six experimental areas where fenced (P and non-fenced (NP plots were established. Measurements were performed at the beginning of the study period and in winter 2008 in both P and NP plots. Diameter and    height of all sprouts were measured. Results showed a different impact of roe deer on the two species. After seven years chestnut did not show any significant browsing-related damage, while in Turkey oak heavy differences between protected and non-protected areas are present: in NP plots roe deer browsing has produced a significant reduction in basal area (58% and volume (57% compared to P plots. The results agree with previous studies and confirm: (a a selective browsing pressure on Turkey oak; (b the lasting effect of the early impact after clear cutting, visible even seven years after. Based on the findings, we discussed the need for an integrated management of forest vegetation and forest fauna which should define the density of ungulates not only according to the theoretical carrying capacity    of ecosystems, but also considering (i the preservation of the ecosystem overall functionality, (ii the forest structure development and (iii the forest management type. st1\\:*{behavior:url(#ieooui } /* Style Definitions */ table.MsoNormalTable {mso

  2. Web server's reliability improvements using recurrent neural networks

    DEFF Research Database (Denmark)

    Madsen, Henrik; Albu, Rǎzvan-Daniel; Felea, Ioan

    2012-01-01

    In this paper we describe an interesting approach to error prediction illustrated by experimental results. The application consists of monitoring the activity for the web servers in order to collect the specific data. Predicting an error with severe consequences for the performance of a server (the...... usage, network usage and memory usage. We collect different data sets from monitoring the web server's activity and for each one we predict the server's reliability with the proposed recurrent neural network. © 2012 Taylor & Francis Group...

  3. Climate Data Service in the FP7 EarthServer Project

    Science.gov (United States)

    Mantovani, Simone; Natali, Stefano; Barboni, Damiano; Grazia Veratelli, Maria

    2013-04-01

    EarthServer is a European Framework Program project that aims at developing and demonstrating the usability of open standards (OGC and W3C) in the management of multi-source, any-size, multi-dimensional spatio-temporal data - in short: "Big Earth Data Analytics". In order to demonstrate the feasibility of the approach, six thematic Lighthouse Applications (Cryospheric Science, Airborne Science, Atmospheric/ Climate Science, Geology, Oceanography, and Planetary Science), each with 100+ TB, are implemented. Scope of the Atmospheric/Climate lighthouse application (Climate Data Service) is to implement the system containing global to regional 2D / 3D / 4D datasets retrieved either from satellite observations, from numerical modelling and in-situ observations. Data contained in the Climate Data Service regard atmospheric profiles of temperature / humidity, aerosol content, AOT, and cloud properties provided by entities such as the European Centre for Mesoscale Weather Forecast (ECMWF), the Austrian Meteorological Service (Zentralanstalt für Meteorologie und Geodynamik - ZAMG), the Italian National Agency for new technologies, energies and sustainable development (ENEA), and the Sweden's Meteorological and Hydrological Institute (Sveriges Meteorologiska och Hydrologiska Institut -- SMHI). The system, through an easy-to-use web application permits to browse the loaded data, visualize their temporal evolution on a specific point with the creation of 2D graphs of a single field, or compare different fields on the same point (e.g. temperatures from different models and satellite observations), and visualize maps of specific fields superimposed with high resolution background maps. All data access operations and display are performed by means of OGC standard operations namely WMS, WCS and WCPS. The EarthServer project has just started its second year over a 3-years development plan: the present status the system contains subsets of the final database, with the scope of

  4. Construction of a nuclear data server using TCP/IP

    Energy Technology Data Exchange (ETDEWEB)

    Kawano, Toshihiko; Sakai, Osamu [Kyushu Univ., Fukuoka (Japan)

    1997-03-01

    We construct a nuclear data server which provides data in the evaluated nuclear data library through the network by means of TCP/IP. The client is not necessarily a user but a computer program. Two examples with a prototype server program are demonstrated, the first is data transfer from the server to a user, and the second is to a computer program. (author)

  5. Browsing through 3D representations of unstructured picture collections: an empirical study

    CERN Document Server

    Christmann, Olivier

    2007-01-01

    The paper presents a 3D interactive representation of fairly large picture collections which facilitates browsing through unstructured sets of icons or pictures. Implementation of this representation implies choosing between two visualization strategies: users may either manipulate the view (OV) or be immersed in it (IV). The paper first presents this representation, then describes an empirical study (17 participants) aimed at assessing the utility and usability of each view. Subjective judgements in questionnaires and debriefings were varied: 7 participants preferred the IV view, 4 the OV one, and 6 could not choose between the two. Visual acuity and visual exploration strategies seem to have exerted a greater influence on participants' preferences than task performance or feeling of immersion.

  6. A Web-Based System for Remote Data Browsing in HT-7 Tokamak

    Institute of Scientific and Technical Information of China (English)

    Cheng Ting; Luo Jiarong; Meng Yuedong; Wang Huazhong

    2005-01-01

    HT-7 is the first superconducting tokamak device for fusion research in China. Many experiments have been performed on the HT-7 tokamak since 1994 with numerous satisfactory results achieved in the fusion research field. As more and better communication is required with other fusion research laboratories, remote access to experimental data is becoming increasingly important in order to raise the degree of openness of experiments and to expand research results.The web-based remote data browsing system enables authorized users in geographically different locations to view and search for experimental data without having to install any utility software at their terminals. The three-tier software architecture and thin client technology are used to operate the system effectively. This paper describes the structure of the system and the realization of its functions, focusing on three main points: the communication between the participating tiers, the data structure of the system and the visualization of the raw data on web pages.

  7. The emotional responses of browsing Facebook: Happiness, envy, and the role of tie strength.

    Science.gov (United States)

    Lin, Ruoyun; Utz, Sonja

    2015-11-01

    On Facebook, users are exposed to posts from both strong and weak ties. Even though several studies have examined the emotional consequences of using Facebook, less attention has been paid to the role of tie strength. This paper aims to explore the emotional outcomes of reading a post on Facebook and examine the role of tie strength in predicting happiness and envy. Two studies - one correlational, based on a sample of 207 American participants and the other experimental, based on a sample of 194 German participants - were conducted in 2014. In Study 2, envy was further distinguished into benign and malicious envy. Based on a multi-method approach, the results showed that positive emotions are more prevalent than negative emotions while browsing Facebook. Moreover, tie strength is positively associated with the feeling of happiness and benign envy, whereas malicious envy is independent of tie strength after reading a (positive) post on Facebook.

  8. Early and long-term impacts of browsing by roe deer in oak coppiced woods along a gradient of population density

    Directory of Open Access Journals (Sweden)

    Francesco Chianucci

    2015-01-01

    Full Text Available Over the last few decades, wild ungulate populations have exhibited relevant geographic and demographic expansion in most European countries; roe deer is amongst the most widespread ungulate species. The increasing roe deer densities have led to strong impact on forest regeneration; the problem has been recently recognized in coppice woods, a silvicultural system which is widespread in Italy, where it amounts to about 56% of the total national forested area.In this study we investigated the effect of roe deer browsing on the vegetative regeneration of Turkey oak few years after coppicing, along a gradient of roe deer density. A browsing index revealed that browsing impact was high at any given roe deer density but increased at higher density, with the browsing rate ranging from 65% to 79%. We also analyzed the long-term impact of browsing six and eleven years after coppicing under a medium roe deer density. Results indicated the early impact are not ephemeral but produced prolonged impacts through time, with an average reduction in volume of -57% and -41% six and eleven years after coppicing, respectively. Based on these results we proposed integrating browsing monitoring with roe deer density estimation to allow identifying ungulate densities which are compatible with silvicultural and forest management objectives. The proposed browsing index can be regarded as an effective management tool, on account of its simplicity and cost-effectiveness, being therefore highly suitable for routine, large scale monitoring of browsing impact.

  9. Contrasting responses of web-building spiders to deer browsing among habitats and feeding guilds.

    Science.gov (United States)

    Takada, Mayura; Baba, Yuki G; Yanagi, Yosuke; Terada, Saeko; Miyashita, Tadashi

    2008-08-01

    We examined web-building spider species richness and abundance in forests across a deer density gradient to determine the effects of sika deer browsing on spiders among habitats and feeding guilds. Deer decreased the abundance of web-building spiders in understory vegetation but increased their abundance in the litter layer. Deer seemed to affect web-building spiders in the understory vegetation by reducing the number of sites for webs because vegetation complexity was positively correlated with spider density and negatively correlated with deer density. In contrast, the presence of vegetation just above the litter layer decreased the spider density, and deer exerted a negative effect on this vegetation, possibly resulting in an indirect positive effect on spider density. The vegetation just above the litter layer may be unsuitable as a scaffold for building webs if it is too flexible to serve as a reliable web support, and may even hinder spiders from building webs on litter. Alternatively, the negative effect of this vegetation on spiders in the litter may be as a result of reduced local prey availability under the leaves because of the reduced accessibility of aerial insects. The response to deer browsing on web-building spiders that inhabit the understory vegetation varied with feeding guild. Deer tended to affect web-invading spiders, which inhabit the webs of other spiders and steal prey, more heavily than other web-building spiders, probably because of the accumulated effects of habitat fragmentation through the trophic levels. Thus, the treatment of a particular higher-order taxon as a homogeneous group could result in misleading conclusions about the effects of mammalian herbivores.

  10. Feeding characteristics reveal functional distinctions among browsing herbivorous fishes on coral reefs

    Science.gov (United States)

    Streit, Robert P.; Hoey, Andrew S.; Bellwood, David R.

    2015-12-01

    The removal of macroalgal biomass by fishes is a key process on coral reefs. Numerous studies have identified the fish species responsible for removing mature macroalgae, and have identified how this varies spatially, temporally, and among different algal types. None, however, have considered the behavioural and morphological traits of the browsing fishes and how this may influence the removal of macroalgal material. Using video observations of fish feeding on the brown macroalga Sargassum polycystum, we quantified the feeding behaviour and morphology of the four dominant browsing species on the Great Barrier Reef ( Kyphosus vaigiensis, Naso unicornis, Siganus canaliculatus, and Siganus doliatus). The greatest distinction between species was the algal material they targeted. K. vaigiensis and N. unicornis bit on the entire macroalgal thallus in approximately 90 % of bites. In contrast, Si. canaliculatus and Si. doliatus avoided biting the stalks, with 80-98 % of bites being on the macroalgal leaves only. This distinctive grouping into `entire thallus-biters' versus `leaf-biters' was not supported by size-standardized measures of biting morphology. Rather, species-specific adult body sizes, tooth shape, and feeding behaviour appear to underpin this functional distinction, with adults of the two larger fish species ( N. unicornis and K. vaigiensis) eating the entire macroalgal thallus, while the two smaller species ( Si. canaliculatus and Si. doliatus) bite only leaves. These findings caution against assumed homogeneity within this, and potentially other, functional groups on coral reefs. As functional redundancy within the macroalgal browsers is limited, the smaller `leaf-biting' species are unlikely to be able to compensate functionally for the loss of larger `entire thallus-biting' species.

  11. Self-medication with tannin-rich browse in goats infected with gastro-intestinal nematodes.

    Science.gov (United States)

    Amit, M; Cohen, I; Marcovics, A; Muklada, H; Glasser, T A; Ungar, E D; Landau, S Y

    2013-12-06

    Primates self-medicate to alleviate symptoms caused by gastro-intestinal nematodes (GIN) by consuming plants that contain secondary compounds. Would goats display the same dietary acumen? Circumstantial evidence suggests they could: goats in Mediterranean rangelands containing a shrub - Pistacia lentiscus - with known anthelmintic properties consume significant amounts of the shrub, particularly in the fall when the probability of being infected with GIN is greatest, even though its tannins impair protein metabolism and deter herbivory. In order to test rigorously the self-medication hypothesis in goats, we conducted a controlled study using 21 GIN-infected and 23 non-infected goats exposed to browse foliage from P. lentiscus, another browse species - Phillyrea latifolia, or hay during the build-up of infection. GIN-infected goats showed clear symptoms of infection, which was alleviated by P. lentiscus foliage but ingesting P. lentiscus had a detrimental effect on protein metabolism in the absence of disease. When given a choice between P. lentiscus and hay, infected goats of the Mamber breed showed higher preference for P. lentiscus than non-infected counterparts, in particular if they had been exposed to Phillyrea latifolia before. This was not found in Damascus goats. Damascus goats, which exhibit higher propensity to consume P. lentiscus may use it as a drug prophylactically, whereas Mamber goats, which are more reluctant to ingest it, select P. lentiscus foliage therapeutically. These results hint at subtle trade-offs between the roles of P. lentiscus as a food, a toxin and a medicine. This is the first evidence of self-medication in goats under controlled conditions. Endorsing the concept of self-medication could greatly modify the current paradigm of veterinary parasitology whereby man decides when and how to treat GIN-infected animals, and result in transferring this decision to the animals themselves.

  12. ADF/ADC Web Tools for Browsing and Visualizing Astronomical Catalogs and NASA Astrophysics Mission Metadata

    Science.gov (United States)

    Shaya, E.; Kargatis, V.; Blackwell, J.; Borne, K.; White, R. A.; Cheung, C.

    1998-05-01

    Several new web based services have been introduced this year by the Astrophysics Data Facility (ADF) at the NASA Goddard Space Flight Center. IMPReSS is a graphical interface to astrophysics databases that presents the user with the footprints of observations of space-based missions. It also aids astronomers in retrieving these data by sending requests to distributed data archives. The VIEWER is a reader of ADC astronomical catalogs and journal tables that allows subsetting of catalogs by column choices and range selection and provides database-like search capability within each table. With it, the user can easily find the table data most appropriate for their purposes and then download either the subset table or the original table. CATSEYE is a tool that plots output tables from the VIEWER (and soon AMASE), making exploring the datasets fast and easy. Having completed the basic functionality of these systems, we are enhancing the site to provide advanced functionality. These will include: market basket storage of tables and records of VIEWER output for IMPReSS and AstroBrowse queries, non-HTML table responses to AstroBrowse type queries, general column arithmetic, modularity to allow entrance into the sequence of web pages at any point, histogram plots, navigable maps, and overplotting of catalog objects on mission footprint maps. When completed, the ADF/ADC web facilities will provide astronomical tabled data and mission retrieval information in several hyperlinked environments geared for users at any level, from the school student to the typical astronomer to the expert datamining tools at state-of-the-art data centers.

  13. Freiburg RNA Tools: a web server integrating IntaRNA, ExpaRNA and LocARNA

    OpenAIRE

    Smith, Cameron; Heyne, Steffen; Richter, Andreas S.; Will, Sebastian; Backofen, Rolf

    2010-01-01

    The Freiburg RNA tools web server integrates three tools for the advanced analysis of RNA in a common web-based user interface. The tools IntaRNA, ExpaRNA and LocARNA support the prediction of RNA–RNA interaction, exact RNA matching and alignment of RNA, respectively. The Freiburg RNA tools web server and the software packages of the stand-alone tools are freely accessible at http://rna.informatik.uni-freiburg.de.

  14. Creating a Data Warehouse using SQL Server

    DEFF Research Database (Denmark)

    Sørensen, Jens Otto; Alnor, Karl

    1999-01-01

    In this paper we construct a Star Join Schema and show how this schema can be created using the basic tools delivered with SQL Server 7.0. Major objectives are to keep the operational database unchanged so that data loading can be done with out disturbing the business logic of the operational dat...

  15. Mastering SQL Server 2014 data mining

    CERN Document Server

    Bassan, Amarpreet Singh

    2014-01-01

    If you are a developer who is working on data mining for large companies and would like to enhance your knowledge of SQL Server Data Mining Suite, this book is for you. Whether you are brand new to data mining or are a seasoned expert, you will be able to master the skills needed to build a data mining solution.

  16. Implementing bioinformatic workflows within the bioextract server

    Science.gov (United States)

    Computational workflows in bioinformatics are becoming increasingly important in the achievement of scientific advances. These workflows typically require the integrated use of multiple, distributed data sources and analytic tools. The BioExtract Server (http://bioextract.org) is a distributed servi...

  17. Solarwinds Server & Application Monitor deployment and administration

    CERN Document Server

    Brant, Justin

    2013-01-01

    A concise and practical guide to using SolarWinds Server & Application Monitor.If you are an IT professionals ranging from an entry-level technician to a more advanced network or system administrator who is new to network monitoring services and/or SolarWinds SAM, this book is ideal for you.

  18. Windows Server 2012初体验

    Institute of Scientific and Technical Information of China (English)

    毕晓日; 王洪波

    2013-01-01

    Windows Server 2012虽然已经发布,但它到底会给我们的工作带来哪些便利呢,哪些又是它的亮点呢,本文从我们最常用的应用入手让您可以体验它的新功能!

  19. Microsoft Exchange Server PowerShell cookbook

    CERN Document Server

    Andersson, Jonas

    2015-01-01

    This book is for messaging professionals who want to build real-world scripts with Windows PowerShell 5 and the Exchange Management Shell. If you are a network or systems administrator responsible for managing and maintaining Exchange Server 2013, you will find this highly useful.

  20. Virtual Server Self-Service Provisioning @ CERN

    CERN Document Server

    Sucik, J

    2009-01-01

    The presentation for the Microsoft Windows Server Roundtable event gives answer to questions why CERN Virtual Infrastructure (CVI) solution is based on Microsoft Hyper-V and System Center Virtual Machine Management (SCVMM) products. Experiences, challenges, results and successes concerning the CVI service are also presented. Presentation also includes brief overview of CERN and its IT infrastructure.

  1. 在Windows Server 2008上备份Exchange Server 2007

    Institute of Scientific and Technical Information of China (English)

    Paul Robichaux; 新宇(译者)

    2009-01-01

    我非常喜欢Windows Server2008。它在Windows Server 2003的各个方面都做了重大改进。实际上,它是我最喜欢的操作系统,我旅行时常用的机器ThinkPad上安装的就是Windows Sewer 2008,它的运行状态非常良好。

  2. 基于ArcGIS Server和J2EE架构下的缓冲区分析实现%IMPLEMENT OF BUFFER ANALYSIS ON ARCGIS SERVER AND J2EE CONSTRUCTION

    Institute of Scientific and Technical Information of China (English)

    阮明; 罗年学

    2007-01-01

    分析了ArcGIS Server和J2EE的体系架构以及ArcGIS Server自身缓冲区分析模板的局限性,探讨了利用JSF、Java和ArcGIS Server ADF框架自行实现基于B/S三层结构的缓冲区分析系统的技术和方法.

  3. Optimal Control of the D-Policy M/G/1 Queueing System with Server Breakdowns

    Directory of Open Access Journals (Sweden)

    Kuo-Hsiung Wang

    2008-01-01

    Full Text Available This study deals with a single server in the D-policy M/G/1 queueing system in which the server is turned off at the end of each complete period and is activated again only when the cumulative completion times of the customers in the system exceeds a given level D. While the server is working, he is subject to breakdowns according to a Poisson process. When the server breaks down, he requires repair at a repair facility, where the repair time obeys a general distribution. We have demonstrated that the probability that the server is busy in the steady-state is equal to the traffic intensity. The total expected cost function per customer per unit time is constructed to determine the optimal operating D-policy at a minimum cost. We use the steady-state analytic results and apply an efficient Matlab computer program to calculate the optimal value of D. Based on three different service distributions: exponential, 3-stage Erlang and deterministic, we provide extensive numerical computation for illustration purpose. Sensitivity analysis is also investigated.

  4. ANALYSIS OF MULTI-SERVER SINGLE QUEUE

    Directory of Open Access Journals (Sweden)

    Emmanuel John Ekpenyong

    2011-04-01

    Full Text Available Queuing properties such as expected total service time and its variance and some performance measures like the expected number of phases in the system, expected number of phases in the queue, expected number of customers in the queue, expected waiting time in the queue and in the system as well as the number of customers in the system have been derived for an  M/Ek/s: ( /FCFS queuing model with k identified stages in series, each with average service time of . Also, numerical illustrations have also been used to illustrate the results.

  5. ANALYSIS OF MULTI-SERVER SINGLE QUEUE

    OpenAIRE

    2011-01-01

    Queuing properties such as expected total service time and its variance and some performance measures like the expected number of phases in the system, expected number of phases in the queue, expected number of customers in the queue, expected waiting time in the queue and in the system as well as the number of customers in the system have been derived for an  M/Ek/s: ( /FCFS) queuing model with k identified stages in series, each with average service time of . Also, numerical illustrati...

  6. Access to Library Collections: Summary of a Documentary and Opinion Survey on the Direct Shelf Approach and Browsing

    Science.gov (United States)

    Hyman, Richard J.

    1971-01-01

    The validity of the direct shelf approach as a concept for organizing library materials, with special reference to its component, browsing" is investigated by this survey. Findings implied policy recommendations for library management and library school curricula. (33 references) (Author/NH)

  7. Web-Browsing Competencies of Pre-Service Adult Facilitators: Implications for Curriculum Transformation and Distance Learning

    Science.gov (United States)

    Theresa, Ofoegbu; Ugwu, Agboeze Matthias; Ihebuzoaju, Anyanwu Joy; Uche, Asogwa

    2013-01-01

    The study investigated the Web-browsing competencies of pre-service adult facilitators in the southeast geopolitical zone of Nigeria. Survey design was adopted for the study. The population consists of all pre-service adult facilitators in all the federal universities in the southeast geopolitical zone of Nigeria. Accidental sampling technique was…

  8. Design and Implementation VOIP Service on Open IMS and Asterisk Servers Interconnected through Enum Server

    CERN Document Server

    Munadi, Rendy; Mulyana, Asep; M, R Rumani; 10.5121/ijngn.2010.2201

    2010-01-01

    Asterisk and Open IMS use SIP signal protocol to enable both of them can be connected. To facilitate both relationships, Enum server- that is able to translate the numbering address such as PSTN (E.164) to URI address (Uniform Resource Identifier)- can be used. In this research, we interconnect Open IMS and Asterisk server Enum server. We then analyze the server performance and PDD (Post Dial Delay) values resulted by the system. As the result of the experiment, we found that, for a call from Open IMS user to analog Asterisk telephone (FXS) with a arrival call each servers is 30 call/sec, the maximum PDD value is 493.656 ms. Open IMS is able to serve maximum 30 call/s with computer processor 1.55 GHz, while the Asterisk with computer processor 3.0 GHz, may serve up to 55 call/sec. Enum on server with 1.15 GHz computer processor have the capability of serving maximum of 8156 queries/sec.

  9. Effects of tree species richness and composition on moose winter browsing damage and foraging selectivity: an experimental study.

    Science.gov (United States)

    Milligan, Harriet T; Koricheva, Julia

    2013-07-01

    The optimal foraging theory, the nutrient balance hypothesis, and the plant association theories predict that foraging decisions and resulting tree damage by large mammalian browsers may be influenced by the species richness and species composition of forest stands. This may lead to either associational susceptibility (increased damage on a focal plant in a mixed stand) or associational resistance (reduced damage in a mixed stand). Better understanding of the mechanisms and the relative importance of tree species richness and composition effects on foraging by mammalian browsers is needed to support sustainable management of forests and mammal populations. However, existing knowledge of forest diversity effects on foraging by large mammalian browsers comes largely from observational studies while experimental evidence is limited. We analysed winter browsing by moose (Alces alces L.) in a long-term, large-scale experiment in Finland, which represents a tree species richness gradient from monocultures to 2-, 3- and 5-species mixtures composed of Scots pine (Pinus sylvestris L.), Norway spruce (Picea abies L.), Siberian larch (Larix sibirica Ledeb.), silver birch (Betula pendula Roth.) and black alder (Alnus glutinosa L.). The intensity of browsing per plot increased with tree species richness while browsing selectivity decreased with tree species being targeted more equally in species-rich mixtures. Tree species composition of a plot was also an important determinant of intensity of browsing. The greatest browsing occurred in plots containing preferred species (pine and birch) while intermediate preference species (larch and alder) experienced associational susceptibility when growing with pine and birch compared with their monocultures or mixtures without pine and birch. In contrast, we found no evidence of associational resistance; the presence of a least preferred species (spruce) in a mixture had no significant effect on moose browsing on other tree species. We

  10. Securing SQL server protecting your database from attackers

    CERN Document Server

    Cherry, Denny

    2015-01-01

    SQL server is the most widely-used database platform in the world, and a large percentage of these databases are not properly secured, exposing sensitive customer and business data to attack. In Securing SQL Server, Third Edition, you will learn about the potential attack vectors that can be used to break into SQL server databases as well as how to protect databases from these attacks. In this book, Denny Cherry - a Microsoft SQL MVP and one of the biggest names in SQL server - will teach you how to properly secure an SQL server database from internal and external threats using best practic

  11. Experience with Server Self Service Center (S3C)

    CERN Document Server

    Sucik, J; CERN. Geneva. IT Department

    2010-01-01

    CERN has a successful experience with running Server Self Service Center (S3C) for virtual server provisioning which is based on Microsoft® Virtual Server 2005. With the introduction of Windows Server 2008 and its built-in hypervisor based virtualization (Hyper-V) there are new possibilities for the expansion of the current service. This paper describes the architecture of the redesigned virtual Server Self Service based on Hyper-V which provides dynamically scalable virtualized resources on demand as needed and outlines the possible implications on the future use of virtual machines at CERN.

  12. Experience with Server Self Service Center (S3C)

    CERN Multimedia

    Sucik, J

    2009-01-01

    CERN has a successful experience with running Server Self Service Center (S3C) for virtual server provisioning which is based on Microsoft® Virtual Server 2005. With the introduction of Windows Server 2008 and its built-in hypervisor based virtualization (Hyper-V) there are new possibilities for the expansion of the current service. This paper describes the architecture of the redesigned virtual Server Self Service based on Hyper-V which provides dynamically scalable virtualized resources on demand as needed and outlines the possible implications on the future use of virtual machines at CERN.

  13. Securing SQL Server Protecting Your Database from Attackers

    CERN Document Server

    Cherry, Denny

    2012-01-01

    Written by Denny Cherry, a Microsoft MVP for the SQL Server product, a Microsoft Certified Master for SQL Server 2008, and one of the biggest names in SQL Server today, Securing SQL Server, Second Edition explores the potential attack vectors someone can use to break into your SQL Server database as well as how to protect your database from these attacks. In this book, you will learn how to properly secure your database from both internal and external threats using best practices and specific tricks the author uses in his role as an independent consultant while working on some of the largest

  14. GeneBee-net: Internet-based server for analyzing biopolymers

    Energy Technology Data Exchange (ETDEWEB)

    Brodsky, L.I.; Ivanov, V.V.; Nikolaev, V.K. [Small Scientific Manufacturing Enterprise, Moscow (Russian Federation)] [and others

    1995-08-01

    This work describes a network server for searching databanks of biopolymer structures and performing other biocomputing procedures; it is available via direct Internet connection. Basic server procedures are dedicated to homology (similarity) search of sequence and 3D structure of proteins. The homologies found could be used to build multiple alignments, predict protein and RNA secondary structure, and construct phylogenetic trees. In addition to traditional methods of sequence similarity search, the authors propose {open_quotes}non-matrix{close_quotes} (correlational) search. An analogous approach is used to identify regions of similar tertiary structure of proteins. Algorithm concepts and usage examples are presented for new methods. Service logic is based upon interaction of a client program and server procedures. The client program allows the compilation of queries and the processing of results of an analysis.

  15. The State of the Art in Locally Distributed Web-Server Systems

    Directory of Open Access Journals (Sweden)

    M. Ezhilvendan

    2013-01-01

    Full Text Available The paper describes a novel algorithms for a load balancer, allocates the work to the clusters of SIP server. The several load balancing algorithms for distributing Session Initiation Protocol (SIP request to a cluster of SIP servers. This algorithm also supports the following three techniques such as CJSQ, TJSQ and TLWL. It is combine knowledge of the SIP, recognizing variability in call length, dynamic estimates of back-end server load for different SIP transactions. In this paper load balancer improves both throughput and response time. The SIP is a protocol of growing importance, with uses for VOIP, IPTV, audio conferencing, instant messaging. We present a detailed analysis of occupancy to show how our algorithms significantly reduce response time.

  16. An ECG storage and retrieval system embedded in client server HIS utilizing object-oriented DB.

    Science.gov (United States)

    Wang, C; Ohe, K; Sakurai, T; Nagase, T; Kaihara, S

    1996-02-01

    In the University of Tokyo Hospital, the improved client server HIS has been applied to clinical practice and physicians can order prescription, laboratory examination, ECG examination and radiographic examination, etc. directly by themselves and read results of these examinations, except medical signal waves, schema and image, on UNIX workstations. Recently, we designed and developed an ECG storage and retrieval system embedded in the client server HIS utilizing object-oriented database to take the first step in dealing with digitized signal, schema and image data and show waves, graphics, and images directly to physicians by the client server HIS. The system was developed based on object-oriented analysis and design, and implemented with object-oriented database management system (OODMS) and C++ programming language. In this paper, we describe the ECG data model, functions of the storage and retrieval system, features of user interface and the result of its implementation in the HIS.

  17. 3USS: a web server for detecting alternative 3'UTRs from RNA-seq experiments.

    KAUST Repository

    Le Pera, Loredana

    2015-01-22

    Protein-coding genes with multiple alternative polyadenylation sites can generate mRNA 3\\'UTR sequences of different lengths, thereby causing the loss or gain of regulatory elements, which can affect stability, localization and translation efficiency. 3USS is a web-server developed with the aim of giving experimentalists the possibility to automatically identify alternative 3 \\': UTRs (shorter or longer with respect to a reference transcriptome), an option that is not available in standard RNA-seq data analysis procedures. The tool reports as putative novel the 3 \\': UTRs not annotated in available databases. Furthermore, if data from two related samples are uploaded, common and specific alternative 3 \\': UTRs are identified and reported by the server.3USS is freely available at http://www.biocomputing.it/3uss_server.

  18. Data decomposition of Monte Carlo particle transport simulations via tally servers

    Energy Technology Data Exchange (ETDEWEB)

    Romano, Paul K., E-mail: paul.k.romano@gmail.com [Massachusetts Institute of Technology, Department of Nuclear Science and Engineering, 77 Massachusetts Ave., Cambridge, MA 02139 (United States); Siegel, Andrew R., E-mail: siegala@mcs.anl.gov [Argonne National Laboratory, Theory and Computing Sciences, 9700 S Cass Ave., Argonne, IL 60439 (United States); Forget, Benoit, E-mail: bforget@mit.edu [Massachusetts Institute of Technology, Department of Nuclear Science and Engineering, 77 Massachusetts Ave., Cambridge, MA 02139 (United States); Smith, Kord, E-mail: kord@mit.edu [Massachusetts Institute of Technology, Department of Nuclear Science and Engineering, 77 Massachusetts Ave., Cambridge, MA 02139 (United States)

    2013-11-01

    An algorithm for decomposing large tally data in Monte Carlo particle transport simulations is developed, analyzed, and implemented in a continuous-energy Monte Carlo code, OpenMC. The algorithm is based on a non-overlapping decomposition of compute nodes into tracking processors and tally servers. The former are used to simulate the movement of particles through the domain while the latter continuously receive and update tally data. A performance model for this approach is developed, suggesting that, for a range of parameters relevant to LWR analysis, the tally server algorithm should perform with minimal overhead on contemporary supercomputers. An implementation of the algorithm in OpenMC is then tested on the Intrepid and Titan supercomputers, supporting the key predictions of the model over a wide range of parameters. We thus conclude that the tally server algorithm is a successful approach to circumventing classical on-node memory constraints en route to unprecedentedly detailed Monte Carlo reactor simulations.

  19. Effectiveness of fencing and hunting to control Lama guanicoe browsing damage: Implications for Nothofagus pumilio regeneration in harvested forests.

    Science.gov (United States)

    Martínez Pastur, Guillermo; Soler, Rosina; Ivancich, Horacio; Lencinas, María V; Bahamonde, Héctor; Peri, Pablo L

    2016-03-01

    Browsing damage by native ungulates is often to be considered one of the reasons of regeneration failure in Nothofagus pumilio silvicultural systems. Fencing and hunting in forests at regeneration phase have been proposed to mitigate browsing effects. This study aims to determine effectiveness of these control methods in harvested forests, evaluating browsing damage over regeneration, as well as climate-related constraints (freezing or desiccation). Forest structure and regeneration plots were established in two exclosures against native ungulates (Lama guanicoe) by wire fences in the Chilean portion of Tierra del Fuego island, where tree regeneration density, growth, abiotic damage and quality (multi-stems and base/stem deformation) were assessed. Exclosures did not influence regeneration density (at the initial stage with 1.3 m high). However, sapling height at 10-years old was significantly lower outside (40-50 cm high) than inside exclosures (80-100 cm), and also increased their annual height growth, probably as a hunting effect. Likewise, quality was better inside exclosures. Alongside browsing, abiotic conditions negatively influenced sapling quality in the regeneration phase (20%-28% of all seedlings), but greatly to taller plants (as those from inside exclosure). This highlights the importance of considering climatic factors when analysing browsing effects. For best results, control of guanaco in recently harvested areas by fencing should be applied in combination with a reduction of guanaco density through continuous hunting. The benefits of mitigation actions (fencing and hunting) on regeneration growth may shorten the regeneration phase period in shelterwood cutting forests (30-50% less time), but incremental costs must be analysed in the framework of management planning by means of long-term studies.

  20. Client/server approach to image capturing

    Science.gov (United States)

    Tuijn, Chris; Stokes, Earle

    1998-01-01

    The diversity of the digital image capturing devices on the market today is quite astonishing and ranges from low-cost CCD scanners to digital cameras (for both action and stand-still scenes), mid-end CCD scanners for desktop publishing and pre- press applications and high-end CCD flatbed scanners and drum- scanners with photo multiplier technology. Each device and market segment has its own specific needs which explains the diversity of the associated scanner applications. What all those applications have in common is the need to communicate with a particular device to import the digital images; after the import, additional image processing might be needed as well as color management operations. Although the specific requirements for all of these applications might differ considerably, a number of image capturing and color management facilities as well as other services are needed which can be shared. In this paper, we propose a client/server architecture for scanning and image editing applications which can be used as a common component for all these applications. One of the principal components of the scan server is the input capturing module. The specification of the input jobs is based on a generic input device model. Through this model we make abstraction of the specific scanner parameters and define the scan job definitions by a number of absolute parameters. As a result, scan job definitions will be less dependent on a particular scanner and have a more universal meaning. In this context, we also elaborate on the interaction of the generic parameters and the color characterization (i.e., the ICC profile). Other topics that are covered are the scheduling and parallel processing capabilities of the server, the image processing facilities, the interaction with the ICC engine, the communication facilities (both in-memory and over the network) and the different client architectures (stand-alone applications, TWAIN servers, plug-ins, OLE or Apple-event driven

  1. Browsing the sky through the ASI Science Data Centre Data Explorer Tool

    CERN Document Server

    D'Elia, V; Verrecchia, F; Gendre, B; Giommi, P

    2010-01-01

    We present here the Data Explorer tool developed at the ASI Science Data Center (ASDC). This tool is designed to provide an efficient and user-friendly way to display information residing in several catalogs stored in the ASDC servers, to cross-correlate this information and to download/analyze data via our scientific tools and/or external services. Our database includes GRB catalogs (such as Swift and Beppo-SAX), which can be queried through the Data Explorer. The GRB fields can be viewed in multiwavelength and the data can be analyzed or retrieved.

  2. Energy Servers Deliver Clean, Affordable Power

    Science.gov (United States)

    2010-01-01

    K.R. Sridhar developed a fuel cell device for Ames Research Center, that could use solar power to split water into oxygen for breathing and hydrogen for fuel on Mars. Sridhar saw the potential of the technology, when reversed, to create clean energy on Earth. He founded Bloom Energy, of Sunnyvale, California, to advance the technology. Today, the Bloom Energy Server is providing cost-effective, environmentally friendly energy to a host of companies such as eBay, Google, and The Coca-Cola Company. Bloom's NASA-derived Energy Servers generate energy that is about 67-percent cleaner than a typical coal-fired power plant when using fossil fuels and 100-percent cleaner with renewable fuels.

  3. Berkeley PHOG: PhyloFacts orthology group prediction web server.

    Science.gov (United States)

    Datta, Ruchira S; Meacham, Christopher; Samad, Bushra; Neyer, Christoph; Sjölander, Kimmen

    2009-07-01

    Ortholog detection is essential in functional annotation of genomes, with applications to phylogenetic tree construction, prediction of protein-protein interaction and other bioinformatics tasks. We present here the PHOG web server employing a novel algorithm to identify orthologs based on phylogenetic analysis. Results on a benchmark dataset from the TreeFam-A manually curated orthology database show that PHOG provides a combination of high recall and precision competitive with both InParanoid and OrthoMCL, and allows users to target different taxonomic distances and precision levels through the use of tree-distance thresholds. For instance, OrthoMCL-DB achieved 76% recall and 66% precision on this dataset; at a slightly higher precision (68%) PHOG achieves 10% higher recall (86%). InParanoid achieved 87% recall at 24% precision on this dataset, while a PHOG variant designed for high recall achieves 88% recall at 61% precision, increasing precision by 37% over InParanoid. PHOG is based on pre-computed trees in the PhyloFacts resource, and contains over 366 K orthology groups with a minimum of three species. Predicted orthologs are linked to GO annotations, pathway information and biological literature. The PHOG web server is available at http://phylofacts.berkeley.edu/orthologs/.

  4. Proving the correctness of client/server software

    Indian Academy of Sciences (India)

    Eyad Alkassar; Sebastian Bogan; Wolfgang J Paul

    2009-02-01

    Remote procedure calls (RPCs) lie at the heart of any client/server software. Thus, formal specification and verification of RPC mechanisms is a prerequisite for the verification of any such software. In this paper, we present a mathematical specification of an RPC mechanism and we outline how to prove the correctness of an implementation — say written in C — of this mechanism at the code level. We define a formal model of user processes running concurrently under a simple operating system, which provides inter-process communication and portmapper system calls. A simple theory of non-interference permits us to use conventional sequential program analysis between system calls (within the concurrent model). An RPC mechanism is specified and the correctness proof for server implementations, using this mechanism, is outlined. To the best of our knowledge this is the first treatment of the correctness of an entire RPC mechanism at the code level.

  5. Reporting with Microsoft SQL Server 2012

    CERN Document Server

    Serra, James

    2014-01-01

    This is a step-by-step tutorial that deals with Microsoft Server 2012 reporting tools:SSRS and Power View.If you are a BI developer, consultant, or architect who wishes to learn how to use SSRS and Power View, and want to understand the best use for each tool, then this book will get you up and running quickly. No prior experience is required with either tool!

  6. Descriptors of server capabilities in China

    DEFF Research Database (Denmark)

    Adeyemi, Oluseyi; Slepniov, Dmitrij; Wæhrens, Brian Vejrum;

    2013-01-01

    China with the huge market potential it possesses is an important issue for subsidiaries of western multinational companies. The objective of this paper is therefore to strengthen researchers’ and practitioners’ perspectives on what are the descriptors of server capabilities. The descriptors...... are relevant to determine subsidiary roles and as an indication of the capabilities required. These descriptors are identified through extensive literature review and validated by case studies of two Danish multinational companies subsidiaries operating in China. They provided the empirical basis...

  7. SQL Server 2012 reporting services blueprints

    CERN Document Server

    Ribunal, Marlon

    2013-01-01

    Follow the fictional John Kirkland through a series of real-world reporting challenges based on actual business conditions. Use his detailed blueprints to develop your own reports for every requirement.This book is for report developers, data analysts, and database administrators struggling to master the complex world of effective reporting in SQL Server 2012. Knowledge of how data sources and data sets work will greatly help readers to speed through the tutorials.

  8. Preprint server seeks way to halt plagiarists

    CERN Multimedia

    Giles, J

    2003-01-01

    "An unusual case of plagiarism has struck ArXiv, the popular physics preprint server at Cornell University in Ithaca, New York, resulting in the withdrawal of 22 papers...The plagiarism case traces its origins to June 2002, when Yasushi Watanabe, a high-energy physicist at the Tokyo Insitute of Technology, was contacted by Ramy Noboulsi, who said he was a mathematical physicist" (1 page)

  9. Energy Efficiency in Small Server Rooms: Field Surveys and Findings

    Energy Technology Data Exchange (ETDEWEB)

    Cheung, Iris [Hoi; Greenberg, Steve; Mahdavi, Roozbeh; Brown, Richard; Tschudi, William

    2014-08-11

    Fifty-seven percent of US servers are housed in server closets, server rooms, and localized data centers, in what are commonly referred to as small server rooms, which comprise 99percent of all server spaces in the US. While many mid-tier and enterprise-class data centers are owned by large corporations that consider energy efficiency a goal to minimize business operating costs, small server rooms typically are not similarly motivated. They are characterized by decentralized ownership and management and come in many configurations, which creates a unique set of efficiency challenges. To develop energy efficiency strategies for these spaces, we surveyed 30 small server rooms across eight institutions, and selected four of them for detailed assessments. The four rooms had Power Usage Effectiveness (PUE) values ranging from 1.5 to 2.1. Energy saving opportunities ranged from no- to low-cost measures such as raising cooling set points and better airflow management, to more involved but cost-effective measures including server consolidation and virtualization, and dedicated cooling with economizers. We found that inefficiencies mainly resulted from organizational rather than technical issues. Because of the inherent space and resource limitations, the most effective measure is to operate servers through energy-efficient cloud-based services or well-managed larger data centers, rather than server rooms. Backup power requirement, and IT and cooling efficiency should be evaluated to minimize energy waste in the server space. Utility programs are instrumental in raising awareness and spreading technical knowledge on server operation, and the implementation of energy efficiency measures in small server rooms.

  10. The use of interactive graphical maps for browsing medical/health Internet information resources

    Directory of Open Access Journals (Sweden)

    Boulos Maged

    2003-01-01

    Full Text Available Abstract As online information portals accumulate metadata descriptions of Web resources, it becomes necessary to develop effective ways for visualising and navigating the resultant huge metadata repositories as well as the different semantic relationships and attributes of described Web resources. Graphical maps provide a good method to visualise, understand and navigate a world that is too large and complex to be seen directly like the Web. Several examples of maps designed as a navigational aid for Web resources are presented in this review with an emphasis on maps of medical and health-related resources. The latter include HealthCyberMap maps http://healthcybermap.semanticweb.org/, which can be classified as conceptual information space maps, and the very abstract and geometric Visual Net maps of PubMed http://map.net (for demos. Information resources can be also organised and navigated based on their geographic attributes. Some of the maps presented in this review use a Kohonen Self-Organising Map algorithm, and only HealthCyberMap uses a Geographic Information System to classify Web resource data and render the maps. Maps based on familiar metaphors taken from users' everyday life are much easier to understand. Associative and pictorial map icons that enable instant recognition and comprehension are preferred to geometric ones and are key to successful maps for browsing medical/health Internet information resources.

  11. Your browsing behavior for a Big Mac: Economics of Personal Information Online

    CERN Document Server

    Carrascal, Juan Pablo; Erramilli, Vijay; Cherubini, Mauro; de Oliveira, Rodrigo

    2011-01-01

    Most online services (Google, Facebook etc.) operate by providing a service to users for free, and in return they collect and monetize personal information (PI) of the users. This operational model is inherently economic, as the "good" being traded and monetized is PI. This model is coming under increased scrutiny as online services are moving to capture more PI of users, raising serious privacy concerns. However, little is known on how users valuate different types of PI while being online, as well as the perceptions of users with regards to exploitation of their PI by online service providers. In this paper, we study how users valuate different types of PI while being online, while capturing the context by relying on Experience Sampling. We were able to extract the monetary value that 168 participants put on different pieces of PI. We find that users value their PI related to their offline identities more (3 times) than their browsing behavior. Users also value information pertaining to financial transactio...

  12. Brown world forests: increased ungulate browsing keeps temperate trees in recruitment bottlenecks in resource hotspots.

    Science.gov (United States)

    Churski, Marcin; Bubnicki, Jakub W; Jędrzejewska, Bogumiła; Kuijper, Dries P J; Cromsigt, Joris P G M

    2017-04-01

    Plant biomass consumers (mammalian herbivory and fire) are increasingly seen as major drivers of ecosystem structure and function but the prevailing paradigm in temperate forest ecology is still that their dynamics are mainly bottom-up resource-controlled. Using conceptual advances from savanna ecology, particularly the demographic bottleneck model, we present a novel view on temperate forest dynamics that integrates consumer and resource control. We used a fully factorial experiment, with varying levels of ungulate herbivory and resource (light) availability, to investigate how these factors shape recruitment of five temperate tree species. We ran simulations to project how inter- and intraspecific differences in height increment under the different experimental scenarios influence long-term recruitment of tree species. Strong herbivore-driven demographic bottlenecks occurred in our temperate forest system, and bottlenecks were as strong under resource-rich as under resource-poor conditions. Increased browsing by herbivores in resource-rich patches strongly counteracted the increased escape strength of saplings in these patches. This finding is a crucial extension of the demographic bottleneck model which assumes that increased resource availability allows plants to more easily escape consumer-driven bottlenecks. Our study demonstrates that a more dynamic understanding of consumer-resource interactions is necessary, where consumers and plants both respond to resource availability.

  13. An Attention-Information-Based Spatial Adaptation Framework for Browsing Videos via Mobile Devices

    Directory of Open Access Journals (Sweden)

    Li Houqiang

    2007-01-01

    Full Text Available With the growing popularity of personal digital assistant devices and smart phones, more and more consumers are becoming quite enthusiastic to appreciate videos via mobile devices. However, limited display size of the mobile devices has been imposing significant barriers for users to enjoy browsing high-resolution videos. In this paper, we present an attention-information-based spatial adaptation framework to address this problem. The whole framework includes two major parts: video content generation and video adaptation system. During video compression, the attention information in video sequences will be detected using an attention model and embedded into bitstreams with proposed supplement-enhanced information (SEI structure. Furthermore, we also develop an innovative scheme to adaptively adjust quantization parameters in order to simultaneously improve the quality of overall encoding and the quality of transcoding the attention areas. When the high-resolution bitstream is transmitted to mobile users, a fast transcoding algorithm we developed earlier will be applied to generate a new bitstream for attention areas in frames. The new low-resolution bitstream containing mostly attention information, instead of the high-resolution one, will be sent to users for display on the mobile devices. Experimental results show that the proposed spatial adaptation scheme is able to improve both subjective and objective video qualities.

  14. GAPforAPE: an augmented browsing system to improve Web 2.0 accessibility

    Science.gov (United States)

    Mirri, Silvia; Salomoni, Paola; Prandi, Catia; Muratori, Ludovico Antonio

    2012-09-01

    The Web 2.0 evolution has spread more interactive technologies which affected accessibility for users who navigate the Web by using assistive technologies. In particular, the partial download of new data, the continuous refreshing, and the massive use of scripting can represent significant barriers especially for people with visual impairments, who enjoy the Web by means of screen readers. On the other hand, such technologies can be an opportunity, because they can provide a new means of transcoding Web content, making the Web more accessible. In this article we present GAPforAPE, an augmented browsing system (based on Web browsers extensions) which offers a user's profiling system and transcodes Web content according to constrains declared by users: the same Web page is provided to any user, but GAPforAPE computes adequate customizations, by exploiting scripting technologies which usually affect Web pages accessibility. GAPforAPE imitates screen readers behavior: it applies a specific set of transcoding scripts devoted to a given Web site, when available, and a default set of transcoding operations otherwise. The continuous and quick evolution of the Web has shown that a crowdsourcing system is a desirable solution, letting the transcoding scripts evolve in the same way.

  15. The RAST Server: Rapid Annotations using Subsystems Technology

    Directory of Open Access Journals (Sweden)

    Overbeek Ross A

    2008-02-01

    Full Text Available Abstract Background The number of prokaryotic genome sequences becoming available is growing steadily and is growing faster than our ability to accurately annotate them. Description We describe a fully automated service for annotating bacterial and archaeal genomes. The service identifies protein-encoding, rRNA and tRNA genes, assigns functions to the genes, predicts which subsystems are represented in the genome, uses this information to reconstruct the metabolic network and makes the output easily downloadable for the user. In addition, the annotated genome can be browsed in an environment that supports comparative analysis with the annotated genomes maintained in the SEED environment. The service normally makes the annotated genome available within 12–24 hours of submission, but ultimately the quality of such a service will be judged in terms of accuracy, consistency, and completeness of the produced annotations. We summarize our attempts to address these issues and discuss plans for incrementally enhancing the service. Conclusion By providing accurate, rapid annotation freely to the community we have created an important community resource. The service has now been utilized by over 120 external users annotating over 350 distinct genomes.

  16. PERFORMANCE EVALUATION OF DIRECT PROCESSOR ACCESS FOR NON DEDICATED SERVER

    Directory of Open Access Journals (Sweden)

    P. S. BALAMURUGAN

    2010-10-01

    Full Text Available The objective of the paper is to design a co processor for a desktop machine which enables the machine to act as non dedicated server, such that the co processor will act as a server processor and the multi-core processor to act as desktop processor. By implementing this methodology a client machine can be made to act as a non dedicated server and a client machine. These type of machine can be used in autonomy networks. This design will lead to design of a cost effective server and machine which can parallel act as a non dedicated server and a client machine or it can be made to switch and act as client or server.

  17. A CONFERENCE CONTROL MODEL BETWEEN A WEB SERVER AND A TELECOM APPLICATION SERVER

    Institute of Scientific and Technical Information of China (English)

    Wang Kaixi; Yang Fangchun

    2008-01-01

    The paper proposes a conference control model between a web server and a telecom application server, referred to as the Conference Directed Graph (CDG), and describes an asynchronous communication mechanism between them. The Corba Interface Definition Language (IDL) interfaces are defined, and a message sequence chart is illustrated. This web conference control model provides conference users with a new approach to manage and control a conference and the participants. The performance of the system prototype is analyzed and verified in the 863 project named "The Multimedia and Mobile Services Enabled Soft-switch System".

  18. Observations on the seasonal browsing and grazing behaviour of camels (Camelus dromedarius in southern Darfur-Sudan

    Directory of Open Access Journals (Sweden)

    Alia S. A. Amin,

    2011-04-01

    Full Text Available The observations about camels behaviour during browsing and grazing were recorded during dry and green season in southern Darfur (Latitude 8º 30' and 13º 30' North, by using apparently healthy free ranging camels during the months of March – May (dry season and August – September (green season. Total number of 210 indigenous Arabian camels of different age was used in this study. Camels were observed to be selective browser rather than grazer during dry and green seasons; also they were able to consume whatever plants available to fulfil their needs during the dry season. Camels did not stay long on a single species of plants, but were observed to take several mouthfuls and to move to another or to the same species browsing young green stems or branches with or without thorns and together with leaves, young growing shoots, flowers and fruits during green season, however, it has been observed that camels concentrate on certain evergreen trees and bushes together with the dry grasses if found during the dry season in the dry wadi beds. Camels are selective feeders not only with regard to plants but also in respect of part of the plants they eat , on the natural range they browse and graze at any time of the day but they tend to avoid feeding during the hottest period of the day and adopt positions. Camels prefer to feed on bushes and trees due to their anatomical adaptations. These findings indicate that camels are able to adapt themselves to the seasonal pasture fluctuations without affecting the trees they browsed because of their selectivity to choose some parts not the entire plant.

  19. Analysis of search and browsing behavior of young users on the web

    NARCIS (Netherlands)

    Duarte Torres, Sergio; Weber, Ingmar; Hiemstra, Djoerd; Najork, M.

    2014-01-01

    The Internet is increasingly used by young children for all kinds of purposes. Nonetheless, there are not many resources especially designed for children on the Internet and most of the content online is designed for grown up users. This situation is problematic if we consider the large differences

  20. Windows Server 2008的新特性

    Institute of Scientific and Technical Information of China (English)

    胡岚

    2009-01-01

    Window Server 2008较以前的版本有本质上的改进,尤其在虚拟化技术、Server Core、AD DS以及管理性和安全性等方面都有着技术上的突破和人性化的设计。对Windows Server 2008的新特性和新功能进行了阐述。

  1. Design of a distributed CORBA based image processing server.

    Science.gov (United States)

    Giess, C; Evers, H; Heid, V; Meinzer, H P

    2000-01-01

    This paper presents the design and implementation of a distributed image processing server based on CORBA. Existing image processing tools were encapsulated in a common way with this server. Data exchange and conversion is done automatically inside the server, hiding these tasks from the user. The different image processing tools are visible as one large collection of algorithms and due to the use of CORBA are accessible via intra-/internet.

  2. Swiss EMBnet node web server.

    Science.gov (United States)

    Falquet, Laurent; Bordoli, Lorenza; Ioannidis, Vassilios; Pagni, Marco; Jongeneel, C Victor

    2003-07-01

    EMBnet is a consortium of collaborating bioinformatics groups located mainly within Europe (http://www.embnet.org). Each member country is represented by a 'node', a group responsible for the maintenance of local services for their users (e.g. education, training, software, database distribution, technical support, helpdesk). Among these services a web portal with links and access to locally developed and maintained software is essential and different for each node. Our web portal targets biomedical scientists in Switzerland and elsewhere, offering them access to a collection of important sequence analysis tools mirrored from other sites or developed locally. We describe here the Swiss EMBnet node web site (http://www.ch.embnet.org), which presents a number of original services not available anywhere else.

  3. Browsing preference and ecological carrying capacity of sambar deer (Cervus unicolor brookei) on secondary vegetation in forest plantation.

    Science.gov (United States)

    Ismail, Dahlan; Jiwan, Dawend

    2015-02-01

    The browsing preference and ecological carrying capacity (ECC) of sambar deer (Cervus unicolor brookei) in acacia plantations for management and conservation of the ecosystem were investigated at Sabal Forest Reserve in Sarawak, Malaysia. The identification of the species browsed by the sambar deer was based on an observation of the plant parts consumed. ECC estimation was based on body weight (BW) and the physiological stages of animals browsed in six fenced 4-ha paddocks. Sambar deer were found foraging on only 29 out of 42 species of secondary vegetation in the acacia plantation. The remaining species are too high for the deer to reach. Planted species, Shorea macrophylla are not palatable to the deer. This augurs well for the integration of sambar deer into shorea plantations. The most frequently exploited plants were Ficus spp. Sambar deer preferred woody species more than non-woody species and they are browser animals. By producing metabolizable energy of 19,000 to 27,000 MJ/ha, the ECC was five head/ha to 5.25 head/ha. Given its contribution to the conservation of wildlife and its capacity to sustain the ecosystem, the sambar deer integrated farming system offers a promising strategy for the future of tropical forestry management.

  4. Securing SQL Server Protecting Your Database from Attackers

    CERN Document Server

    Cherry, Denny

    2011-01-01

    There is a lot at stake for administrators taking care of servers, since they house sensitive data like credit cards, social security numbers, medical records, and much more. In Securing SQL Server you will learn about the potential attack vectors that can be used to break into your SQL Server database, and how to protect yourself from these attacks. Written by a Microsoft SQL Server MVP, you will learn how to properly secure your database, from both internal and external threats. Best practices and specific tricks employed by the author will also be revealed. Learn expert techniques to protec

  5. DYNAMIC REQUEST DISPATCHING ALGORITHM FOR WEB SERVER CLUSTER

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    The overall increase in traffic on the WWWcauses a disproportionate increase in client requeststo popular web sites.Site administrators constantlyface the requirement to i mprove server's capacity.Web server cluster is a popular solution.It usesgroup of independent servers that are managed as asingle systemfor higher availability,easier manage-ability and greater scalability.Many web sites haveadopted this solution.Request dispatching[1-2]is one of the core tech-nologies used by parallel web server clusters...

  6. Thermal-aware relocation of servers in green data centers

    Institute of Scientific and Technical Information of China (English)

    Muhammad Tayyab CHAUDHRY; T C LING; S A HUSSAIN; Xin-zhu LU

    2015-01-01

    Rise in inlet air temperature increases the corresponding outlet air temperature from the server. As an added effect of rise in inlet air temperature, some active servers may start exhaling intensely hot air to form a hotspot. Increase in hot air tem-perature and occasional hotspots are an added burden on the cooling mechanism and result in energy wastage in data centers. The increase in inlet air temperature may also result in failure of server hardware. Identifying and comparing the thermal sensi-tivity to inlet air temperature for various servers helps in the thermal-aware arrangement and location switching of servers to minimize the cooling energy wastage. The peak outlet temperature among the relocated servers can be lowered and even be homogenized to reduce the cooling load and chances of hotspots. Based upon mutual comparison of inlet temperature sensitivity of heterogeneous servers, this paper presents a proactive approach for thermal-aware relocation of data center servers. The ex-perimental results show that each relocation operation has a cooling energy saving of as much as 2.1 kW·h and lowers the chances of hotspots by over 77%. Thus, the thermal-aware relocation of servers helps in the establishment of green data centers.

  7. A Request Distribution Algorithm for Web Server Cluster

    Directory of Open Access Journals (Sweden)

    Wei Zhang

    2011-12-01

    Full Text Available With the explosively increasing of web-based applications’ workloads, Web server cluster encounters challenge in response time for requests. Request distribution among servers in web server cluster is the key to address such challenge, especially under heavy workloads. In this paper, we propose a new request distribution algorithm named llac (least load active cache for load balancing switch in web server cluster. The goal of llac is to improve the cache hit rate and reduce response time. Packets are parsed in IP level, and back-end servers are notified to cache hot files using link change technology, neither changing URL information nor modifying the service program. This avoids switching overhead between user mode and kernel mode. The load balancing switch directly creates connection with the selected server, avoiding migrating connection overhead. This policy estimates the current composited load of each server and selects the server with the least load to serve the request. It also improves the resource utilization of web servers. Experimental results show that llac achieves better performance for web applications than wrr (weight round robin which is a popular request distribution.  

  8. Server-Aided Two-Party Computation with Simultaneous Corruption

    DEFF Research Database (Denmark)

    Cascudo Pueyo, Ignacio; Damgård, Ivan Bjerre; Ranellucci, Samuel

    We consider secure two-party computation in the client-server model where there are two adversaries that operate separately but simultaneously, each of them corrupting one of the parties and a restricted subset of servers that they interact with. We model security via the local universal composab......We consider secure two-party computation in the client-server model where there are two adversaries that operate separately but simultaneously, each of them corrupting one of the parties and a restricted subset of servers that they interact with. We model security via the local universal...

  9. Foundations of SQL Server 2008 R2 Business Intelligence

    CERN Document Server

    Fouche, Guy

    2011-01-01

    Foundations of SQL Server 2008 R2 Business Intelligence introduces the entire exciting gamut of business intelligence tools included with SQL Server 2008. Microsoft has designed SQL Server 2008 to be more than just a database. It's a complete business intelligence (BI) platform. The database is at its core, and surrounding the core are tools for data mining, modeling, reporting, analyzing, charting, and integration with other enterprise-level software packages. SQL Server 2008 puts an incredible amount of BI functionality at your disposal. But how do you take advantage of it? That's what this

  10. Pro SQL Server 2012 relational database design and implementation

    CERN Document Server

    Davidson, Louis

    2012-01-01

    Learn effective and scalable database design techniques in a SQL Server environment. Pro SQL Server 2012 Relational Database Design and Implementation covers everything from design logic that business users will understand, all the way to the physical implementation of design in a SQL Server database. Grounded in best practices and a solid understanding of the underlying theory, Louis Davidson shows how to "get it right" in SQL Server database design and lay a solid groundwork for the future use of valuable business data. Gives a solid foundation in best practices and relational theory Covers

  11. MocServer: What & Where in a few milliseconds

    CERN Document Server

    Fernique, Pierre; Oberto, Anais; Pineau, Francois-Xavier

    2016-01-01

    The MocServer is an astronomical service dedicated to the manipulation of data set coverages. This server sets together about 15 000 spatial footprints associated to catalogs, data bases and pixel surveys. Thanks to the Multi-Order Coverage map coding method (MOC1.0 IVOA standard), the MocServer is able to provide in a few milliseconds the list of data set identifiers intersecting any polygon on the sky. The MOC server has been deployed in June 2015 by the Centre de Donnees astronomiques de Strasbourg. It is operational and already in use by Aladin Desktop and Aladin Lite prototype versions.

  12. 2004 progress report : Effects of ungulate browsing on post-fire recovery of riparian cottonwoods : Implications for management of riparian forests, Seedskadee National Wildlife Refuge, Wyoming

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — Browsing pressure by ungulates may limit natural establishment of native cottonwood and willow stands, and fires, which have become more frequent on riparian lands...

  13. The Event Browser:An Intutive Approach to Browsing BaBar Object Databases

    Institute of Scientific and Technical Information of China (English)

    AdeyemiAdesanya

    2001-01-01

    Providing efficient access to more than 300TB of experiment data is the responsibility of the BaBar1 Databases Group.Unlike generic tools,The Event Browser presents users with an abstraction of the BaBar data model.Multithreaded CORBA2 servers perform database operations using small transactions in an effort to avoid lock contention issues and provide adequate response times.The GUI client is implemented in Java and can be easily deployed throughout the community in the form of a web applet.The browser allows users to examine collections of related physics events and identify associations between the collections and the physical files in which they reside,helping administrators distribute data to other sites worldwide,This paper discusses the various aspects of the Event Browser including requirements,design challenges and key features of the current implementation.

  14. BPhyOG: An interactive server for genome-wide inference of bacterial phylogenies based on overlapping genes

    Directory of Open Access Journals (Sweden)

    Lin Kui

    2007-07-01

    Full Text Available Abstract Background Overlapping genes (OGs in bacterial genomes are pairs of adjacent genes of which the coding sequences overlap partly or entirely. With the rapid accumulation of sequence data, many OGs in bacterial genomes have now been identified. Indeed, these might prove a consistent feature across all microbial genomes. Our previous work suggests that OGs can be considered as robust markers at the whole genome level for the construction of phylogenies. An online, interactive web server for inferring phylogenies is needed for biologists to analyze phylogenetic relationships among a set of bacterial genomes of interest. Description BPhyOG is an online interactive server for reconstructing the phylogenies of completely sequenced bacterial genomes on the basis of their shared overlapping genes. It provides two tree-reconstruction methods: Neighbor Joining (NJ and Unweighted Pair-Group Method using Arithmetic averages (UPGMA. Users can apply the desired method to generate phylogenetic trees, which are based on an evolutionary distance matrix for the selected genomes. The distance between two genomes is defined by the normalized number of their shared OG pairs. BPhyOG also allows users to browse the OGs that were used to infer the phylogenetic relationships. It provides detailed annotation for each OG pair and the features of the component genes through hyperlinks. Users can also retrieve each of the homologous OG pairs that have been determined among 177 genomes. It is a useful tool for analyzing the tree of life and overlapping genes from a genomic standpoint. Conclusion BPhyOG is a useful interactive web server for genome-wide inference of any potential evolutionary relationship among the genomes selected by users. It currently includes 177 completely sequenced bacterial genomes containing 79,855 OG pairs, the annotation and homologous OG pairs of which are integrated comprehensively. The reliability of phylogenies complemented by

  15. MARSIS data and simulation exploited using array databases: PlanetServer/EarthServer for sounding radars

    Science.gov (United States)

    Cantini, Federico; Pio Rossi, Angelo; Orosei, Roberto; Baumann, Peter; Misev, Dimitar; Oosthoek, Jelmer; Beccati, Alan; Campalani, Piero; Unnithan, Vikram

    2014-05-01

    MARSIS is an orbital synthetic aperture radar for both ionosphere and subsurface sounding on board ESA's Mars Express (Picardi et al. 2005). It transmits electromagnetic pulses centered at 1.8, 3, 4 or 5 MHz that penetrate below the surface and are reflected by compositional and/or structural discontinuities in the subsurface of Mars. MARSIS data are available as a collection of single orbit data files. The availability of tools for a more effective access to such data would greatly ease data analysis and exploitation by the community of users. For this purpose, we are developing a database built on the raster database management system RasDaMan (e.g. Baumann et al., 1994), to be populated with MARSIS data and integrated in the PlanetServer/EarthServer (e.g. Oosthoek et al., 2013; Rossi et al., this meeting) project. The data (and related metadata) are stored in the db for each frequency used by MARSIS radar. The capability of retrieving data belonging to a certain orbit or to multiple orbit on the base of latitute/longitude boundaries is a key requirement of the db design, allowing, besides the "classical" radargram representation of the data, and in area with sufficiently hight orbit density, a 3D data extraction, subset and analysis of subsurface structures. Moreover the use of the OGC WCPS (Web Coverage Processing Service) standard can allow calculations on database query results for multiple echoes and/or subsets of a certain data product. Because of the low directivity of its dipole antenna, MARSIS receives echoes from portions of the surface of Mars that are distant from nadir and can be mistakenly interpreted as subsurface echoes. For this reason, methods have been developed to simulate surface echoes (e.g. Nouvel et al., 2004), to reveal the true origin of an echo through comparison with instrument data. These simulations are usually time-consuming, and so far have been performed either on a case-by-case basis or in some simplified form. A code for

  16. Determination of Browse Intake and Nutrient Digestibility of Grazing West African Dwarf (WAD) Goats Fed Varying Levels of Gmelina arborea Leaves as Supplements in Delta State Nigeria

    OpenAIRE

    O. Okpara; P.O. Akporhuarho; G.O. Okagbare

    2014-01-01

    The Research was carried out to assess the browse intake and nutrient digestibility of grazing West African Dwarf (WAD) goats fed varying levels of Gmelina arborea leaves as supplement. Which produces appreciable amount of forage even at the peak of the dry season in the tropics, thereby ensuring all year round supply of follage and fodder. Thirty growing West Africa Dwarf (WAD) goats were used to dertermine the level of browse intake and nutrient diggestibility by goats fed verying levels Gm...

  17. Optimization environments and the NEOS server

    Energy Technology Data Exchange (ETDEWEB)

    Gropp, W.; More, J.J. [Argonne National Lab., IL (United States). Mathematics and Computer Science Div.

    1997-03-01

    The authors are interested in the development of problem-solving environments that simplify the formulation of optimization problems, and the access to computational resources. Once the problem has been formulated, the first step in solving an optimization problem in a typical computational environment is to identify and obtain the appropriate piece of optimization software. Once the software has been installed and tested in the local environment, the user must read the documentation and write code to define the optimization problem in the manner required by the software. Typically, Fortran or C code must be written to define the problem, compute function values and derivatives, and specify sparsity patterns. Finally, the user must debug, compile, link, and execute the code. The Network-Enabled Optimization System (NEOS) is an Internet-based service for optimization providing information, software, and problem-solving services for optimization. The main components of NEOS are the NEOS Guide and the NEOS Server. The current version of the NEOS Server is described in Section 2. The authors emphasize nonlinear optimization problems, but NEOS does handle linear and nonlinearly constrained optimization problems, and solvers for optimization problems subject to integer variables are being added. In Section 4 the authors begin to explore possible extensions to the NEOS Server by discussing the addition of solvers for global optimization problems. Section 5 discusses how a remote procedure call (RPC) interface to NEOS addresses some of the limitations of NEOS in the areas of security and usability. The detailed implementation of such an interface raises a number of questions, such as exactly how the RPC is implemented, what security or authentication approaches are used, and what techniques are used to improve the efficiency of the communication. They outline some of the issues in network computing that arise from the emerging style of computing used by NEOS.

  18. Individual variation of isotopic niches in grazing and browsing desert ungulates.

    Science.gov (United States)

    Lehmann, D; Mfune, J K E; Gewers, E; Brain, C; Voigt, C C

    2015-09-01

    Ungulates often adjust their diet when food availability varies over time. However, it is poorly understood when and to what extent individuals change their diet and, if they do so, if all individuals of a population occupy distinct or similar dietary niches. In the arid Namibian Kunene Region, we studied temporal variations of individual niches in grazing gemsbok (Oryx gazella gazella) and predominantly browsing springbok (Antidorcas marsupialis). We used variation in stable C and N isotope ratios of tail hair increments as proxies to estimate individual isotopic dietary niches and their temporal plasticity. Isotopic dietary niches of populations of the two species were mutually exclusive, but similar in breadth. Isotopic niche breadth of gemsbok was better explained by within-individual variation than by between-individual variation of stable isotope ratios, indicating that gemsbok individuals were facultative specialists in using isotopically distinct local food resources. In contrast, inter- and intra-individual variations contributed similarly to the isotopic niche breadth of the springbok population, suggesting a higher degree of individual isotopic segregation in a more generalist ungulate. In both species, between-individual variation was neither explained by changes in plant primary productivity, sex, geographical position nor by group size. Within species, individual dietary niches overlapped partially, suggesting that both populations included individuals with distinct isotopic dietary niches. Our study provides the first evidence for isotopic dietary niche segregation in individuals of two distinct desert ungulates. Similar, yet isotopically distinct dietary niches of individuals may facilitate partitioning of food resources and thus individual survival in desert ecosystems.

  19. Client Server Model Based DAQ System for Real-Time Air Pollution Monitoring

    Directory of Open Access Journals (Sweden)

    Vetrivel. P

    2014-01-01

    Full Text Available The proposed system consists of client server model based Data-Acquisition Unit. The Embedded Web Server integrates Pollution Server and DAQ that collects air Pollutants levels (CO, NO2, and SO2. The Pollution Server is designed by considering modern resource constrained embedded systems. In contrast, an application server is designed to the efficient execution of programs and scripts for supporting the construction of various applications. While a pollution server mainly deals with sending HTML for display in a web browser on the client terminal, an application server provides access to server side logic for pollutants levels to be use by client application programs. The Embedded Web Server is an arm mcb2300 board with internet connectivity and acts as air pollution server as this standalone device gathers air pollutants levels and as a Server. Embedded Web server is accessed by various clients.

  20. A Fault-Tolerant Architecture for Parlay Application Server

    Institute of Scientific and Technical Information of China (English)

    LI Yong-ping; CHEN Jun-liang

    2004-01-01

    As the value-added service providing system in the Next-Generation Networks (NGN), Application Servers (AS) are required to provide the carrier-class reliability. To increase the reliability of AS, the fault-tolerant technology is often adopted. This paper proposes a fault-tolerant architecture for AS against single-point faults. The result of analysis shows that the architecture has a good reliability and is easily extendable. Such an advantage is attributed to a kind of special fault-tolerant design, which is different from others in that two Service Logic Program (SLP) instances do not only provide backups to each other, but also share them in the service traffic.

  1. HS06 Benchmark for an ARM Server

    Science.gov (United States)

    Kluth, Stefan

    2014-06-01

    We benchmarked an ARM cortex-A9 based server system with a four-core CPU running at 1.1 GHz. The system used Ubuntu 12.04 as operating system and the HEPSPEC 2006 (HS06) benchmarking suite was compiled natively with gcc-4.4 on the system. The benchmark was run for various settings of the relevant gcc compiler options. We did not find significant influence from the compiler options on the benchmark result. The final HS06 benchmark result is 10.4.

  2. HS06 Benchmark for an ARM Server

    CERN Document Server

    Kluth, Stefan

    2013-01-01

    We benchmarked an ARM cortex-A9 based server system with a four-core CPU running at 1.1 GHz. The system used Ubuntu 12.04 as operating system and the HEPSPEC 2006 (HS06) benchmarking suite was compiled natively with gcc-4.4 on the system. The benchmark was run for various settings of the relevant gcc compiler options. We did not find significant influence from the compiler options on the benchmark result. The final HS06 benchmark result is 10.4.

  3. Professional Microsoft SQL Server 2012 Reporting Services

    CERN Document Server

    Turley, Paul; Silva, Thiago; Withee, Ken; Paisley, Grant

    2012-01-01

    A must-have guide for the latest updates to the new release of Reporting Services SQL Server Reporting Services allows you to create reports and business intelligence (BI) solutions. With this updated resource, a team of experts shows you how Reporting Services makes reporting faster, easier and more powerful than ever in web, desktop, and portal solutions. New coverage discusses the new reporting tool called Crescent, BI semantic model's impact on report design and creation, semantic model design, and more. You'll explore the major enhancements to Report Builder and benefit from best practice

  4. Getting started with Microsoft Lync server 2013

    CERN Document Server

    Volpe, Fabrizio

    2013-01-01

    This book has a practical approach with a lot of step-by-step guides and explanations as to where and why we're doing the various operations.Getting Started with Microsoft Lync Server 2013 is a starting point for system administrators, IT pros, unified communication technicians, and decision makers in companies or in the consultancy business. For people who have never managed Lync (or a U.C. product), the book will guide you through the basic concepts and mistakes. If you are already managing a Lync deployment you will find important explanations and ideas put together in a single text. If you

  5. Implementing VMware vCenter Server

    CERN Document Server

    Kuminsky, Konstantin

    2013-01-01

    This book is a practical, hands-on guide that will help you learn everything you need to know to administer your environment with VMware vCenter Server. Throughout the book, there are best practices and useful tips and tricks which can be used for day-to-day tasks.If you are an administrator or a technician starting with VMware, with little or no knowledge of virtualization products, this book is ideal for you. Even if you are an IT professional looking to expand your existing environment, you will be able to use this book to help you improve the management of these environments. IT managers w

  6. Winter browse selection by white-tailed deer and implications for bottomland forest restoration in the Upper Mississippi River Valley, USA

    Science.gov (United States)

    Cogger, Benjamin J.; De Jager, Nathan R.; Thomsen, Meredith; Adams, Carrie Reinhardt

    2014-01-01

    White-tailed deer (Odocoileus virginianus) forage selectively, modifying upland forest species composition and in some cases shifting ecosystems to alternative stable states. Few studies, however, have investigated plant selection by deer in bottomland forests. Herbaceous invasive species are common in wetlands and their expansion could be promoted if deer avoid them and preferentially feed on native woody species. We surveyed plant species composition and winter deer browsing in 14 floodplain forest restoration sites along the Upper Mississippi River and tributaries. Tree seedling density declined rapidly with increasing cover of invasive Phalaris arundinacea, averaging less than 1 per m2 in all sites in which the grass was present. Deer browsed ∼46% of available tree seedling stems (branches) at mainland restorations, compared to ∼3% at island sites. Across all tree species, the number of browsed stems increased linearly with the number available and responded unimodally to tree height. Maximum browsing rates were observed on trees with high stem abundances (>10 per plant) and of heights between 50 and 150 cm. Deer preferred Ulmus americana and Acer saccharinum, and avoided Fraxinus pennsylvanica, Acer negundo, and Quercus spp. at mainland sites, and did not browse Phalaris arundinacea if present. Depending on plant growth responses to herbivory and the competitive effects of unbrowsed species, our results suggest that selective foraging could promote the expansion of invasive species and/or alter tree species composition in bottomland forest restorations. Islands may, however, serve as refuges from browsing on a regional scale.

  7. Remote Sensing Data Analytics for Planetary Science with PlanetServer/EarthServer

    Science.gov (United States)

    Rossi, Angelo Pio; Figuera, Ramiro Marco; Flahaut, Jessica; Martinot, Melissa; Misev, Dimitar; Baumann, Peter; Pham Huu, Bang; Besse, Sebastien

    2016-04-01

    Planetary Science datasets, beyond the change in the last two decades from physical volumes to internet-accessible archives, still face the problem of large-scale processing and analytics (e.g. Rossi et al., 2014, Gaddis and Hare, 2015). PlanetServer, the Planetary Science Data Service of the EC-funded EarthServer-2 project (#654367) tackles the planetary Big Data analytics problem with an array database approach (Baumann et al., 2014). It is developed to serve a large amount of calibrated, map-projected planetary data online, mainly through Open Geospatial Consortium (OGC) Web Coverage Processing Service (WCPS) (e.g. Rossi et al., 2014; Oosthoek et al., 2013; Cantini et al., 2014). The focus of the H2020 evolution of PlanetServer is still on complex multidimensional data, particularly hyperspectral imaging and topographic cubes and imagery. In addition to hyperspectral and topographic from Mars (Rossi et al., 2014), the use of WCPS is applied to diverse datasets on the Moon, as well as Mercury. Other Solar System Bodies are going to be progressively available. Derived parameters such as summary products and indices can be produced through WCPS queries, as well as derived imagery colour combination products, dynamically generated and accessed also through OGC Web Coverage Service (WCS). Scientific questions translated into queries can be posed to a large number of individual coverages (data products), locally, regionally or globally. The new PlanetServer system uses the the Open Source Nasa WorldWind (e.g. Hogan, 2011) virtual globe as visualisation engine, and the array database Rasdaman Community Edition as core server component. Analytical tools and client components of relevance for multiple communities and disciplines are shared across service such as the Earth Observation and Marine Data Services of EarthServer. The Planetary Science Data Service of EarthServer is accessible on http://planetserver.eu. All its code base is going to be available on GitHub, on

  8. The Collocation of Web Server in the Setting of Windows Server 2003%在Windows Server 2003环境下Web服务器的配置

    Institute of Scientific and Technical Information of China (English)

    贾燕茹

    2008-01-01

    Windows Server 003可以用系统自带的ⅡS或采用第三方软件两种方式来架设Web服务器.同时Windows Server 2003家族中,还有一个Web版,专用于基于Web服务的各种Web接口应用.文章以Windows Server 2003系统自带ⅡS6.0为例,论述了Web服务器的架设过程.

  9. The Live Access Server Scientific Product Generation Through Workflow Orchestration

    Science.gov (United States)

    Hankin, S.; Calahan, J.; Li, J.; Manke, A.; O'Brien, K.; Schweitzer, R.

    2006-12-01

    The Live Access Server (LAS) is a well-established Web-application for display and analysis of geo-science data sets. The software, which can be downloaded and installed by anyone, gives data providers an easy way to establish services for their on-line data holdings, so their users can make plots; create and download data sub-sets; compare (difference) fields; and perform simple analyses. Now at version 7.0, LAS has been in operation since 1994. The current "Armstrong" release of LAS V7 consists of three components in a tiered architecture: user interface, workflow orchestration and Web Services. The LAS user interface (UI) communicates with the LAS Product Server via an XML protocol embedded in an HTTP "get" URL. Libraries (APIs) have been developed in Java, JavaScript and perl that can readily generate this URL. As a result of this flexibility it is common to find LAS user interfaces of radically different character, tailored to the nature of specific datasets or the mindset of specific users. When a request is received by the LAS Product Server (LPS -- the workflow orchestration component), business logic converts this request into a series of Web Service requests invoked via SOAP. These "back- end" Web services perform data access and generate products (visualizations, data subsets, analyses, etc.). LPS then packages these outputs into final products (typically HTML pages) via Jakarta Velocity templates for delivery to the end user. "Fine grained" data access is performed by back-end services that may utilize JDBC for data base access; the OPeNDAP "DAPPER" protocol; or (in principle) the OGC WFS protocol. Back-end visualization services are commonly legacy science applications wrapped in Java or Python (or perl) classes and deployed as Web Services accessible via SOAP. Ferret is the default visualization application used by LAS, though other applications such as Matlab, CDAT, and GrADS can also be used. Other back-end services may include generation of Google

  10. Client-Server Connection Status Monitoring Using Ajax Push Technology

    Science.gov (United States)

    Lamongie, Julien R.

    2008-01-01

    This paper describes how simple client-server connection status monitoring can be implemented using Ajax (Asynchronous JavaScript and XML), JSF (Java Server Faces) and ICEfaces technologies. This functionality is required for NASA LCS (Launch Control System) displays used in the firing room for the Constellation project. Two separate implementations based on two distinct approaches are detailed and analyzed.

  11. How to Configurate Oracle Enterprise Manager on Windows 2000 Server

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    Oracle Enterprise Manager is a system management tool, which provides an integrated solution for centrally managing your heterogeneous environment Servers. Enterprise Manager combines a graphical Console, Oracle Management Servers, Oracle Intelligent Agents, common services, and tools to provide an integrated, comprehensive systems management platform for managing Oracle products, and is comprised of such as Data

  12. Dynamic Web Pages: Performance Impact on Web Servers.

    Science.gov (United States)

    Kothari, Bhupesh; Claypool, Mark

    2001-01-01

    Discussion of Web servers and requests for dynamic pages focuses on experimentally measuring and analyzing the performance of the three dynamic Web page generation technologies: CGI, FastCGI, and Servlets. Develops a multivariate linear regression model and predicts Web server performance under some typical dynamic requests. (Author/LRW)

  13. Enhanced networked server management with random remote backups

    Science.gov (United States)

    Kim, Song-Kyoo

    2003-08-01

    In this paper, the model is focused on available server management in network environments. The (remote) backup servers are hooked up by VPN (Virtual Private Network) and replace broken main severs immediately. A virtual private network (VPN) is a way to use a public network infrastructure and hooks up long-distance servers within a single network infrastructure. The servers can be represent as "machines" and then the system deals with main unreliable and random auxiliary spare (remote backup) machines. When the system performs a mandatory routine maintenance, auxiliary machines are being used for backups during idle periods. Unlike other existing models, the availability of auxiliary machines is changed for each activation in this enhanced model. Analytically tractable results are obtained by using several mathematical techniques and the results are demonstrated in the framework of optimized networked server allocation problems.

  14. Network Congestion Control in 4G Technology Through Iterative Server

    Directory of Open Access Journals (Sweden)

    Khaleel Ahmad

    2012-07-01

    Full Text Available During the last few decades, mobile communication has developed rapidly. The increasing dependency of people on telecommunication resources is pushing even more current technological developments in the mobile world. In Real-time multimedia applications, such as Live TV or live movie, video conferencing, VoIP, on-line gaming etc. are exciting applications to the success of 4G.In todays Internet these applications are not subject to congestion control, therefore the growth of popularity of these applications may endanger the stability of the Internet. In this paper, we propose a novel model to solve the network congestion problem through iterative server. In this model, when a client send a request to server then server will generate a individual iterative server for requesting client. After completing the request, the iterative server will be automatically destroyed.

  15. Creating A Model HTTP Server Program Using java

    CERN Document Server

    Veerasamy, Bala Dhandayuthapani

    2010-01-01

    HTTP Server is a computer programs that serves webpage content to clients. A webpage is a document or resource of information that is suitable for the World Wide Web and can be accessed through a web browser and displayed on a computer screen. This information is usually in HTML format, and may provide navigation to other webpage's via hypertext links. WebPages may be retrieved from a local computer or from a remote HTTP Server. WebPages are requested and served from HTTP Servers using Hypertext Transfer Protocol (HTTP). WebPages may consist of files of static or dynamic text stored within the HTTP Server's file system. Client-side scripting can make WebPages more responsive to user input once in the client browser. This paper encompasses the creation of HTTP server program using java language, which is basically supporting for HTML and JavaScript.

  16. Windows Server 2008 R2中的Server Core具有哪些新特性?

    Institute of Scientific and Technical Information of China (English)

    2009-01-01

    Server Core是Windows Server2008中一个受欢迎的特性,它是一个需要更少的资源占用和维护的安装选项。Server Core支持许多有用的角色和特性,但缺少一些关键特性(如.NET Framework支持,这意味着它没有Power Shell)。另外,Windows Server 2008 R2还解决了许多遗留问题。以下是Server Core的主要新特性:

  17. Impacts of white-tailed deer on red trillium (Trillium recurvatum): defining a threshold for deer browsing pressure at the Indiana Dunes National Lakeshore

    Science.gov (United States)

    Pavlovic, Noel B.; Leicht-Young, Stacey A.; Grundel, Ralph

    2014-01-01

    Overabundant white-tailed deer (Odocoileus virginianus) have been a concern for land managers in eastern North America because of their impacts on native forest ecosystems. Managers have sought native plant species to serve as phytoindicators of deer impacts to supplement deer surveys. We analyzed experimental data about red trillium (Trillium recurvatum), large flowered trillium (T. grandiflorum), nodding trillium (T. cernuum), and declined trillium (T. flexipes) growth in paired exclosure (fenced) plots and control (unfenced) plots from 2002 to 2010 at the Indiana Dunes National Lakeshore. The latter two species lacked replication, so statistical analysis was not possible. All red trillium plants were surveyed for height-to-leaf, effects of browsing, and presence of flowers. Data from individuals in 2009 demonstrated a sigmoidal relationship between height-to-leaf and probability of flowering. The relationship on moraine soils was shifted to taller plants compared to those on sand substrates, with respectively 50 percent flowering at 18 and 16 cm and 33 percent flowering at 16 and 14 cm height-to-leaf. On a plot basis, the proportion of plants flowering was influenced by height to leaf, duration of protection, and deviation in rainfall. The proportion of plants flowering increased ninefold in exclosures (28 percent) compared to control plots (3 percent) over the 8 years of protection. The mean height-to-leaf was a function of the interaction between treatment and duration, as well as red trillium density. Changes in height-to-leaf in control plots from year to year were significantly influenced by an interaction between change in deer density and change in snowfall depth. There was a significant negative correlation between change in deer density and snowfall depth. Plants in the exclosures increased in height at a rate of 1.5 cm yr−1 whereas control plants decreased in height by 0.9 cm yr−1. In all, 78 percent of the control plots lacked flowering

  18. Oceanotron, Scalable Server for Marine Observations

    Science.gov (United States)

    Loubrieu, T.; Bregent, S.; Blower, J. D.; Griffiths, G.

    2013-12-01

    Ifremer, French marine institute, is deeply involved in data management for different ocean in-situ observation programs (ARGO, OceanSites, GOSUD, ...) or other European programs aiming at networking ocean in-situ observation data repositories (myOcean, seaDataNet, Emodnet). To capitalize the effort for implementing advance data dissemination services (visualization, download with subsetting) for these programs and generally speaking water-column observations repositories, Ifremer decided to develop the oceanotron server (2010). Knowing the diversity of data repository formats (RDBMS, netCDF, ODV, ...) and the temperamental nature of the standard interoperability interface profiles (OGC/WMS, OGC/WFS, OGC/SOS, OpeNDAP, ...), the server is designed to manage plugins: - StorageUnits : which enable to read specific data repository formats (netCDF/OceanSites, RDBMS schema, ODV binary format). - FrontDesks : which get external requests and send results for interoperable protocols (OGC/WMS, OGC/SOS, OpenDAP). In between a third type of plugin may be inserted: - TransformationUnits : which enable ocean business related transformation of the features (for example conversion of vertical coordinates from pressure in dB to meters under sea surface). The server is released under open-source license so that partners can develop their own plugins. Within MyOcean project, University of Reading has plugged a WMS implementation as an oceanotron frontdesk. The modules are connected together by sharing the same information model for marine observations (or sampling features: vertical profiles, point series and trajectories), dataset metadata and queries. The shared information model is based on OGC/Observation & Measurement and Unidata/Common Data Model initiatives. The model is implemented in java (http://www.ifremer.fr/isi/oceanotron/javadoc/). This inner-interoperability level enables to capitalize ocean business expertise in software development without being indentured to

  19. Outdoor Urban Propagation Experiment of a Handset MIMO Antenna with a Human Phantom located in a Browsing Stance

    DEFF Research Database (Denmark)

    Yamamoto, Atsushi; Hayashi, Toshiteru; Ogawa, Koichi;

    2007-01-01

    Outdoor radio propagation experiments are presented at 2.4 GHz, using a handset MIMO antenna with two monopoles and two planar inverted-F antennas (PIFAs), adjacent to a human phantom in browsing stance. The propagation test was performed in an urban area of a city, which resulted in non lineof......-sight (NLOS) situations. In our investigation, the 4-by-4 MIMO and SISO channel capacities for the reception signals were evaluated. These measurements show that the handset MIMO antenna, close to the human operator, is capable of MIMO reception....

  20. Secure thin client architecture for DICOM image analysis

    Science.gov (United States)

    Mogatala, Harsha V. R.; Gallet, Jacqueline

    2005-04-01

    This paper presents a concept of Secure Thin Client (STC) Architecture for Digital Imaging and Communications in Medicine (DICOM) image analysis over Internet. STC Architecture provides in-depth analysis and design of customized reports for DICOM images using drag-and-drop and data warehouse technology. Using a personal computer and a common set of browsing software, STC can be used for analyzing and reporting detailed patient information, type of examinations, date, Computer Tomography (CT) dose index, and other relevant information stored within the images header files as well as in the hospital databases. STC Architecture is three-tier architecture. The First-Tier consists of drag-and-drop web based interface and web server, which provides customized analysis and reporting ability to the users. The Second-Tier consists of an online analytical processing (OLAP) server and database system, which serves fast, real-time, aggregated multi-dimensional data using OLAP technology. The Third-Tier consists of a smart algorithm based software program which extracts DICOM tags from CT images in this particular application, irrespective of CT vendor's, and transfers these tags into a secure database system. This architecture provides Winnipeg Regional Health Authorities (WRHA) with quality indicators for CT examinations in the hospitals. It also provides health care professionals with analytical tool to optimize radiation dose and image quality parameters. The information is provided to the user by way of a secure socket layer (SSL) and role based security criteria over Internet. Although this particular application has been developed for WRHA, this paper also discusses the effort to extend the Architecture to other hospitals in the region. Any DICOM tag from any imaging modality could be tracked with this software.

  1. A Web Server for MACCS Magnetometer Data

    Science.gov (United States)

    Engebretson, Mark J.

    1998-01-01

    NASA Grant NAG5-3719 was provided to Augsburg College to support the development of a web server for the Magnetometer Array for Cusp and Cleft Studies (MACCS), a two-dimensional array of fluxgate magnetometers located at cusp latitudes in Arctic Canada. MACCS was developed as part of the National Science Foundation's GEM (Geospace Environment Modeling) Program, which was designed in part to complement NASA's Global Geospace Science programs during the decade of the 1990s. This report describes the successful use of these grant funds to support a working web page that provides both daily plots and file access to any user accessing the worldwide web. The MACCS home page can be accessed at http://space.augsburg.edu/space/MaccsHome.html.

  2. Microsoft SQL Server 2012 Master Data Services

    OpenAIRE

    Puhakka, Jani

    2014-01-01

    Insinöörityössä selvitettiin Microsoft SQL Server 2012 Master Data Services -palvelimen toiminnot master datan hallintaan. Tavoitteena oli muodostaa käsitys järjestelmän toiminnallisuuksista ja miten näitä voidaan hyödyntää. Ensimmäisenä työssä tutustuttiin master data -käsitteeseen ja -käyttötarkoitukseen. Tämän jälkeen asennettiin Master Data Services -ympäristö virtuaalikoneelle sekä tutustuttiin käytettävissä oleviin hallintatyökaluihin. Seuraavana käytiin läpi Master Data Serviceen l...

  3. Record Recommendations for the CERN Document Server

    CERN Document Server

    AUTHOR|(CDS)2096025; Marian, Ludmila

    CERN Document Server (CDS) is the institutional repository of the European Organization for Nuclear Research (CERN). It hosts all the research material produced at CERN, as well as multi- media and administrative documents. It currently has more than 1.5 million records grouped in more than 1000 collections. It’s underlying platform is Invenio, an open source digital library system created at CERN. As the size of CDS increases, discovering useful and interesting records becomes more challenging. Therefore, the goal of this work is to create a system that supports the user in the discovery of related interesting records. To achieve this, a set of recommended records are displayed on the record page. These recommended records are based on the analyzed behavior (page views and downloads) of other users. This work will describe the methods and algorithms used for creating, implementing, and the integration with the underlying software platform, Invenio. A very important decision factor when designing a recomme...

  4. Optimal Multi-Server Allocation to Parallel Queues With Independent Random Queue-Server Connectivity

    CERN Document Server

    Al-Zubaidy, Hussein; Viniotis, Yannis

    2011-01-01

    We investigate an optimal scheduling problem in a discrete-time system of L parallel queues that are served by K identical, randomly connected servers. Each queue may be connected to a subset of the K servers during any given time slot. This model has been widely used in studies of emerging 3G/4G wireless systems. We introduce the class of Most Balancing (MB) policies and provide their mathematical characterization. We prove that MB policies are optimal; we de?ne optimality as minimization, in stochastic ordering sense, of a range of cost functions of the queue lengths, including the process of total number of packets in the system. We use stochastic coupling arguments for our proof. We introduce the Least Connected Server First/Longest Connected Queue (LCSF/LCQ) policy as an easy-to-implement approximation of MB policies. We conduct a simulation study to compare the performance of several policies. The simulation results show that: (a) in all cases, LCSF/LCQ approximations to the MB policies outperform the o...

  5. Analysis of Mobile Application Structure and System Construction Based on WebServer%基于WebServer的移动应用结构分析与系统建设

    Institute of Scientific and Technical Information of China (English)

    王永平

    2014-01-01

    Mobile learning device can learn at any time, any place through the portable mobile computing. In the construction process of mobile application system, SSH integrated framework can be used in Web development server procedures. The representation logic and control logic separated are respected by the presentation layer. It uses business logic layer to process, reduce the coupling system architecture. This software structure is clear, expansibility and maintainability of higher.%移动学习通过便携式移动计算设备能够在任何时间、任何地点进行学习,在移动应用系统的建设过程中,利用SSH整合框架进行Web Server程序的开发,把表现逻辑和控制逻辑分离开来,分别由表现层、业务逻辑层来处理,降低了系统总架构的耦合性,使软件结构清晰,可扩展性和可维护性更高。

  6. 基于SQL Server视图的数据库安全模型的研究%Research and Analysis of View Security Model Based on the SQL Server Database

    Institute of Scientific and Technical Information of China (English)

    陈增祥

    2012-01-01

    随着信息技术和市场的发展,数据管理不再仅仅是存储和管理数据,而转变成用户所需要的各种数据管理的方式。数据库系统的安全特性主要是针对数据而言的,包括数据安全性、数据完整性、故障恢复等。本文以视图这个视角,去分析数据库安全模型。主要介绍SQL servet数据库视图的安全模型。%With the development of information technology and the market, data management is no longer just data storage and management, but the entire required data management mode by users. The security features of database system are mainly directed against the data, including data security, data integrity, and fault recovery and so on. In view of this perspective, this article analyzes the database security model, mainly introduces security model of the view of SQL server database.

  7. An Intra-Server Interconnect Fabric for Heterogeneous Computing

    Institute of Scientific and Technical Information of China (English)

    曹政; 刘小丽; 李强; 刘小兵; 王展; 安学军

    2014-01-01

    With the increasing diversity of application needs and computing units, the server with heterogeneous pro-cessors is more and more widespread. However, conventional SMP/ccNUMA server architecture introduces communication bottleneck between heterogeneous processors and only uses heterogeneous processors as coprocessors, which limits the efficiency and flexibility of using heterogeneous processors. To solve this problem, this paper proposes an intra-server inter-connect fabric that supports both intra-server peer-to-peer interconnection and I/O resource sharing among heterogeneous processors. By connecting processors and I/O devices with the proposed fabric, heterogeneous processors can perform direct communication with each other and run in stand-alone mode with shared intra-server resources. We design the proposed fabric by extending the de-facto system I/O bus protocol PCIe (Peripheral Computer Interconnect Express) and implement it with a single chip cZodiac. By making full use of PCIe’s original advantages, the interconnection and the I/O sharing mechanism are light weight and efficient. Evaluations that have been carried out on both the FPGA (Field Programmable Gate Array) prototype and the cycle-accurate simulator demonstrate that our design is feasible and scalable. In addition, our design is suitable for not only the heterogeneous server but also the high density server.

  8. Amino acid sequences used for clusterintg (Multi FASTA format) - Gclust Server | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available Gclust Server Amino acid sequences used for clusterintg (Multi FASTA format) Data detail Data name Amino aci...d sequences used for clusterintg (Multi FASTA format) Description of data contents Amino acid sequences of p...redicted proteins and their annotation for 95 organism species. FASTA format file...5.fa.zip File size: 161MB Simple search URL - Data acquisition method - Data analysis method - Number of data entries - FAST... Site Policy | Contact Us Amino acid sequences used for clusterintg (Multi FASTA format) - Gclust Server | LSDB Archive ...

  9. The Implementation Strategy and Technique of COM Server%COM服务器的实现策略和技术

    Institute of Scientific and Technical Information of China (English)

    郑涛; 张德贤

    2001-01-01

    在系统地分析COM服务器软件结构及其线程模型的基础上,分别给出了使用MFC和ATL技术构造COM服务器的具体技术,最后研究了几种主要COM线程模型的实现方法.%Based on the detailed analysis of the software structure and threading models of COM server, this paper discusses the implementation of COM server by applying MFC and ATL respectively, thus offers support for developing practical application based on COM by using currently available technique.

  10. Web Instant Messaging System Based on Server Push Technology and php%基于php和服务器推技术的Web即时聊天系统

    Institute of Scientific and Technical Information of China (English)

    王振兴; 黄静

    2012-01-01

    基于http协议应用于Web端,实现一个浏览器无关的、便于移植的、高性能的Web即时聊天系统.系统使用服务器推技术中的ajax长轮询模型构建http通讯模型,利用开源LAMP架构搭建服务器端程序,并使用XML文件系统存储即时聊天内容,前端使用javascript的jquery框架实现与浏览器无关的ajax前端程序.提供包括文本表情模式的聊天,便于整合到社交类型的网站中.项目实践表明,基于php和服务器推技术的Web即时聊天系统稳定性高,具有一定的使用价值.%Based on the http protocol used in the Web side, it realizes a not browser based, easy to transplant, high-performance real-time web chat system. System uses the ajax long polling model of the server push technique to build the http communication, using the open source LAMP architecture to build server-side programs, and use the xml file system to store real-time chat content, the front using the javascript jquery framework to achieve and browse device-independent ajax front-end program. Text chat and expressions, and ease of integration into the sns type of site. Project practice shows that the web php-based server push model real time chat system stability, it has a certain value.

  11. The Method of Integrating Windows Server 2003 and Windows NT Server 4.0%Windows Server 2003与Windows NT Server 4.0集成方案的探讨

    Institute of Scientific and Technical Information of China (English)

    陈征

    2004-01-01

    通过对新一代微软网络操作系统Windows Server2003新功能的分析,提出了域信任方法和添加服务器方法二种集成方案,实现Windows Server 2003与高校计算机机房中Windows NT Server 4.0网络环境的集成,使它们能够充分发挥各自的特长,同时又安全、稳定地工作.

  12. Microsoft SQL Server Reporting Services Recipes for Designing Expert Reports

    CERN Document Server

    Turley, Paul

    2010-01-01

    Learn to design more effective and sophisticated business reports. While most users of SQL Server Reporting Services are now comfortable designing and building simple reports, business today demands increasingly complex reporting. In this book, top Reporting Services design experts have contributed step-by-step recipes for creating various types of reports. Written by well-known SQL Server Reporting Services experts, this book gives you the tools to meet your clients' needs: SQL Server Reporting Services enables you to create a wide variety of reports; This guide helps you customize reports fo

  13. Mac OS X Snow Leopard Server For Dummies

    CERN Document Server

    Rizzo, John

    2009-01-01

    Making Everything Easier!. Mac OS® X Snow Leopard Server for Dummies. Learn to::;. Set up and configure a Mac network with Snow Leopard Server;. Administer, secure, and troubleshoot the network;. Incorporate a Mac subnet into a Windows Active Directory® domain;. Take advantage of Unix® power and security. John Rizzo. Want to set up and administer a network even if you don't have an IT department? Read on!. Like everything Mac, Snow Leopard Server was designed to be easy to set up and use. Still, there are so many options and features that this book will save you heaps of time and effort. It wa

  14. IBM WebSphere Application Server 80 Administration Guide

    CERN Document Server

    Robinson, Steve

    2011-01-01

    IBM WebSphere Application Server 8.0 Administration Guide is a highly practical, example-driven tutorial. You will be introduced to WebSphere Application Server 8.0, and guided through configuration, deployment, and tuning for optimum performance. If you are an administrator who wants to get up and running with IBM WebSphere Application Server 8.0, then this book is not to be missed. Experience with WebSphere and Java would be an advantage, but is not essential.

  15. Profit-Aware Server Allocation for Green Internet Services

    CERN Document Server

    Mazzucco, Michele; Dikaiakos, Marios

    2011-01-01

    A server farm is examined, where a number of servers are used to offer a service to impatient customers. Every completed request generates a certain amount of profit, running servers consume electricity for power and cooling, while waiting customers might leave the system before receiving service if they experience excessive delays. A dynamic allocation policy aiming at satisfying the conflicting goals of maximizing the quality of users' experience while minimizing the cost for the provider is introduced and evaluated. The results of several experiments are described, showing that the proposed scheme performs well under different traffic conditions.

  16. Microsoft Windows Server 2008 R2 Administration Instant Reference

    CERN Document Server

    Hester, Matthew

    2010-01-01

    Windows Server 2008 R2 Administration Instant Reference provides quick referencing for the day-to-day tasks of administrating Microsoft's newest version of Windows Server. This book uses design features such as thumb tabs, secondary and tertiary tables of contents, and special heading treatments to provide quick and easy lookup, as well as quick-reference tables and lists to provide answers on the spot. Covering the essentials of day-to-day tasks Windows Server administrators perform, key topics include: Hyper-V 2.0; DirectAccess; LiveMigration; Automation; Core Active Directory administration

  17. Expert T-SQL window functions in SQL Server

    CERN Document Server

    Kellenberger, Kathi

    2015-01-01

    Expert T-SQL Window Functions in SQL Server takes you from any level of knowledge of windowing functions and turns you into an expert who can use these powerful functions to solve many T-SQL queries. Replace slow cursors and self-joins with queries that are easy to write and fantastically better performing, all through the magic of window functions. First introduced in SQL Server 2005, window functions came into full blossom with SQL Server 2012. They truly are one of the most notable developments in SQL in a decade, and every developer and DBA can benefit from their expressive power in sol

  18. Instant migration from Windows Server 2008 and 2008 R2 to 2012 how-to

    CERN Document Server

    Sivarajan, Santhosh

    2013-01-01

    Presented in a hands-on reference manual style, with real-world scenarios to lead you through each process. This book is intended for Windows server administrators who are performing migrations from their existing Windows Server 2008 / 2008 R2 environment to Windows Server 2012. The reader must be familiar with Windows Server 2008.

  19. The challenge of stabilizing control for queueing systems with unobservable server states

    NARCIS (Netherlands)

    Nazarathy, Y.; Taimre, T.; Asanjarani, A.; Kuhn, J.; Patch, B.; Vuorinen, A.

    2016-01-01

    We address the problem of stabilizing control for complex queueing systems where servers follow unobservable Markovian environments. The controller needs to assign servers to queues without full information about the servers' states. A control challenge is to devise a policy that matches servers to

  20. Effect of Tannin and Species Variation on In vitro Digestibility, Gas, and Methane Production of Tropical Browse Plants.

    Science.gov (United States)

    Gemeda, B S; Hassen, A

    2015-02-01

    Nineteen tanniferous browse plants were collected from South Africa to investigate their digestibility, gas production (GP) characteristics and methane production. Fresh samples were collected, dried in forced oven, and ground and analyzed for nutrient composition. In vitro GP and in vitro organic matter digestibility (IVOMD) were determined using rumen fluid collected, strained and anaerobically prepared. A semi- automated system was used to measure GP by incubating the sample in a shaking incubator at 39°C. There was significant (pmethane production. Methane production was positively correlated with NDF, ADF, cellulose and hemi-cellulose. Tannin decreased GP, IVOMD, total volatile fatty acid and methane production. The observed low methanogenic potential and substantial ammonia generation of some of the browses might be potentially useful as rumen manipulating agents. However, a systematic evaluation is needed to determine optimum levels of supplementation in a mixed diet in order to attain a maximal depressing effect on enteric CH4 production with a minimal detrimental effect on rumen fermentation of poor quality roughage based diet.

  1. Dry matter and digesta particle size gradients along the goat digestive tract on grass and browse diets.

    Science.gov (United States)

    Clauss, M; Fritz, J; Tschuor, A; Braun, U; Hummel, J; Codron, D

    2017-02-01

    Physical properties of the digesta vary along the ruminant digestive tract. They also vary within the forestomach, leading to varying degrees of rumen contents stratification in 'moose-type' (browsing) and 'cattle-type' (intermediate and grazing) ruminants. We investigated the dry matter concentration (DM) and the mean digesta particle size (MPS) within the forestomach and along the digestive tract in 10 goats fed grass hay or dried browse after a standardized 12-h fast, euthanasia and freezing in the natural position. In all animals, irrespective of diet, DM showed a peak in the omasum and an increase from caecum via colon towards the faeces and a decrease in MPS between the reticulum and the omasum. Both patterns are typical for ruminants in general. In the forestomach, there was little systematic difference between more cranial and more caudal locations ('horizontal stratification'), with the possible exception of large particle segregation in the dorsal rumen blindsac on the grass diet. In contrast, the typical (vertical) contents stratification was evident for DM (with drier contents dorsally) and, to a lower degree, for MPS (with larger particles dorsally). Although evident in both groups, this stratification was more pronounced on the grass diet. The results support the interpretation that differences in rumen contents stratification between ruminants are mainly an effect of species-specific physiology, but can be enhanced due to the diet consumed.

  2. 基于Java的Web服务器性能测试工具研究%Java-based Web server performance testing tools to study

    Institute of Scientific and Technical Information of China (English)

    贺蕴彬

    2013-01-01

    国民经济的不断增长,推动了计算机技术的普及,我国计算机用户的数量急剧增多,大大增加了 Web服务器的承载压力,导致 Web 服务器在使用的过程中出现响应延时的现象,因此需要对 Web 服务器的性能测试工具进行研究,以完善 Web 服务器的使用。首先阐述了 Web 服务器的分析和设计,其次分析介绍了 Web 服务器的性能测试工具。%The growing national economy,promote the popularization of computer technology,China’s sharp increase in the number of computer users,greatly increasing the Web server hosting the pressure,leading to a Web server in use occurred during latency and,therefore,the need for Web server performance testing tool for research to improve the use of the Web server.This paper first gives an overview on Java,but also elaborates the Web server analysis and design,the final analysis describes the Web server performance testing tools.

  3. Base-on Cloud Computing A new type of distributed application server system design

    Directory of Open Access Journals (Sweden)

    Ying-ying Chen

    2012-11-01

    Full Text Available At this stage the application server systems, such as e-commerce platform , instant messaging system , enterprise information system and so on, will be led to lose connections , the data latency phenomena because of too much concurrent requests, application server architecture, system architecture, etc. In serious cases, the server is running blocked. The new type of application server system contains four parts: a client program, transfer servers, application servers and databases. Application server is the core of the system. Its performance determines the system’s performance. At the same time the application servers and transfer servers can be designed as the web service open to be used, and they can be achieved as distributed architecture by a number of hardware servers, which can effectively deal with high concurrent client application requests.

  4. Robust biometrics based authentication and key agreement scheme for multi-server environments using smart cards.

    Science.gov (United States)

    Lu, Yanrong; Li, Lixiang; Yang, Xing; Yang, Yixian

    2015-01-01

    Biometrics authenticated schemes using smart cards have attracted much attention in multi-server environments. Several schemes of this type where proposed in the past. However, many of them were found to have some design flaws. This paper concentrates on the security weaknesses of the three-factor authentication scheme by Mishra et al. After careful analysis, we find their scheme does not really resist replay attack while failing to provide an efficient password change phase. We further propose an improvement of Mishra et al.'s scheme with the purpose of preventing the security threats of their scheme. We demonstrate the proposed scheme is given to strong authentication against several attacks including attacks shown in the original scheme. In addition, we compare the performance and functionality with other multi-server authenticated key schemes.

  5. Robust Biometrics Based Authentication and Key Agreement Scheme for Multi-Server Environments Using Smart Cards

    Science.gov (United States)

    Lu, Yanrong; Li, Lixiang; Yang, Xing; Yang, Yixian

    2015-01-01

    Biometrics authenticated schemes using smart cards have attracted much attention in multi-server environments. Several schemes of this type where proposed in the past. However, many of them were found to have some design flaws. This paper concentrates on the security weaknesses of the three-factor authentication scheme by Mishra et al. After careful analysis, we find their scheme does not really resist replay attack while failing to provide an efficient password change phase. We further propose an improvement of Mishra et al.’s scheme with the purpose of preventing the security threats of their scheme. We demonstrate the proposed scheme is given to strong authentication against several attacks including attacks shown in the original scheme. In addition, we compare the performance and functionality with other multi-server authenticated key schemes. PMID:25978373

  6. NeisseriaBase: a specialised Neisseria genomic resource and analysis platform.

    Science.gov (United States)

    Zheng, Wenning; Mutha, Naresh V R; Heydari, Hamed; Dutta, Avirup; Siow, Cheuk Chuen; Jakubovics, Nicholas S; Wee, Wei Yee; Tan, Shi Yang; Ang, Mia Yang; Wong, Guat Jah; Choo, Siew Woh

    2016-01-01

    Background. The gram-negative Neisseria is associated with two of the most potent human epidemic diseases: meningococcal meningitis and gonorrhoea. In both cases, disease is caused by bacteria colonizing human mucosal membrane surfaces. Overall, the genus shows great diversity and genetic variation mainly due to its ability to acquire and incorporate genetic material from a diverse range of sources through horizontal gene transfer. Although a number of databases exist for the Neisseria genomes, they are mostly focused on the pathogenic species. In this present study we present the freely available NeisseriaBase, a database dedicated to the genus Neisseria encompassing the complete and draft genomes of 15 pathogenic and commensal Neisseria species. Methods. The genomic data were retrieved from National Center for Biotechnology Information (NCBI) and annotated using the RAST server which were then stored into the MySQL database. The protein-coding genes were further analyzed to obtain information such as calculation of GC content (%), predicted hydrophobicity and molecular weight (Da) using in-house Perl scripts. The web application was developed following the secure four-tier web application architecture: (1) client workstation, (2) web server, (3) application server, and (4) database server. The web interface was constructed using PHP, JavaScript, jQuery, AJAX and CSS, utilizing the model-view-controller (MVC) framework. The in-house developed bioinformatics tools implemented in NeisseraBase were developed using Python, Perl, BioPerl and R languages. Results. Currently, NeisseriaBase houses 603,500 Coding Sequences (CDSs), 16,071 RNAs and 13,119 tRNA genes from 227 Neisseria genomes. The database is equipped with interactive web interfaces. Incorporation of the JBrowse genome browser in the database enables fast and smooth browsing of Neisseria genomes. NeisseriaBase includes the standard BLAST program to facilitate homology searching, and for Virulence Factor

  7. NeisseriaBase: a specialised Neisseria genomic resource and analysis platform

    Directory of Open Access Journals (Sweden)

    Wenning Zheng

    2016-03-01

    Full Text Available Background. The gram-negative Neisseria is associated with two of the most potent human epidemic diseases: meningococcal meningitis and gonorrhoea. In both cases, disease is caused by bacteria colonizing human mucosal membrane surfaces. Overall, the genus shows great diversity and genetic variation mainly due to its ability to acquire and incorporate genetic material from a diverse range of sources through horizontal gene transfer. Although a number of databases exist for the Neisseria genomes, they are mostly focused on the pathogenic species. In this present study we present the freely available NeisseriaBase, a database dedicated to the genus Neisseria encompassing the complete and draft genomes of 15 pathogenic and commensal Neisseria species. Methods. The genomic data were retrieved from National Center for Biotechnology Information (NCBI and annotated using the RAST server which were then stored into the MySQL database. The protein-coding genes were further analyzed to obtain information such as calculation of GC content (%, predicted hydrophobicity and molecular weight (Da using in-house Perl scripts. The web application was developed following the secure four-tier web application architecture: (1 client workstation, (2 web server, (3 application server, and (4 database server. The web interface was constructed using PHP, JavaScript, jQuery, AJAX and CSS, utilizing the model-view-controller (MVC framework. The in-house developed bioinformatics tools implemented in NeisseraBase were developed using Python, Perl, BioPerl and R languages. Results. Currently, NeisseriaBase houses 603,500 Coding Sequences (CDSs, 16,071 RNAs and 13,119 tRNA genes from 227 Neisseria genomes. The database is equipped with interactive web interfaces. Incorporation of the JBrowse genome browser in the database enables fast and smooth browsing of Neisseria genomes. NeisseriaBase includes the standard BLAST program to facilitate homology searching, and for Virulence

  8. PENERAPAN PENGOLAHAN PARALEL MODEL CLUSTER SEBAGAI WEB SERVER

    Directory of Open Access Journals (Sweden)

    Maman Somantri

    2009-06-01

    Full Text Available engolahan paralel merupakan suatu cara yang dilakukan untuk meningkatkan kecepatan pengolahandata dengan melakukan lebih dari satu pengolahan data tersebut secara bersamaan. Salah satu bentuk pengolahanparalel adalah model cluster. Pengolahan paralel model cluster ini akan digunakan untuk mengolah data Web,dengan membangun server Web yang di-cluster. Cluster server Web ini menggunakan teknologi Linux VirtualServer (LVS yang dapat dilakukan dengan NAT, IP tunneling, dan direct routing yang memiliki empat algoritmapenjadwalan.Pada penelitian ini akan digunakan teknologi LVS untuk membuat cluster Web Server denganmenggunakan NAT, diterapkannya teknologi Network File System, dan Network Block Device yang digunakansebagai media penyimpanan dalam jaringan. Dalam pengujian sistem cluster ini, pertama dilakukan pengujianjaringan yang digunakan untuk mengetahui kinerja sistem, dan pengujian sistem cluster dalam mengolah data Webdengan perangkat lunak WebBench dan script benchmark.

  9. LHCb: Fabric Management with Diskless Servers and Quattor on LHCb

    CERN Multimedia

    Schweitzer, P; Brarda, L; Neufeld, N

    2011-01-01

    Large scientific experiments nowadays very often are using large computer farms to process the events acquired from the detectors. In LHCb a small sysadmin team manages 1400 servers of the LHCb Event Filter Farm, but also a wide variety of control servers for the detector electronics and infrastructure computers: file servers, gateways, DNS, DHCP and others. This variety of servers could not be handled without a solid fabric management system. We choose the Quattor toolkit for this task. We will present our use of this toolkit, with an emphasis on how we handle our diskless nodes (Event filter farm nodes and computers embedded in the acquisition electronic cards). We will show our current tests to replace the standard (RedHat/Scientific Linux) way of handling diskless nodes to fusion filesystems and how it improves fabric management.

  10. DYNAMIC REQUEST DISPATCHING ALGORITHM FOR WEB SERVER CLUSTER

    Institute of Scientific and Technical Information of China (English)

    Yang Zhenjiang; Zhang Deyun; Sun Qindong; Sun Qing

    2006-01-01

    Distributed architectures support increased load on popular web sites by dispatching client requests transparently among multiple servers in a cluster. Packet Single-Rewriting technology and client address hashing algorithm in ONE-IP technology which can ensure application-session-keep have been analyzed, an improved request dispatching algorithm which is simple, effective and supports dynamic load balance has been proposed. In this algorithm, dispatcher evaluates which server node will process request by applying a hash function to the client IP address and comparing the result with its assigned identifier subset; it adjusts the size of the subset according to the performance and current load of each server, so as to utilize all servers' resource effectively. Simulation shows that the improved algorithm has better performance than the original one.

  11. Improvements to the National Transport Code Collaboration Data Server

    Science.gov (United States)

    Alexander, David A.

    2001-10-01

    The data server of the National Transport Code Colaboration Project provides a universal network interface to interpolated or raw transport data accessible by a universal set of names. Data can be acquired from a local copy of the Iternational Multi-Tokamak (ITER) profile database as well as from TRANSP trees of MDS Plus data systems on the net. Data is provided to the user's network client via a CORBA interface, thus providing stateful data server instances, which have the advantage of remembering the desired interpolation, data set, etc. This paper will review the status and discuss the recent improvements made to the data server, such as the modularization of the data server and the addition of hdf5 and MDS Plus data file writing capability.

  12. License - Gclust Server | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available [ Credits ] BLAST Search Image Search Home About Archive Update History Contact us ...base Database Description Download License Update History of This Database Site Policy | Contact Us License - Gclust Server | LSDB Archive ...

  13. A Web-Based Airborne Remote Sensing Telemetry Server Project

    Data.gov (United States)

    National Aeronautics and Space Administration — A Web-based Airborne Remote Sensing Telemetry Server (WARSTS) is proposed to integrate UAV telemetry and web-technology into an innovative communication, command,...

  14. Comparing Server Energy Use and Efficiency Using Small Sample Sizes

    Energy Technology Data Exchange (ETDEWEB)

    Coles, Henry C.; Qin, Yong; Price, Phillip N.

    2014-11-01

    This report documents a demonstration that compared the energy consumption and efficiency of a limited sample size of server-type IT equipment from different manufacturers by measuring power at the server power supply power cords. The results are specific to the equipment and methods used. However, it is hoped that those responsible for IT equipment selection can used the methods described to choose models that optimize energy use efficiency. The demonstration was conducted in a data center at Lawrence Berkeley National Laboratory in Berkeley, California. It was performed with five servers of similar mechanical and electronic specifications; three from Intel and one each from Dell and Supermicro. Server IT equipment is constructed using commodity components, server manufacturer-designed assemblies, and control systems. Server compute efficiency is constrained by the commodity component specifications and integration requirements. The design freedom, outside of the commodity component constraints, provides room for the manufacturer to offer a product with competitive efficiency that meets market needs at a compelling price. A goal of the demonstration was to compare and quantify the server efficiency for three different brands. The efficiency is defined as the average compute rate (computations per unit of time) divided by the average energy consumption rate. The research team used an industry standard benchmark software package to provide a repeatable software load to obtain the compute rate and provide a variety of power consumption levels. Energy use when the servers were in an idle state (not providing computing work) were also measured. At high server compute loads, all brands, using the same key components (processors and memory), had similar results; therefore, from these results, it could not be concluded that one brand is more efficient than the other brands. The test results show that the power consumption variability caused by the key components as a

  15. Ecological role of reindeer summer browsing in the mountain birch (Betula pubescens ssp. czerepanovii) forests: effects on plant defense, litter decomposition, and soil nutrient cycling.

    Science.gov (United States)

    Stark, Sari; Julkunen-Tiitto, Riitta; Kumpula, Jouko

    2007-03-01

    Mammalian herbivores commonly alter the concentrations of secondary compounds in plants and, by this mechanism, have indirect effects on litter decomposition and soil carbon and nutrient cycling. In northernmost Fennoscandia, the subarctic mountain birch (Betula pubescens ssp. czerepanovii) forests are important pasture for the semidomestic reindeer (Rangifer tarandus). In the summer ranges, mountain birches are intensively browsed, whereas in the winter ranges, reindeer feed on ground lichens, and the mountain birches remain intact. We analyzed the effect of summer browsing on the concentrations of secondary substances, litter decomposition, and soil nutrient pools in areas that had been separated as summer or winter ranges for at least 20 years, and we predicted that summer browsing may reduce levels of secondary compounds in the mountain birch and, by this mechanism, have an indirect effect on the decomposition of mountain birch leaf litter and soil nutrient cycling. The effect of browsing on the concentration of secondary substances in the mountain birch leaves varied between different years and management districts, but in some cases, the concentration of condensed tannins was lower in the summer than in the winter ranges. In a reciprocal litter decomposition trial, both litter origin and emplacement significantly affected the litter decomposition rate. Decomposition rates were faster for the litter originating from and placed into the summer range. Soil inorganic nitrogen (N) concentrations were higher in the summer than in the winter ranges, which indicates that reindeer summer browsing may enhance the soil nutrient cycling. There was a tight inverse relationship between soil N and foliar tannin concentrations in the winter range but not in the summer range. This suggests that in these strongly nutrient-limited ecosystems, soil N availability regulates the patterns of resource allocation to condensed tannins in the absence but not in the presence of browsing.

  16. Limit Theorems For Closed Queuing Networks With Excess Of Servers

    OpenAIRE

    Tsitsiashvili, G.

    2013-01-01

    In this paper limit theorems for closed queuing networks with excess of servers are formulated and proved. First theorem is a variant of the central limit theorem and is proved using classical results of V.I. Romanovskiy for discrete Markov chains. Second theorem considers a convergence to chi square distribution. These theorems are mainly based on an assumption of servers excess in queuing nodes.

  17. An efficient solution to Web server based automatic packaging

    Institute of Scientific and Technical Information of China (English)

    JIANG Jian-ping; DING Xing-miao

    2004-01-01

    The manfully packaging the content of the web server for down loading is much inefficient.The paper introduced a Web server agent for packaging automitization, which makes self-organization of the content of the pages and free the site administrator from the heavy burden of the manually packaging the web pages downloaded by web browsers. An example was illustrated to show the details of the solution, which was implemented in J2EE architecture.

  18. EarthServer: an Intercontinental Collaboration on Petascale Datacubes

    Science.gov (United States)

    Baumann, P.; Rossi, A. P.

    2015-12-01

    With the unprecedented increase of orbital sensor, in-situ measurement, and simulation data there is a rich, yet not leveraged potential for getting insights from dissecting datasets and rejoining them with other datasets. Obviously, the goal is to allow users to "ask any question, any time" thereby enabling them to "build their own product on the go".One of the most influential initiatives in Big Geo Data is EarthServer which has demonstrated new directions for flexible, scalable EO services based on innovative NewSQL technology. Researchers from Europe, the US and recently Australia have teamed up to rigourously materialize the concept of the datacube. Such a datacube may have spatial and temporal dimensions (such as a satellite image time series) and may unite an unlimited number of scenes. Independently from whatever efficient data structuring a server network may perform internally, users will always see just a few datacubes they can slice and dice. EarthServer has established client and server technology for such spatio-temporal datacubes. The underlying scalable array engine, rasdaman, enables direct interaction, including 3-D visualization, what-if scenarios, common EO data processing, and general analytics. Services exclusively rely on the open OGC "Big Geo Data" standards suite, the Web Coverage Service (WCS) including the Web Coverage Processing Service (WCPS). Conversely, EarthServer has significantly shaped and advanced the OGC Big Geo Data standards landscape based on the experience gained.Phase 1 of EarthServer has advanced scalable array database technology into 100+ TB services; in phase 2, Petabyte datacubes will be built in Europe and Australia to perform ad-hoc querying and merging. Standing between EarthServer phase 1 (from 2011 through 2014) and phase 2 (from 2015 through 2018) we present the main results and outline the impact on the international standards landscape; effectively, the Big Geo Data standards established through initiative of

  19. Sirocco Storage Server v. pre-alpha 0.1

    Energy Technology Data Exchange (ETDEWEB)

    2015-12-18

    Sirocco is a parallel storage system under development, designed for write-intensive workloads on large-scale HPC platforms. It implements a keyvalue object store on top of a set of loosely federated storage servers that cooperate to ensure data integrity and performance. It includes support for a range of different types of storage transactions. This software release constitutes a conformant storage server, along with the client-side libraries to access the storage over a network.

  20. The web server of IBM's Bioinformatics and Pattern Discovery group

    OpenAIRE

    Huynh, Tien; Rigoutsos, Isidore; Parida, Laxmi; Platt, Daniel,; Shibuya, Tetsuo

    2003-01-01

    We herein present and discuss the services and content which are available on the web server of IBM's Bioinformatics and Pattern Discovery group. The server is operational around the clock and provides access to a variety of methods that have been published by the group's members and collaborators. The available tools correspond to applications ranging from the discovery of patterns in streams of events and the computation of multiple sequence alignments, to the discovery of genes in nucleic ...

  1. AN ANALYTICAL STUDY AND SYNTHESIS ON WEB SERVER SECURITY

    Directory of Open Access Journals (Sweden)

    Jyoti Pandey

    2015-10-01

    Full Text Available Web servers are being as a viable means to access Internet-based applications. Latest Approaches to secure Web servers are not much efficient or robust enough to protect and their applications from hackers. There are several approaches and analogies to examine the minimum-security requirements of a system. But The Highly efficient Techniques approaches as Protection profile, a systematic approach Therefore, we derive the Web security components that make a secure Web server from the Web Server Protection Profile. Study of A component-based framework as well as an open source solution has been done subsequently in this paper We believe that after the studying of such a system (implemented and deployed later, it will function reliably and effectively. This work aims at establishing the provable reliability of construction and the feasibility of component-based solutions for the secure Web server. This paper gives a theoretical approach and analogy of Profiling a web server including All three basic security models together( System security, Transmission security & Access Control Systemetc.

  2. 我可以将Windows Server 2008R2加入到Windows Server 2008群集中吗?

    Institute of Scientific and Technical Information of China (English)

    2009-01-01

    不可以。你不能在群集中混合使用Windows Server 2008和Windows Server 2008 R2节点。你需要创建新的Windows Server 2008 R2群集,然后从Windows Server 2008中迁移资个VM的CPU使用情况。

  3. Web Server Security on Open Source Environments

    Science.gov (United States)

    Gkoutzelis, Dimitrios X.; Sardis, Manolis S.

    Administering critical resources has never been more difficult that it is today. In a changing world of software innovation where major changes occur on a daily basis, it is crucial for the webmasters and server administrators to shield their data against an unknown arsenal of attacks in the hands of their attackers. Up until now this kind of defense was a privilege of the few, out-budgeted and low cost solutions let the defender vulnerable to the uprising of innovating attacking methods. Luckily, the digital revolution of the past decade left its mark, changing the way we face security forever: open source infrastructure today covers all the prerequisites for a secure web environment in a way we could never imagine fifteen years ago. Online security of large corporations, military and government bodies is more and more handled by open source application thus driving the technological trend of the 21st century in adopting open solutions to E-Commerce and privacy issues. This paper describes substantial security precautions in facing privacy and authentication issues in a totally open source web environment. Our goal is to state and face the most known problems in data handling and consequently propose the most appealing techniques to face these challenges through an open solution.

  4. Tiled WMS/KML Server V2

    Science.gov (United States)

    Plesea, Lucian

    2012-01-01

    This software is a higher-performance implementation of tiled WMS, with integral support for KML and time-varying data. This software is compliant with the Open Geospatial WMS standard, and supports KML natively as a WMS return type, including support for the time attribute. Regionated KML wrappers are generated that match the existing tiled WMS dataset. Ping and JPG formats are supported, and the software is implemented as an Apache 2.0 module that supports a threading execution model that is capable of supporting very high request rates. The module intercepts and responds to WMS requests that match certain patterns and returns the existing tiles. If a KML format that matches an existing pyramid and tile dataset is requested, regionated KML is generated and returned to the requesting application. In addition, KML requests that do not match the existing tile datasets generate a KML response that includes the corresponding JPG WMS request, effectively adding KML support to a backing WMS server.

  5. Windows Server 2003下构建校园VPN%A Construction of Campus VPN Based on Windows Server 2003

    Institute of Scientific and Technical Information of China (English)

    黄英铭

    2006-01-01

    近几年VPN技术逐步趋向成熟.构建基于Internet的校园VPN,可以实现安全的网络办公.文章简单介绍了VPN的关键技术,分析了Windows Server 2003中的VPN功能,提出了基于Windows Server 2003软件平台的校园VPN解决方案.

  6. Downsizer - A Graphical User Interface-Based Application for Browsing, Acquiring, and Formatting Time-Series Data for Hydrologic Modeling

    Science.gov (United States)

    Ward-Garrison, Christian; Markstrom, Steven L.; Hay, Lauren E.

    2009-01-01

    The U.S. Geological Survey Downsizer is a computer application that selects, downloads, verifies, and formats station-based time-series data for environmental-resource models, particularly the Precipitation-Runoff Modeling System. Downsizer implements the client-server software architecture. The client presents a map-based, graphical user interface that is intuitive to modelers; the server provides streamflow and climate time-series data from over 40,000 measurement stations across the United States. This report is the Downsizer user's manual and provides (1) an overview of the software design, (2) installation instructions, (3) a description of the graphical user interface, (4) a description of selected output files, and (5) troubleshooting information.

  7. Open Polar Server (OPS—An Open Source Infrastructure for the Cryosphere Community

    Directory of Open Access Journals (Sweden)

    Weibo Liu

    2016-03-01

    Full Text Available The Center for Remote Sensing of Ice Sheets (CReSIS at the University of Kansas has collected approximately 1000 terabytes (TB of radar depth sounding data over the Arctic and Antarctic ice sheets since 1993 in an effort to map the thickness of the ice sheets and ultimately understand the impacts of climate change and sea level rise. In addition to data collection, the storage, management, and public distribution of the dataset are also primary roles of the CReSIS. The Open Polar Server (OPS project developed a free and open source infrastructure to store, manage, analyze, and distribute the data collected by CReSIS in an effort to replace its current data storage and distribution approach. The OPS infrastructure includes a spatial database management system (DBMS, map and web server, JavaScript geoportal, and MATLAB application programming interface (API for the inclusion of data created by the cryosphere community. Open source software including GeoServer, PostgreSQL, PostGIS, OpenLayers, ExtJS, GeoEXT and others are used to build a system that modernizes the CReSIS data distribution for the entire cryosphere community and creates a flexible platform for future development. Usability analysis demonstrates the OPS infrastructure provides an improved end user experience. In addition, interpolating glacier topography is provided as an application example of the system.

  8. A Study on Partnering Mechanism in B to B EC Server for Global Supply Chain Management

    Science.gov (United States)

    Kaihara, Toshiya

    B to B Electronic Commerce (EC) technology is now in progress and regarded as an information infrastructure for global business. As the number and diversity of EC participants grows at the agile environment, the complexity of purchasing from a vast and dynamic array of goods and services needs to be hidden from the end user. Putting the complexity into the EC system instead means providing flexible auction server for enabling commerce within different business units. Market mechanism could solve the product distribution problem in the auction server by allocating the scheduled resources according to market prices. In this paper, we propose a partnering mechanism for B to B EC with market-oriented programming that mediates amongst unspecified various companies in the trade, and demonstrate the applicability of the economic analysis to this framework after constructing a primitive EC server. The proposed mechanism facilitates sophisticated B to B EC, which conducts a Pareto optimal solution for all the participating business units in the coming agile era.

  9. Serving database information using a flexible server in a three tier architecture

    Energy Technology Data Exchange (ETDEWEB)

    Lee Lueking et al.

    2003-08-11

    The D0 experiment at Fermilab relies on a central Oracle database for storing all detector calibration information. Access to this data is needed by hundreds of physics applications distributed worldwide. In order to meet the demands of these applications from scarce resources, we have created a distributed system that isolates the user applications from the database facilities. This system, known as the Database Application Network (DAN) operates as the middle tier in a three tier architecture. A DAN server employs a hierarchical caching scheme and database connection management facility that limits access to the database resource. The modular design allows for caching strategies and database access components to be determined by runtime configuration. To solve scalability problems, a proxy database component allows for DAN servers to be arranged in a hierarchy. Also included is an event based monitoring system that is currently being used to collect statistics for performance analysis and problem diagnosis. DAN servers are currently implemented as a Python multithreaded program using CORBA for network communications and interface specification. The requirement details, design, and implementation of DAN are discussed along with operational experience and future plans.

  10. ConoServer: updated content, knowledge, and discovery tools in the conopeptide database.

    Science.gov (United States)

    Kaas, Quentin; Yu, Rilei; Jin, Ai-Hua; Dutertre, Sébastien; Craik, David J

    2012-01-01

    ConoServer (http://www.conoserver.org) is a database specializing in the sequences and structures of conopeptides, which are toxins expressed by marine cone snails. Cone snails are carnivorous gastropods, which hunt their prey using a cocktail of toxins that potently subvert nervous system function. The ability of these toxins to specifically target receptors, channels and transporters of the nervous system has attracted considerable interest for their use in physiological research and as drug leads. Since the founding publication on ConoServer in 2008, the number of entries in the database has nearly doubled, the interface has been redesigned and new annotations have been added, including a more detailed description of cone snail species, biological activity measurements and information regarding the identification of each sequence. Automatically updated statistics on classification schemes, three-dimensional structures, conopeptide-bearing species and endoplasmic reticulum signal sequence conservation trends, provide a convenient overview of current knowledge on conopeptides. Transcriptomics and proteomics have began generating massive numbers of new conopeptide sequences, and two dedicated tools have been recently implemented in ConoServer to standardize the analysis of conopeptide precursor sequences and to help in the identification by mass spectrometry of toxins whose sequences were predicted at the nucleic acid level.

  11. OrthoSelect: a web server for selecting orthologous gene alignments from EST sequences.

    Science.gov (United States)

    Schreiber, Fabian; Wörheide, Gert; Morgenstern, Burkhard

    2009-07-01

    In the absence of whole genome sequences for many organisms, the use of expressed sequence tags (EST) offers an affordable approach for researchers conducting phylogenetic analyses to gain insight about the evolutionary history of organisms. Reliable alignments for phylogenomic analyses are based on orthologous gene sequences from different taxa. So far, researchers have not sufficiently tackled the problem of the completely automated construction of such datasets. Existing software tools are either semi-automated, covering only part of the necessary data processing, or implemented as a pipeline, requiring the installation and configuration of a cascade of external tools, which may be time-consuming and hard to manage. To simplify data set construction for phylogenomic studies, we set up a web server that uses our recently developed OrthoSelect approach. To the best of our knowledge, our web server is the first web-based EST analysis pipeline that allows the detection of orthologous gene sequences in EST libraries and outputs orthologous gene alignments. Additionally, OrthoSelect provides the user with an extensive results section that lists and visualizes all important results, such as annotations, data matrices for each gene/taxon and orthologous gene alignments. The web server is available at http://orthoselect.gobics.de.

  12. A More Complete Model for TCP Connections Established between One Server and Many Receivers

    Institute of Scientific and Technical Information of China (English)

    LINYu; CHENGShiduan; WUHaitao; WANGChonggang

    2003-01-01

    Different from previous TCP (transmis-sion control program) modeling works, this paper presents a more complete analytical model of multiple TCP con-nections established between a busy server and multiple receivers under two distinct cases: the case there is suffi-cient bandwidth and the case there is a bandwidth bottle-neck link between the server and receivers. In the former case, the server will become the bottleneck of the whole system, and TCP behaviors are different from the model presented before. However, in the latter case, multiple TCP connections will share the bandwidth of the bottle-neck link. Based on the analysis of working flows in the system and a M/G/1 queueing model, the RTT and long-term TCP throughput formulae are derived in terms of number of TCPs, packet loss rate, and end-to-end delay.And the effect of maximum window size is also investi-gated. The simulation results confirm that new model is more accurate than previous model.

  13. Improving Performance on WWW using Intelligent Predictive Caching for Web Proxy Servers

    Directory of Open Access Journals (Sweden)

    J. B. Patil

    2011-01-01

    Full Text Available Web proxy caching is used to improve the performance of the Web infrastructure. It aims to reduce network traffic, server load, and user perceived retrieval delays. The heart of a caching system is its page replacement policy, which needs to make good replacement decisions when its cache is full and a new document needs to be stored. The latest and most popular replacement policies like GDSF and GDSF# use the file size, access frequency, and age in the decision process. The effectiveness of any replacement policy can be evaluated using two metrics: hit ratio (HR and byte hit ratio (BHR. There is always a trade-off between HR and BHR. In this paper, using three different Web proxy server logs, we use trace driven analysis to evaluate the effects of different replacement policies on the performance of a Web proxy server. We propose a modification of GDSF# policy, IPGDSF#. Our simulation results show that our proposed replacement policy IPGDSF# performs better than several policies proposed in the literature in terms of hit rate as well as byte hit rate.

  14. A Robust Mechanism For Defending Distributed Denial Of Service Attacks On Web Servers

    Directory of Open Access Journals (Sweden)

    Jaydip Sen

    2011-03-01

    Full Text Available Distributed Denial of Service (DDoS attacks have emerged as a popular means of causing mass targetedservice disruptions, often for extended periods of time. The relative ease and low costs of launching suchattacks, supplemented by the current inadequate sate of any viable defense mechanism, have made themone of the top threats to the Internet community today. Since the increasing popularity of web-basedapplications has led to several critical services being provided over the Internet, it is imperative tomonitor the network traffic so as to prevent malicious attackers from depleting the resources of thenetwork and denying services to legitimate users. This paper first presents a brief discussion on some ofthe important types of DDoS attacks that currently exist and some existing mechanisms to combat theseattacks. It then points out the major drawbacks of the currently existing defense mechanisms andproposes a new mechanism for protecting a web-server against a DDoS attack. In the proposedmechanism, incoming traffic to the server is continuously monitored and any abnormal rise in theinbound traffic is immediately detected. The detection algorithm is based on a statistical analysis of theinbound traffic on the server and a robust hypothesis testing framework. While the detection process ison, the sessions from the legitimate sources are not disrupted and the load on the server is restored to thenormal level by blocking the traffic from the attacking sources. To cater to different scenarios, thedetection algorithm has various modules with varying level of computational and memory overheads fortheir execution. While the approximate modules are fast in detection and involve less overhead, theyprovide lower level of detection accuracy. The accurate modules employ complex detection logic andhence involve more overhead for their execution. However, they have very high detection accuracy.Simulations carried out on the proposed mechanism have produced

  15. GeoServer: il server geospaziale Open Source novità della nuova versione 2.3.0

    Directory of Open Access Journals (Sweden)

    Simone Giannecchini

    2013-04-01

    Full Text Available GeoServer è un server geospaziale Open Source sviluppato con tecnologia Java Enterprise per la gestione e l’editing di dati geospaziali secondo gli standard OGC e ISO Technical Committee 211. Esso fornisce le funzionalità di base per creareinfrastrutture spaziali di dati (SDI ed è progettato per essere interoperabile potendo pubblicare dati provenienti da ogni tipo di fonte spaziale utilizzando standard aperti.Open Source GeoSpatial server developed with Java Enterprise technology for managing, sharing and editing geospatial data according to the OGC and ISO TC 211 standards. GeoServer provides the basic functionalities to create spatial data infrastructures (SDI.GeoServer is designed for interoperability, it publishes data from any major spatial data source using open standards: it is the reference implementation of the Open Geospatial Consortium (OGC Web Feature Service (WFS and Web Coverage Service (WCS standards, as well as a highperformance certified compliant Web Map Service (WMS. GeoServer forms a core component of the Geospatial Web.

  16. Evaluation of the Intel Westmere-EP server processor

    CERN Document Server

    Jarp, S; Leduc, J; Nowak, A; CERN. Geneva. IT Department

    2010-01-01

    In this paper we report on a set of benchmark results recently obtained by CERN openlab when comparing the 6-core “Westmere-EP” processor with Intel’s previous generation of the same microarchitecture, the “Nehalem-EP”. The former is produced in a new 32nm process, the latter in 45nm. Both platforms are dual-socket servers. Multiple benchmarks were used to get a good understanding of the performance of the new processor. We used both industry-standard benchmarks, such as SPEC2006, and specific High Energy Physics benchmarks, representing both simulation of physics detectors and data analysis of physics events. Before summarizing the results we must stress the fact that benchmarking of modern processors is a very complex affair. One has to control (at least) the following features: processor frequency, overclocking via Turbo mode, the number of physical cores in use, the use of logical cores via Simultaneous Multi-Threading (SMT), the cache sizes available, the memory configuration installed, as well...

  17. Evaluation of the Intel Sandy Bridge-EP server processor

    CERN Document Server

    Jarp, S; Leduc, J; Nowak, A; CERN. Geneva. IT Department

    2012-01-01

    In this paper we report on a set of benchmark results recently obtained by CERN openlab when comparing an 8-core “Sandy Bridge-EP” processor with Intel’s previous microarchitecture, the “Westmere-EP”. The Intel marketing names for these processors are “Xeon E5-2600 processor series” and “Xeon 5600 processor series”, respectively. Both processors are produced in a 32nm process, and both platforms are dual-socket servers. Multiple benchmarks were used to get a good understanding of the performance of the new processor. We used both industry-standard benchmarks, such as SPEC2006, and specific High Energy Physics benchmarks, representing both simulation of physics detectors and data analysis of physics events. Before summarizing the results we must stress the fact that benchmarking of modern processors is a very complex affair. One has to control (at least) the following features: processor frequency, overclocking via Turbo mode, the number of physical cores in use, the use of logical cores ...

  18. Partial replacement of an artificial nectar diet with native browse for feather-tail gliders (Acrobates pygmaeus) in captivity.

    Science.gov (United States)

    Herrmann, Ella A; Herrin, Kimberly Vinette; Gleen, Wendy; Davies, Paul; Stapley, Rodd; Stebbings, Vanessa; Wiszniewski, Joanna; Spindler, Rebecca; Faichney, Graham J; Chaves, Alexandre V

    2013-01-01

    Captive-bred feather-tail gliders (Acrobates pygmaeus) housed at Taronga Zoo have had a long history of eye cholesterol plaques that may be associated with a largely sugar-based diet such as artificial nectar. The gliders also have prolonged periods of reduced activity when they are not visible in exhibits. This may be due to the ad libitum supply of an energy rich feed and reduced need to forage. This study examined behavioral and physiological changes associated with supplementing the high sugar-based diet with two species of native browse. The experiment was conducted over two consecutive periods of 3 weeks and consisted of two treatment groups: one group was offered the artificial nectar only, while the other group was offered the artificial nectar supplemented with a variety of native flowers. Live weight was recorded weekly. There was no change (P > 0.10) in artificial nectar intake with the supplementation of native browse in the diet. Blood metabolites (cholesterol, triglycerides, glucose) tested for the two groups had no differences (P > 0.10) between treatments. Upon examination, there were no signs of tooth decay or cholesterol plaques in all animals throughout the experiment. Feed intake and behavior were recorded via sensor cameras. There was an increase (P native flowers compared to gliders fed the artificial nectar alone. In conclusion, supplementing to provide a more native diet to A. pygmaeus enhanced their natural foraging behavior, suggesting that it may result in long-term improvements in their health.

  19. Design and implementation of streaming media server cluster based on FFMpeg.

    Science.gov (United States)

    Zhao, Hong; Zhou, Chun-long; Jin, Bao-zhao

    2015-01-01

    Poor performance and network congestion are commonly observed in the streaming media single server system. This paper proposes a scheme to construct a streaming media server cluster system based on FFMpeg. In this scheme, different users are distributed to different servers according to their locations and the balance among servers is maintained by the dynamic load-balancing algorithm based on active feedback. Furthermore, a service redirection algorithm is proposed to improve the transmission efficiency of streaming media data. The experiment results show that the server cluster system has significantly alleviated the network congestion and improved the performance in comparison with the single server system.

  20. METAGENassist: a comprehensive web server for comparative metagenomics.

    Science.gov (United States)

    Arndt, David; Xia, Jianguo; Liu, Yifeng; Zhou, You; Guo, An Chi; Cruz, Joseph A; Sinelnikov, Igor; Budwill, Karen; Nesbø, Camilla L; Wishart, David S

    2012-07-01

    With recent improvements in DNA sequencing and sample extraction techniques, the quantity and quality of metagenomic data are now growing exponentially. This abundance of richly annotated metagenomic data and bacterial census information has spawned a new branch of microbiology called comparative metagenomics. Comparative metagenomics involves the comparison of bacterial populations between different environmental samples, different culture conditions or different microbial hosts. However, in order to do comparative metagenomics, one typically requires a sophisticated knowledge of multivariate statistics and/or advanced software programming skills. To make comparative metagenomics more accessible to microbiologists, we have developed a freely accessible, easy-to-use web server for comparative metagenomic analysis called METAGENassist. Users can upload their bacterial census data from a wide variety of common formats, using either amplified 16S rRNA data or shotgun metagenomic data. Metadata concerning environmental, culture, or host conditions can also be uploaded. During the data upload process, METAGENassist also performs an automated taxonomic-to-phenotypic mapping. Phenotypic information covering nearly 20 functional categories such as GC content, genome size, oxygen requirements, energy sources and preferred temperature range is automatically generated from the taxonomic input data. Using this phenotypically enriched data, users can then perform a variety of multivariate and univariate data analyses including fold change analysis, t-tests, PCA, PLS-DA, clustering and classification. To facilitate data processing, users are guided through a step-by-step analysis workflow using a variety of menus, information hyperlinks and check boxes. METAGENassist also generates colorful, publication quality tables and graphs that can be downloaded and used directly in the preparation of scientific papers. METAGENassist is available at http://www.metagenassist.ca.