WorldWideScience

Sample records for server log analysis

  1. Analysis of Web Server Log Files: Website of Information Management Department of Hacettepe University

    Directory of Open Access Journals (Sweden)

    Mandana Mir Moftakhari

    2015-09-01

    Full Text Available Over the last decade, the importance of analysing information management systems logs has grown, because it has proved that results of the analysing log data can help developing in information system design, interface and architecture of websites. Log file analysis is one of the best ways in order to understand information-searching process of online searchers, users’ needs, interests, knowledge, and prejudices. The utilization of data collected in transaction logs of web search engines helps designers, researchers and web site managers to find complex interactions of users’ goals and behaviours to increase efficiency and effectiveness of websites. Before starting any analysis it should be observed that the log file of the web site contain enough information, otherwise analyser wouldn’t be able to create complete report. In this study we evaluate the website of Information Management Department of Hacettepe University by analysing the server log files. Results show that there is not adequate amount of information in log files which are provided by web site server. The reports which we have created have some information about users’ behaviour and need but they are not sufficient for taking ideal decisions about contents & hyperlink structure of website. It also provides that creating an extended log file is essential for the website. Finally we believe that results can be helpful to improve, redesign and create better website.

  2. Using Web Server Logs in Evaluating Instructional Web Sites.

    Science.gov (United States)

    Ingram, Albert L.

    2000-01-01

    Web server logs contain a great deal of information about who uses a Web site and how they use it. This article discusses the analysis of Web logs for instructional Web sites; reviews the data stored in most Web server logs; demonstrates what further information can be gleaned from the logs; and discusses analyzing that information for the…

  3. Conversation Threads Hidden within Email Server Logs

    Science.gov (United States)

    Palus, Sebastian; Kazienko, Przemysław

    Email server logs contain records of all email Exchange through this server. Often we would like to analyze those emails not separately but in conversation thread, especially when we need to analyze social network extracted from those email logs. Unfortunately each mail is in different record and those record are not tided to each other in any obvious way. In this paper method for discussion threads extraction was proposed together with experiments on two different data sets - Enron and WrUT..

  4. Getting to the Source: a Survey of Quantitative Data Sources Available to the Everyday Librarian: Part 1: Web Server Log Analysis

    Directory of Open Access Journals (Sweden)

    Lisa Goddard

    2007-03-01

    Full Text Available This is the first part of a two‐part article that provides a survey of data sources which are likely to be immediately available to the typical practitioner who wishes to engage instatistical analysis of collections and services within his or her own library. Part I outlines the data elements which can be extracted from web server logs, and discusses web log analysis tools. Part II looks at logs, reports, and data sources from proxy servers, resource vendors, link resolvers, federated search engines, institutional repositories, electronic reference services, and the integrated library system.

  5. Understanding Academic Information Seeking Habits through Analysis of Web Server Log Files: The Case of the Teachers College Library Website

    Science.gov (United States)

    Asunka, Stephen; Chae, Hui Soo; Hughes, Brian; Natriello, Gary

    2009-01-01

    Transaction logs of user activity on an academic library website were analyzed to determine general usage patterns on the website. This paper reports on insights gained from the analysis, and identifies and discusses issues relating to content access, interface design and general functionality of the website. (Contains 13 figures and 8 tables.)

  6. Mining the SDSS SkyServer SQL queries log

    Science.gov (United States)

    Hirota, Vitor M.; Santos, Rafael; Raddick, Jordan; Thakar, Ani

    2016-05-01

    SkyServer, the Internet portal for the Sloan Digital Sky Survey (SDSS) astronomic catalog, provides a set of tools that allows data access for astronomers and scientific education. One of SkyServer data access interfaces allows users to enter ad-hoc SQL statements to query the catalog. SkyServer also presents some template queries that can be used as basis for more complex queries. This interface has logged over 330 million queries submitted since 2001. It is expected that analysis of this data can be used to investigate usage patterns, identify potential new classes of queries, find similar queries, etc. and to shed some light on how users interact with the Sloan Digital Sky Survey data and how scientists have adopted the new paradigm of e-Science, which could in turn lead to enhancements on the user interfaces and experience in general. In this paper we review some approaches to SQL query mining, apply the traditional techniques used in the literature and present lessons learned, namely, that the general text mining approach for feature extraction and clustering does not seem to be adequate for this type of data, and, most importantly, we find that this type of analysis can result in very different queries being clustered together.

  7. Using Web Server Logs to Track Users through the Electronic Forest

    Science.gov (United States)

    Coombs, Karen A.

    2005-01-01

    This article analyzes server logs, providing helpful information in making decisions about Web-based services. The author indicates, as a result of analyzing server logs, several interesting things about the users' behavior were learned. The resulting findings are discussed in this article. Certain pages of the author's Web site, for instance, are…

  8. http Log Analysis

    DEFF Research Database (Denmark)

    Bøving, Kristian Billeskov; Simonsen, Jesper

    2004-01-01

    This article documents how log analysis can inform qualitative studies concerning the usage of web-based information systems (WIS). No prior research has used http log files as data to study collaboration between multiple users in organisational settings. We investigate how to perform http log...... analysis; what http log analysis says about the nature of collaborative WIS use; and how results from http log analysis may support other data collection methods such as surveys, interviews, and observation. The analysis of log files initially lends itself to research designs, which serve to test...... hypotheses using a quantitative methodology. We show that http log analysis can also be valuable in qualitative research such as case studies. The results from http log analysis can be triangulated with other data sources and for example serve as a means of supporting the interpretation of interview data...

  9. Log-Less Metadata Management on Metadata Server for Parallel File Systems

    Directory of Open Access Journals (Sweden)

    Jianwei Liao

    2014-01-01

    Full Text Available This paper presents a novel metadata management mechanism on the metadata server (MDS for parallel and distributed file systems. In this technique, the client file system backs up the sent metadata requests, which have been handled by the metadata server, so that the MDS does not need to log metadata changes to nonvolatile storage for achieving highly available metadata service, as well as better performance improvement in metadata processing. As the client file system backs up certain sent metadata requests in its memory, the overhead for handling these backup requests is much smaller than that brought by the metadata server, while it adopts logging or journaling to yield highly available metadata service. The experimental results show that this newly proposed mechanism can significantly improve the speed of metadata processing and render a better I/O data throughput, in contrast to conventional metadata management schemes, that is, logging or journaling on MDS. Besides, a complete metadata recovery can be achieved by replaying the backup logs cached by all involved clients, when the metadata server has crashed or gone into nonoperational state exceptionally.

  10. Using ‘search transitions’ to study searchers investment of effort: experiences with client and server side logging

    OpenAIRE

    Pharo, Nils; Nordlie, Ragnar

    2013-01-01

    We are investigating the value of using the concept ‘search transition’ for studying effort invested in information search processes. In this paper we present findings from a comparative study of data collected from client and server side loggings. The purpose is to see what factors of effort can be captured from the two logging methods. The data stems from studies of searchers interaction with an XML information retrieval system. The searchers interaction was simultaneously logged by a scree...

  11. Web server attack analyzer

    OpenAIRE

    Mižišin, Michal

    2013-01-01

    Web server attack analyzer - Abstract The goal of this work was to create prototype of analyzer of injection flaws attacks on web server. Proposed solution combines capabilities of web application firewall and web server log analyzer. Analysis is based on configurable signatures defined by regular expressions. This paper begins with summary of web attacks, followed by detection techniques analysis on web servers, description and justification of selected implementation. In the end are charact...

  12. Detection of attack-targeted scans from the Apache HTTP Server access logs

    Directory of Open Access Journals (Sweden)

    Merve Baş Seyyar

    2018-01-01

    Full Text Available A web application could be visited for different purposes. It is possible for a web site to be visited by a regular user as a normal (natural visit, to be viewed by crawlers, bots, spiders, etc. for indexing purposes, lastly to be exploratory scanned by malicious users prior to an attack. An attack targeted web scan can be viewed as a phase of a potential attack and can lead to more attack detection as compared to traditional detection methods. In this work, we propose a method to detect attack-oriented scans and to distinguish them from other types of visits. In this context, we use access log files of Apache (or ISS web servers and try to determine attack situations through examination of the past data. In addition to web scan detections, we insert a rule set to detect SQL Injection and XSS attacks. Our approach has been applied on sample data sets and results have been analyzed in terms of performance measures to compare our method and other commonly used detection techniques. Furthermore, various tests have been made on log samples from real systems. Lastly, several suggestions about further development have been also discussed.

  13. Pro SQL Server 2008 Analysis Services

    CERN Document Server

    Janus, Philo B

    2009-01-01

    Every business has a reams of business data locked away in databases, business systems, and spreadsheets. While you may be able to build some reports by pulling a few of these repositories together, actually performing any kind of analysis on the data that runs your business can range from problematic to impossible. Pro SQL Server 2008 Analysis Services will show you how to pull that data together and present it for reporting and analysis in a way that makes the data accessible to business users, instead of needing to rely on the IT department every time someone needs a different report. * Acc

  14. NRSAS: Nuclear Receptor Structure Analysis Servers.

    NARCIS (Netherlands)

    Bettler, E.J.M.; Krause, R.; Horn, F.; Vriend, G.

    2003-01-01

    We present a coherent series of servers that can perform a large number of structure analyses on nuclear hormone receptors. These servers are part of the NucleaRDB project, which provides a powerful information system for nuclear hormone receptors. The computations performed by the servers include

  15. Clustering of users of digital libraries through log file analysis

    Directory of Open Access Journals (Sweden)

    Juan Antonio Martínez-Comeche

    2017-09-01

    Full Text Available This study analyzes how users perform information retrieval tasks when introducing queries to the Hispanic Digital Library. Clusters of users are differentiated based on their distinct information behavior. The study used the log files collected by the server over a year and different possible clustering algorithms are compared. The k-means algorithm is found to be a suitable clustering method for the analysis of large log files from digital libraries. In the case of the Hispanic Digital Library the results show three clusters of users and the characteristic information behavior of each group is described.

  16. Instant Microsoft SQL Server Analysis Services 2012 dimensions and cube

    CERN Document Server

    Acharya, Anurag

    2013-01-01

    Get to grips with a new technology, understand what it is and what it can do for you, and then get to work with the most important features and tasks. Written in a practical, friendly manner this book will take you through the journey from installing SQL Server to developing your first cubes.""Microsoft SQL Server Analysis Service 2012 Dimensions"" and Cube Starter is targeted at anyone who wants to get started with cube development in Microsoft SQL Server Analysis Services. Regardless of whether you are a SQL Server developer who knows nothing about cube development or SSAS or even OLAP, you

  17. Analyzing Web Server Logs to Improve a Site's Usage. The Systems Librarian

    Science.gov (United States)

    Breeding, Marshall

    2005-01-01

    This column describes ways to streamline and optimize how a Web site works in order to improve both its usability and its visibility. The author explains how to analyze logs and other system data to measure the effectiveness of the Web site design and search engine.

  18. LHCb Online Log Analysis and Maintenance System

    CERN Document Server

    Garnier, J-C

    2011-01-01

    History has shown, many times computer logs are the only information an administrator may have for an incident, which could be caused either by a malfunction or an attack. Due to the huge amount of logs that are produced from large-scale IT infrastructures, such as LHCb Online, critical information may be overlooked or simply be drowned in a sea of other messages. This clearly demonstrates the need for an automatic system for long-term maintenance and real time analysis of the logs. We have constructed a low cost, fault tolerant centralized logging system which is able to do in-depth analysis and cross-correlation of every log. This system is capable of handling O(10000) different log sources and numerous formats, while trying to keep the overhead as low as possible. It provides log gathering and management, Offline analysis and online analysis. We call Offline analysis the procedure of analyzing old logs for critical information, while Online analysis refer to the procedure of early alerting and reacting. ...

  19. A semantic perspective on query log analysis

    NARCIS (Netherlands)

    Hofmann, K.; de Rijke, M.; Huurnink, B.; Meij, E.

    2009-01-01

    We present our views on the CLEF log file analysis task. We argue for a task definition that focuses on the semantic enrichment of query logs. In addition, we discuss how additional information about the context in which queries are being made could further our understanding of users’ information

  20. Instant SQL Server Analysis Services 2012 Cube Security

    CERN Document Server

    Jayanty, Satya SK

    2013-01-01

    Filled with practical, step-by-step instructions and clear explanations for the most important and useful tasks. Instant Microsoft SQL Server Analysis Services 2012 Cube Security is a practical, hands-on guide that provides a number of clear, step-by-step exercises for getting started with cube security.This book is aimed at Database Administrators, Data Architects, and Systems Administrators who are managing the SQL Server data platform. It is also beneficial for analysis services developers who already have some experience with the technology, but who want to go into more detail on advanced

  1. Genonets server-a web server for the construction, analysis and visualization of genotype networks.

    Science.gov (United States)

    Khalid, Fahad; Aguilar-Rodríguez, José; Wagner, Andreas; Payne, Joshua L

    2016-07-08

    A genotype network is a graph in which vertices represent genotypes that have the same phenotype. Edges connect vertices if their corresponding genotypes differ in a single small mutation. Genotype networks are used to study the organization of genotype spaces. They have shed light on the relationship between robustness and evolvability in biological systems as different as RNA macromolecules and transcriptional regulatory circuits. Despite the importance of genotype networks, no tool exists for their automatic construction, analysis and visualization. Here we fill this gap by presenting the Genonets Server, a tool that provides the following features: (i) the construction of genotype networks for categorical and univariate phenotypes from DNA, RNA, amino acid or binary sequences; (ii) analyses of genotype network topology and how it relates to robustness and evolvability, as well as analyses of genotype network topography and how it relates to the navigability of a genotype network via mutation and natural selection; (iii) multiple interactive visualizations that facilitate exploratory research and education. The Genonets Server is freely available at http://ieu-genonets.uzh.ch. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  2. Root cause analysis with enriched process logs

    NARCIS (Netherlands)

    Suriadi, S.; Ouyang, C.; Aalst, van der W.M.P.; Hofstede, ter A.H.M.; La Rosa, M.; Soffer, P.

    2013-01-01

    n the field of process mining, the use of event logs for the purpose of root cause analysis is increasingly studied. In such an analysis, the availability of attributes/features that may explain the root cause of some phenomena is crucial. Currently, the process of obtaining these attributes from

  3. SQL Server Analysis Services 2012 cube development cookbook

    CERN Document Server

    Dewald, Baya; Hughes, Steve

    2013-01-01

    A practical cookbook packed with recipes to help developers produce data cubes as quickly as possible by following step by step instructions, rather than explaining data mining concepts with SSAS.If you are a BI or ETL developer using SQL Server Analysis services to build OLAP cubes, this book is ideal for you. Prior knowledge of relational databases and experience with Excel as well as SQL development is required.

  4. Barcode server: a visualization-based genome analysis system.

    Directory of Open Access Journals (Sweden)

    Fenglou Mao

    Full Text Available We have previously developed a computational method for representing a genome as a barcode image, which makes various genomic features visually apparent. We have demonstrated that this visual capability has made some challenging genome analysis problems relatively easy to solve. We have applied this capability to a number of challenging problems, including (a identification of horizontally transferred genes, (b identification of genomic islands with special properties and (c binning of metagenomic sequences, and achieved highly encouraging results. These application results inspired us to develop this barcode-based genome analysis server for public service, which supports the following capabilities: (a calculation of the k-mer based barcode image for a provided DNA sequence; (b detection of sequence fragments in a given genome with distinct barcodes from those of the majority of the genome, (c clustering of provided DNA sequences into groups having similar barcodes; and (d homology-based search using Blast against a genome database for any selected genomic regions deemed to have interesting barcodes. The barcode server provides a job management capability, allowing processing of a large number of analysis jobs for barcode-based comparative genome analyses. The barcode server is accessible at http://csbl1.bmb.uga.edu/Barcode.

  5. Requirements-Driven Log Analysis Extended Abstract

    Science.gov (United States)

    Havelund, Klaus

    2012-01-01

    Imagine that you are tasked to help a project improve their testing effort. In a realistic scenario it will quickly become clear, that having an impact is diffcult. First of all, it will likely be a challenge to suggest an alternative approach which is significantly more automated and/or more effective than current practice. The reality is that an average software system has a complex input/output behavior. An automated testing approach will have to auto-generate test cases, each being a pair (i; o) consisting of a test input i and an oracle o. The test input i has to be somewhat meaningful, and the oracle o can be very complicated to compute. Second, even in case where some testing technology has been developed that might improve current practice, it is then likely difficult to completely change the current behavior of the testing team unless the technique is obviously superior and does everything already done by existing technology. So is there an easier way to incorporate formal methods-based approaches than the full edged test revolution? Fortunately the answer is affirmative. A relatively simple approach is to benefit from possibly already existing logging infrastructure, which after all is part of most systems put in production. A log is a sequence of events, generated by special log recording statements, most often manually inserted in the code by the programmers. An event can be considered as a data record: a mapping from field names to values. We can analyze such a log using formal methods, for example checking it against a formal specification. This separates running the system for analyzing its behavior. It is not meant as an alternative to testing since it does not address the important in- put generation problem. However, it offers a solution which testing teams might accept since it has low impact on the existing process. A single person might be assigned to perform such log analysis, compared to the entire testing team changing behavior.

  6. COMAN: a web server for comprehensive metatranscriptomics analysis.

    Science.gov (United States)

    Ni, Yueqiong; Li, Jun; Panagiotou, Gianni

    2016-08-11

    Microbiota-oriented studies based on metagenomic or metatranscriptomic sequencing have revolutionised our understanding on microbial ecology and the roles of both clinical and environmental microbes. The analysis of massive metatranscriptomic data requires extensive computational resources, a collection of bioinformatics tools and expertise in programming. We developed COMAN (Comprehensive Metatranscriptomics Analysis), a web-based tool dedicated to automatically and comprehensively analysing metatranscriptomic data. COMAN pipeline includes quality control of raw reads, removal of reads derived from non-coding RNA, followed by functional annotation, comparative statistical analysis, pathway enrichment analysis, co-expression network analysis and high-quality visualisation. The essential data generated by COMAN are also provided in tabular format for additional analysis and integration with other software. The web server has an easy-to-use interface and detailed instructions, and is freely available at http://sbb.hku.hk/COMAN/ CONCLUSIONS: COMAN is an integrated web server dedicated to comprehensive functional analysis of metatranscriptomic data, translating massive amount of reads to data tables and high-standard figures. It is expected to facilitate the researchers with less expertise in bioinformatics in answering microbiota-related biological questions and to increase the accessibility and interpretation of microbiota RNA-Seq data.

  7. Analysis of RIA standard curve by log-logistic and cubic log-logit models

    International Nuclear Information System (INIS)

    Yamada, Hideo; Kuroda, Akira; Yatabe, Tami; Inaba, Taeko; Chiba, Kazuo

    1981-01-01

    In order to improve goodness-of-fit in RIA standard analysis, programs for computing log-logistic and cubic log-logit were written in BASIC using personal computer P-6060 (Olivetti). Iterative least square method of Taylor series was applied for non-linear estimation of logistic and log-logistic. Hear ''log-logistic'' represents Y = (a - d)/(1 + (log(X)/c)sup(b)) + d As weights either 1, 1/var(Y) or 1/σ 2 were used in logistic or log-logistic and either Y 2 (1 - Y) 2 , Y 2 (1 - Y) 2 /var(Y), or Y 2 (1 - Y) 2 /σ 2 were used in quadratic or cubic log-logit. The term var(Y) represents squares of pure error and σ 2 represents estimated variance calculated using a following equation log(σ 2 + 1) = log(A) + J log(y). As indicators for goodness-of-fit, MSL/S sub(e)sup(2), CMD% and WRV (see text) were used. Better regression was obtained in case of alpha-fetoprotein by log-logistic than by logistic. Cortisol standard curve was much better fitted with cubic log-logit than quadratic log-logit. Predicted precision of AFP standard curve was below 5% in log-logistic in stead of 8% in logistic analysis. Predicted precision obtained using cubic log-logit was about five times lower than that with quadratic log-logit. Importance of selecting good models in RIA data processing was stressed in conjunction with intrinsic precision of radioimmunoassay system indicated by predicted precision. (author)

  8. COGNAT: a web server for comparative analysis of genomic neighborhoods.

    Science.gov (United States)

    Klimchuk, Olesya I; Konovalov, Kirill A; Perekhvatov, Vadim V; Skulachev, Konstantin V; Dibrova, Daria V; Mulkidjanian, Armen Y

    2017-11-22

    In prokaryotic genomes, functionally coupled genes can be organized in conserved gene clusters enabling their coordinated regulation. Such clusters could contain one or several operons, which are groups of co-transcribed genes. Those genes that evolved from a common ancestral gene by speciation (i.e. orthologs) are expected to have similar genomic neighborhoods in different organisms, whereas those copies of the gene that are responsible for dissimilar functions (i.e. paralogs) could be found in dissimilar genomic contexts. Comparative analysis of genomic neighborhoods facilitates the prediction of co-regulated genes and helps to discern different functions in large protein families. We intended, building on the attribution of gene sequences to the clusters of orthologous groups of proteins (COGs), to provide a method for visualization and comparative analysis of genomic neighborhoods of evolutionary related genes, as well as a respective web server. Here we introduce the COmparative Gene Neighborhoods Analysis Tool (COGNAT), a web server for comparative analysis of genomic neighborhoods. The tool is based on the COG database, as well as the Pfam protein families database. As an example, we show the utility of COGNAT in identifying a new type of membrane protein complex that is formed by paralog(s) of one of the membrane subunits of the NADH:quinone oxidoreductase of type 1 (COG1009) and a cytoplasmic protein of unknown function (COG3002). This article was reviewed by Drs. Igor Zhulin, Uri Gophna and Igor Rogozin.

  9. minepath.org: a free interactive pathway analysis web server.

    Science.gov (United States)

    Koumakis, Lefteris; Roussos, Panos; Potamias, George

    2017-07-03

    ( www.minepath.org ) is a web-based platform that elaborates on, and radically extends the identification of differentially expressed sub-paths in molecular pathways. Besides the network topology, the underlying MinePath algorithmic processes exploit exact gene-gene molecular relationships (e.g. activation, inhibition) and are able to identify differentially expressed pathway parts. Each pathway is decomposed into all its constituent sub-paths, which in turn are matched with corresponding gene expression profiles. The highly ranked, and phenotype inclined sub-paths are kept. Apart from the pathway analysis algorithm, the fundamental innovation of the MinePath web-server concerns its advanced visualization and interactive capabilities. To our knowledge, this is the first pathway analysis server that introduces and offers visualization of the underlying and active pathway regulatory mechanisms instead of genes. Other features include live interaction, immediate visualization of functional sub-paths per phenotype and dynamic linked annotations for the engaged genes and molecular relations. The user can download not only the results but also the corresponding web viewer framework of the performed analysis. This feature provides the flexibility to immediately publish results without publishing source/expression data, and get all the functionality of a web based pathway analysis viewer. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  10. WAMI: a web server for the analysis of minisatellite maps

    Directory of Open Access Journals (Sweden)

    El-Kalioby Mohamed

    2010-06-01

    Full Text Available Abstract Background Minisatellites are genomic loci composed of tandem arrays of short repetitive DNA segments. A minisatellite map is a sequence of symbols that represents the tandem repeat array such that the set of symbols is in one-to-one correspondence with the set of distinct repeats. Due to variations in repeat type and organization as well as copy number, the minisatellite maps have been widely used in forensic and population studies. In either domain, researchers need to compare the set of maps to each other, to build phylogenetic trees, to spot structural variations, and to study duplication dynamics. Efficient algorithms for these tasks are required to carry them out reliably and in reasonable time. Results In this paper we present WAMI, a web-server for the analysis of minisatellite maps. It performs the above mentioned computational tasks using efficient algorithms that take the model of map evolution into account. The WAMI interface is easy to use and the results of each analysis task are visualized. Conclusions To the best of our knowledge, WAMI is the first server providing all these computational facilities to the minisatellite community. The WAMI web-interface and the source code of the underlying programs are available at http://www.nubios.nileu.edu.eg/tools/wami.

  11. A Two-Tiered Model for Analyzing Library Web Site Usage Statistics, Part 1: Web Server Logs.

    Science.gov (United States)

    Cohen, Laura B.

    2003-01-01

    Proposes a two-tiered model for analyzing web site usage statistics for academic libraries: one tier for library administrators that analyzes measures indicating library use, and a second tier for web site managers that analyzes measures aiding in server maintenance and site design. Discusses the technology of web site usage statistics, and…

  12. Analysis of a multi-server queueing model of ABR

    Directory of Open Access Journals (Sweden)

    R. Núñez-Queija

    1998-01-01

    Full Text Available In this paper we present a queueing model for the performance analysis of Available Bit Rate (ABR traffic in Asynchronous Transfer Mode (ATM networks. We consider a multi-channel service station with two types of customers, denoted by high priority and low priority customers. In principle, high priority customers have preemptive priority over low priority customers, except on a fixed number of channels that are reserved for low priority traffic. The arrivals occur according to two independent Poisson processes, and service times are assumed to be exponentially distributed. Each high priority customer requires a single server, whereas low priority customers are served in processor sharing fashion. We derive the joint distribution of the numbers of customers (of both types in the system in steady state. Numerical results illustrate the effect of high priority traffic on the service performance of low priority traffic.

  13. Log Analysis Using Splunk Hadoop Connect

    Science.gov (United States)

    2017-06-01

    23 A. INPUT SELECTION ...Conversion Resource Consumption .............................................. 28 Figure 8. Distribution of Events in Years...running a logging service puts a performance tax on the system and may cause the degradation of performance. More thorough 8 logging will cause a

  14. Comparative analysis of nuclear magnetic resonance well logging and nuclear magnetic resonance mud logging

    International Nuclear Information System (INIS)

    Yuan Zugui

    2008-01-01

    The hydrogen atoms in oil and water are able to resonate and generate signals in the magnetic field, which is used by the NMR (nuclear magnetic resonance) technology in petroleum engineering to research and evaluate rock characteristics. NMR well logging was used to measure the physical property parameters of the strata in well bore, whereas NMR mud logging was used to analyze (while drilling) the physical property parameters of cores, cuttings and sidewall coring samples on surface (drilling site). Based on the comparative analysis of the porosity and permeability parameters obtained by NMR well logging and those from analysis of the cores, cuttings and sidewall coring samples by NMR mud logging in the same depth of 13 wells, these two methods are of certain difference, but their integral tendency is relatively good. (authors)

  15. SDSS Log Viewer: visual exploratory analysis of large-volume SQL log data

    Science.gov (United States)

    Zhang, Jian; Chen, Chaomei; Vogeley, Michael S.; Pan, Danny; Thakar, Ani; Raddick, Jordan

    2012-01-01

    User-generated Structured Query Language (SQL) queries are a rich source of information for database analysts, information scientists, and the end users of databases. In this study a group of scientists in astronomy and computer and information scientists work together to analyze a large volume of SQL log data generated by users of the Sloan Digital Sky Survey (SDSS) data archive in order to better understand users' data seeking behavior. While statistical analysis of such logs is useful at aggregated levels, efficiently exploring specific patterns of queries is often a challenging task due to the typically large volume of the data, multivariate features, and data requirements specified in SQL queries. To enable and facilitate effective and efficient exploration of the SDSS log data, we designed an interactive visualization tool, called the SDSS Log Viewer, which integrates time series visualization, text visualization, and dynamic query techniques. We describe two analysis scenarios of visual exploration of SDSS log data, including understanding unusually high daily query traffic and modeling the types of data seeking behaviors of massive query generators. The two scenarios demonstrate that the SDSS Log Viewer provides a novel and potentially valuable approach to support these targeted tasks.

  16. Analysis of the asymmetrical shortest two-server queueing model

    NARCIS (Netherlands)

    J.W. Cohen

    1995-01-01

    textabstractThis study presents the analytic solution for the asymmetrical two-server queueing model with arriving customers joining the shorter queue for the case with Poisson arrivals and negative exponentially distributed service times. The bivariate generating function of the stationary joint

  17. SciServer Compute brings Analysis to Big Data in the Cloud

    Science.gov (United States)

    Raddick, Jordan; Medvedev, Dmitry; Lemson, Gerard; Souter, Barbara

    2016-06-01

    SciServer Compute uses Jupyter Notebooks running within server-side Docker containers attached to big data collections to bring advanced analysis to big data "in the cloud." SciServer Compute is a component in the SciServer Big-Data ecosystem under development at JHU, which will provide a stable, reproducible, sharable virtual research environment.SciServer builds on the popular CasJobs and SkyServer systems that made the Sloan Digital Sky Survey (SDSS) archive one of the most-used astronomical instruments. SciServer extends those systems with server-side computational capabilities and very large scratch storage space, and further extends their functions to a range of other scientific disciplines.Although big datasets like SDSS have revolutionized astronomy research, for further analysis, users are still restricted to downloading the selected data sets locally - but increasing data sizes make this local approach impractical. Instead, researchers need online tools that are co-located with data in a virtual research environment, enabling them to bring their analysis to the data.SciServer supports this using the popular Jupyter notebooks, which allow users to write their own Python and R scripts and execute them on the server with the data (extensions to Matlab and other languages are planned). We have written special-purpose libraries that enable querying the databases and other persistent datasets. Intermediate results can be stored in large scratch space (hundreds of TBs) and analyzed directly from within Python or R with state-of-the-art visualization and machine learning libraries. Users can store science-ready results in their permanent allocation on SciDrive, a Dropbox-like system for sharing and publishing files. Communication between the various components of the SciServer system is managed through SciServer‘s new Single Sign-on Portal.We have created a number of demos to illustrate the capabilities of SciServer Compute, including Python and R scripts

  18. Analysis of free SSL/TLS Certificates and their implementation as Security Mechanism in Application Servers.

    Directory of Open Access Journals (Sweden)

    Mario E. Cueva Hurtado

    2017-02-01

    Full Text Available Security in the application layer (SSL, provides the confidentiality, integrity, and authenticity of the data, between two applications that communicate with each other. This article is the result of having implemented Free SSL / TLS Certificates in application servers, determining the relevant characteristics that must have a SSL/TLS certificate, the Certifying Authority generate it. A vulnerability analysis is developed in application servers and encrypted communications channel is established to protect against attacks such as man in the middle, phishing and maintaining the integrity of information that is transmitted between the client and server.

  19. Cross-system log file analysis for hypothesis testing

    NARCIS (Netherlands)

    Glahn, Christian

    2008-01-01

    Glahn, C. (2008). Cross-system log file analysis for hypothesis testing. Presented at Empowering Learners for Lifelong Competence Development: pedagogical, organisational and technological issues. 4th TENCompetence Open Workshop. April, 10, 2008, Madrid, Spain.

  20. Comparison of approaches for mobile document image analysis using server supported smartphones

    Science.gov (United States)

    Ozarslan, Suleyman; Eren, P. Erhan

    2014-03-01

    With the recent advances in mobile technologies, new capabilities are emerging, such as mobile document image analysis. However, mobile phones are still less powerful than servers, and they have some resource limitations. One approach to overcome these limitations is performing resource-intensive processes of the application on remote servers. In mobile document image analysis, the most resource consuming process is the Optical Character Recognition (OCR) process, which is used to extract text in mobile phone captured images. In this study, our goal is to compare the in-phone and the remote server processing approaches for mobile document image analysis in order to explore their trade-offs. For the inphone approach, all processes required for mobile document image analysis run on the mobile phone. On the other hand, in the remote-server approach, core OCR process runs on the remote server and other processes run on the mobile phone. Results of the experiments show that the remote server approach is considerably faster than the in-phone approach in terms of OCR time, but adds extra delays such as network delay. Since compression and downscaling of images significantly reduce file sizes and extra delays, the remote server approach overall outperforms the in-phone approach in terms of selected speed and correct recognition metrics, if the gain in OCR time compensates for the extra delays. According to the results of the experiments, using the most preferable settings, the remote server approach performs better than the in-phone approach in terms of speed and acceptable correct recognition metrics.

  1. Test Program for the Performance Analysis of DNS64 Servers

    Directory of Open Access Journals (Sweden)

    Gábor Lencse

    2015-09-01

    Full Text Available In our earlier research papers, bash shell scripts using the host Linux command were applied for testing the performance and stability of different DNS64 server imple­mentations. Because of their inefficiency, a small multi-threaded C/C++ program (named dns64perf was written which can directly send DNS AAAA record queries. After the introduction to the essential theoretical background about the structure of DNS messages and TCP/IP socket interface programming, the design decisions and implementation details of our DNS64 performance test program are disclosed. The efficiency of dns64perf is compared to that of the old method using bash shell scripts. The result is convincing: dns64perf can send at least 95 times more DNS AAAA record queries per second. The source code of dns64perf is published under the GNU GPLv3 license to support the work of other researchers in the field of testing the performance of DNS64 servers.

  2. Development of interpretation models for PFN uranium log analysis

    International Nuclear Information System (INIS)

    Barnard, R.W.

    1980-11-01

    This report presents the models for interpretation of borehole logs for the PFN (Prompt Fission Neutron) uranium logging system. Two models have been developed, the counts-ratio model and the counts/dieaway model. Both are empirically developed, but can be related to the theoretical bases for PFN analysis. The models try to correct for the effects of external factors (such as probe or formation parameters) in the calculation of uranium grade. The theoretical bases and calculational techniques for estimating uranium concentration from raw PFN data and other parameters are discussed. Examples and discussions of borehole logs are included

  3. Microsoft® SQL Server® 2008 Analysis Services Step by Step

    CERN Document Server

    Cameron, Scott

    2009-01-01

    Teach yourself to use SQL Server 2008 Analysis Services for business intelligence-one step at a time. You'll start by building your understanding of the business intelligence platform enabled by SQL Server and the Microsoft Office System, highlighting the role of Analysis Services. Then, you'll create a simple multidimensional OLAP cube and progressively add features to help improve, secure, deploy, and maintain an Analysis Services database. You'll explore core Analysis Services 2008 features and capabilities, including dimension, cube, and aggregation design wizards; a new attribute relatio

  4. PlanetServer: Innovative approaches for the online analysis of hyperspectral satellite data from Mars

    Science.gov (United States)

    Oosthoek, J. H. P.; Flahaut, J.; Rossi, A. P.; Baumann, P.; Misev, D.; Campalani, P.; Unnithan, V.

    2014-06-01

    PlanetServer is a WebGIS system, currently under development, enabling the online analysis of Compact Reconnaissance Imaging Spectrometer (CRISM) hyperspectral data from Mars. It is part of the EarthServer project which builds infrastructure for online access and analysis of huge Earth Science datasets. Core functionality consists of the rasdaman Array Database Management System (DBMS) for storage, and the Open Geospatial Consortium (OGC) Web Coverage Processing Service (WCPS) for data querying. Various WCPS queries have been designed to access spatial and spectral subsets of the CRISM data. The client WebGIS, consisting mainly of the OpenLayers javascript library, uses these queries to enable online spatial and spectral analysis. Currently the PlanetServer demonstration consists of two CRISM Full Resolution Target (FRT) observations, surrounding the NASA Curiosity rover landing site. A detailed analysis of one of these observations is performed in the Case Study section. The current PlanetServer functionality is described step by step, and is tested by focusing on detecting mineralogical evidence described in earlier Gale crater studies. Both the PlanetServer methodology and its possible use for mineralogical studies will be further discussed. Future work includes batch ingestion of CRISM data and further development of the WebGIS and analysis tools.

  5. Data pre-processing for web log mining: Case study of commercial bank website usage analysis

    Directory of Open Access Journals (Sweden)

    Jozef Kapusta

    2013-01-01

    Full Text Available We use data cleaning, integration, reduction and data conversion methods in the pre-processing level of data analysis. Data processing techniques improve the overall quality of the patterns mined. The paper describes using of standard pre-processing methods for preparing data of the commercial bank website in the form of the log file obtained from the web server. Data cleaning, as the simplest step of data pre-processing, is non–trivial as the analysed content is highly specific. We had to deal with the problem of frequent changes of the content and even frequent changes of the structure. Regular changes in the structure make use of the sitemap impossible. We presented approaches how to deal with this problem. We were able to create the sitemap dynamically just based on the content of the log file. In this case study, we also examined just the one part of the website over the standard analysis of an entire website, as we did not have access to all log files for the security reason. As the result, the traditional practices had to be adapted for this special case. Analysing just the small fraction of the website resulted in the short session time of regular visitors. We were not able to use recommended methods to determine the optimal value of session time. Therefore, we proposed new methods based on outliers identification for raising the accuracy of the session length in this paper.

  6. Automatic Data Logging and Quality Analysis System for Mobile Devices

    Directory of Open Access Journals (Sweden)

    Yong-Yi Fanjiang

    2017-01-01

    Full Text Available The testing phase of mobile device products includes two important test projects that must be completed before shipment: the field trial and the beta user trial. During the field trial, the product is certified based on its integration and stability with the local operator’s system, and, during the beta user trial, the product is certified by multiple users regarding its daily use, where the goal is to detect and solve early problems. In the traditional approach used to issue returns, testers must log into a web site, fill out a problem form, and then go through a browser or FTP to upload logs; however, this is inconvenient, and problems are reported slowly. Therefore, we propose an “automatic logging analysis system” (ALAS to construct a convenient test environment and, using a record analysis (log parser program, automate the parsing of log files and have questions automatically sent to the database by the system. Finally, the mean time between failures (MTBF is used to establish measurement indicators for the beta user trial.

  7. A web server for analysis, comparison and prediction of protein ligand binding sites.

    Science.gov (United States)

    Singh, Harinder; Srivastava, Hemant Kumar; Raghava, Gajendra P S

    2016-03-25

    One of the major challenges in the field of system biology is to understand the interaction between a wide range of proteins and ligands. In the past, methods have been developed for predicting binding sites in a protein for a limited number of ligands. In order to address this problem, we developed a web server named 'LPIcom' to facilitate users in understanding protein-ligand interaction. Analysis, comparison and prediction modules are available in the "LPIcom' server to predict protein-ligand interacting residues for 824 ligands. Each ligand must have at least 30 protein binding sites in PDB. Analysis module of the server can identify residues preferred in interaction and binding motif for a given ligand; for example residues glycine, lysine and arginine are preferred in ATP binding sites. Comparison module of the server allows comparing protein-binding sites of multiple ligands to understand the similarity between ligands based on their binding site. This module indicates that ATP, ADP and GTP ligands are in the same cluster and thus their binding sites or interacting residues exhibit a high level of similarity. Propensity-based prediction module has been developed for predicting ligand-interacting residues in a protein for more than 800 ligands. In addition, a number of web-based tools have been integrated to facilitate users in creating web logo and two-sample between ligand interacting and non-interacting residues. In summary, this manuscript presents a web-server for analysis of ligand interacting residue. This server is available for public use from URL http://crdd.osdd.net/raghava/lpicom .

  8. Geophysical well logging operations and log analysis in Geothermal Well Desert Peak No. B-23-1

    Energy Technology Data Exchange (ETDEWEB)

    Sethi, D.K.; Fertl, W.H.

    1980-03-01

    Geothermal Well Desert Peak No. B-23-1 was logged by Dresser Atlas during April/May 1979 to a total depth of 2939 m (9642 ft). A temperature of 209/sup 0/C (408/sup 0/F) was observed on the maximum thermometer run with one of the logging tools. Borehole tools rated to a maximum temperature of 204.4/sup 0/C (400/sup 0/F) were utilized for logging except for the Densilog tool, which was from the other set of borehole instruments, rated to a still higher temperature, i.e., 260/sup 0/C (500/sup 0/F). The quality of the logs recorded and the environmental effects on the log response have been considered. The log response in the unusual lithologies of igneous and metamorphic formations encountered in this well could be correlated with the drill cutting data. An empirical, statistical log interpretation approach has made it possible to obtain meaningful information on the rocks penetrated. Various crossplots/histograms of the corrected log data have been generated on the computer. These are found to provide good resolution between the lithological units in the rock sequence. The crossplotting techniques and the statistical approach were combined with the drill cutting descriptions in order to arrive at the lithological characteristics. The results of log analysis and recommendations for logging of future wells have been included.

  9. Bayesian analysis of log Gaussian Cox processes for disease mapping

    DEFF Research Database (Denmark)

    Benes, Viktor; Bodlák, Karel; Møller, Jesper

    We consider a data set of locations where people in Central Bohemia have been infected by tick-borne encephalitis, and where population census data and covariates concerning vegetation and altitude are available. The aims are to estimate the risk map of the disease and to study the dependence...... of the risk on the covariates. Instead of using the common area level approaches we consider a Bayesian analysis for a log Gaussian Cox point process with covariates. Posterior characteristics for a discretized version of the log Gaussian Cox process are computed using markov chain Monte Carlo methods...

  10. Cross-system log file analysis for hypothesis testing

    NARCIS (Netherlands)

    Glahn, Christian; Specht, Marcus; Schoonenboom, Judith; Sligte, Henk; Moghnieh, Ayman; Hernández-Leo, Davinia; Stefanov, Krassen; Lemmers, Ruud; Koper, Rob

    2008-01-01

    Glahn, C., Specht, M., Schoonenboom, J., Sligte, H., Moghnieh, A., Hernández-Leo, D. Stefanov, K., Lemmers, R., & Koper, R. (2008). Cross-system log file analysis for hypothesis testing. In H. Sligte & R. Koper (Eds.), Proceedings of the 4th TENCompetence Open Workshop. Empowering Learners for

  11. Android application and REST server system for quasar spectrum presentation and analysis

    Science.gov (United States)

    Wasiewicz, P.; Pietralik, K.; Hryniewicz, K.

    2017-08-01

    This paper describes the implementation of a system consisting of a mobile application and RESTful architecture server intended for the analysis and presentation of quasars' spectrum. It also depicts the quasar's characteristics and significance to the scientific community, the source for acquiring astronomical objects' spectral data, used software solutions as well as presents the aspect of Cloud Computing and various possible deployment configurations.

  12. CalFitter: a web server for analysis of protein thermal denaturation data.

    Science.gov (United States)

    Mazurenko, Stanislav; Stourac, Jan; Kunka, Antonin; Nedeljkovic, Sava; Bednar, David; Prokop, Zbynek; Damborsky, Jiri

    2018-05-14

    Despite significant advances in the understanding of protein structure-function relationships, revealing protein folding pathways still poses a challenge due to a limited number of relevant experimental tools. Widely-used experimental techniques, such as calorimetry or spectroscopy, critically depend on a proper data analysis. Currently, there are only separate data analysis tools available for each type of experiment with a limited model selection. To address this problem, we have developed the CalFitter web server to be a unified platform for comprehensive data fitting and analysis of protein thermal denaturation data. The server allows simultaneous global data fitting using any combination of input data types and offers 12 protein unfolding pathway models for selection, including irreversible transitions often missing from other tools. The data fitting produces optimal parameter values, their confidence intervals, and statistical information to define unfolding pathways. The server provides an interactive and easy-to-use interface that allows users to directly analyse input datasets and simulate modelled output based on the model parameters. CalFitter web server is available free at https://loschmidt.chemi.muni.cz/calfitter/.

  13. Online characterization of planetary surfaces: PlanetServer, an open-source analysis and visualization tool

    Science.gov (United States)

    Marco Figuera, R.; Pham Huu, B.; Rossi, A. P.; Minin, M.; Flahaut, J.; Halder, A.

    2018-01-01

    The lack of open-source tools for hyperspectral data visualization and analysis creates a demand for new tools. In this paper we present the new PlanetServer, a set of tools comprising a web Geographic Information System (GIS) and a recently developed Python Application Programming Interface (API) capable of visualizing and analyzing a wide variety of hyperspectral data from different planetary bodies. Current WebGIS open-source tools are evaluated in order to give an overview and contextualize how PlanetServer can help in this matters. The web client is thoroughly described as well as the datasets available in PlanetServer. Also, the Python API is described and exposed the reason of its development. Two different examples of mineral characterization of different hydrosilicates such as chlorites, prehnites and kaolinites in the Nili Fossae area on Mars are presented. As the obtained results show positive outcome in hyperspectral analysis and visualization compared to previous literature, we suggest using the PlanetServer approach for such investigations.

  14. Analysis of logging data from nuclear borehole tools

    International Nuclear Information System (INIS)

    Hovgaard, J.; Oelgaard, P.L.

    1989-12-01

    The processing procedure for logging data from a borehole of the Stenlille project of Dansk Naturgas A/S has been analysed. The tools considered in the analysis were an integral, natural-gamma tool, a neutron porosity tool, a gamma density tool and a caliper tool. It is believed that in most cases the processing procedure used by the logging company in the interpretation of the raw data is fully understood. An exception is the epithermal part of the neutron porosity tool where all data needed for an interpretation were not available. The analysis has shown that some parts of the interpretation procedure may not be consistent with the physical principle of the tools. (author)

  15. Analysis of a multi-server queueing model of ABR

    NARCIS (Netherlands)

    R. Núñez Queija (Rudesindo); O.J. Boxma (Onno)

    1996-01-01

    textabstractIn this paper we present a queueing model for the performance a-na-ly-sis of ABR traffic in ATM networks. We consider a multi-channel service station with two types of customers, the first having preemptive priority over the second. The arrivals occur according to two independent Poisson

  16. Audit and trace log management consolidation and analysis

    CERN Document Server

    Maier, Phillip Q

    2006-01-01

    As regulation and legislation evolve, the critical need for cost-effective and efficient IT audit and monitoring solutions will continue to grow. Audit and Trace Log Management: Consolidation and Analysis offers a comprehensive introduction and explanation of requirements and problem definition, and also delivers a multidimensional solution set with broad applicability across a wide range of organizations. Itprovidesa wealth of information in the form of processwalkthroughs. These include problem determination, requirements gathering,scope definition, risk assessment, compliance objectives,

  17. LOG FILE ANALYSIS AND CREATION OF MORE INTELLIGENT WEB SITES

    Directory of Open Access Journals (Sweden)

    Mislav Šimunić

    2012-07-01

    Full Text Available To enable successful performance of any company or business system, both inthe world and in the Republic of Croatia, among many problems relating to its operationsand particularly to maximum utilization and efficiency of the Internet as a media forrunning business (especially in terms of marketing, they should make the best possible useof the present-day global trends and advantages of sophisticated technologies andapproaches to running a business. Bearing in mind the fact of daily increasing competitionand more demanding market, this paper addresses certain scientific and practicalcontribution to continuous analysis of demand market and adaptation thereto by analyzingthe log files and by retroactive effect on the web site. A log file is a carrier of numerousdata and indicators that should be used in the best possible way to improve the entirebusiness operations of a company. However, this is not always simple and easy. The websites differ in size, purpose, and technology used for designing them. For this very reason,the analytic analysis frameworks should be such that can cover any web site and at thesame time leave some space for analyzing and investigating the specific characteristicof each web site and provide for its dynamics by analyzing the log file records. Thoseconsiderations were a basis for this paper

  18. Windows server cookbook for Windows server 2003 and Windows 2000

    CERN Document Server

    Allen, Robbie

    2005-01-01

    This practical reference guide offers hundreds of useful tasks for managing Windows 2000 and Windows Server 2003, Microsoft's latest server. These concise, on-the-job solutions to common problems are certain to save you many hours of time searching through Microsoft documentation. Topics include files, event logs, security, DHCP, DNS, backup/restore, and more

  19. Cyber-T web server: differential analysis of high-throughput data.

    Science.gov (United States)

    Kayala, Matthew A; Baldi, Pierre

    2012-07-01

    The Bayesian regularization method for high-throughput differential analysis, described in Baldi and Long (A Bayesian framework for the analysis of microarray expression data: regularized t-test and statistical inferences of gene changes. Bioinformatics 2001: 17: 509-519) and implemented in the Cyber-T web server, is one of the most widely validated. Cyber-T implements a t-test using a Bayesian framework to compute a regularized variance of the measurements associated with each probe under each condition. This regularized estimate is derived by flexibly combining the empirical measurements with a prior, or background, derived from pooling measurements associated with probes in the same neighborhood. This approach flexibly addresses problems associated with low replication levels and technology biases, not only for DNA microarrays, but also for other technologies, such as protein arrays, quantitative mass spectrometry and next-generation sequencing (RNA-seq). Here we present an update to the Cyber-T web server, incorporating several useful new additions and improvements. Several preprocessing data normalization options including logarithmic and (Variance Stabilizing Normalization) VSN transforms are included. To augment two-sample t-tests, a one-way analysis of variance is implemented. Several methods for multiple tests correction, including standard frequentist methods and a probabilistic mixture model treatment, are available. Diagnostic plots allow visual assessment of the results. The web server provides comprehensive documentation and example data sets. The Cyber-T web server, with R source code and data sets, is publicly available at http://cybert.ics.uci.edu/.

  20. LabKey Server: An open source platform for scientific data integration, analysis and collaboration

    Science.gov (United States)

    2011-01-01

    organizations. It tracks roughly 27,000 assay runs, 860,000 specimen vials and 1,300,000 vial transfers. Conclusions Sharing data, analysis tools and infrastructure can speed the efforts of large research consortia by enhancing efficiency and enabling new insights. The Atlas installation of LabKey Server demonstrates the utility of the LabKey platform for collaborative research. Stable, supported builds of LabKey Server are freely available for download at http://www.labkey.org. Documentation and source code are available under the Apache License 2.0. PMID:21385461

  1. LabKey Server: An open source platform for scientific data integration, analysis and collaboration

    Directory of Open Access Journals (Sweden)

    Lum Karl

    2011-03-01

    countries and 350 organizations. It tracks roughly 27,000 assay runs, 860,000 specimen vials and 1,300,000 vial transfers. Conclusions Sharing data, analysis tools and infrastructure can speed the efforts of large research consortia by enhancing efficiency and enabling new insights. The Atlas installation of LabKey Server demonstrates the utility of the LabKey platform for collaborative research. Stable, supported builds of LabKey Server are freely available for download at http://www.labkey.org. Documentation and source code are available under the Apache License 2.0.

  2. LabKey Server: an open source platform for scientific data integration, analysis and collaboration.

    Science.gov (United States)

    Nelson, Elizabeth K; Piehler, Britt; Eckels, Josh; Rauch, Adam; Bellew, Matthew; Hussey, Peter; Ramsay, Sarah; Nathe, Cory; Lum, Karl; Krouse, Kevin; Stearns, David; Connolly, Brian; Skillman, Tom; Igra, Mark

    2011-03-09

    roughly 27,000 assay runs, 860,000 specimen vials and 1,300,000 vial transfers. Sharing data, analysis tools and infrastructure can speed the efforts of large research consortia by enhancing efficiency and enabling new insights. The Atlas installation of LabKey Server demonstrates the utility of the LabKey platform for collaborative research. Stable, supported builds of LabKey Server are freely available for download at http://www.labkey.org. Documentation and source code are available under the Apache License 2.0.

  3. Financial and Economic Analysis of Reduced Impact Logging

    Science.gov (United States)

    Tom Holmes

    2016-01-01

    Concern regarding extensive damage to tropical forests resulting from logging increased dramatically after World War II when mechanized logging systems developed in industrialized countries were deployed in the tropics. As a consequence, tropical foresters began developing logging procedures that were more environmentally benign, and by the 1990s, these practices began...

  4. WebMGA: a customizable web server for fast metagenomic sequence analysis.

    Science.gov (United States)

    Wu, Sitao; Zhu, Zhengwei; Fu, Liming; Niu, Beifang; Li, Weizhong

    2011-09-07

    The new field of metagenomics studies microorganism communities by culture-independent sequencing. With the advances in next-generation sequencing techniques, researchers are facing tremendous challenges in metagenomic data analysis due to huge quantity and high complexity of sequence data. Analyzing large datasets is extremely time-consuming; also metagenomic annotation involves a wide range of computational tools, which are difficult to be installed and maintained by common users. The tools provided by the few available web servers are also limited and have various constraints such as login requirement, long waiting time, inability to configure pipelines etc. We developed WebMGA, a customizable web server for fast metagenomic analysis. WebMGA includes over 20 commonly used tools such as ORF calling, sequence clustering, quality control of raw reads, removal of sequencing artifacts and contaminations, taxonomic analysis, functional annotation etc. WebMGA provides users with rapid metagenomic data analysis using fast and effective tools, which have been implemented to run in parallel on our local computer cluster. Users can access WebMGA through web browsers or programming scripts to perform individual analysis or to configure and run customized pipelines. WebMGA is freely available at http://weizhongli-lab.org/metagenomic-analysis. WebMGA offers to researchers many fast and unique tools and great flexibility for complex metagenomic data analysis.

  5. WebMGA: a customizable web server for fast metagenomic sequence analysis

    Directory of Open Access Journals (Sweden)

    Niu Beifang

    2011-09-01

    Full Text Available Abstract Background The new field of metagenomics studies microorganism communities by culture-independent sequencing. With the advances in next-generation sequencing techniques, researchers are facing tremendous challenges in metagenomic data analysis due to huge quantity and high complexity of sequence data. Analyzing large datasets is extremely time-consuming; also metagenomic annotation involves a wide range of computational tools, which are difficult to be installed and maintained by common users. The tools provided by the few available web servers are also limited and have various constraints such as login requirement, long waiting time, inability to configure pipelines etc. Results We developed WebMGA, a customizable web server for fast metagenomic analysis. WebMGA includes over 20 commonly used tools such as ORF calling, sequence clustering, quality control of raw reads, removal of sequencing artifacts and contaminations, taxonomic analysis, functional annotation etc. WebMGA provides users with rapid metagenomic data analysis using fast and effective tools, which have been implemented to run in parallel on our local computer cluster. Users can access WebMGA through web browsers or programming scripts to perform individual analysis or to configure and run customized pipelines. WebMGA is freely available at http://weizhongli-lab.org/metagenomic-analysis. Conclusions WebMGA offers to researchers many fast and unique tools and great flexibility for complex metagenomic data analysis.

  6. Log-Normality and Multifractal Analysis of Flame Surface Statistics

    Science.gov (United States)

    Saha, Abhishek; Chaudhuri, Swetaprovo; Law, Chung K.

    2013-11-01

    The turbulent flame surface is typically highly wrinkled and folded at a multitude of scales controlled by various flame properties. It is useful if the information contained in this complex geometry can be projected onto a simpler regular geometry for the use of spectral, wavelet or multifractal analyses. Here we investigate local flame surface statistics of turbulent flame expanding under constant pressure. First the statistics of local length ratio is experimentally obtained from high-speed Mie scattering images. For spherically expanding flame, length ratio on the measurement plane, at predefined equiangular sectors is defined as the ratio of the actual flame length to the length of a circular-arc of radius equal to the average radius of the flame. Assuming isotropic distribution of such flame segments we convolute suitable forms of the length-ratio probability distribution functions (pdfs) to arrive at corresponding area-ratio pdfs. Both the pdfs are found to be near log-normally distributed and shows self-similar behavior with increasing radius. Near log-normality and rather intermittent behavior of the flame-length ratio suggests similarity with dissipation rate quantities which stimulates multifractal analysis. Currently at Indian Institute of Science, India.

  7. Using MATLAB software with Tomcat server and Java platform for remote image analysis in pathology.

    Science.gov (United States)

    Markiewicz, Tomasz

    2011-03-30

    The Matlab software is a one of the most advanced development tool for application in engineering practice. From our point of view the most important is the image processing toolbox, offering many built-in functions, including mathematical morphology, and implementation of a many artificial neural networks as AI. It is very popular platform for creation of the specialized program for image analysis, also in pathology. Based on the latest version of Matlab Builder Java toolbox, it is possible to create the software, serving as a remote system for image analysis in pathology via internet communication. The internet platform can be realized based on Java Servlet Pages with Tomcat server as servlet container. In presented software implementation we propose remote image analysis realized by Matlab algorithms. These algorithms can be compiled to executable jar file with the help of Matlab Builder Java toolbox. The Matlab function must be declared with the set of input data, output structure with numerical results and Matlab web figure. Any function prepared in that manner can be used as a Java function in Java Servlet Pages (JSP). The graphical user interface providing the input data and displaying the results (also in graphical form) must be implemented in JSP. Additionally the data storage to database can be implemented within algorithm written in Matlab with the help of Matlab Database Toolbox directly with the image processing. The complete JSP page can be run by Tomcat server. The proposed tool for remote image analysis was tested on the Computerized Analysis of Medical Images (CAMI) software developed by author. The user provides image and case information (diagnosis, staining, image parameter etc.). When analysis is initialized, input data with image are sent to servlet on Tomcat. When analysis is done, client obtains the graphical results as an image with marked recognized cells and also the quantitative output. Additionally, the results are stored in a server

  8. BioconductorBuntu: a Linux distribution that implements a web-based DNA microarray analysis server.

    Science.gov (United States)

    Geeleher, Paul; Morris, Dermot; Hinde, John P; Golden, Aaron

    2009-06-01

    BioconductorBuntu is a custom distribution of Ubuntu Linux that automatically installs a server-side microarray processing environment, providing a user-friendly web-based GUI to many of the tools developed by the Bioconductor Project, accessible locally or across a network. System installation is via booting off a CD image or by using a Debian package provided to upgrade an existing Ubuntu installation. In its current version, several microarray analysis pipelines are supported including oligonucleotide, dual-or single-dye experiments, including post-processing with Gene Set Enrichment Analysis. BioconductorBuntu is designed to be extensible, by server-side integration of further relevant Bioconductor modules as required, facilitated by its straightforward underlying Python-based infrastructure. BioconductorBuntu offers an ideal environment for the development of processing procedures to facilitate the analysis of next-generation sequencing datasets. BioconductorBuntu is available for download under a creative commons license along with additional documentation and a tutorial from (http://bioinf.nuigalway.ie).

  9. Towards an entropy-based analysis of log variability

    DEFF Research Database (Denmark)

    Back, Christoffer Olling; Debois, Søren; Slaats, Tijs

    2017-01-01

    the development of hybrid miners: given a (sub-)log, can we determine a priori whether the log is best suited for imperative or declarative mining? We propose using the concept of entropy, commonly used in information theory. We consider different measures for entropy that could be applied and show through...... experimentation on both synthetic and real-life logs that these entropy measures do indeed give insights into the complexity of the log and can act as an indicator of which mining paradigm should be used....

  10. Towards an Entropy-based Analysis of Log Variability

    DEFF Research Database (Denmark)

    Back, Christoffer Olling; Debois, Søren; Slaats, Tijs

    2018-01-01

    the development of hybrid miners: given a log, can we determine a priori whether the log is best suited for imperative or declarative mining? We propose using the concept of entropy, commonly used in information theory. We consider different measures for entropy that could be applied and show through...... experimentation on both synthetic and real-life logs that these entropy measures do indeed give insights into the complexity of the log and can act as an indicator of which mining paradigm should be used....

  11. MotifNet: a web-server for network motif analysis.

    Science.gov (United States)

    Smoly, Ilan Y; Lerman, Eugene; Ziv-Ukelson, Michal; Yeger-Lotem, Esti

    2017-06-15

    Network motifs are small topological patterns that recur in a network significantly more often than expected by chance. Their identification emerged as a powerful approach for uncovering the design principles underlying complex networks. However, available tools for network motif analysis typically require download and execution of computationally intensive software on a local computer. We present MotifNet, the first open-access web-server for network motif analysis. MotifNet allows researchers to analyze integrated networks, where nodes and edges may be labeled, and to search for motifs of up to eight nodes. The output motifs are presented graphically and the user can interactively filter them by their significance, number of instances, node and edge labels, and node identities, and view their instances. MotifNet also allows the user to distinguish between motifs that are centered on specific nodes and motifs that recur in distinct parts of the network. MotifNet is freely available at http://netbio.bgu.ac.il/motifnet . The website was implemented using ReactJs and supports all major browsers. The server interface was implemented in Python with data stored on a MySQL database. estiyl@bgu.ac.il or michaluz@cs.bgu.ac.il. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  12. Cased-hole log analysis and reservoir performance monitoring

    CERN Document Server

    Bateman, Richard M

    2015-01-01

    This book addresses vital issues, such as the evaluation of shale gas reservoirs and their production. Topics include the cased-hole logging environment, reservoir fluid properties; flow regimes; temperature, noise, cement bond, and pulsed neutron logging; and casing inspection. Production logging charts and tables are included in the appendices. The work serves as a comprehensive reference for production engineers with upstream E&P companies, well logging service company employees, university students, and petroleum industry training professionals. This book also: ·       Provides methods of conveying production logging tools along horizontal well segments as well as measurements of formation electrical resistivity through casing ·       Covers new information on fluid flow characteristics in inclined pipe and provides new and improved nuclear tool measurements in cased wells ·       Includes updates on cased-hole wireline formation testing  

  13. PIQMIe: a web server for semi-quantitative proteomics data management and analysis.

    Science.gov (United States)

    Kuzniar, Arnold; Kanaar, Roland

    2014-07-01

    We present the Proteomics Identifications and Quantitations Data Management and Integration Service or PIQMIe that aids in reliable and scalable data management, analysis and visualization of semi-quantitative mass spectrometry based proteomics experiments. PIQMIe readily integrates peptide and (non-redundant) protein identifications and quantitations from multiple experiments with additional biological information on the protein entries, and makes the linked data available in the form of a light-weight relational database, which enables dedicated data analyses (e.g. in R) and user-driven queries. Using the web interface, users are presented with a concise summary of their proteomics experiments in numerical and graphical forms, as well as with a searchable protein grid and interactive visualization tools to aid in the rapid assessment of the experiments and in the identification of proteins of interest. The web server not only provides data access through a web interface but also supports programmatic access through RESTful web service. The web server is available at http://piqmie.semiqprot-emc.cloudlet.sara.nl or http://www.bioinformatics.nl/piqmie. This website is free and open to all users and there is no login requirement. © The Author(s) 2014. Published by Oxford University Press on behalf of Nucleic Acids Research.

  14. A performance analysis of advanced I/O architectures for PC-based network file servers

    Science.gov (United States)

    Huynh, K. D.; Khoshgoftaar, T. M.

    1994-12-01

    In the personal computing and workstation environments, more and more I/O adapters are becoming complete functional subsystems that are intelligent enough to handle I/O operations on their own without much intervention from the host processor. The IBM Subsystem Control Block (SCB) architecture has been defined to enhance the potential of these intelligent adapters by defining services and conventions that deliver command information and data to and from the adapters. In recent years, a new storage architecture, the Redundant Array of Independent Disks (RAID), has been quickly gaining acceptance in the world of computing. In this paper, we would like to discuss critical system design issues that are important to the performance of a network file server. We then present a performance analysis of the SCB architecture and disk array technology in typical network file server environments based on personal computers (PCs). One of the key issues investigated in this paper is whether a disk array can outperform a group of disks (of same type, same data capacity, and same cost) operating independently, not in parallel as in a disk array.

  15. Nuclear cross section library for oil well logging analysis

    International Nuclear Information System (INIS)

    Kodeli, I.; Kitsos, S.; Aldama, D.L.; Zefran, B.

    2003-01-01

    As part of the IRTMBA (Improved Radiation Transport Modelling for Borehole Applications) Project of the EU Community's 5 th Programme a special purpose multigroup cross section library to be used in the deterministic (as well as Monte Carlo) oil well logging particle transport calculations was prepared. This library is expected to improve the prediction of the neutron and gamma spectra at the detector positions of the logging tool, and their use for the interpretation of the neutron logging measurements was studied. Preparation and testing of this library is described. (author)

  16. Handbook on nuclear data for borehole logging and mineral analysis

    International Nuclear Information System (INIS)

    1993-01-01

    In nuclear geophysics, an extension of the nuclear data available for reactor and shielding calculations, is required. In general, the problems and the methods of attack are the same, but in nuclear geophysics the environment is earth materials, with virtually all the natural elements in the Periodic Table involved, although not at the same time. In addition, the geometrical configurations encountered in nuclear geophysics are very different from those associated with reactor and shielding design, and they can impose a different demand on the required accuracy of the nuclear data and on the dependence on the calculational approach. Borehole logging is a very good example, since an experimental investigation aimed at varying only one parameter (e.g. moisture content) whilst keeping all the others constant in a geologically complex system that effectively exhibits 'infinite geometry' for neutrons and γ rays is virtually impossible. An increasingly important are of nuclear geophysics is the on-line analysis of natural materials such as coal (e.g. C, H, O, Al, Si, Ca, Fe, Cl, S, N,) the raw materials of the cement industry (S, Na, K, Al, Si, Ca, Fe, Mn, Ti, P, Mg, F, O), and mined ores of Fe, Al, Mn, Cu, Ni, Ag and Au, amongst others. Refs, figs and tabs

  17. Make Log Yield Analysis Part of Your Daily Routine

    Science.gov (United States)

    Jan Wiedenbeck; Jeff Palmer; Robert Mayer

    2006-01-01

    You haven't been conducting regular log yield studies because you don't have extra people to assign to the task. Besides, you've been around sawmills your whole life and have an innate sense of how your logs are yielding relative to the price you paid for them. Right? At the USDA Forest Service's hardwood marketing and utilization research lab in...

  18. Local regularity analysis of strata heterogeneities from sonic logs

    Directory of Open Access Journals (Sweden)

    S. Gaci

    2010-09-01

    Full Text Available Borehole logs provide geological information about the rocks crossed by the wells. Several properties of rocks can be interpreted in terms of lithology, type and quantity of the fluid filling the pores and fractures.

    Here, the logs are assumed to be nonhomogeneous Brownian motions (nhBms which are generalized fractional Brownian motions (fBms indexed by depth-dependent Hurst parameters H(z. Three techniques, the local wavelet approach (LWA, the average-local wavelet approach (ALWA, and Peltier Algorithm (PA, are suggested to estimate the Hurst functions (or the regularity profiles from the logs.

    First, two synthetic sonic logs with different parameters, shaped by the successive random additions (SRA algorithm, are used to demonstrate the potential of the proposed methods. The obtained Hurst functions are close to the theoretical Hurst functions. Besides, the transitions between the modeled layers are marked by Hurst values discontinuities. It is also shown that PA leads to the best Hurst value estimations.

    Second, we investigate the multifractional property of sonic logs data recorded at two scientific deep boreholes: the pilot hole VB and the ultra deep main hole HB, drilled for the German Continental Deep Drilling Program (KTB. All the regularity profiles independently obtained for the logs provide a clear correlation with lithology, and from each regularity profile, we derive a similar segmentation in terms of lithological units. The lithological discontinuities (strata' bounds and faults contacts are located at the local extrema of the Hurst functions. Moreover, the regularity profiles are compared with the KTB estimated porosity logs, showing a significant relation between the local extrema of the Hurst functions and the fluid-filled fractures. The Hurst function may then constitute a tool to characterize underground heterogeneities.

  19. COGcollator: a web server for analysis of distant relationships between homologous protein families.

    Science.gov (United States)

    Dibrova, Daria V; Konovalov, Kirill A; Perekhvatov, Vadim V; Skulachev, Konstantin V; Mulkidjanian, Armen Y

    2017-11-29

    The Clusters of Orthologous Groups (COGs) of proteins systematize evolutionary related proteins into specific groups with similar functions. However, the available databases do not provide means to assess the extent of similarity between the COGs. We intended to provide a method for identification and visualization of evolutionary relationships between the COGs, as well as a respective web server. Here we introduce the COGcollator, a web tool for identification of evolutionarily related COGs and their further analysis. We demonstrate the utility of this tool by identifying the COGs that contain distant homologs of (i) the catalytic subunit of bacterial rotary membrane ATP synthases and (ii) the DNA/RNA helicases of the superfamily 1. This article was reviewed by Drs. Igor N. Berezovsky, Igor Zhulin and Yuri Wolf.

  20. Disclosure-Protected Inference with Linked Microdata Using a Remote Analysis Server

    Directory of Open Access Journals (Sweden)

    Chipperfield James O.

    2014-03-01

    Full Text Available Large amounts of microdata are collected by data custodians in the form of censuses and administrative records. Often, data custodians will collect different information on the same individual. Many important questions can be answered by linking microdata collected by different data custodians. For this reason, there is very strong demand from analysts, within government, business, and universities, for linked microdata. However, many data custodians are legally obliged to ensure the risk of disclosing information about a person or organisation is acceptably low. Different authors have considered the problem of how to facilitate reliable statistical inference from analysis of linked microdata while ensuring that the risk of disclosure is acceptably low. This article considers the problem from the perspective of an Integrating Authority that, by definition, is trusted to link the microdata and to facilitate analysts’ access to the linked microdata via a remote server, which allows analysts to fit models and view the statistical output without being able to observe the underlying linked microdata. One disclosure risk that must be managed by an Integrating Authority is that one data custodian may use the microdata it supplied to the Integrating Authority and statistical output released from the remote server to disclose information about a person or organisation that was supplied by the other data custodian. This article considers analysis of only binary variables. The utility and disclosure risk of the proposed method are investigated both in a simulation and using a real example. This article shows that some popular protections against disclosure (dropping records, rounding regression coefficients or imposing restrictions on model selection can be ineffective in the above setting.

  1. Testing and analysis of internal hardwood log defect prediction models

    Science.gov (United States)

    R. Edward Thomas

    2011-01-01

    The severity and location of internal defects determine the quality and value of lumber sawn from hardwood logs. Models have been developed to predict the size and position of internal defects based on external defect indicator measurements. These models were shown to predict approximately 80% of all internal knots based on external knot indicators. However, the size...

  2. Acoustic measurements on trees and logs: a review and analysis

    Science.gov (United States)

    Xiping Wang

    2013-01-01

    Acoustic technologies have been well established as material evaluation tools in the past several decades, and their use has become widely accepted in the forest products industry for online quality control and products grading. Recent research developments on acoustic sensing technology offer further opportunities to evaluate standing trees and logs for general wood...

  3. Empirical Analysis of Server Consolidation and Desktop Virtualization in Cloud Computing

    Directory of Open Access Journals (Sweden)

    Bao Rong Chang

    2013-01-01

    Full Text Available Physical server transited to virtual server infrastructure (VSI and desktop device to virtual desktop infrastructure (VDI have the crucial problems of server consolidation, virtualization performance, virtual machine density, total cost of ownership (TCO, and return on investments (ROI. Besides, how to appropriately choose hypervisor for the desired server/desktop virtualization is really challenging, because a trade-off between virtualization performance and cost is a hard decision to make in the cloud. This paper introduces five hypervisors to establish the virtual environment and then gives a careful assessment based on C/P ratio that is derived from composite index, consolidation ratio, virtual machine density, TCO, and ROI. As a result, even though ESX server obtains the highest ROI and lowest TCO in server virtualization and Hyper-V R2 gains the best performance of virtual machine management; both of them however cost too much. Instead the best choice is Proxmox Virtual Environment (Proxmox VE because it not only saves the initial investment a lot to own a virtual server/desktop infrastructure, but also obtains the lowest C/P ratio.

  4. A Comprehensive Sensitivity Analysis of a Data Center Network with Server Virtualization for Business Continuity

    Directory of Open Access Journals (Sweden)

    Tuan Anh Nguyen

    2015-01-01

    Full Text Available Sensitivity assessment of availability for data center networks (DCNs is of paramount importance in design and management of cloud computing based businesses. Previous work has presented a performance modeling and analysis of a fat-tree based DCN using queuing theory. In this paper, we present a comprehensive availability modeling and sensitivity analysis of a DCell-based DCN with server virtualization for business continuity using stochastic reward nets (SRN. We use SRN in modeling to capture complex behaviors and dependencies of the system in detail. The models take into account (i two DCell configurations, respectively, composed of two and three physical hosts in a DCell0 unit, (ii failure modes and corresponding recovery behaviors of hosts, switches, and VMs, and VM live migration mechanism within and between DCell0s, and (iii dependencies between subsystems (e.g., between a host and VMs and between switches and VMs in the same DCell0. The constructed SRN models are analyzed in detail with regard to various metrics of interest to investigate system’s characteristics. A comprehensive sensitivity analysis of system availability is carried out in consideration of the major impacting parameters in order to observe the system’s complicated behaviors and find the bottlenecks of system availability. The analysis results show the availability improvement, capability of fault tolerance, and business continuity of the DCNs complying with DCell network topology. This study provides a basis of designing and management of DCNs for business continuity.

  5. DIANA-microT web server v5.0: service integration into miRNA functional analysis workflows.

    Science.gov (United States)

    Paraskevopoulou, Maria D; Georgakilas, Georgios; Kostoulas, Nikos; Vlachos, Ioannis S; Vergoulis, Thanasis; Reczko, Martin; Filippidis, Christos; Dalamagas, Theodore; Hatzigeorgiou, A G

    2013-07-01

    MicroRNAs (miRNAs) are small endogenous RNA molecules that regulate gene expression through mRNA degradation and/or translation repression, affecting many biological processes. DIANA-microT web server (http://www.microrna.gr/webServer) is dedicated to miRNA target prediction/functional analysis, and it is being widely used from the scientific community, since its initial launch in 2009. DIANA-microT v5.0, the new version of the microT server, has been significantly enhanced with an improved target prediction algorithm, DIANA-microT-CDS. It has been updated to incorporate miRBase version 18 and Ensembl version 69. The in silico-predicted miRNA-gene interactions in Homo sapiens, Mus musculus, Drosophila melanogaster and Caenorhabditis elegans exceed 11 million in total. The web server was completely redesigned, to host a series of sophisticated workflows, which can be used directly from the on-line web interface, enabling users without the necessary bioinformatics infrastructure to perform advanced multi-step functional miRNA analyses. For instance, one available pipeline performs miRNA target prediction using different thresholds and meta-analysis statistics, followed by pathway enrichment analysis. DIANA-microT web server v5.0 also supports a complete integration with the Taverna Workflow Management System (WMS), using the in-house developed DIANA-Taverna Plug-in. This plug-in provides ready-to-use modules for miRNA target prediction and functional analysis, which can be used to form advanced high-throughput analysis pipelines.

  6. Basic Static Code Analysis Untuk Mendeteksi Backdoor Shell Pada Web Server

    Directory of Open Access Journals (Sweden)

    Nelly Indriani Widiastuti

    2017-05-01

    Full Text Available Mengakses  sistem komputer tanpa ijin merupakan kejahatan yang dilakukan dengan memasuki atau menyusup kedalam suatu sistem jaringan komputer tanpa sepengetahuan dari pemilik sistem tersebut. Kejahatan  tersebut bertujuan untuk mengintai atau mencuri informasi penting dan rahasia. Dalam praktiknya peretas menyisipkan berkas backdoor shell pada lokasi yang sulit ditemukan oleh pemilik sistem. Beberapa perangkat yang sudah ada masih dalam bentuk terminal. Perangkat tersebut melakukan pencarian berkas berdasarkan nama-nama yang telah terdaftar sebelumnya. Akibatnya, pada saat berkas backdoor shell  jenis baru menginfeksi, tools tersebut tidak dapat mendeteksi keberadaannya. Berdasarkan hal tersebut, maka dalam penelitian ini pencarian backdoor shell pada web server menggunakan metode basic static code analysis. File sistem diproses melalui dua tahap utama yaitu string matching dan taint analysis. Dalam proses taint analysis, sistem menghitung peluang kemungkinan setiap signature sebagai backdoor untuk mengatasi kamus backdoor yang tidak lengkap. Berdasarkan  hasil yang didapat dari pengujian yang dilakukan terhadap 3964 berkas diperoleh tingkat akurasi  yang lebih besar dibandingkan dengan aplikasi php shell detector sebesar 75%.

  7. The Analysis of a Link Between a Remote Local Area Network and Its Server Resources

    National Research Council Canada - National Science Library

    Beaver, Theresa

    2005-01-01

    ... paramount. One way to provide this support is to create a Local Area Network (LAN) in which the workstations are positioned at the deployed location while the servers are maintained at a Main Operating Base (MOB...

  8. Analysis of Java Distributed Architectures in Designing and Implementing a Client/Server Database System

    National Research Council Canada - National Science Library

    Akin, Ramis

    1998-01-01

    .... Information is scattered throughout organizations and must be easily accessible. A new solution is needed for effective and efficient management of data in today's distributed client/server environment...

  9. Deep Recurrent Model for Server Load and Performance Prediction in Data Center

    Directory of Open Access Journals (Sweden)

    Zheng Huang

    2017-01-01

    Full Text Available Recurrent neural network (RNN has been widely applied to many sequential tagging tasks such as natural language process (NLP and time series analysis, and it has been proved that RNN works well in those areas. In this paper, we propose using RNN with long short-term memory (LSTM units for server load and performance prediction. Classical methods for performance prediction focus on building relation between performance and time domain, which makes a lot of unrealistic hypotheses. Our model is built based on events (user requests, which is the root cause of server performance. We predict the performance of the servers using RNN-LSTM by analyzing the log of servers in data center which contains user’s access sequence. Previous work for workload prediction could not generate detailed simulated workload, which is useful in testing the working condition of servers. Our method provides a new way to reproduce user request sequence to solve this problem by using RNN-LSTM. Experiment result shows that our models get a good performance in generating load and predicting performance on the data set which has been logged in online service. We did experiments with nginx web server and mysql database server, and our methods can been easily applied to other servers in data center.

  10. Log analysis in the shallow oil sands of the San Joaquin Valley, California

    International Nuclear Information System (INIS)

    Vohs, J.B.

    1976-01-01

    Many fields in the San Joaquin Valley of California produce oil from a depth of 2,500 ft or less. During the period of primary production in these fields, evaluation of potential pay intervals from logs was restricted to examination of ES logs and correlation. With the introduction of secondary and tertiary recovery techniques the need for more and better answers, more quickly available, became apparent. However, several log-analysis problems had to be resolved. Formation evaluation using well logs was complicated by the shaliness of the sand intervals, the low and variable salinity of the formation waters, and the presence of low-pressure-gas (depleted) zones in many of the shallow sands. Solutions to these problems have required more modern logging programs and interpretation techniques. Logs available for the evaluation of these sands are the dual induction-laterolog, the compensated formation density log, the compensated neutron log, and the microlaterolog or proximity log. With this suite of logs it is possible to determine the shale content, porosity, saturation in the flushed zone, and water saturation of the sand, and to locate the low-pressure-gas sands and depleted zones. In cases where freshwater and oil are interlayered, it is possible to tell which sands contain oil and which contain only water. Because a quick interpretation is required, wellsite techniques are called for. These will be described

  11. Abdominal aortic aneurysms: virtual imaging and analysis through a remote web server

    International Nuclear Information System (INIS)

    Neri, Emanuele; Bargellini, Irene; Vignali, Claudio; Bartolozzi, Carlo; Rieger, Michael; Jaschke, Werner; Giachetti, Andrea; Tuveri, Massimiliano

    2005-01-01

    The study describes the application of a web-based software in the planning of the endovascular treatment of abdominal aortic aneurysms (AAA). The software has been developed in the framework of a 2-year research project called Aneurysm QUAntification Through an Internet Collaborative System (AQUATICS); it allows to manage remotely Virtual Reality Modeling Language (VRML) models of the abdominal aorta, derived from multirow computed tomography angiography (CTA) data sets, and to obtain measurements of diameters, angles and centerline lengths. To test the reliability of measurements, two radiologists performed a detailed analysis of multiple 3D models generated from a synthetic phantom, mimicking an AAA. The system was tested on 30 patients with AAA; CTA data sets were mailed and the time required for segmentation and measurement were collected for each case. The Bland-Altman plot analysis showed that the mean intra- and inter-observer differences in measures on phantoms were clinically acceptable. The mean time required for segmentation was 1 h (range 45-120 min). The mean time required for measurements on the web was 7 min (range 4-11 min). The AQUATICS web server may provide a rapid, standardized and accurate tool for the evaluation of AAA prior to the endovascular treatment. (orig.)

  12. Analysis of artificial fireplace logs by high temperature gas chromatography.

    Science.gov (United States)

    Kuk, Raymond J

    2002-11-01

    High temperature gas chromatography is used to analyze the wax of artificial fireplace logs (firelogs). Firelogs from several different manufacturers are studied and compared. This study shows that the wax within a single firelog is homogeneous and that the wax is also uniform throughout a multi-firelog package. Different brands are shown to have different wax compositions. Firelogs of the same brand, but purchased in different locations, also have different wax compositions. With this information it may be possible to associate an unknown firelog sample to a known sample, but a definitive statement of the origin cannot be made.

  13. Current trends in nuclear borehole logging techniques for elemental analysis

    International Nuclear Information System (INIS)

    1988-06-01

    This report is the result of a consultants' meeting organized by the IAEA and held in Ottawa, Canada, 2-6 November 1987 in order to assess the present technical status of nuclear borehole logging techniques, to find out the well established applications and the development trends. It contains a summary report giving a comprehensive overview of the techniques and applications and a collection of research papers describing work done in industrial institutes. A separate abstract was prepared for each of these 9 papers. Refs, figs and tabs

  14. Communication Base Station Log Analysis Based on Hierarchical Clustering

    Directory of Open Access Journals (Sweden)

    Zhang Shao-Hua

    2017-01-01

    Full Text Available Communication base stations generate massive data every day, these base station logs play an important value in mining of the business circles. This paper use data mining technology and hierarchical clustering algorithm to group the scope of business circle for the base station by recording the data of these base stations.Through analyzing the data of different business circle based on feature extraction and comparing different business circle category characteristics, which can choose a suitable area for operators of commercial marketing.

  15. QlikView Server and Publisher

    CERN Document Server

    Redmond, Stephen

    2014-01-01

    This is a comprehensive guide with a step-by-step approach that enables you to host and manage servers using QlikView Server and QlikView Publisher.If you are a server administrator wanting to learn about how to deploy QlikView Server for server management,analysis and testing, and QlikView Publisher for publishing of business content then this is the perfect book for you. No prior experience with QlikView is expected.

  16. Design and Analysis of an Enhanced Patient-Server Mutual Authentication Protocol for Telecare Medical Information System.

    Science.gov (United States)

    Amin, Ruhul; Islam, S K Hafizul; Biswas, G P; Khan, Muhammad Khurram; Obaidat, Mohammad S

    2015-11-01

    In order to access remote medical server, generally the patients utilize smart card to login to the server. It has been observed that most of the user (patient) authentication protocols suffer from smart card stolen attack that means the attacker can mount several common attacks after extracting smart card information. Recently, Lu et al.'s proposes a session key agreement protocol between the patient and remote medical server and claims that the same protocol is secure against relevant security attacks. However, this paper presents several security attacks on Lu et al.'s protocol such as identity trace attack, new smart card issue attack, patient impersonation attack and medical server impersonation attack. In order to fix the mentioned security pitfalls including smart card stolen attack, this paper proposes an efficient remote mutual authentication protocol using smart card. We have then simulated the proposed protocol using widely-accepted AVISPA simulation tool whose results make certain that the same protocol is secure against active and passive attacks including replay and man-in-the-middle attacks. Moreover, the rigorous security analysis proves that the proposed protocol provides strong security protection on the relevant security attacks including smart card stolen attack. We compare the proposed scheme with several related schemes in terms of computation cost and communication cost as well as security functionalities. It has been observed that the proposed scheme is comparatively better than related existing schemes.

  17. DeepBlue epigenomic data server: programmatic data retrieval and analysis of epigenome region sets.

    Science.gov (United States)

    Albrecht, Felipe; List, Markus; Bock, Christoph; Lengauer, Thomas

    2016-07-08

    Large amounts of epigenomic data are generated under the umbrella of the International Human Epigenome Consortium, which aims to establish 1000 reference epigenomes within the next few years. These data have the potential to unravel the complexity of epigenomic regulation. However, their effective use is hindered by the lack of flexible and easy-to-use methods for data retrieval. Extracting region sets of interest is a cumbersome task that involves several manual steps: identifying the relevant experiments, downloading the corresponding data files and filtering the region sets of interest. Here we present the DeepBlue Epigenomic Data Server, which streamlines epigenomic data analysis as well as software development. DeepBlue provides a comprehensive programmatic interface for finding, selecting, filtering, summarizing and downloading region sets. It contains data from four major epigenome projects, namely ENCODE, ROADMAP, BLUEPRINT and DEEP. DeepBlue comes with a user manual, examples and a well-documented application programming interface (API). The latter is accessed via the XML-RPC protocol supported by many programming languages. To demonstrate usage of the API and to enable convenient data retrieval for non-programmers, we offer an optional web interface. DeepBlue can be openly accessed at http://deepblue.mpi-inf.mpg.de. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  18. FIDEA: a server for the functional interpretation of differential expression analysis.

    KAUST Repository

    D'Andrea, Daniel

    2013-06-10

    The results of differential expression analyses provide scientists with hundreds to thousands of differentially expressed genes that need to be interpreted in light of the biology of the specific system under study. This requires mapping the genes to functional classifications that can be, for example, the KEGG pathways or InterPro families they belong to, their GO Molecular Function, Biological Process or Cellular Component. A statistically significant overrepresentation of one or more category terms in the set of differentially expressed genes is an essential step for the interpretation of the biological significance of the results. Ideally, the analysis should be performed by scientists who are well acquainted with the biological problem, as they have a wealth of knowledge about the system and can, more easily than a bioinformatician, discover less obvious and, therefore, more interesting relationships. To allow experimentalists to explore their data in an easy and at the same time exhaustive fashion within a single tool and to test their hypothesis quickly and effortlessly, we developed FIDEA. The FIDEA server is located at http://www.biocomputing.it/fidea; it is free and open to all users, and there is no login requirement.

  19. myPhyloDB: a local web server for the storage and analysis of metagenomic data.

    Science.gov (United States)

    Manter, Daniel K; Korsa, Matthew; Tebbe, Caleb; Delgado, Jorge A

    2016-01-01

    myPhyloDB v.1.1.2 is a user-friendly personal database with a browser-interface designed to facilitate the storage, processing, analysis, and distribution of microbial community populations (e.g. 16S metagenomics data). MyPhyloDB archives raw sequencing files, and allows for easy selection of project(s)/sample(s) of any combination from all available data in the database. The data processing capabilities of myPhyloDB are also flexible enough to allow the upload and storage of pre-processed data, or use the built-in Mothur pipeline to automate the processing of raw sequencing data. myPhyloDB provides several analytical (e.g. analysis of covariance,t-tests, linear regression, differential abundance (DESeq2), and principal coordinates analysis (PCoA)) and normalization (rarefaction, DESeq2, and proportion) tools for the comparative analysis of taxonomic abundance, species richness and species diversity for projects of various types (e.g. human-associated, human gut microbiome, air, soil, and water) for any taxonomic level(s) desired. Finally, since myPhyloDB is a local web-server, users can quickly distribute data between colleagues and end-users by simply granting others access to their personal myPhyloDB database. myPhyloDB is available athttp://www.ars.usda.gov/services/software/download.htm?softwareid=472 and more information along with tutorials can be found on our websitehttp://www.myphylodb.org. Database URL:http://www.myphylodb.org. Published by Oxford University Press 2016. This work is written by US Government employees and is in the public domain in the United States.

  20. CPU Server

    CERN Multimedia

    The CERN computer centre has hundreds of racks like these. They are over a million times more powerful than our first computer in the 1960's. This tray is a 'dual-core' server. This means it effectively has two CPUs in it (eg. two of your home computers minimised to fit into a single box). Also note the copper cooling fins, to help dissipate the heat.

  1. Analysis of the Macroscopic Behavior of Server Systems in the Internet Environment

    Directory of Open Access Journals (Sweden)

    Yusuke Tanimura

    2017-11-01

    Full Text Available Elasticity is one of the key features of cloud-hosted services built on virtualization technology. To utilize the elasticity of cloud environments, administrators should accurately capture the operational status of server systems, which changes constantly according to service requests incoming irregularly. However, it is difficult to detect and avoid in advance that operating services are falling into an undesirable state. In this paper, we focus on the management of server systems that include cloud systems, and propose a new method for detecting the sign of undesirable scenarios before the system becomes overloaded as a result of various causes. In this method, a measure that utilizes the fluctuation of the macroscopic operational state observed in the server system is introduced. The proposed measure has the property of drastically increasing before the server system is in an undesirable state. Using the proposed measure, we realize a function to detect that the server system is falling into an overload scenario, and we demonstrate its effectiveness through experiments.

  2. Query Log Analysis of an Electronic Health Record Search Engine

    Science.gov (United States)

    Yang, Lei; Mei, Qiaozhu; Zheng, Kai; Hanauer, David A.

    2011-01-01

    We analyzed a longitudinal collection of query logs of a full-text search engine designed to facilitate information retrieval in electronic health records (EHR). The collection, 202,905 queries and 35,928 user sessions recorded over a course of 4 years, represents the information-seeking behavior of 533 medical professionals, including frontline practitioners, coding personnel, patient safety officers, and biomedical researchers for patient data stored in EHR systems. In this paper, we present descriptive statistics of the queries, a categorization of information needs manifested through the queries, as well as temporal patterns of the users’ information-seeking behavior. The results suggest that information needs in medical domain are substantially more sophisticated than those that general-purpose web search engines need to accommodate. Therefore, we envision there exists a significant challenge, along with significant opportunities, to provide intelligent query recommendations to facilitate information retrieval in EHR. PMID:22195150

  3. Comparing two digital consumer health television services using transaction log analysis

    Directory of Open Access Journals (Sweden)

    Paul Huntington

    2002-09-01

    Full Text Available Use is an important characteristic in determining the success or otherwise of any digital information service, and in making comparisons between services. The source of most use data is the server logs that record user activity on a real-time and continuous basis. There is much demand from sponsors, channel owners and marketing departments for this information. The authors evaluate the performance of use metrics, including reach, in order to make comparisons between two services and discuss the methodological problems associated with making such comparisons. The two services were: Living Health, managed by Flextech and distributed by Telewest, and NHS Direct Digital, managed by Communicopia Data and distributed by Kingston Interactive Television. The data were collected over the period August 2001 to February 2002. During this period, the two sites were visited by approximately 20 000 people who recorded more than three-quarters of a million page views.

  4. Logistics analysis to Improve Deployability (LOG-AID): Field Experiment/Results

    National Research Council Canada - National Science Library

    Evers, Kenneth

    2000-01-01

    .... Under sponsorship of the Air Force Research Laboratory Logistics Readiness Branch (AFRL/HESR), the Synergy team analyzed the current wing-level deployment process as part of the Logistics Analysis to Improve Deployability (LOG-AID) program...

  5. Analysis of Multiserver Queueing System with Opportunistic Occupation and Reservation of Servers

    Directory of Open Access Journals (Sweden)

    Bin Sun

    2014-01-01

    Full Text Available We consider a multiserver queueing system with two input flows. Type-1 customers have preemptive priority and are lost during arrival only if all servers are occupied by type-1 customers. If all servers are occupied, but some provide service to type-2 customers, service of type-2 customer is terminated and type-1 customer occupies the server. If the number of busy servers is less than the threshold M during type-2 customer arrival epoch, this customer is accepted. Otherwise, it is lost or becomes a retrial customer. It will retry to obtain service. Type-2 customer whose service is terminated is lost or moves to the pool of retrial customers. The service time is exponentially distributed with the rate dependent on the customer’s type. Such queueing system is suitable for modeling cognitive radio. Type-1 customers are interpreted as requests generated by primary users. Type-2 customers are generated by secondary or cognitive users. The problem of optimal choice of the threshold M is the subject of this paper. Behavior of the system is described by the multidimensional Markov chain. Its generator, ergodicity condition, and stationary distribution are given. The system performance measures are obtained. The numerical results show the effectiveness of considered admission control.

  6. Technical clearance for mental disorders among the servers in the city hall of Manaus: a preliminary analysis

    Directory of Open Access Journals (Sweden)

    Vívian Silva Lima Marangoni

    2017-01-01

    Full Text Available This study aimed to investigate the occurrence of work absenteeism due to mental disorders among the servers of the Prefeitura Municipal de Manaus - PMM, registered by the Junta Médica do Município, in the period January to December 2011. It was found, after the documentary analysis, data of extreme significance that converge with numerous studies which indicate the nature of work as a risk factor for mental illness. It is through them that mental disorders has been a major cause of absence from work activities, especially among the servers in the area of health and education, representing 10% of total permits issued in 2011. These findings may support future studies that focus on health promotion and quality of life of these professionals, thus becoming a major challenge for policy makers

  7. Windows Terminal Servers Orchestration

    Science.gov (United States)

    Bukowiec, Sebastian; Gaspar, Ricardo; Smith, Tim

    2017-10-01

    Windows Terminal Servers provide application gateways for various parts of the CERN accelerator complex, used by hundreds of CERN users every day. The combination of new tools such as Puppet, HAProxy and Microsoft System Center suite enable automation of provisioning workflows to provide a terminal server infrastructure that can scale up and down in an automated manner. The orchestration does not only reduce the time and effort necessary to deploy new instances, but also facilitates operations such as patching, analysis and recreation of compromised nodes as well as catering for workload peaks.

  8. The RNAsnp web server

    DEFF Research Database (Denmark)

    Radhakrishnan, Sabarinathan; Tafer, Hakim; Seemann, Ernst Stefan

    2013-01-01

    , are derived from extensive pre-computed tables of distributions of substitution effects as a function of gene length and GC content. Here, we present a web service that not only provides an interface for RNAsnp but also features a graphical output representation. In addition, the web server is connected...... to a local mirror of the UCSC genome browser database that enables the users to select the genomic sequences for analysis and visualize the results directly in the UCSC genome browser. The RNAsnp web server is freely available at: http://rth.dk/resources/rnasnp/....

  9. Analyzing Log Files using Data-Mining

    Directory of Open Access Journals (Sweden)

    Marius Mihut

    2008-01-01

    Full Text Available Information systems (i.e. servers, applications and communication devices create a large amount of monitoring data that are saved as log files. For analyzing them, a data-mining approach is helpful. This article presents the steps which are necessary for creating an ‘analyzing instrument’, based on an open source software called Waikato Environment for Knowledge Analysis (Weka [1]. For exemplification, a system log file created by a Windows-based operating system, is used as input file.

  10. Web application for recording learners’ mouse trajectories and retrieving their study logs for data analysis

    Directory of Open Access Journals (Sweden)

    Yoshinori Miyazaki

    2012-03-01

    Full Text Available With the accelerated implementation of e-learning systems in educational institutions, it has become possible to record learners’ study logs in recent years. It must be admitted that little research has been conducted upon the analysis of the study logs that are obtained. In addition, there is no software that traces the mouse movements of learners during their learning processes, which the authors believe would enable teachers to better understand their students’ behaviors. The objective of this study is to develop a Web application that records students’ study logs, including their mouse trajectories, and to devise an IR tool that can summarize such diversified data. The results of an experiment are also scrutinized to provide an analysis of the relationship between learners’ activities and their study logs.

  11. ANALYSIS OF MULTI-SERVER QUEUEING SYSTEM WITH PREEMPTIVE PRIORITY AND REPEATED CALLS

    Directory of Open Access Journals (Sweden)

    S. A. Dudin

    2015-01-01

    Full Text Available Multi-server retrial queueing system with no buffer and two types of customers is analyzed as the model of cognitive radio system. Customers of type 1 have a preemptive priority. Customers of both types arrive according to Markovian Arrival Processes. Service times have exponential distribution with parameter depending on the customer type. Type 2 customers are admitted for service only if the number of busy servers is less than the predefined threshold. The rejected type 2 customers retry for the service. Existence condition of the stationary mode of system operation is derived. Formulas for computing key performance measures of the system are presented.

  12. APPLICATION OF THE SINGLE SERVER QUEUING SYSTEM ANALYSIS IN WOOD PRODUCTS INDUSTRY

    Directory of Open Access Journals (Sweden)

    Arif GÜRAY

    2001-01-01

    Full Text Available The aim of this study, simulated of the single server queuing system(CNC at the door-joinery facilities. We simulated the system both by hand and computer programming with SIMAN languages. From the obtained results, we aimed to provide some suggestions to the manager. Because, the ending of the study, simulation showed the real system in some hypothesis. As a result of simulated system will have long queues in future time.

  13. The analysis of MS Project Server platform for managing of human resources in multiproject environment

    OpenAIRE

    Kadunc, Boštjan

    2009-01-01

    Effective project management requires qualified project managers, who must have complete control over their projects, so they can perform fluently. For easier and more effective project management, we can use various software solutions. In my graduation thesis I focused on working with resources - detecting and resolving their over-allocation. For their regulation I have used Microsoft Office Project Professional 2007 in connection with Microsoft Office Project Server 2007. My goal wa...

  14. The PARIGA server for real time filtering and analysis of reciprocal BLAST results.

    Directory of Open Access Journals (Sweden)

    Massimiliano Orsini

    Full Text Available BLAST-based similarity searches are commonly used in several applications involving both nucleotide and protein sequences. These applications span from simple tasks such as mapping sequences over a database to more complex procedures as clustering or annotation processes. When the amount of analysed data increases, manual inspection of BLAST results become a tedious procedure. Tools for parsing or filtering BLAST results for different purposes are then required. We describe here PARIGA (http://resources.bioinformatica.crs4.it/pariga/, a server that enables users to perform all-against-all BLAST searches on two sets of sequences selected by the user. Moreover, since it stores the two BLAST output in a python-serialized-objects database, results can be filtered according to several parameters in real-time fashion, without re-running the process and avoiding additional programming efforts. Results can be interrogated by the user using logical operations, for example to retrieve cases where two queries match same targets, or when sequences from the two datasets are reciprocal best hits, or when a query matches a target in multiple regions. The Pariga web server is designed to be a helpful tool for managing the results of sequence similarity searches. The design and implementation of the server renders all operations very fast and easy to use.

  15. PRince: a web server for structural and physicochemical analysis of protein-RNA interface.

    Science.gov (United States)

    Barik, Amita; Mishra, Abhishek; Bahadur, Ranjit Prasad

    2012-07-01

    We have developed a web server, PRince, which analyzes the structural features and physicochemical properties of the protein-RNA interface. Users need to submit a PDB file containing the atomic coordinates of both the protein and the RNA molecules in complex form (in '.pdb' format). They should also mention the chain identifiers of interacting protein and RNA molecules. The size of the protein-RNA interface is estimated by measuring the solvent accessible surface area buried in contact. For a given protein-RNA complex, PRince calculates structural, physicochemical and hydration properties of the interacting surfaces. All these parameters generated by the server are presented in a tabular format. The interacting surfaces can also be visualized with software plug-in like Jmol. In addition, the output files containing the list of the atomic coordinates of the interacting protein, RNA and interface water molecules can be downloaded. The parameters generated by PRince are novel, and users can correlate them with the experimentally determined biophysical and biochemical parameters for better understanding the specificity of the protein-RNA recognition process. This server will be continuously upgraded to include more parameters. PRince is publicly accessible and free for use. Available at http://www.facweb.iitkgp.ernet.in/~rbahadur/prince/home.html.

  16. A polylogarithmic competitive algorithm for the k-server problem

    NARCIS (Netherlands)

    Bansal, N.; Buchbinder, N.; Madry, A.; Naor, J.

    2011-01-01

    We give the first polylogarithmic-competitive randomized online algorithm for the $k$-server problem on an arbitrary finite metric space. In particular, our algorithm achieves a competitive ratio of O(log^3 n log^2 k log log n) for any metric space on n points. Our algorithm improves upon the

  17. Lithology and mineralogy recognition from geochemical logging tool data using multivariate statistical analysis.

    Science.gov (United States)

    Konaté, Ahmed Amara; Ma, Huolin; Pan, Heping; Qin, Zhen; Ahmed, Hafizullah Abba; Dembele, N'dji Dit Jacques

    2017-10-01

    The availability of a deep well that penetrates deep into the Ultra High Pressure (UHP) metamorphic rocks is unusual and consequently offers a unique chance to study the metamorphic rocks. One such borehole is located in the southern part of Donghai County in the Sulu UHP metamorphic belt of Eastern China, from the Chinese Continental Scientific Drilling Main hole. This study reports the results obtained from the analysis of oxide log data. A geochemical logging tool provides in situ, gamma ray spectroscopy measurements of major and trace elements in the borehole. Dry weight percent oxide concentration logs obtained for this study were SiO 2 , K 2 O, TiO 2 , H 2 O, CO 2 , Na 2 O, Fe 2 O 3 , FeO, CaO, MnO, MgO, P 2 O 5 and Al 2 O 3 . Cross plot and Principal Component Analysis methods were applied for lithology characterization and mineralogy description respectively. Cross plot analysis allows lithological variations to be characterized. Principal Component Analysis shows that the oxide logs can be summarized by two components related to the feldspar and hydrous minerals. This study has shown that geochemical logging tool data is accurate and adequate to be tremendously useful in UHP metamorphic rocks analysis. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. Analytic Provenance Datasets: A Data Repository of Human Analysis Activity and Interaction Logs

    OpenAIRE

    Mohseni, Sina; Pachuilo, Andrew; Nirjhar, Ehsanul Haque; Linder, Rhema; Pena, Alyssa; Ragan, Eric D.

    2018-01-01

    We present an analytic provenance data repository that can be used to study human analysis activity, thought processes, and software interaction with visual analysis tools during exploratory data analysis. We conducted a series of user studies involving exploratory data analysis scenario with textual and cyber security data. Interactions logs, think-alouds, videos and all coded data in this study are available online for research purposes. Analysis sessions are segmented in multiple sub-task ...

  19. A Computer Program for Flow-Log Analysis of Single Holes (FLASH)

    Science.gov (United States)

    Day-Lewis, F. D.; Johnson, C.D.; Paillet, Frederick L.; Halford, K.J.

    2011-01-01

    A new computer program, FLASH (Flow-Log Analysis of Single Holes), is presented for the analysis of borehole vertical flow logs. The code is based on an analytical solution for steady-state multilayer radial flow to a borehole. The code includes options for (1) discrete fractures and (2) multilayer aquifers. Given vertical flow profiles collected under both ambient and stressed (pumping or injection) conditions, the user can estimate fracture (or layer) transmissivities and far-field hydraulic heads. FLASH is coded in Microsoft Excel with Visual Basic for Applications routines. The code supports manual and automated model calibration. ?? 2011, The Author(s). Ground Water ?? 2011, National Ground Water Association.

  20. deepTools2: a next generation web server for deep-sequencing data analysis.

    Science.gov (United States)

    Ramírez, Fidel; Ryan, Devon P; Grüning, Björn; Bhardwaj, Vivek; Kilpert, Fabian; Richter, Andreas S; Heyne, Steffen; Dündar, Friederike; Manke, Thomas

    2016-07-08

    We present an update to our Galaxy-based web server for processing and visualizing deeply sequenced data. Its core tool set, deepTools, allows users to perform complete bioinformatic workflows ranging from quality controls and normalizations of aligned reads to integrative analyses, including clustering and visualization approaches. Since we first described our deepTools Galaxy server in 2014, we have implemented new solutions for many requests from the community and our users. Here, we introduce significant enhancements and new tools to further improve data visualization and interpretation. deepTools continue to be open to all users and freely available as a web service at deeptools.ie-freiburg.mpg.de The new deepTools2 suite can be easily deployed within any Galaxy framework via the toolshed repository, and we also provide source code for command line usage under Linux and Mac OS X. A public and documented API for access to deepTools functionality is also available. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  1. An improved method of studying user-system interaction by combining transaction log analysis and protocol analysis

    Directory of Open Access Journals (Sweden)

    Jillian R. Griffiths

    2002-01-01

    Full Text Available The paper reports a novel approach to studying user-system interaction that captures a complete record of the searcher's actions, the system responses and synchronised talk-aloud comments from the searcher. The data is recorded unobtrusively and is available for later analysis. The approach is set in context by a discussion of transaction logging and protocol analysis and examples of the search logging in operation are presented

  2. EarthServer - an FP7 project to enable the web delivery and analysis of 3D/4D models

    Science.gov (United States)

    Laxton, John; Sen, Marcus; Passmore, James

    2013-04-01

    EarthServer aims at open access and ad-hoc analytics on big Earth Science data, based on the OGC geoservice standards Web Coverage Service (WCS) and Web Coverage Processing Service (WCPS). The WCS model defines "coverages" as a unifying paradigm for multi-dimensional raster data, point clouds, meshes, etc., thereby addressing a wide range of Earth Science data including 3D/4D models. WCPS allows declarative SQL-style queries on coverages. The project is developing a pilot implementing these standards, and will also investigate the use of GeoSciML to describe coverages. Integration of WCPS with XQuery will in turn allow coverages to be queried in combination with their metadata and GeoSciML description. The unified service will support navigation, extraction, aggregation, and ad-hoc analysis on coverage data from SQL. Clients will range from mobile devices to high-end immersive virtual reality, and will enable 3D model visualisation using web browser technology coupled with developing web standards. EarthServer is establishing open-source client and server technology intended to be scalable to Petabyte/Exabyte volumes, based on distributed processing, supercomputing, and cloud virtualization. Implementation will be based on the existing rasdaman server technology developed. Services using rasdaman technology are being installed serving the atmospheric, oceanographic, geological, cryospheric, planetary and general earth observation communities. The geology service (http://earthserver.bgs.ac.uk/) is being provided by BGS and at present includes satellite imagery, superficial thickness data, onshore DTMs and 3D models for the Glasgow area. It is intended to extend the data sets available to include 3D voxel models. Use of the WCPS standard allows queries to be constructed against single or multiple coverages. For example on a single coverage data for a particular area can be selected or data with a particular range of pixel values. Queries on multiple surfaces can be

  3. Catching errors with patient-specific pretreatment machine log file analysis.

    Science.gov (United States)

    Rangaraj, Dharanipathy; Zhu, Mingyao; Yang, Deshan; Palaniswaamy, Geethpriya; Yaddanapudi, Sridhar; Wooten, Omar H; Brame, Scott; Mutic, Sasa

    2013-01-01

    A robust, efficient, and reliable quality assurance (QA) process is highly desired for modern external beam radiation therapy treatments. Here, we report the results of a semiautomatic, pretreatment, patient-specific QA process based on dynamic machine log file analysis clinically implemented for intensity modulated radiation therapy (IMRT) treatments delivered by high energy linear accelerators (Varian 2100/2300 EX, Trilogy, iX-D, Varian Medical Systems Inc, Palo Alto, CA). The multileaf collimator machine (MLC) log files are called Dynalog by Varian. Using an in-house developed computer program called "Dynalog QA," we automatically compare the beam delivery parameters in the log files that are generated during pretreatment point dose verification measurements, with the treatment plan to determine any discrepancies in IMRT deliveries. Fluence maps are constructed and compared between the delivered and planned beams. Since clinical introduction in June 2009, 912 machine log file analyses QA were performed by the end of 2010. Among these, 14 errors causing dosimetric deviation were detected and required further investigation and intervention. These errors were the result of human operating mistakes, flawed treatment planning, and data modification during plan file transfer. Minor errors were also reported in 174 other log file analyses, some of which stemmed from false positives and unreliable results; the origins of these are discussed herein. It has been demonstrated that the machine log file analysis is a robust, efficient, and reliable QA process capable of detecting errors originating from human mistakes, flawed planning, and data transfer problems. The possibility of detecting these errors is low using point and planar dosimetric measurements. Copyright © 2013 American Society for Radiation Oncology. Published by Elsevier Inc. All rights reserved.

  4. Non-destructive analysis and detection of internal characteristics of spruce logs through X computerized tomography

    International Nuclear Information System (INIS)

    Longuetaud, F.

    2005-10-01

    Computerized tomography allows a direct access to internal features of scanned logs on the basis of density and moisture content variations. The objective of this work is to assess the feasibility of an automatic detection of internal characteristics with the final aim of conducting scientific analyses. The database is constituted by CT images of 24 spruces obtained with a medical CT scanner. Studied trees are representative of several social status and are coming from four stands located in North-Eastern France, themselves are representative of several age, density and fertility classes. The automatic processing developed are the following. First, pith detection in logs dealing with the problem of knot presence and ring eccentricity. The accuracy of the localisation was less than one mm. Secondly, the detection of the sapwood/heart-wood limit in logs dealing with the problem of knot presence (main source of difficulty). The error on the diameter was 1.8 mm which corresponds to a relative error of 1.3 per cent. Thirdly, the detection of the whorls location and comparison with an optical method. Fourthly the detection of individualized knots. This process allows to count knots and to locate them in a log (longitudinal position and azimuth); however, the validation of the method and extraction of branch diameter and inclination are still to be developed. An application of this work was a variability analysis of the sapwood content in the trunk: at the within-tree level, the sapwood width was found to be constant under the living crown; at the between-tree level, a strong correlation was found with the amount of living branches. A great number of analyses are possible from our work results, among others: architectural analysis with the pith tracking and the apex death occurrence; analysis of radial variations of the heart-wood shape; analysis of the knot distribution in logs. (author)

  5. An efficient biometric and password-based remote user authentication using smart card for Telecare Medical Information Systems in multi-server environment.

    Science.gov (United States)

    Maitra, Tanmoy; Giri, Debasis

    2014-12-01

    The medical organizations have introduced Telecare Medical Information System (TMIS) to provide a reliable facility by which a patient who is unable to go to a doctor in critical or urgent period, can communicate to a doctor through a medical server via internet from home. An authentication mechanism is needed in TMIS to hide the secret information of both parties, namely a server and a patient. Recent research includes patient's biometric information as well as password to design a remote user authentication scheme that enhances the security level. In a single server environment, one server is responsible for providing services to all the authorized remote patients. However, the problem arises if a patient wishes to access several branch servers, he/she needs to register to the branch servers individually. In 2014, Chuang and Chen proposed an remote user authentication scheme for multi-server environment. In this paper, we have shown that in their scheme, an non-register adversary can successfully logged-in into the system as a valid patient. To resist the weaknesses, we have proposed an authentication scheme for TMIS in multi-server environment where the patients can register to a root telecare server called registration center (RC) in one time to get services from all the telecare branch servers through their registered smart card. Security analysis and comparison shows that our proposed scheme provides better security with low computational and communication cost.

  6. Formation evaluation in Devonian shale through application of new core and log analysis methods

    International Nuclear Information System (INIS)

    Luffel, D.L.; Guidry, F.K.

    1990-01-01

    In the Devonian shale of the Appalachian Basin all porosity in excess of about 2.5 percent is generally occupied by free hydrocarbons, which is mostly gas, based on results of new core and log analysis methods. In this study, sponsored by the Gas Research Institute, reservoir porosities averaged about 5 percent and free gas content averaged about 2 percent by bulk volume, based on analyses on 519 feet of conventional core in four wells. In this source-rich Devonian shale, which also provides the reservoir storage, the rock everywhere appears to be at connate, or irreducible, water saturation corresponding to two or three percent of bulk volume. This became evident when applying the new core and log analysis methods, along with a new plotting method relating bulk volume of pore fluids to porosity. This plotting method has proved to be a valuable tool: it provides useful insight on the fluid distribution present in the reservoir, it provides a clear idea of porosity required to store free hydrocarbons, it leads to a method of linking formation factor to porosity, and it provides a good quality control method to monitor core and log analysis results. In the Devonian shale an important part of the formation evaluation is to determine the amount of kerogen, since this appears as hydrocarbon-filled porosity to conventional logs. In this study Total Organic Carbon and pyrolysis analyses were made on 93 core samples from four wells. Based on these data a new method was used to drive volumetric kerogen and free oil content, and kerogen was found to range up to 26 percent by volume. A good correlation was subsequently developed to derive kerogen from the uranium response of the spectral gamma ray log. Another important result of this study is the measurement of formation water salinity directly on core samples. Results on 50 measurements in the four study wells ranged from 19,000 to 220,000 ppm NaCl

  7. psRNATarget: a plant small RNA target analysis server (2017 release).

    Science.gov (United States)

    Dai, Xinbin; Zhuang, Zhaohong; Zhao, Patrick Xuechun

    2018-04-30

    Plant regulatory small RNAs (sRNAs), which include most microRNAs (miRNAs) and a subset of small interfering RNAs (siRNAs), such as the phased siRNAs (phasiRNAs), play important roles in regulating gene expression. Although generated from genetically distinct biogenesis pathways, these regulatory sRNAs share the same mechanisms for post-translational gene silencing and translational inhibition. psRNATarget was developed to identify plant sRNA targets by (i) analyzing complementary matching between the sRNA sequence and target mRNA sequence using a predefined scoring schema and (ii) by evaluating target site accessibility. This update enhances its analytical performance by developing a new scoring schema that is capable of discovering miRNA-mRNA interactions at higher 'recall rates' without significantly increasing total prediction output. The scoring procedure is customizable for the users to search both canonical and non-canonical targets. This update also enables transmitting and analyzing 'big' data empowered by (a) the implementation of multi-threading chunked file uploading, which can be paused and resumed, using HTML5 APIs and (b) the allocation of significantly more computing nodes to its back-end Linux cluster. The updated psRNATarget server has clear, compelling and user-friendly interfaces that enhance user experiences and present data clearly and concisely. The psRNATarget is freely available at http://plantgrn.noble.org/psRNATarget/.

  8. An adversarial queueing model for online server routing

    NARCIS (Netherlands)

    Bonifaci, V.

    2007-01-01

    In an online server routing problem, a vehicle or server moves in a network in order to process incoming requests at the nodes. Online server routing problems have been thoroughly studied using competitive analysis. We propose a new model for online server routing, based on adversarial queueing

  9. The Feasibility of Using Cluster Analysis to Examine Log Data from Educational Video Games. CRESST Report 790

    Science.gov (United States)

    Kerr, Deirdre; Chung, Gregory K. W. K.; Iseli, Markus R.

    2011-01-01

    Analyzing log data from educational video games has proven to be a challenging endeavor. In this paper, we examine the feasibility of using cluster analysis to extract information from the log files that is interpretable in both the context of the game and the context of the subject area. If cluster analysis can be used to identify patterns of…

  10. pocketZebra: a web-server for automated selection and classification of subfamily-specific binding sites by bioinformatic analysis of diverse protein families.

    Science.gov (United States)

    Suplatov, Dmitry; Kirilin, Eugeny; Arbatsky, Mikhail; Takhaveev, Vakil; Svedas, Vytas

    2014-07-01

    The new web-server pocketZebra implements the power of bioinformatics and geometry-based structural approaches to identify and rank subfamily-specific binding sites in proteins by functional significance, and select particular positions in the structure that determine selective accommodation of ligands. A new scoring function has been developed to annotate binding sites by the presence of the subfamily-specific positions in diverse protein families. pocketZebra web-server has multiple input modes to meet the needs of users with different experience in bioinformatics. The server provides on-site visualization of the results as well as off-line version of the output in annotated text format and as PyMol sessions ready for structural analysis. pocketZebra can be used to study structure-function relationship and regulation in large protein superfamilies, classify functionally important binding sites and annotate proteins with unknown function. The server can be used to engineer ligand-binding sites and allosteric regulation of enzymes, or implemented in a drug discovery process to search for potential molecular targets and novel selective inhibitors/effectors. The server, documentation and examples are freely available at http://biokinet.belozersky.msu.ru/pocketzebra and there are no login requirements. © The Author(s) 2014. Published by Oxford University Press on behalf of Nucleic Acids Research.

  11. SU-F-R-12: Prediction of TrueBeam Hardware Issues Using Trajectory Log Analysis

    Energy Technology Data Exchange (ETDEWEB)

    DiCostanzo, D; Ayan, A; Woollard, J; Gupta, N [The Ohio State University, Columbus, OH (United States)

    2016-06-15

    Purpose: To predict potential failures of hardware within the Varian TrueBeam linear accelerator in order to proactively replace parts and decrease machine downtime. Methods: Machine downtime is a problem for all radiation oncology departments and vendors. Most often it is the result of unexpected equipment failure, and increased due to lack of in-house clinical engineering support. Preventative maintenance attempts to assuage downtime, but often is ineffective at preemptively preventing many failure modes such as MLC motor failures, the need to tighten a gantry chain, or the replacement of a jaw motor, among other things. To attempt to alleviate downtime, software was developed in house that determines the maximum value of each axis enumerated in the Truebeam trajectory log files. After patient treatments, this data is stored in a SQL database. Microsoft Power BI is used to plot the average maximum error of each day of each machine as a function of time. The results are then correlated with actual faults that occurred at the machine with the help of Varian service engineers. Results: Over the course of six months, 76,312 trajectory logs have been written into the database and plotted in Power BI. Throughout the course of analysis MLC motors have been replaced on three machines due to the early warning of the trajectory log analysis. The service engineers have also been alerted to possible gantry issues on one occasion due to the aforementioned analysis. Conclusion: Analyzing the trajectory log data is a viable and effective early warning system for potential failures of the TrueBeam linear accelerator. With further analysis and tightening of the tolerance values used to determine a possible imminent failure, it should be possible to pinpoint future issues more thoroughly and for more axes of motion.

  12. SU-F-R-12: Prediction of TrueBeam Hardware Issues Using Trajectory Log Analysis

    International Nuclear Information System (INIS)

    DiCostanzo, D; Ayan, A; Woollard, J; Gupta, N

    2016-01-01

    Purpose: To predict potential failures of hardware within the Varian TrueBeam linear accelerator in order to proactively replace parts and decrease machine downtime. Methods: Machine downtime is a problem for all radiation oncology departments and vendors. Most often it is the result of unexpected equipment failure, and increased due to lack of in-house clinical engineering support. Preventative maintenance attempts to assuage downtime, but often is ineffective at preemptively preventing many failure modes such as MLC motor failures, the need to tighten a gantry chain, or the replacement of a jaw motor, among other things. To attempt to alleviate downtime, software was developed in house that determines the maximum value of each axis enumerated in the Truebeam trajectory log files. After patient treatments, this data is stored in a SQL database. Microsoft Power BI is used to plot the average maximum error of each day of each machine as a function of time. The results are then correlated with actual faults that occurred at the machine with the help of Varian service engineers. Results: Over the course of six months, 76,312 trajectory logs have been written into the database and plotted in Power BI. Throughout the course of analysis MLC motors have been replaced on three machines due to the early warning of the trajectory log analysis. The service engineers have also been alerted to possible gantry issues on one occasion due to the aforementioned analysis. Conclusion: Analyzing the trajectory log data is a viable and effective early warning system for potential failures of the TrueBeam linear accelerator. With further analysis and tightening of the tolerance values used to determine a possible imminent failure, it should be possible to pinpoint future issues more thoroughly and for more axes of motion.

  13. Log Usage Analysis: What it Discloses about Use, Information Seeking and Trustworthiness

    Directory of Open Access Journals (Sweden)

    David Nicholas

    2014-06-01

    Full Text Available The Trust and Authority in Scholarly Communications in the Light of the Digital Transition research project1 was a study which investigated the behaviours and attitudes of academic researchers as producers and consumers of scholarly information resources in respect to how they determine authority and trustworthiness. The research questions for the study arose out of CIBER’s studies of the virtual scholar. This paper focuses on elements of this study, mainly an analysis of a scholarly publisher’s usage logs, which was undertaken at the start of the project in order to build an evidence base, which would help calibrate the main methodological tools used by the project: interviews and questionnaire. The specific purpose of the log study was to identify and assess the digital usage behaviours that potentially raise trustworthiness and authority questions. Results from the self-report part of the study were additionally used to explain the logs. The main findings were that: 1 logs provide a good indicator of use and information seeking behaviour, albeit in respect to just a part of the information seeking journey; 2 the ‘lite’ form of information seeking behaviour observed in the logs is a sign of users trying to make their mind up in the face of a tsunami of information as to what is relevant and to be trusted; 3 Google and Google Scholar are the discovery platforms of choice for academic researchers, which partly points to the fact that they are influenced in what they use and read by ease of access; 4 usage is not a suitable proxy for quality. The paper also provides contextual data from CIBER’s previous studies.

  14. Interconnecting smartphone, image analysis server, and case report forms in clinical trials for automatic skin lesion tracking in clinical trials

    Science.gov (United States)

    Haak, Daniel; Doma, Aliaa; Gombert, Alexander; Deserno, Thomas M.

    2016-03-01

    Today, subject's medical data in controlled clinical trials is captured digitally in electronic case report forms (eCRFs). However, eCRFs only insufficiently support integration of subject's image data, although medical imaging is looming large in studies today. For bed-side image integration, we present a mobile application (App) that utilizes the smartphone-integrated camera. To ensure high image quality with this inexpensive consumer hardware, color reference cards are placed in the camera's field of view next to the lesion. The cards are used for automatic calibration of geometry, color, and contrast. In addition, a personalized code is read from the cards that allows subject identification. For data integration, the App is connected to an communication and image analysis server that also holds the code-study-subject relation. In a second system interconnection, web services are used to connect the smartphone with OpenClinica, an open-source, Food and Drug Administration (FDA)-approved electronic data capture (EDC) system in clinical trials. Once the photographs have been securely stored on the server, they are released automatically from the mobile device. The workflow of the system is demonstrated by an ongoing clinical trial, in which photographic documentation is frequently performed to measure the effect of wound incision management systems. All 205 images, which have been collected in the study so far, have been correctly identified and successfully integrated into the corresponding subject's eCRF. Using this system, manual steps for the study personnel are reduced, and, therefore, errors, latency and costs decreased. Our approach also increases data security and privacy.

  15. ANDRILL Borehole AND-1B: Well Log Analysis of Lithofacies and Glacimarine Cycles.

    Science.gov (United States)

    Jackolski, C. L.; Williams, T.; Powell, R. D.; Jarrard, R.; Morin, R. H.; Talarico, F. M.; Niessen, F.; Kuhn, G.

    2008-12-01

    During the 2006-2007 austral summer, the Antarctic geological drilling program ANDRILL recovered cores of sedimentary rock from a 1285-m-deep borehole below the McMurdo Ice Shelf. Well logging instruments were deployed to a depth of 1017 mbsf after core recovery. This study focuses on two intervals of the AND-1B borehole: upper HQ (238-343 mbsf; Pliocene) and NQ (698-1017 mbsf; upper Miocene), which were logged with natural gamma ray, induction resistivity and magnetic susceptibility tools. To understand how the well logs fit into a more complete physical properties data set, we performed factor and cluster analyses on a suite of well logs and core logs in the upper HQ and NQ intervals. In both intervals, factor analysis groups resistivity and core P-velocity into a factor that we interpret as being inversely proportional to porosity. It also groups natural gamma and potassium (from the XRF core scanner) into a factor that we interpret as a particle-size or lithology index. An additional factor in the NQ interval, influenced by clast number and magnetic susceptibility, distinguishes subglacial diamictites from other lithofacies. The factors in each interval (2 in HQ, 3 in NQ) are used as input to cluster analysis. The results are log data objectively organized into clusters, or electrofacies. We compare these electrofacies to the lithofacies, well logs and unconformity-bounded glacimarine cycles of AND-1B. Patterns in the glacimarine cycles are observed in the well logs and electrofacies. In the NQ glacimarine sediments, an electrofacies pattern is produced between subglacial diamictites at the bottom of each sequence and the glacial retreat facies above. Subglacial diamictites have higher values for the additional NQ factor, corresponding to clast number and magnetic susceptibility, than the muds and sands that form the retreat facies. Differences in the porosity factor are not observed in any electrofacies pattern in the NQ interval, but subtle patterns in the

  16. GeoServer cookbook

    CERN Document Server

    Iacovella, Stefano

    2014-01-01

    This book is ideal for GIS experts, developers, and system administrators who have had a first glance at GeoServer and who are eager to explore all its features in order to configure professional map servers. Basic knowledge of GIS and GeoServer is required.

  17. An Entry Point for Formal Methods: Specification and Analysis of Event Logs

    Directory of Open Access Journals (Sweden)

    Howard Barringer

    2010-03-01

    Full Text Available Formal specification languages have long languished, due to the grave scalability problems faced by complete verification methods. Runtime verification promises to use formal specifications to automate part of the more scalable art of testing, but has not been widely applied to real systems, and often falters due to the cost and complexity of instrumentation for online monitoring. In this paper we discuss work in progress to apply an event-based specification system to the logging mechanism of the Mars Science Laboratory mission at JPL. By focusing on log analysis, we exploit the "instrumentation" already implemented and required for communicating with the spacecraft. We argue that this work both shows a practical method for using formal specifications in testing and opens interesting research avenues, including a challenging specification learning problem.

  18. Mastering Lync Server 2010

    CERN Document Server

    Winters, Nathan

    2012-01-01

    An in-depth guide on the leading Unified Communications platform Microsoft Lync Server 2010 maximizes communication capabilities in the workplace like no other Unified Communications (UC) solution. Written by experts who know Lync Server inside and out, this comprehensive guide shows you step by step how to administer the newest and most robust version of Lync Server. Along with clear and detailed instructions, learning is aided by exercise problems and real-world examples of established Lync Server environments. You'll gain the skills you need to effectively deploy Lync Server 2010 and be on

  19. Impact of logging on a mangrove swamp in South Mexico: cost / benefit analysis

    Directory of Open Access Journals (Sweden)

    Cristian Tovilla Hernández

    2001-06-01

    Full Text Available Environmental changes caused by logging in a mangrove swamp were studied in Barra de Tecoanapa, Guerrero, Mexico. Original forest included Rhizophora mangle, Laguncularia racemosa, Avicennia germinans and halophytic vegetation, and produced wood (164.03 m3/ha and organic matter (3.9 g/m2/day. A total of 3.5 tons of wood per year were harvested from this area. Later, an average of 2 555 kg of maize per planting cycle were obtained (market value of 88 USD. Succession when the area was abandoned included strictly facultative and glycophyte halophytes (16 families, Cyperaceae and Poaceae were the best represented. After logging, temperatures increased 13 °C in the soil and 11°C in the air, whereas salinity reached 52 psu in the dry season. These modified soil color and sand content increased from 42.6 to 63.4%. Logging was deleterious to species, habitat, biogeochemical and biological cycles, organic matter production, seeds, young plants, genetic exchange conservation of soil and its fertility, coastal protection, and aesthetic value; 3 000 m2 had eroded as the river advanced towards the deforested area (the cost/benefit analysis showed a ratio of 246: 1. There was long-term economic loss for the community and only 30% of the site has recovered after five years.

  20. Identifying Plant Part Composition of Forest Logging Residue Using Infrared Spectral Data and Linear Discriminant Analysis

    Directory of Open Access Journals (Sweden)

    Gifty E. Acquah

    2016-08-01

    Full Text Available As new markets, technologies and economies evolve in the low carbon bioeconomy, forest logging residue, a largely untapped renewable resource will play a vital role. The feedstock can however be variable depending on plant species and plant part component. This heterogeneity can influence the physical, chemical and thermochemical properties of the material, and thus the final yield and quality of products. Although it is challenging to control compositional variability of a batch of feedstock, it is feasible to monitor this heterogeneity and make the necessary changes in process parameters. Such a system will be a first step towards optimization, quality assurance and cost-effectiveness of processes in the emerging biofuel/chemical industry. The objective of this study was therefore to qualitatively classify forest logging residue made up of different plant parts using both near infrared spectroscopy (NIRS and Fourier transform infrared spectroscopy (FTIRS together with linear discriminant analysis (LDA. Forest logging residue harvested from several Pinus taeda (loblolly pine plantations in Alabama, USA, were classified into three plant part components: clean wood, wood and bark and slash (i.e., limbs and foliage. Five-fold cross-validated linear discriminant functions had classification accuracies of over 96% for both NIRS and FTIRS based models. An extra factor/principal component (PC was however needed to achieve this in FTIRS modeling. Analysis of factor loadings of both NIR and FTIR spectra showed that, the statistically different amount of cellulose in the three plant part components of logging residue contributed to their initial separation. This study demonstrated that NIR or FTIR spectroscopy coupled with PCA and LDA has the potential to be used as a high throughput tool in classifying the plant part makeup of a batch of forest logging residue feedstock. Thus, NIR/FTIR could be employed as a tool to rapidly probe/monitor the variability

  1. Techniques of Turnovers’ Evolution and Structure Analysis Using SQL Server 2005

    Directory of Open Access Journals (Sweden)

    Alexandru Manole

    2007-07-01

    Full Text Available The turnovers’ evolution and structure analysis can provide many useful information for the construction of a viable set of policies for products, prices and retail network. When the analysis deals with large quantities of raw data, one of the solutions that guarantees the rigorous treatment of the data is the use of a software system based on a data warehouse.

  2. A tandem queue with delayed server release

    OpenAIRE

    Nawijn, W.M.

    1997-01-01

    We consider a tandem queue with two stations. The rst station is an s-server queue with Poisson arrivals and exponential service times. After terminating his service in the rst station, a customer enters the second station to require service at an exponential single server, while in the meantime he is blocking his server in station 1 until he completes service in station 2, whereupon the server in station 1 is released. An analysis of the generating function of the simultaneous probability di...

  3. Expert cube development with SQL server analysis services 2012 multidimensional models

    CERN Document Server

    Ferrari, Alberto; Russo, Marco

    2014-01-01

    An easy-to-follow guide full of hands on examples of real-world Analysis Services cube development tasks. Each topic is explained and placed in context, and for the more inquisitive reader, there also more in-depth details of the concepts used.If you are an Analysis Services cube designer wishing to learn more advanced topic and best practices for cube design, this book is for you.You are expected to have some prior experience with Analysis Services cube development.

  4. Microbial Diagnostic Array Workstation (MDAW: a web server for diagnostic array data storage, sharing and analysis

    Directory of Open Access Journals (Sweden)

    Chang Yung-Fu

    2008-09-01

    Full Text Available Abstract Background Microarrays are becoming a very popular tool for microbial detection and diagnostics. Although these diagnostic arrays are much simpler when compared to the traditional transcriptome arrays, due to the high throughput nature of the arrays, the data analysis requirements still form a bottle neck for the widespread use of these diagnostic arrays. Hence we developed a new online data sharing and analysis environment customised for diagnostic arrays. Methods Microbial Diagnostic Array Workstation (MDAW is a database driven application designed in MS Access and front end designed in ASP.NET. Conclusion MDAW is a new resource that is customised for the data analysis requirements for microbial diagnostic arrays.

  5. Earth System Model Development and Analysis using FRE-Curator and Live Access Servers: On-demand analysis of climate model output with data provenance.

    Science.gov (United States)

    Radhakrishnan, A.; Balaji, V.; Schweitzer, R.; Nikonov, S.; O'Brien, K.; Vahlenkamp, H.; Burger, E. F.

    2016-12-01

    There are distinct phases in the development cycle of an Earth system model. During the model development phase, scientists make changes to code and parameters and require rapid access to results for evaluation. During the production phase, scientists may make an ensemble of runs with different settings, and produce large quantities of output, that must be further analyzed and quality controlled for scientific papers and submission to international projects such as the Climate Model Intercomparison Project (CMIP). During this phase, provenance is a key concern:being able to track back from outputs to inputs. We will discuss one of the paths taken at GFDL in delivering tools across this lifecycle, offering on-demand analysis of data by integrating the use of GFDL's in-house FRE-Curator, Unidata's THREDDS and NOAA PMEL's Live Access Servers (LAS).Experience over this lifecycle suggests that a major difficulty in developing analysis capabilities is only partially the scientific content, but often devoted to answering the questions "where is the data?" and "how do I get to it?". "FRE-Curator" is the name of a database-centric paradigm used at NOAA GFDL to ingest information about the model runs into an RDBMS (Curator database). The components of FRE-Curator are integrated into Flexible Runtime Environment workflow and can be invoked during climate model simulation. The front end to FRE-Curator, known as the Model Development Database Interface (MDBI) provides an in-house web-based access to GFDL experiments: metadata, analysis output and more. In order to provide on-demand visualization, MDBI uses Live Access Servers which is a highly configurable web server designed to provide flexible access to geo-referenced scientific data, that makes use of OPeNDAP. Model output saved in GFDL's tape archive, the size of the database and experiments, continuous model development initiatives with more dynamic configurations add complexity and challenges in providing an on

  6. A comparison of two methods of logMAR visual acuity data scoring for statistical analysis

    Directory of Open Access Journals (Sweden)

    O. A. Oduntan

    2009-12-01

    Full Text Available The purpose of this study was to compare two methods of logMAR visual acuity (VA scoring. The two methods are referred to as letter scoring (method 1 and line scoring (method 2. The two methods were applied to VA data obtained from one hundred and forty (N=140 children with oculocutaneous albinism. Descriptive, correlation andregression statistics were then used to analyze the data.  Also, where applicable, the Bland and Altman analysis was used to compare sets of data from the two methods.  The right and left eyes data were included in the study, but because the findings were similar in both eyes, only the results for the right eyes are presented in this paper.  For method 1, the mean unaided VA (mean UAOD1 = 0.39 ±0.15 logMAR. The mean aided (mean ADOD1 VA = 0.50 ± 0.16 logMAR.  For method 2, the mean unaided (mean UAOD2 VA = 0.71 ± 0.15 logMAR, while the mean aided VA (mean ADOD2 = 0.60 ± 0.16 logMAR. The range and mean values of the improvement in VA for both methods were the same. The unaided VAs (UAOD1, UAOD2 and aided (ADOD1, ADOD2 for methods 1 and 2 correlated negatively (Unaided, r = –1, p<0.05, (Aided, r = –1, p<0.05.  The improvement in VA (differences between the unaided and aided VA values (DOD1 and DOD2 were positively correlated (r = +1, p <0.05. The Bland and Altman analyses showed that the VA improvement (unaided – aided VA values (DOD1 and DOD2 were similar for the two methods. Findings indicated that only the improvement in VA could be compared when different scoring methods are used. Therefore the scoring method used in any VA research project should be stated in the publication so that appropriate comparisons could be made by other researchers.

  7. PIQMIe: A web server for semi-quantitative proteomics data management and analysis

    NARCIS (Netherlands)

    A. Kuzniar (Arnold); R. Kanaar (Roland)

    2014-01-01

    textabstractWe present the Proteomics Identifications and Quantitations Data Management and Integration Service or PIQMIe that aids in reliable and scalable data management, analysis and visualization of semi-quantitative mass spectrometry based proteomics experiments. PIQMIe readily integrates

  8. In situ analysis of coal from single electrode resistance, self-potential and gamma-ray logs

    International Nuclear Information System (INIS)

    Kayal, J.R.

    1981-01-01

    Single electrode resistance, self-potential and gamma-ray logging have been carried out in North Karanpura, West Bokaro and Jharia coalfields of Gondwana basin in Eastern India. Correlation of these geophysical logs is found to be very useful in locating the coal beds, determining their accurate depths and thickness and approximate quality. Coal seams have been detected as very high resistive formations compared to sandstone/shale which are interbedded in the coal basin. High or low self-potential values are obtained against the coal beds depending on the borehole fluid conditions. Burnt coals (Jhama) are characterised as highly conductive beds. Gamma ray logs have been effectively used alongwith electrical logs for correlation and identification of coal seams. Further analysis of gamma-ray log data determines a linear relationship with ash content of coal. (author)

  9. myPhyloDB: a local web server for the storage and analysis of metagenomics data

    Science.gov (United States)

    myPhyloDB is a user-friendly personal database with a browser-interface designed to facilitate the storage, processing, analysis, and distribution of metagenomics data. MyPhyloDB archives raw sequencing files, and allows for easy selection of project(s)/sample(s) of any combination from all availab...

  10. Nebula--a web-server for advanced ChIP-seq data analysis.

    Science.gov (United States)

    Boeva, Valentina; Lermine, Alban; Barette, Camille; Guillouf, Christel; Barillot, Emmanuel

    2012-10-01

    ChIP-seq consists of chromatin immunoprecipitation and deep sequencing of the extracted DNA fragments. It is the technique of choice for accurate characterization of the binding sites of transcription factors and other DNA-associated proteins. We present a web service, Nebula, which allows inexperienced users to perform a complete bioinformatics analysis of ChIP-seq data. Nebula was designed for both bioinformaticians and biologists. It is based on the Galaxy open source framework. Galaxy already includes a large number of functionalities for mapping reads and peak calling. We added the following to Galaxy: (i) peak calling with FindPeaks and a module for immunoprecipitation quality control, (ii) de novo motif discovery with ChIPMunk, (iii) calculation of the density and the cumulative distribution of peak locations relative to gene transcription start sites, (iv) annotation of peaks with genomic features and (v) annotation of genes with peak information. Nebula generates the graphs and the enrichment statistics at each step of the process. During Steps 3-5, Nebula optionally repeats the analysis on a control dataset and compares these results with those from the main dataset. Nebula can also incorporate gene expression (or gene modulation) data during these steps. In summary, Nebula is an innovative web service that provides an advanced ChIP-seq analysis pipeline providing ready-to-publish results. Nebula is available at http://nebula.curie.fr/ Supplementary data are available at Bioinformatics online.

  11. Borehole logging

    International Nuclear Information System (INIS)

    Olsen, H.

    1995-01-01

    Numerous ground water investigations have been accomplished by means of borehole logging. Borehole logging can be applied to establish new water recovery wells, to control the existing water producing wells and source areas and to estimate ground water quality. (EG)

  12. Server virtualization solutions

    OpenAIRE

    Jonasts, Gusts

    2012-01-01

    Currently in the information technology sector that is responsible for a server infrastructure is a huge development in the field of server virtualization on x86 computer architecture. As a prerequisite for such a virtualization development is growth in server productivity and underutilization of available computing power. Several companies in the market are working on two virtualization architectures – hypervizor and hosting. In this paper several of virtualization products that use host...

  13. Disk Storage Server

    CERN Multimedia

    This model was a disk storage server used in the Data Centre up until 2012. Each tray contains a hard disk drive (see the 5TB hard disk drive on the main disk display section - this actually fits into one of the trays). There are 16 trays in all per server. There are hundreds of these servers mounted on racks in the Data Centre, as can be seen.

  14. Group-Server Queues

    OpenAIRE

    Li, Quan-Lin; Ma, Jing-Yu; Xie, Mingzhou; Xia, Li

    2017-01-01

    By analyzing energy-efficient management of data centers, this paper proposes and develops a class of interesting {\\it Group-Server Queues}, and establishes two representative group-server queues through loss networks and impatient customers, respectively. Furthermore, such two group-server queues are given model descriptions and necessary interpretation. Also, simple mathematical discussion is provided, and simulations are made to study the expected queue lengths, the expected sojourn times ...

  15. Single-trial log transformation is optimal in frequency analysis of resting EEG alpha.

    Science.gov (United States)

    Smulders, Fren T Y; Ten Oever, Sanne; Donkers, Franc C L; Quaedflieg, Conny W E M; van de Ven, Vincent

    2018-02-01

    The appropriate definition and scaling of the magnitude of electroencephalogram (EEG) oscillations is an underdeveloped area. The aim of this study was to optimize the analysis of resting EEG alpha magnitude, focusing on alpha peak frequency and nonlinear transformation of alpha power. A family of nonlinear transforms, Box-Cox transforms, were applied to find the transform that (a) maximized a non-disputed effect: the increase in alpha magnitude when the eyes are closed (Berger effect), and (b) made the distribution of alpha magnitude closest to normal across epochs within each participant, or across participants. The transformations were performed either at the single epoch level or at the epoch-average level. Alpha peak frequency showed large individual differences, yet good correspondence between various ways to estimate it in 2 min of eyes-closed and 2 min of eyes-open resting EEG data. Both alpha magnitude and the Berger effect were larger for individual alpha than for a generic (8-12 Hz) alpha band. The log-transform on single epochs (a) maximized the t-value of the contrast between the eyes-open and eyes-closed conditions when tested within each participant, and (b) rendered near-normally distributed alpha power across epochs and participants, thereby making further transformation of epoch averages superfluous. The results suggest that the log-normal distribution is a fundamental property of variations in alpha power across time in the order of seconds. Moreover, effects on alpha power appear to be multiplicative rather than additive. These findings support the use of the log-transform on single epochs to achieve appropriate scaling of alpha magnitude. © 2018 The Authors. European Journal of Neuroscience published by Federation of European Neuroscience Societies and John Wiley & Sons Ltd.

  16. Direct push driven in situ color logging tool (CLT): technique, analysis routines, and application

    Science.gov (United States)

    Werban, U.; Hausmann, J.; Dietrich, P.; Vienken, T.

    2014-12-01

    Direct push technologies have recently seen a broad development providing several tools for in situ parameterization of unconsolidated sediments. One of these techniques is the measurement of soil colors - a proxy information that reveals to soil/sediment properties. We introduce the direct push driven color logging tool (CLT) for real-time and depth-resolved investigation of soil colors within the visible spectrum. Until now, no routines exist on how to handle high-resolved (mm-scale) soil color data. To develop such a routine, we transform raw data (CIEXYZ) into soil color surrogates of selected color spaces (CIExyY, CIEL*a*b*, CIEL*c*h*, sRGB) and denoise small-scale natural variability by Haar and Daublet4 wavelet transformation, gathering interpretable color logs over depth. However, interpreting color log data as a single application remains challenging. Additional information, such as site-specific knowledge of the geological setting, is required to correlate soil color data to specific layers properties. Hence, we exemplary provide results from a joint interpretation of in situ-obtained soil color data and 'state-of-the-art' direct push based profiling tool data and discuss the benefit of additional data. The developed routine is capable of transferring the provided information obtained as colorimetric data into interpretable color surrogates. Soil color data proved to correlate with small-scale lithological/chemical changes (e.g., grain size, oxidative and reductive conditions), especially when combined with additional direct push vertical high resolution data (e.g., cone penetration testing and soil sampling). Thus, the technique allows enhanced profiling by means of providing another reproducible high-resolution parameter for analysis subsurface conditions. This opens potential new areas of application and new outputs for such data in site investigation. It is our intention to improve color measurements by means method of application and data

  17. Cluster analysis and quality assessment of logged water at an irrigation project, eastern Saudi Arabia.

    Science.gov (United States)

    Hussain, Mahbub; Ahmed, Syed Munaf; Abderrahman, Walid

    2008-01-01

    A multivariate statistical technique, cluster analysis, was used to assess the logged surface water quality at an irrigation project at Al-Fadhley, Eastern Province, Saudi Arabia. The principal idea behind using the technique was to utilize all available hydrochemical variables in the quality assessment including trace elements and other ions which are not considered in conventional techniques for water quality assessments like Stiff and Piper diagrams. Furthermore, the area belongs to an irrigation project where water contamination associated with the use of fertilizers, insecticides and pesticides is expected. This quality assessment study was carried out on a total of 34 surface/logged water samples. To gain a greater insight in terms of the seasonal variation of water quality, 17 samples were collected from both summer and winter seasons. The collected samples were analyzed for a total of 23 water quality parameters including pH, TDS, conductivity, alkalinity, sulfate, chloride, bicarbonate, nitrate, phosphate, bromide, fluoride, calcium, magnesium, sodium, potassium, arsenic, boron, copper, cobalt, iron, lithium, manganese, molybdenum, nickel, selenium, mercury and zinc. Cluster analysis in both Q and R modes was used. Q-mode analysis resulted in three distinct water types for both the summer and winter seasons. Q-mode analysis also showed the spatial as well as temporal variation in water quality. R-mode cluster analysis led to the conclusion that there are two major sources of contamination for the surface/shallow groundwater in the area: fertilizers, micronutrients, pesticides, and insecticides used in agricultural activities, and non-point natural sources.

  18. Gulf of Mexico Gas Hydrate Joint Industry Project Leg II logging-while-drilling data acquisition and analysis

    Science.gov (United States)

    Collett, Timothy S.; Lee, Wyung W.; Zyrianova, Margarita V.; Mrozewski, Stefan A.; Guerin, Gilles; Cook, Ann E.; Goldberg, Dave S.

    2012-01-01

    One of the objectives of the Gulf of Mexico Gas Hydrate Joint Industry Project Leg II (GOM JIP Leg II) was the collection of a comprehensive suite of logging-while-drilling (LWD) data within gas-hydrate-bearing sand reservoirs in order to make accurate estimates of the concentration of gas hydrates under various geologic conditions and to understand the geologic controls on the occurrence of gas hydrate at each of the sites drilled during this expedition. The LWD sensors just above the drill bit provided important information on the nature of the sediments and the occurrence of gas hydrate. There has been significant advancements in the use of downhole well-logging tools to acquire detailed information on the occurrence of gas hydrate in nature: From using electrical resistivity and acoustic logs to identify gas hydrate occurrences in wells to where wireline and advanced logging-while-drilling tools are routinely used to examine the petrophysical nature of gas hydrate reservoirs and the distribution and concentration of gas hydrates within various complex reservoir systems. Recent integrated sediment coring and well-log studies have confirmed that electrical resistivity and acoustic velocity data can yield accurate gas hydrate saturations in sediment grain supported (isotropic) systems such as sand reservoirs, but more advanced log analysis models are required to characterize gas hydrate in fractured (anisotropic) reservoir systems. In support of the GOM JIP Leg II effort, well-log data montages have been compiled and presented in this report which includes downhole logs obtained from all seven wells drilled during this expedition with a focus on identifying and characterizing the potential gas-hydrate-bearing sedimentary section in each of the wells. Also presented and reviewed in this report are the gas-hydrate saturation and sediment porosity logs for each of the wells as calculated from available downhole well logs.

  19. Performance analysis of MIMO wireless optical communication system with Q-ary PPM over correlated log-normal fading channel

    Science.gov (United States)

    Wang, Huiqin; Wang, Xue; Lynette, Kibe; Cao, Minghua

    2018-06-01

    The performance of multiple-input multiple-output wireless optical communication systems that adopt Q-ary pulse position modulation over spatial correlated log-normal fading channel is analyzed in terms of its un-coded bit error rate and ergodic channel capacity. The analysis is based on the Wilkinson's method which approximates the distribution of a sum of correlated log-normal random variables to a log-normal random variable. The analytical and simulation results corroborate the increment of correlation coefficients among sub-channels lead to system performance degradation. Moreover, the receiver diversity has better performance in resistance of spatial correlation caused channel fading.

  20. SHIFT: server for hidden stops analysis in frame-shifted translation.

    Science.gov (United States)

    Gupta, Arun; Singh, Tiratha Raj

    2013-02-23

    Frameshift is one of the three classes of recoding. Frame-shifts lead to waste of energy, resources and activity of the biosynthetic machinery. In addition, some peptides synthesized after frame-shifts are probably cytotoxic which serve as plausible cause for innumerable number of diseases and disorders such as muscular dystrophies, lysosomal storage disorders, and cancer. Hidden stop codons occur naturally in coding sequences among all organisms. These codons are associated with the early termination of translation for incorrect reading frame selection and help to reduce the metabolic cost related to the frameshift events. Researchers have identified several consequences of hidden stop codons and their association with myriad disorders. However the wealth of information available is speckled and not effortlessly acquiescent to data-mining. To reduce this gap, this work describes an algorithmic web based tool to study hidden stops in frameshifted translation for all the lineages through respective genetic code systems. This paper describes SHIFT, an algorithmic web application tool that provides a user-friendly interface for identifying and analyzing hidden stops in frameshifted translation of genomic sequences for all available genetic code systems. We have calculated the correlation between codon usage frequencies and the plausible contribution of codons towards hidden stops in an off-frame context. Markovian chains of various order have been used to model hidden stops in frameshifted peptides and their evolutionary association with naturally occurring hidden stops. In order to obtain reliable and persuasive estimates for the naturally occurring and predicted hidden stops statistical measures have been implemented. This paper presented SHIFT, an algorithmic tool that allows user-friendly exploration, analysis, and visualization of hidden stop codons in frameshifted translations. It is expected that this web based tool would serve as a useful complement for

  1. BSSF: a fingerprint based ultrafast binding site similarity search and function analysis server

    Directory of Open Access Journals (Sweden)

    Jiang Hualiang

    2010-01-01

    Full Text Available Abstract Background Genome sequencing and post-genomics projects such as structural genomics are extending the frontier of the study of sequence-structure-function relationship of genes and their products. Although many sequence/structure-based methods have been devised with the aim of deciphering this delicate relationship, there still remain large gaps in this fundamental problem, which continuously drives researchers to develop novel methods to extract relevant information from sequences and structures and to infer the functions of newly identified genes by genomics technology. Results Here we present an ultrafast method, named BSSF(Binding Site Similarity & Function, which enables researchers to conduct similarity searches in a comprehensive three-dimensional binding site database extracted from PDB structures. This method utilizes a fingerprint representation of the binding site and a validated statistical Z-score function scheme to judge the similarity between the query and database items, even if their similarities are only constrained in a sub-pocket. This fingerprint based similarity measurement was also validated on a known binding site dataset by comparing with geometric hashing, which is a standard 3D similarity method. The comparison clearly demonstrated the utility of this ultrafast method. After conducting the database searching, the hit list is further analyzed to provide basic statistical information about the occurrences of Gene Ontology terms and Enzyme Commission numbers, which may benefit researchers by helping them to design further experiments to study the query proteins. Conclusions This ultrafast web-based system will not only help researchers interested in drug design and structural genomics to identify similar binding sites, but also assist them by providing further analysis of hit list from database searching.

  2. Standardizing effect size from linear regression models with log-transformed variables for meta-analysis.

    Science.gov (United States)

    Rodríguez-Barranco, Miguel; Tobías, Aurelio; Redondo, Daniel; Molina-Portillo, Elena; Sánchez, María José

    2017-03-17

    Meta-analysis is very useful to summarize the effect of a treatment or a risk factor for a given disease. Often studies report results based on log-transformed variables in order to achieve the principal assumptions of a linear regression model. If this is the case for some, but not all studies, the effects need to be homogenized. We derived a set of formulae to transform absolute changes into relative ones, and vice versa, to allow including all results in a meta-analysis. We applied our procedure to all possible combinations of log-transformed independent or dependent variables. We also evaluated it in a simulation based on two variables either normally or asymmetrically distributed. In all the scenarios, and based on different change criteria, the effect size estimated by the derived set of formulae was equivalent to the real effect size. To avoid biased estimates of the effect, this procedure should be used with caution in the case of independent variables with asymmetric distributions that significantly differ from the normal distribution. We illustrate an application of this procedure by an application to a meta-analysis on the potential effects on neurodevelopment in children exposed to arsenic and manganese. The procedure proposed has been shown to be valid and capable of expressing the effect size of a linear regression model based on different change criteria in the variables. Homogenizing the results from different studies beforehand allows them to be combined in a meta-analysis, independently of whether the transformations had been performed on the dependent and/or independent variables.

  3. BEBAN JARINGAN SAAT MENGAKSES EMAIL DARI BEBERAPA MAIL SERVER

    Directory of Open Access Journals (Sweden)

    Husni Thamrin

    2017-01-01

    Full Text Available Expensive internet facilities require prudent in its use both as a source of information and communication media. This paper discusses observations of the perceived burden of network bandwidth when accessing some of the mail server using a webmail application. Mail server in question consists of three commercial server and 2 non-commercial server. Data when it download home page, while logged in, open the email, and during idle logout recorded with sniffer Wireshark. Observations in various situations and scenarios indicate that access Yahoo email gives the network load is very high while the SquirrelMail gives the network load is very low than 5 other mail servers. For an institution, use a local mail server (institutional is highly recommended in the context of banddwidth savings.

  4. Analysis of log rate noise in Ontario's CANDU reactors

    Energy Technology Data Exchange (ETDEWEB)

    Hinds, H.W. [Dynamic Simulation and Analysis Corp., Deep River, Ontario (Canada); Banica, C.; Arguner, D. [Ontario Power Generation, Ajax, Ontario (Canada); Scharfenberg, R. [Bruce Power, Tiverton, Ontario (Canada)

    2007-07-01

    In the fall of 2003, the operators noticed that in the recently-refurbished Bruce A Shutdown System no. 1 (SDS1) the noise level in Log Rate signals were much larger than before. At the request of the Canadian Nuclear Safety Commission (CNSC), all Canadian CANDU reactors took action to characterize their Log Rate noise. Staff of the Inspection and Maintenance Services division of Ontario Power Generation (OPG) has collected high-speed high-accuracy noise data from nearly all 16 Ontario reactors, either as part of routine measurements before planned outages or as a dedicated noise recording. This paper gives the results of examining a suitable subset of this data, with respect to the characteristics and possible causes of Log Rate noise. The reactor and instrumentation design is different at each station: the locations of the moderator injection nozzles, the location of the ion chambers for each system, and the design of the Log Rate amplifiers. It was found that the Log noise (source of Log Rate noise) was much larger for those ion chambers in the path of the moderator injection nozzles, compared to those which were not in the path. This 'extra' Log noise would then be either attenuated or amplified depending on the transfer function (time constants) of the Log Rate amplifier. It was also observed that most of the Log and Log Rate noise is independent of any other signal measured. Although all CANDU reactors in Ontario have Log and Log Rate noise, the Bruce A SDS1 system has the largest amount of Log Rate noise, because (a) its SDS1 (and RRS) ion chambers are at the top of the reactor in the path of the moderator injection nozzles, and (b) its SDS1 Log Rate amplifiers have the smallest time constants. (author)

  5. CMLOG: A common message logging system

    International Nuclear Information System (INIS)

    Chen, J.; Akers, W.; Bickley, M.; Wu, D.; Watson, W. III

    1997-01-01

    The Common Message Logging (CMLOG) system is an object-oriented and distributed system that not only allows applications and systems to log data (messages) of any type into a centralized database but also lets applications view incoming messages in real-time or retrieve stored data from the database according to selection rules. It consists of a concurrent Unix server that handles incoming logging or searching messages, a Motif browser that can view incoming messages in real-time or display stored data in the database, a client daemon that buffers and sends logging messages to the server, and libraries that can be used by applications to send data to or retrieve data from the database via the server. This paper presents the design and implementation of the CMLOG system meanwhile it will also address the issue of integration of CMLOG into existing control systems. CMLOG into existing control systems

  6. Preliminary analysis of geophysical logs from drill hole UE-25p No. 1, Yucca Mountain, Nye County, Nevada

    International Nuclear Information System (INIS)

    Muller, D.C.; Kibler, J.E.

    1984-01-01

    Geophysical logs from drill hole UE-25p No. 1 correlate well with logs through the same geologic units from other drill holes at Yucca Mountain, Nevada. The in-situ physical properties of the rocks as determined from well logs are consistent with laboratory-measured physical properties of core from other drill holes. The density, neutron and caliper logs are very spiky through most of the Topopah Spring Member. This spikiness occurs on the same logs in cored holes where the Topopah Spring Member is highly fractured and lithophysal. The uranium channel of the spectral gamma-ray log through the Topopah Spring Member correlates with uranium logs from cored holes where most of the fractures have not been healed or filled with materials that concentrate uranium. Therefore, fracture porosity and permeability of the Topopah Spring Member are expected to be high and consistent with fracture analysis from other drill holes on Yucca Mountain, and hydrologic tests from well J-13. The Paleozoic dolomites which underlie the Tertiary tuffs are intensely brecciated, and the uranium count rate is much higher than normal for dolomites because uranium has been concentrated in the recementing material. 19 references, 1 figure, 2 tables

  7. Taming Log Files from Game/Simulation-Based Assessments: Data Models and Data Analysis Tools. Research Report. ETS RR-16-10

    Science.gov (United States)

    Hao, Jiangang; Smith, Lawrence; Mislevy, Robert; von Davier, Alina; Bauer, Malcolm

    2016-01-01

    Extracting information efficiently from game/simulation-based assessment (G/SBA) logs requires two things: a well-structured log file and a set of analysis methods. In this report, we propose a generic data model specified as an extensible markup language (XML) schema for the log files of G/SBAs. We also propose a set of analysis methods for…

  8. Lithology identification of aquifers from geophysical well logs and fuzzy logic analysis: Shui-Lin Area, Taiwan

    Science.gov (United States)

    Hsieh, Bieng-Zih; Lewis, Charles; Lin, Zsay-Shing

    2005-04-01

    The purpose of this study is to construct a fuzzy lithology system from well logs to identify formation lithology of a groundwater aquifer system in order to better apply conventional well logging interpretation in hydro-geologic studies because well log responses of aquifers are sometimes different from those of conventional oil and gas reservoirs. The input variables for this system are the gamma-ray log reading, the separation between the spherically focused resistivity and the deep very-enhanced resistivity curves, and the borehole compensated sonic log reading. The output variable is groundwater formation lithology. All linguistic variables are based on five linguistic terms with a trapezoidal membership function. In this study, 50 data sets are clustered into 40 training sets and 10 testing sets for constructing the fuzzy lithology system and validating the ability of system prediction, respectively. The rule-based database containing 12 fuzzy lithology rules is developed from the training data sets, and the rule strength is weighted. A Madani inference system and the bisector of area defuzzification method are used for fuzzy inference and defuzzification. The success of training performance and the prediction ability were both 90%, with the calculated correlation of training and testing equal to 0.925 and 0.928, respectively. Well logs and core data from a clastic aquifer (depths 100-198 m) in the Shui-Lin area of west-central Taiwan are used for testing the system's construction. Comparison of results from core analysis, well logging and the fuzzy lithology system indicates that even though the well logging method can easily define a permeable sand formation, distinguishing between silts and sands and determining grain size variation in sands is more subjective. These shortcomings can be improved by a fuzzy lithology system that is able to yield more objective decisions than some conventional methods of log interpretation.

  9. The feasibility of well-logging measurements of arsenic levels using neutron-activation analysis

    Science.gov (United States)

    Oden, C.P.; Schweitzer, J.S.; McDowell, G.M.

    2006-01-01

    Arsenic is an extremely toxic metal, which poses a significant problem in many mining environments. Arsenic contamination is also a major problem in ground and surface waters. A feasibility study was conducted to determine if neutron-activation analysis is a practical method of measuring in situ arsenic levels. The response of hypothetical well-logging tools to arsenic was simulated using a readily available Monte Carlo simulation code (MCNP). Simulations were made for probes with both hyperpure germanium (HPGe) and bismuth germanate (BGO) detectors using accelerator and isotopic neutron sources. Both sources produce similar results; however, the BGO detector is much more susceptible to spectral interference than the HPGe detector. Spectral interference from copper can preclude low-level arsenic measurements when using the BGO detector. Results show that a borehole probe could be built that would measure arsenic concentrations of 100 ppm by weight to an uncertainty of 50 ppm in about 15 min. ?? 2006 Elsevier Ltd. All rights reserved.

  10. Improved analysis of bacterial CGH data beyond the log-ratio paradigm

    Directory of Open Access Journals (Sweden)

    Aakra Ågot

    2009-03-01

    Full Text Available Abstract Background Existing methods for analyzing bacterial CGH data from two-color arrays are based on log-ratios only, a paradigm inherited from expression studies. We propose an alternative approach, where microarray signals are used in a different way and sequence identity is predicted using a supervised learning approach. Results A data set containing 32 hybridizations of sequenced versus sequenced genomes have been used to test and compare methods. A ROC-analysis has been performed to illustrate the ability to rank probes with respect to Present/Absent calls. Classification into Present and Absent is compared with that of a gaussian mixture model. Conclusion The results indicate our proposed method is an improvement of existing methods with respect to ranking and classification of probes, especially for multi-genome arrays.

  11. Spectral Noise Logging for well integrity analysis in the mineral water well in Asselian aquifer

    Directory of Open Access Journals (Sweden)

    R.R. Kantyukov

    2017-06-01

    Full Text Available This paper describes a mineral water well with decreasing salinity level according to lab tests. A well integrity package including Spectral Noise Logging (SNL, High-Precision Temperature (HPT logging and electromagnetic defectoscopy (EmPulse was performed in the well which allowed finding casing leaks and fresh water source. In the paper all logging data were thoroughly analyzed and recommendation for workover was mentioned. The SNL-HPT-EmPulse survey allowed avoiding well abandonment.

  12. A Multi-temporal Analysis of Logging Impacts on Tropical Forest Structure Using Airborne Lidar Data

    Science.gov (United States)

    Keller, M. M.; Pinagé, E. R.; Duffy, P.; Longo, M.; dos-Santos, M. N.; Leitold, V.; Morton, D. C.

    2017-12-01

    The long-term impacts of selective logging on carbon cycling and ecosystem function in tropical-forests are still uncertain. Despite improvements in selective logging detection using satellite data, quantifying changes in forest structure from logging and recovery following logging is difficult using orbital data. We analyzed the dynamics of forest structure comparing logged and unlogged forests in the Eastern Brazilian Amazon (Paragominas Municipality, Pará State) using small footprint discrete return airborne lidar data acquired in 2012 and 2014. Logging operations were conducted at the 1200 ha study site from 2006 through 2013 using reduced impact logging techniques—management practices that minimize canopy and ground damage compared to more common conventional logging. Nevertheless, logging still reduced aboveground biomass by 10% to 20% in logged areas compared to intact forests. We aggregated lidar point-cloud data at spatial scales ranging from 50 m to 250 m and developed a binomial classification model based on the height distribution of lidar returns in 2012 and validated the model against the 2014 lidar acquisition. We accurately classified intact and logged forest classes compared with field data. Classification performance improved as spatial resolution increased (AUC = 0.974 at 250 m). We analyzed the differences in canopy gaps, understory damage (based on a relative density model), and biomass (estimated from total canopy height) of intact and logged classes. As expected, logging greatly increased both canopy gap formation and understory damage. However, while the area identified as canopy gap persisted for at least 8 years (from the oldest logging treatments in 2006 to the most recent lidar acquisition in 2014), the effects of ground damage were mostly erased by vigorous understory regrowth after about 5 years. The rate of new gap formation was 6 to 7 times greater in recently logged forests compared to undisturbed forests. New gaps opened at a

  13. Analysis of free geo-server software usability from the viewpoint of INSPIRE requirementsAnalysis of free geo-server software usability from the viewpoint of INSPIRE requirements

    Directory of Open Access Journals (Sweden)

    Tomasz  Grasza

    2014-06-01

    Full Text Available The paper presents selected server platforms based on free and open source license, coherent with the standards of the Open Geospatial Consortium. The presented programs are evaluated in the context of the INSPIRE Directive. The first part describes the requirements of the Directive, and afterwards presented are the pros and cons of each platform, to meet these demands. This article provides an answer to the question whether the use of free software can provide interoperable network services in accordance with the requirements of the INSPIRE Directive, on the occasion of presenting the application examples and practical tips on the use of particular programs.[b]Keywords[/b]: GIS, INSPIRE, free software, OGC, geoportal, network services, GeoServer, deegree, GeoNetwork

  14. Linux Server Security

    CERN Document Server

    Bauer, Michael D

    2005-01-01

    Linux consistently appears high up in the list of popular Internet servers, whether it's for the Web, anonymous FTP, or general services such as DNS and delivering mail. But security is the foremost concern of anyone providing such a service. Any server experiences casual probe attempts dozens of time a day, and serious break-in attempts with some frequency as well. This highly regarded book, originally titled Building Secure Servers with Linux, combines practical advice with a firm knowledge of the technical tools needed to ensure security. The book focuses on the most common use of Linux--

  15. Web Server Embedded System

    Directory of Open Access Journals (Sweden)

    Adharul Muttaqin

    2014-07-01

    Full Text Available Abstrak Embedded sistem saat ini menjadi perhatian khusus pada teknologi komputer, beberapa sistem operasi linux dan web server yang beraneka ragam juga sudah dipersiapkan untuk mendukung sistem embedded, salah satu aplikasi yang dapat digunakan dalam operasi pada sistem embedded adalah web server. Pemilihan web server pada lingkungan embedded saat ini masih jarang dilakukan, oleh karena itu penelitian ini dilakukan dengan menitik beratkan pada dua buah aplikasi web server yang tergolong memiliki fitur utama yang menawarkan “keringanan” pada konsumsi CPU maupun memori seperti Light HTTPD dan Tiny HTTPD. Dengan menggunakan parameter thread (users, ramp-up periods, dan loop count pada stress test embedded system, penelitian ini menawarkan solusi web server manakah diantara Light HTTPD dan Tiny HTTPD yang memiliki kecocokan fitur dalam penggunaan embedded sistem menggunakan beagleboard ditinjau dari konsumsi CPU dan memori. Hasil penelitian menunjukkan bahwa dalam hal konsumsi CPU pada beagleboard embedded system lebih disarankan penggunaan Light HTTPD dibandingkan dengan tiny HTTPD dikarenakan terdapat perbedaan CPU load yang sangat signifikan antar kedua layanan web tersebut Kata kunci: embedded system, web server Abstract Embedded systems are currently of particular concern in computer technology, some of the linux operating system and web server variegated also prepared to support the embedded system, one of the applications that can be used in embedded systems are operating on the web server. Selection of embedded web server on the environment is still rarely done, therefore this study was conducted with a focus on two web application servers belonging to the main features that offer a "lightness" to the CPU and memory consumption as Light HTTPD and Tiny HTTPD. By using the parameters of the thread (users, ramp-up periods, and loop count on a stress test embedded systems, this study offers a solution of web server which between the Light

  16. Learning Zimbra Server essentials

    CERN Document Server

    Kouka, Abdelmonam

    2013-01-01

    A standard tutorial approach which will guide the readers on all of the intricacies of the Zimbra Server.If you are any kind of Zimbra user, this book will be useful for you, from newbies to experts who would like to learn how to setup a Zimbra server. If you are an IT administrator or consultant who is exploring the idea of adopting, or have already adopted Zimbra as your mail server, then this book is for you. No prior knowledge of Zimbra is required.

  17. Big Data demonstrator using Hadoop to build a Linux cluster for log data analysis using R

    DEFF Research Database (Denmark)

    Torbensen, Rune Sonnich; Top, Søren

    2017-01-01

    This article walks through the steps to create a Hadoop Linux cluster in the cloud and outlines how to analyze device log data via an example in the R programming language.......This article walks through the steps to create a Hadoop Linux cluster in the cloud and outlines how to analyze device log data via an example in the R programming language....

  18. Server hardware trends

    CERN Multimedia

    CERN. Geneva

    2014-01-01

    This talk will cover the status of the current and upcoming offers on server platforms, focusing mainly on the processing and storage parts. Alternative solutions like Open Compute (OCP) will be quickly covered.

  19. Locating Hidden Servers

    National Research Council Canada - National Science Library

    Oeverlier, Lasse; Syverson, Paul F

    2006-01-01

    .... Announced properties include server resistance to distributed DoS. Both the EFF and Reporters Without Borders have issued guides that describe using hidden services via Tor to protect the safety of dissidents as well as to resist censorship...

  20. Network characteristics for server selection in online games

    Science.gov (United States)

    Claypool, Mark

    2008-01-01

    Online gameplay is impacted by the network characteristics of players connected to the same server. Unfortunately, the network characteristics of online game servers are not well-understood, particularly for groups that wish to play together on the same server. As a step towards a remedy, this paper presents analysis of an extensive set of measurements of game servers on the Internet. Over the course of many months, actual Internet game servers were queried simultaneously by twenty-five emulated game clients, with both servers and clients spread out on the Internet. The data provides statistics on the uptime and populations of game servers over a month long period an an in-depth look at the suitability for game servers for multi-player server selection, concentrating on characteristics critical to playability--latency and fairness. Analysis finds most game servers have latencies suitable for third-person and omnipresent games, such as real-time strategy, sports and role-playing games, providing numerous server choices for game players. However, far fewer game servers have the low latencies required for first-person games, such as shooters or race games. In all cases, groups that wish to play together have a greatly reduced set of servers from which to choose because of inherent unfairness in server latencies and server selection is particularly limited as the group size increases. These results hold across different game types and even across different generations of games. The data should be useful for game developers and network researchers that seek to improve game server selection, whether for single or multiple players.

  1. Noninvasive detection of inhomogeneities in turbid media with time-resolved log-slope analysis

    International Nuclear Information System (INIS)

    Wan, S.K.; Guo Zhixiong; Kumar, Sunil; Aber, Janice; Garetz, B.A.

    2004-01-01

    Detecting foreign objects embedded in turbid media using noninvasive optical tomography techniques is of great importance in many practical applications, such as in biomedical imaging and diagnosis, safety inspection on aircrafts and submarines, and LIDAR techniques. In this paper we develop a novel optical tomography approach based on slope analysis of time-resolved back-scattered signals collected at the medium boundaries where the light source is an ultrafast, short-pulse laser. As the optical field induced by the laser-pulse propagates, the detected temporal signals are influenced by the optical properties of the medium traversed. The detected temporal signatures therefore contain information that can indicate the presence of an inhomogeneity as well as its size and location relative to the laser source and detection systems. The log-slope analysis of the time-resolved back-scattered intensity is shown to be an effective method for extracting the information contained in the signal. The technique is validated by experimental results and by Monte Carlo simulations

  2. Efficient Incremental Garbage Collection for Workstation/Server Database Systems

    OpenAIRE

    Amsaleg , Laurent; Gruber , Olivier; Franklin , Michael

    1994-01-01

    Projet RODIN; We describe an efficient server-based algorithm for garbage collecting object-oriented databases in a workstation/server environment. The algorithm is incremental and runs concurrently with client transactions, however, it does not hold any locks on data and does not require callbacks to clients. It is fault tolerant, but performs very little logging. The algorithm has been designed to be integrated into existing OODB systems, and therefore it works with standard implementation ...

  3. Remote sensing of selective logging in Amazonia Assessing limitations based on detailed field observations, Landsat ETM+, and textural analysis.

    Science.gov (United States)

    Gregory P. Asner; Michael Keller; Rodrigo Pereira; Johan C. Zweede

    2002-01-01

    We combined a detailed field study of forest canopy damage with calibrated Landsat 7 Enhanced Thematic Mapper Plus (ETM+) reflectance data and texture analysis to assess the sensitivity of basic broadband optical remote sensing to selective logging in Amazonia. Our field study encompassed measurements of ground damage and canopy gap fractions along a chronosequence of...

  4. The Use of OPAC in a Large Academic Library: A Transactional Log Analysis Study of Subject Searching

    Science.gov (United States)

    Villen-Rueda, Luis; Senso, Jose A.; de Moya-Anegon, Felix

    2007-01-01

    The analysis of user searches in catalogs has been the topic of research for over four decades, involving numerous studies and diverse methodologies. The present study looks at how different types of users effect queries in the catalog of a university library. For this purpose, we analyzed log files to determine which was the most frequent type of…

  5. Building a web-based CAD server for clinical use, evaluation, and incremental learning. Implementation of analysis function based on execution result and clinical feedback

    International Nuclear Information System (INIS)

    Nomura, Yukihiro; Hayashi, Naoto; Masutani, Yoshitaka; Yoshikawa, Takeharu; Nemoto, Mitsutaka; Hanaoka, Shouhei; Maeda, Eriko; Ohtomo, Kuni; Miki, Soichiro

    2010-01-01

    Development of clinical image analysis software such as computer-assisted detection/diagnosis (CAD) involves a cycle of algorithm development, software implementation, clinical use, refinement of algorithm and software based on feedback. This cycle is expected to accelerate development of CAD software. We have been building a web-based CAD server that enables radiologists to use CAD software and to give feedback in clinical environment. The platform has been utilized in our hospital for 16 months, and more than 2,000 cases of feedback data have been accumulated. In this report, we introduce additional functions for performance evaluation based on executed results of CAD software and clinical feedback. (author)

  6. Comparison of Quantitative Analysis of Image Logs for Shale Volume and Net to Gross Calculation of a Thinly Laminated Reservoir between VNG-NERGE and LAGIA-EGYPT

    Directory of Open Access Journals (Sweden)

    Ahmed Z. Nooh

    2017-09-01

    The gamma ray log data resolution is considerably lower than the FMI log to reflect accurate lithology changes in thinly bedded reservoirs. It has been found afterthought some calibrations and corrections on the FMI resistivity log, the new processed log is used for clay volume and net to gross calculation of the reservoir, indicating the potential of this log for analysis of thin beds. A comparison between VNG-NERGE, NORTH SEA WELL, NERWING and LAGIA-8, LAGIA, EGYPT indicates the calculation for shale volume at different intervals using FMI tools.

  7. An Analysis Platform for Mobile Ad Hoc Network (MANET) Scenario Execution Log Data

    Science.gov (United States)

    2016-01-01

    these technologies. 4.1 Backend Technologies • Java 1.8 • my-sql-connector- java -5.0.8.jar • Tomcat • VirtualBox • Kali MANET Virtual Machine 4.2...Frontend Technologies • LAMPP 4.3 Database • MySQL Server 5. Database The SEDAP database settings and structure are described in this section...contains all the backend java functionality including the web services, should be placed in the webapps directory inside the Tomcat installation

  8. Analisis Perbandingan Load Balancing Web Server Tunggal Dengan Web Server Cluster Menggunakan Linux Virtual Server

    OpenAIRE

    Lukitasari, Desy; Oklilas, Ahmad Fali

    2010-01-01

    Virtual server adalah server yang mempunyai skalabilitas dan ketersedian yang tinggi yang dibangun diatas sebuah cluster dari beberapa real server. Real server dan load balancer akan saling terkoneksi baik dalam jaringan lokal kecepatan tinggi atau yang terpisah secara geografis. Load balancer dapat mengirim permintaan-permintaan ke server yang berbeda dan membuat paralel service dari sebuah cluster pada sebuah alamat IP tunggal dan meminta pengiriman dapat menggunakan teknologi IP load...

  9. Calibration of a neutron log in partially saturated media. Part II. Error analysis

    International Nuclear Information System (INIS)

    Hearst, J.R.; Kasameyer, P.W.; Dreiling, L.A.

    1981-01-01

    Four sources or error (uncertainty) are studied in water content obtained from neutron logs calibrated in partially saturated media for holes up to 3 m. For this calibration a special facility was built and an algorithm for a commercial epithermal neutron log was developed that obtains water content from count rate, bulk density, and gap between the neutron sonde and the borehole wall. The algorithm contained errors due to the calibration and lack of fit, while the field measurements included uncertainties in the count rate (caused by statistics and a short time constant), gap, and density. There can be inhomogeneity in the material surrounding the borehole. Under normal field conditions the hole-size-corrected water content obtained from such neutron logs can have an uncertainty as large as 15% of its value

  10. A mixed-methods analysis of logging injuries in Montana and Idaho.

    Science.gov (United States)

    Lagerstrom, Elise; Magzamen, Sheryl; Rosecrance, John

    2017-12-01

    Despite advances in mechanization, logging continues to be one of the most dangerous occupations in the United States. Logging in the Intermountain West region (Montana and Idaho) is especially hazardous due to steep terrain, extreme weather, and remote work locations. We implemented a mixed-methods approach combining analyses of workers' compensation claims and focus groups to identify factors associated with injuries and fatalities in the logging industry. Inexperienced workers (>6 months experience) accounted for over 25% of claims. Sprain/strain injuries were the most common, accounting for 36% of claims, while fatalities had the highest median claim cost ($274 411). Focus groups identified job tasks involving felling trees, skidding, and truck driving as having highest risk. Injury prevention efforts should focus on training related to safe work methods (especially for inexperienced workers), the development of a safety culture and safety leadership, as well as implementation of engineering controls. © 2017 Wiley Periodicals, Inc.

  11. NExT server

    CERN Document Server

    1989-01-01

    The first website at CERN - and in the world - was dedicated to the World Wide Web project itself and was hosted on Berners-Lee's NeXT computer. The website described the basic features of the web; how to access other people's documents and how to set up your own server. This NeXT machine - the original web server - is still at CERN. As part of the project to restore the first website, in 2013 CERN reinstated the world's first website to its original address.

  12. Reconciling the Log-Linear and Non-Log-Linear Nature of the TSH-Free T4 Relationship: Intra-Individual Analysis of a Large Population.

    Science.gov (United States)

    Rothacker, Karen M; Brown, Suzanne J; Hadlow, Narelle C; Wardrop, Robert; Walsh, John P

    2016-03-01

    The TSH-T4 relationship was thought to be inverse log-linear, but recent cross-sectional studies report a complex, nonlinear relationship; large, intra-individual studies are lacking. Our objective was to analyze the TSH-free T4 relationship within individuals. We analyzed data from 13 379 patients, each with six or more TSH/free T4 measurements and at least a 5-fold difference between individual median TSH and minimum or maximum TSH. Linear and nonlinear regression models of log TSH on free T4 were fitted to data from individuals and goodness of fit compared by likelihood ratio testing. Comparing all models, the linear model achieved best fit in 31% of individuals, followed by quartic (27%), cubic (15%), null (12%), and quadratic (11%) models. After eliminating least favored models (with individuals reassigned to best fitting, available models), the linear model fit best in 42% of participants, quartic in 43%, and null model in 15%. As the number of observations per individual increased, so did the proportion of individuals in whom the linear model achieved best fit, to 66% in those with more than 20 observations. When linear models were applied to all individuals and averaged according to individual median free T4 values, variations in slope and intercept indicated a nonlinear log TSH-free T4 relationship across the population. The log TSH-free T4 relationship appears linear in some individuals and nonlinear in others, but is predominantly linear in those with the largest number of observations. A log-linear relationship within individuals can be reconciled with a non-log-linear relationship in a population.

  13. Annual rainfall statistics for stations in the Top End of Australia: normal and log-normal distribution analysis

    International Nuclear Information System (INIS)

    Vardavas, I.M.

    1992-01-01

    A simple procedure is presented for the statistical analysis of measurement data where the primary concern is the determination of the value corresponding to a specified average exceedance probability. The analysis employs the normal and log-normal frequency distributions together with a χ 2 -test and an error analysis. The error analysis introduces the concept of a counting error criterion, or ζ-test, to test whether the data are sufficient to make the Z 2 -test reliable. The procedure is applied to the analysis of annual rainfall data recorded at stations in the tropical Top End of Australia where the Ranger uranium deposit is situated. 9 refs., 12 tabs., 9 figs

  14. Professional SQL Server 2005 administration

    CERN Document Server

    Knight, Brian; Snyder, Wayne; Armand, Jean-Claude; LoForte, Ross; Ji, Haidong

    2007-01-01

    SQL Server 2005 is the largest leap forward for SQL Server since its inception. With this update comes new features that will challenge even the most experienced SQL Server DBAs. Written by a team of some of the best SQL Server experts in the industry, this comprehensive tutorial shows you how to navigate the vastly changed landscape of the SQL Server administration. Drawing on their own first-hand experiences to offer you best practices, unique tips and tricks, and useful workarounds, the authors help you handle even the most difficult SQL Server 2005 administration issues, including blockin

  15. Performance Analysis of MTD64, our Tiny Multi-Threaded DNS64 Server Implementation: Proof of Concept

    Directory of Open Access Journals (Sweden)

    Gábor Lencse

    2016-07-01

    In this paper, the performance of MTD64 is measured and compared to that of the industry standard BIND in order to check the correctness of the design concepts of MTD64, especially of the one that we use a new thread for each request. For the performance measurements, our earlier proposed dns64perf program is enhanced as dns64perf2, which one is also documented in this paper. We found that MTD64 seriously outperformed BIND and hence our design principles may be useful for the design of a high performance production class DNS64 server. As an additional test, we have also examined the effect of dynamic CPU frequency scaling to the performance of the implementations.

  16. People searching for people: analysis of a people search engine log

    NARCIS (Netherlands)

    Weerkamp, W.; Berendsen, R.; Kovachev, B.; Meij, E.; Balog, K.; de Rijke, M.

    2011-01-01

    Recent years show an increasing interest in vertical search: searching within a particular type of information. Understanding what people search for in these "verticals" gives direction to research and provides pointers for the search engines themselves. In this paper we analyze the search logs of

  17. Analysis and Summary of Historical Dry Well Gamma Logs for S Tank Farm 200 West

    International Nuclear Information System (INIS)

    MYERS, D.A.

    1999-01-01

    Gross gamma ray logs, recorded from January 1975 through mid-year 1994 as part of the Single-Shell Tank Farm Dry Well Surveillance Program, have been reanalyzed for the S tank farm to locate the presence of mobile radionuclides in the subsurface

  18. Progress in analysis of computed tomography (CT) images of hardwood logs for defect detection

    Science.gov (United States)

    Erol Sarigul; A. Lynn Abbott; Daniel L. Schmoldt

    2003-01-01

    This paper addresses the problem of automatically detecting internal defects in logs using computed tomography (CT) images. The overall purpose is to assist in breakdown optimization. Several studies have shown that the commercial value of resulting boards can be increased substantially if defect locations are known in advance, and if this information is used to make...

  19. Combining Data Warehouse and Data Mining Techniques for Web Log Analysis

    DEFF Research Database (Denmark)

    Pedersen, Torben Bach; Jespersen, Søren; Thorhauge, Jesper

    2008-01-01

    a number of approaches thatcombine data warehousing and data mining techniques in order to analyze Web logs.After introducing the well-known click and session data warehouse (DW) schemas,the chapter presents the subsession schema, which allows fast queries on sequences...

  20. Logs Analysis of Adapted Pedagogical Scenarios Generated by a Simulation Serious Game Architecture

    Science.gov (United States)

    Callies, Sophie; Gravel, Mathieu; Beaudry, Eric; Basque, Josianne

    2017-01-01

    This paper presents an architecture designed for simulation serious games, which automatically generates game-based scenarios adapted to learner's learning progression. We present three central modules of the architecture: (1) the learner model, (2) the adaptation module and (3) the logs module. The learner model estimates the progression of the…

  1. Comparison of Certification Authority Roles in Windows Server 2003 and Windows Server 2008

    Directory of Open Access Journals (Sweden)

    A. I. Luchnik

    2011-03-01

    Full Text Available An analysis of Certification Authority components of Microsoft server operating systems was conducted. Based on the results main directions of development of certification authorities and PKI were highlighted.

  2. SU-F-T-295: MLCs Performance and Patient-Specific IMRT QA Using Log File Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Osman, A [King Fahd University of Petroleum and Minerals, Dhahran (Saudi Arabia); American University of Biuret Medical Center, Biuret (Lebanon); Maalej, N [King Fahd University of Petroleum and Minerals, Dhahran (Saudi Arabia); Jayesh, K; Abdel-Rahman, W [King Fahad Specialist Hospital-Dammam, Eastern Province (Saudi Arabia)

    2016-06-15

    Purpose: To analyze the performance of the multi-leaf collimators (MLCs) from the log files recorded during the intensity modulated radiotherapy (IMRT) treatment and to construct the relative fluence maps and do the gamma analysis to compare the planned and executed MLCs movement. Methods: We developed a program to extract and analyze the data from dynamic log files (dynalog files) generated from sliding window IMRT delivery treatments. The program extracts the planned and executed (actual or delivered) MLCs movement, calculates and compares the relative planned and executed fluences. The fluence maps were used to perform the gamma analysis (with 3% dose difference and 3 mm distance to agreement) for 3 IMR patients. We compared our gamma analysis results with those obtained from portal dose image prediction (PDIP) algorithm performed using the EPID. Results: For 3 different IMRT patient treatments, the maximum difference between the planned and the executed MCLs positions was 1.2 mm. The gamma analysis results of the planned and delivered fluences were in good agreement with the gamma analysis from portal dosimetry. The maximum difference for number of pixels passing the gamma criteria (3%/3mm) was 0.19% with respect to portal dosimetry results. Conclusion: MLC log files can be used to verify the performance of the MLCs. Patientspecific IMRT QA based on MLC movement log files gives similar results to EPID dosimetry results. This promising method for patient-specific IMRT QA is fast, does not require dose measurements in a phantom, can be done before the treatment and for every fraction, and significantly reduces the IMRT workload. The author would like to thank King Fahd University of petroleum and Minerals for the support.

  3. Direct potable reuse microbial risk assessment methodology: Sensitivity analysis and application to State log credit allocations.

    Science.gov (United States)

    Soller, Jeffrey A; Eftim, Sorina E; Nappier, Sharon P

    2018-01-01

    Understanding pathogen risks is a critically important consideration in the design of water treatment, particularly for potable reuse projects. As an extension to our published microbial risk assessment methodology to estimate infection risks associated with Direct Potable Reuse (DPR) treatment train unit process combinations, herein, we (1) provide an updated compilation of pathogen density data in raw wastewater and dose-response models; (2) conduct a series of sensitivity analyses to consider potential risk implications using updated data; (3) evaluate the risks associated with log credit allocations in the United States; and (4) identify reference pathogen reductions needed to consistently meet currently applied benchmark risk levels. Sensitivity analyses illustrated changes in cumulative annual risks estimates, the significance of which depends on the pathogen group driving the risk for a given treatment train. For example, updates to norovirus (NoV) raw wastewater values and use of a NoV dose-response approach, capturing the full range of uncertainty, increased risks associated with one of the treatment trains evaluated, but not the other. Additionally, compared to traditional log-credit allocation approaches, our results indicate that the risk methodology provides more nuanced information about how consistently public health benchmarks are achieved. Our results indicate that viruses need to be reduced by 14 logs or more to consistently achieve currently applied benchmark levels of protection associated with DPR. The refined methodology, updated model inputs, and log credit allocation comparisons will be useful to regulators considering DPR projects and design engineers as they consider which unit treatment processes should be employed for particular projects. Published by Elsevier Ltd.

  4. Analysis of historical gross gamma logging data from BX tank farm

    International Nuclear Information System (INIS)

    MYERS, D.A.

    1999-01-01

    Gross gamma ray logs, recorded from January 1975 through mid-year 1994 as part of the Single-Shell Tank Farm Dry Well Surveillance Program, have been reanalyzed for the BX tank farm to locate the presence of mobile radionuclides in the subsurface. This report presents the BX tank farm gross gamma ray data in such a way as to assist others in their study of vadose zone mechanism

  5. UNIX secure server : a free, secure, and functional server example

    OpenAIRE

    Sastre, Hugo

    2016-01-01

    The purpose of this thesis work was to introduce UNIX server as a personal server but also as a start point for investigation and developing at a professional level. The objective of this thesis was to build a secure server providing not only a FTP server but also an HTTP server and a cloud system for remote backups. OpenBSD was used as the operating system. OpenBSD is a UNIX-like operating system made by hackers for hackers. The difference with other systems that might partially provid...

  6. Preliminary analysis of downhole logging data from ICDP Lake Junin drilling Project, Peru

    Science.gov (United States)

    Pierdominici, Simona; Kück, Jochem; Rodbell, Donald T.; Abbott, Mark B.

    2016-04-01

    The International Continental Drilling Programm (ICDP) has supported a scientific drilling campaign in Peru during the summer season 2015. The Lake Junin Drilling Project mainly aims at obtaining high-resolution paleoclimate records from lacustrine sediments to reconstruct the history of the continental records covering the glacial-interglacial cycles. Lake Junín is located at 4000 m a.s.l. in the tropical Andes of Peru, and is characterized by a thick (> 125 m) sediment package deposited at a high rate (0.2 to 1.0 mm yr-1). Lake Junín is one of the few lakes in the tropical Andes that predates the maximum extent of glaciation and is in a geomorphic position to record the waxing and waning of glaciers in nearby cordillera, hence making the lake a key site for the investigation of the Quaternary climate evolution in the inner-tropics of the Southern Hemisphere. Continous coring was performed at three sites in overall 11 boreholes on the lake with at least two overlapping boreholes per site to avoid core gaps. The depth of the boreholes varied between approx. 30 m and 110 m depending on the drill site. The core bit had a bit size of 122.6 mm and yielded a core diameter of 85 mm. Upon completion of coring operations downhole geophysical logging was performed in five of the 11 boreholes (1A, 1C, 1D, 2A and 3B) by the Operational Support Group of ICDP. The main objective was to record in-situ the physical properties of the lacustrine sediments of Lake Junin. Downhole logs provide a powerful tool to fill in information at intervals with core gaps and as depth reference for depth matching of the discontinous cores. Furthermore it will be used for the lithological reconstruction and interpretation. The OSG downhole logging comprised total and spectrum gamma ray, magnetic susceptibility, borehole geometry, temperature, and sonic P-wave velocity. Unstable and collapsing borehole walls made it neccessary to carry out logging in several sections instead of in one run. The

  7. Effect of video server topology on contingency capacity requirements

    Science.gov (United States)

    Kienzle, Martin G.; Dan, Asit; Sitaram, Dinkar; Tetzlaff, William H.

    1996-03-01

    Video servers need to assign a fixed set of resources to each video stream in order to guarantee on-time delivery of the video data. If a server has insufficient resources to guarantee the delivery, it must reject the stream request rather than slowing down all existing streams. Large scale video servers are being built as clusters of smaller components, so as to be economical, scalable, and highly available. This paper uses a blocking model developed for telephone systems to evaluate video server cluster topologies. The goal is to achieve high utilization of the components and low per-stream cost combined with low blocking probability and high user satisfaction. The analysis shows substantial economies of scale achieved by larger server images. Simple distributed server architectures can result in partitioning of resources with low achievable resource utilization. By comparing achievable resource utilization of partitioned and monolithic servers, we quantify the cost of partitioning. Next, we present an architecture for a distributed server system that avoids resource partitioning and results in highly efficient server clusters. Finally, we show how, in these server clusters, further optimizations can be achieved through caching and batching of video streams.

  8. Microsoft SQL Server 2012 bible

    CERN Document Server

    Jorgensen, Adam; LeBlanc, Patrick; Cherry, Denny; Nelson, Aaron

    2012-01-01

    Harness the powerful new SQL Server 2012 Microsoft SQL Server 2012 is the most significant update to this product since 2005, and it may change how database administrators and developers perform many aspects of their jobs. If you're a database administrator or developer, Microsoft SQL Server 2012 Bible teaches you everything you need to take full advantage of this major release. This detailed guide not only covers all the new features of SQL Server 2012, it also shows you step by step how to develop top-notch SQL Server databases and new data connections and keep your databases performing at p

  9. Windows Home Server users guide

    CERN Document Server

    Edney, Andrew

    2008-01-01

    Windows Home Server brings the idea of centralized storage, backup and computer management out of the enterprise and into the home. Windows Home Server is built for people with multiple computers at home and helps to synchronize them, keep them updated, stream media between them, and back them up centrally. Built on a similar foundation as the Microsoft server operating products, it's essentially Small Business Server for the home.This book details how to install, configure, and use Windows Home Server and explains how to connect to and manage different clients such as Windows XP, Windows Vist

  10. Mastering Microsoft Exchange Server 2010

    CERN Document Server

    McBee, Jim

    2010-01-01

    A top-selling guide to Exchange Server-now fully updated for Exchange Server 2010. Keep your Microsoft messaging system up to date and protected with the very newest version, Exchange Server 2010, and this comprehensive guide. Whether you're upgrading from Exchange Server 2007 SP1 or earlier, installing for the first time, or migrating from another system, this step-by-step guide provides the hands-on instruction, practical application, and real-world advice you need.: Explains Microsoft Exchange Server 2010, the latest release of Microsoft's messaging system that protects against spam and vir

  11. A property rights-based analysis of the illegal logging for fuelwood in Kosovo

    OpenAIRE

    Bouriaud, L.; Nichiforel , L.; Ribeiro Nunes, L. M.; Pereira, H.; Bajraktari, A.

    2014-01-01

    The increased demand for fuelwood may have the side-effect of unsustainable use of forest resource. The case of Kosovo fuelwood production is of a peculiar relevance to studying the drivers of the unsustainable patterns of forest biomass use in a post-war and poor economic context. The domestic market demand for fuelwood in Kosovo is estimated at more than 1.5 hm3, while the legal supply, including imports, is slightly higher than 0.3 hm3. Illegal logging for satisfying Kosovo population fuel...

  12. Linking downhole logging data and clay mineralogy analysis in the ICDP Lake Junín drilling Project, Peru

    Science.gov (United States)

    Pierdominici, S.; Schleicher, A.; Kueck, J.; Rodbell, D. T.; Abbott, M. B.

    2017-12-01

    The lake Junin drilling project, co-funded by the International Continental Drilling Program (ICDP), is located at 4000 m a.s.l. in the tropical Andes of Peru. Several boreholes were drilled with the goal to obtain both high-resolution paleoclimate records from lacustrine sediments and to reconstruct the history of the continental records covering the glacial-interglacial cycles. Lake Junín is characterized by a thick package of lacustrine sediments (> 125 m) deposited at a high rate (0.2 to 1.0 mm yr-1), and it is one of the few lakes in the tropical Andes that is hundreds of thousands of years old with a continuous sedimentation rate preserving a very long and continuous record of past ice age cycles. The boreholes reached a maximum depth of 110.08 m and continuous coring was performed at three sites with 11 boreholes. Additionally, an extensive geophysical downhole logging campaign was performed on five boreholes (1A, 1C, 1D, 2A and 3B) by the Operational Support Group of ICDP. Downhole logging measurements comprise total and spectrum gamma ray, magnetic susceptibility, borehole geometry, temperature, and sonic p-wave velocity. In order to fit the downhole logging depths to the composite profile depths, each borehole was depth-matched with the core data. Interpreting the downhole logging data permits to establish a complete lithological log, to characterize the in-situ physical properties of drilled lacustrine sediments, to determine sedimentary structures and to obtain evidences about palaeoclimatic conditions during up to 200 ka. Th and K values are used as a proxy for a first estimate and characterization of clay content in the sediments, which are present as montmorillonite, smectite, illite, and kaolinite in different amounts. Linking the clay minerals that occur in the core material with the downhole logging data allows assessing the geological history of the lake and the relationship to climate change processes. Additional laboratory analysis will be

  13. Analysis of multi-species point patterns using multivariate log Gaussian Cox processes

    DEFF Research Database (Denmark)

    Waagepetersen, Rasmus; Guan, Yongtao; Jalilian, Abdollah

    Multivariate log Gaussian Cox processes are flexible models for multivariate point patterns. However, they have so far only been applied in bivariate cases. In this paper we move beyond the bivariate case in order to model multi-species point patterns of tree locations. In particular we address t...... of the data. The selected number of common latent fields provides an index of complexity of the multivariate covariance structure. Hierarchical clustering is used to identify groups of species with similar patterns of dependence on the common latent fields.......Multivariate log Gaussian Cox processes are flexible models for multivariate point patterns. However, they have so far only been applied in bivariate cases. In this paper we move beyond the bivariate case in order to model multi-species point patterns of tree locations. In particular we address...... the problems of identifying parsimonious models and of extracting biologically relevant information from the fitted models. The latent multivariate Gaussian field is decomposed into components given in terms of random fields common to all species and components which are species specific. This allows...

  14. Fast neutron (14 MeV) attenuation analysis in saturated core samples and its application in well logging

    International Nuclear Information System (INIS)

    Amin Attarzadeh; Mohammad Kamal Ghassem Al Askari; Tagy Bayat

    2009-01-01

    To introduce the application of nuclear logging, it is appropriate to provide a motivation for the use of nuclear measurement techniques in well logging. Importance aspects of the geological sciences are for instance grain and porosity structure and porosity volume of the rocks, as well as the transport properties of a fluid in the porous media. Nuclear measurements are, as a rule non-intrusive. Namely, a measurement does not destroy the sample, and it does not interfere with the process to be measured. Also, non- intrusive measurements are often much faster than the radiation methods, and can also be applied in field measurements. A common type of nuclear measurement employs neutron irradiation. It is powerful technique for geophysical analysis. In this research we illustrate the detail of this technique and it's applications to well logging and oil industry. Experiments have been performed to investigate the possibilities of using neutron attenuation measurements to determine water and oil content of rock sample. A beam of 14 MeV neutrons produced by a 150 KV neutron generator was attenuated by different samples and subsequently detected with plastic scintillators NE102 (Fast counter). Each sample was saturated with water and oil. The difference in neutron attenuation between dry and wet samples was compared with the fluid content determined by mass balance of the sample. In this experiment we were able to determine 3% of humidity in standard sample model (SiO 2 ) and estimate porosity in geological samples when saturated with different fluids. (Author)

  15. CpGAVAS, an integrated web server for the annotation, visualization, analysis, and GenBank submission of completely sequenced chloroplast genome sequences

    Science.gov (United States)

    2012-01-01

    Background The complete sequences of chloroplast genomes provide wealthy information regarding the evolutionary history of species. With the advance of next-generation sequencing technology, the number of completely sequenced chloroplast genomes is expected to increase exponentially, powerful computational tools annotating the genome sequences are in urgent need. Results We have developed a web server CPGAVAS. The server accepts a complete chloroplast genome sequence as input. First, it predicts protein-coding and rRNA genes based on the identification and mapping of the most similar, full-length protein, cDNA and rRNA sequences by integrating results from Blastx, Blastn, protein2genome and est2genome programs. Second, tRNA genes and inverted repeats (IR) are identified using tRNAscan, ARAGORN and vmatch respectively. Third, it calculates the summary statistics for the annotated genome. Fourth, it generates a circular map ready for publication. Fifth, it can create a Sequin file for GenBank submission. Last, it allows the extractions of protein and mRNA sequences for given list of genes and species. The annotation results in GFF3 format can be edited using any compatible annotation editing tools. The edited annotations can then be uploaded to CPGAVAS for update and re-analyses repeatedly. Using known chloroplast genome sequences as test set, we show that CPGAVAS performs comparably to another application DOGMA, while having several superior functionalities. Conclusions CPGAVAS allows the semi-automatic and complete annotation of a chloroplast genome sequence, and the visualization, editing and analysis of the annotation results. It will become an indispensible tool for researchers studying chloroplast genomes. The software is freely accessible from http://www.herbalgenomics.org/cpgavas. PMID:23256920

  16. CpGAVAS, an integrated web server for the annotation, visualization, analysis, and GenBank submission of completely sequenced chloroplast genome sequences

    Directory of Open Access Journals (Sweden)

    Liu Chang

    2012-12-01

    Full Text Available Abstract Background The complete sequences of chloroplast genomes provide wealthy information regarding the evolutionary history of species. With the advance of next-generation sequencing technology, the number of completely sequenced chloroplast genomes is expected to increase exponentially, powerful computational tools annotating the genome sequences are in urgent need. Results We have developed a web server CPGAVAS. The server accepts a complete chloroplast genome sequence as input. First, it predicts protein-coding and rRNA genes based on the identification and mapping of the most similar, full-length protein, cDNA and rRNA sequences by integrating results from Blastx, Blastn, protein2genome and est2genome programs. Second, tRNA genes and inverted repeats (IR are identified using tRNAscan, ARAGORN and vmatch respectively. Third, it calculates the summary statistics for the annotated genome. Fourth, it generates a circular map ready for publication. Fifth, it can create a Sequin file for GenBank submission. Last, it allows the extractions of protein and mRNA sequences for given list of genes and species. The annotation results in GFF3 format can be edited using any compatible annotation editing tools. The edited annotations can then be uploaded to CPGAVAS for update and re-analyses repeatedly. Using known chloroplast genome sequences as test set, we show that CPGAVAS performs comparably to another application DOGMA, while having several superior functionalities. Conclusions CPGAVAS allows the semi-automatic and complete annotation of a chloroplast genome sequence, and the visualization, editing and analysis of the annotation results. It will become an indispensible tool for researchers studying chloroplast genomes. The software is freely accessible from http://www.herbalgenomics.org/cpgavas.

  17. FocusStack and StimServer: a new open source MATLAB toolchain for visual stimulation and analysis of two-photon calcium neuronal imaging data.

    Science.gov (United States)

    Muir, Dylan R; Kampa, Björn M

    2014-01-01

    Two-photon calcium imaging of neuronal responses is an increasingly accessible technology for probing population responses in cortex at single cell resolution, and with reasonable and improving temporal resolution. However, analysis of two-photon data is usually performed using ad-hoc solutions. To date, no publicly available software exists for straightforward analysis of stimulus-triggered two-photon imaging experiments. In addition, the increasing data rates of two-photon acquisition systems imply increasing cost of computing hardware required for in-memory analysis. Here we present a Matlab toolbox, FocusStack, for simple and efficient analysis of two-photon calcium imaging stacks on consumer-level hardware, with minimal memory footprint. We also present a Matlab toolbox, StimServer, for generation and sequencing of visual stimuli, designed to be triggered over a network link from a two-photon acquisition system. FocusStack is compatible out of the box with several existing two-photon acquisition systems, and is simple to adapt to arbitrary binary file formats. Analysis tools such as stack alignment for movement correction, automated cell detection and peri-stimulus time histograms are already provided, and further tools can be easily incorporated. Both packages are available as publicly-accessible source-code repositories.

  18. Installing and Testing a Server Operating System

    Directory of Open Access Journals (Sweden)

    Lorentz JÄNTSCHI

    2003-08-01

    Full Text Available The paper is based on the experience of the author with the FreeBSD server operating system administration on three servers in use under academicdirect.ro domain.The paper describes a set of installation, preparation, and administration aspects of a FreeBSD server.First issue of the paper is the installation procedure of FreeBSD operating system on i386 computer architecture. Discussed problems are boot disks preparation and using, hard disk partitioning and operating system installation using a existent network topology and a internet connection.Second issue is the optimization procedure of operating system, server services installation, and configuration. Discussed problems are kernel and services configuration, system and services optimization.The third issue is about client-server applications. Using operating system utilities calls we present an original application, which allows displaying the system information in a friendly web interface. An original program designed for molecular structure analysis was adapted for systems performance comparisons and it serves for a discussion of Pentium, Pentium II and Pentium III processors computation speed.The last issue of the paper discusses the installation and configuration aspects of dial-in service on a UNIX-based operating system. The discussion includes serial ports, ppp and pppd services configuration, ppp and tun devices using.

  19. SU-C-BRD-03: Analysis of Accelerator Generated Text Logs for Preemptive Maintenance

    International Nuclear Information System (INIS)

    Able, CM; Baydush, AH; Nguyen, C; Munley, MT; Gersh, J; Ndlovu, A; Rebo, I; Booth, J; Perez, M; Sintay, B

    2014-01-01

    Purpose: To develop a model to analyze medical accelerator generated parameter and performance data that will provide an early warning of performance degradation and impending component failure. Methods: A robust 6 MV VMAT quality assurance treatment delivery was used to test the constancy of accelerator performance. The generated text log files were decoded and analyzed using statistical process control (SPC) methodology. The text file data is a single snapshot of energy specific and overall systems parameters. A total of 36 system parameters were monitored which include RF generation, electron gun control, energy control, beam uniformity control, DC voltage generation, and cooling systems. The parameters were analyzed using Individual and Moving Range (I/MR) charts. The chart limits were calculated using a hybrid technique that included the use of the standard 3σ limits and the parameter/system specification. Synthetic errors/changes were introduced to determine the initial effectiveness of I/MR charts in detecting relevant changes in operating parameters. The magnitude of the synthetic errors/changes was based on: the value of 1 standard deviation from the mean operating parameter of 483 TB systems, a small fraction (≤ 5%) of the operating range, or a fraction of the minor fault deviation. Results: There were 34 parameters in which synthetic errors were introduced. There were 2 parameters (radial position steering coil, and positive 24V DC) in which the errors did not exceed the limit of the I/MR chart. The I chart limit was exceeded for all of the remaining parameters (94.2%). The MR chart limit was exceeded in 29 of the 32 parameters (85.3%) in which the I chart limit was exceeded. Conclusion: Statistical process control I/MR evaluation of text log file parameters may be effective in providing an early warning of performance degradation or component failure for digital medical accelerator systems. Research is Supported by Varian Medical Systems, Inc

  20. SU-C-BRD-03: Analysis of Accelerator Generated Text Logs for Preemptive Maintenance

    Energy Technology Data Exchange (ETDEWEB)

    Able, CM; Baydush, AH; Nguyen, C; Munley, MT [Wake Forest School of Medicine, Department of Radiation Oncology, Winston Salem, NC (United States); Gersh, J [Gibbs Cancer Center and Research Institute, Spartenburg Regional Medical Ce, Spartenburg, SC (United States); Ndlovu, A; Rebo, I [John Theuer Cancer Center, Hackensack University Medical Center, Hackensack, NJ (United States); Booth, J; Perez, M [North Sydney Cancer Center, Royal North Shore Hospital, Sydney, St Leonards (Australia); Sintay, B [Cone Health Cancer Center, Greensboro, NC (United States)

    2014-06-15

    Purpose: To develop a model to analyze medical accelerator generated parameter and performance data that will provide an early warning of performance degradation and impending component failure. Methods: A robust 6 MV VMAT quality assurance treatment delivery was used to test the constancy of accelerator performance. The generated text log files were decoded and analyzed using statistical process control (SPC) methodology. The text file data is a single snapshot of energy specific and overall systems parameters. A total of 36 system parameters were monitored which include RF generation, electron gun control, energy control, beam uniformity control, DC voltage generation, and cooling systems. The parameters were analyzed using Individual and Moving Range (I/MR) charts. The chart limits were calculated using a hybrid technique that included the use of the standard 3σ limits and the parameter/system specification. Synthetic errors/changes were introduced to determine the initial effectiveness of I/MR charts in detecting relevant changes in operating parameters. The magnitude of the synthetic errors/changes was based on: the value of 1 standard deviation from the mean operating parameter of 483 TB systems, a small fraction (≤ 5%) of the operating range, or a fraction of the minor fault deviation. Results: There were 34 parameters in which synthetic errors were introduced. There were 2 parameters (radial position steering coil, and positive 24V DC) in which the errors did not exceed the limit of the I/MR chart. The I chart limit was exceeded for all of the remaining parameters (94.2%). The MR chart limit was exceeded in 29 of the 32 parameters (85.3%) in which the I chart limit was exceeded. Conclusion: Statistical process control I/MR evaluation of text log file parameters may be effective in providing an early warning of performance degradation or component failure for digital medical accelerator systems. Research is Supported by Varian Medical Systems, Inc.

  1. SQL Server Integration Services

    CERN Document Server

    Hamilton, Bill

    2007-01-01

    SQL Server 2005 Integration Services (SSIS) lets you build high-performance data integration solutions. SSIS solutions wrap sophisticated workflows around tasks that extract, transform, and load (ETL) data from and to a wide variety of data sources. This Short Cut begins with an overview of key SSIS concepts, capabilities, standard workflow and ETL elements, the development environment, execution, deployment, and migration from Data Transformation Services (DTS). Next, you'll see how to apply the concepts you've learned through hands-on examples of common integration scenarios. Once you've

  2. Analysis of Effects of Sensor Multithreading to Generate Local System Event Timelines

    Science.gov (United States)

    2014-03-27

    logging server, which Huang et al. later use for statistical analysis. In addition to sending logs to a MySQL database, the centralized logging server...anomalous increase in the number of log entries over the course of a month instead of appearing as one single malicious entry. 2.3 Windows® Thread...same resources, as previously described in Subsection 4.2.1.1. Of course , the last point is in addition to the medium load scripts crippling the

  3. A General Purpose Connections type CTI Server Based on SIP Protocol and Its Implementation

    Science.gov (United States)

    Watanabe, Toru; Koizumi, Hisao

    In this paper, we propose a general purpose connections type CTI (Computer Telephony Integration) server that provides various CTI services such as voice logging where the CTI server communicates with IP-PBX using the SIP (Session Initiation Protocol), and accumulates voice packets of external line telephone call flowing between an IP telephone for extension and a VoIP gateway connected to outside line networks. The CTI server realizes CTI services such as voice logging, telephone conference, or IVR (interactive voice response) with accumulating and processing voice packets sampled. Furthermore, the CTI server incorporates a web server function which can provide various CTI services such as a Web telephone directory via a Web browser to PCs, cellular telephones or smart-phones in mobile environments.

  4. Sending servers to Morocco

    CERN Multimedia

    Joannah Caborn Wengler

    2012-01-01

    Did you know that computer centres are like people? They breathe air in and out like a person, they have to be kept at the right temperature, and they can even be organ donors. As part of a regular cycle of equipment renewal, the CERN Computer Centre has just donated 161 retired servers to universities in Morocco.   Prof. Abdeslam Hoummada and CERN DG Rolf Heuer seeing off the servers on the beginning of their journey to Morocco. “Many people don’t realise, but the Computer Centre is like a living thing. You don’t just install equipment and it runs forever. We’re continually replacing machines, broken parts and improving things like the cooling.” Wayne Salter, Leader of the IT Computing Facilities Group, watches over the Computer Centre a bit like a nurse monitoring a patient’s temperature, especially since new international recommendations for computer centre environmental conditions were released. “A new international s...

  5. Geophysical log analysis of selected test and residential wells at the Shenandoah Road National Superfund Site, East Fishkill, Dutchess County, New York

    Science.gov (United States)

    Reynolds, Richard J.; Anderson, J. Alton; Williams, John H.

    2015-01-01

    The U.S. Geological Survey collected and analyzed geophysical logs from 20 test wells and 23 residential wells at the Shenandoah Road National Superfund Site in East Fishkill, New York, from 2006 through 2010 as part of an Interagency Agreement to provide hydrogeologic technical support to the U.S. Environmental Protection Agency, Region 2. The geophysical logs collected include caliper, gamma, acoustic and optical televiewer, deviation, electromagnetic-induction, magnetic-susceptibility, fluid-property, and flow under ambient and pumped conditions. The geophysical logs were analyzed along with single-well aquifer test data and drilling logs to characterize the lithology, fabric, fractures, and flow zones penetrated by the wells. The results of the geophysical log analysis were used as part of the hydrogeologic characterization of the site and in the design of discrete-zone monitoring installations in the test wells and selected residential wells.

  6. Mastering Microsoft Exchange Server 2013

    CERN Document Server

    Elfassy, David

    2013-01-01

    The bestselling guide to Exchange Server, fully updated for the newest version Microsoft Exchange Server 2013 is touted as a solution for lowering the total cost of ownership, whether deployed on-premises or in the cloud. Like the earlier editions, this comprehensive guide covers every aspect of installing, configuring, and managing this multifaceted collaboration system. It offers Windows systems administrators and consultants a complete tutorial and reference, ideal for anyone installing Exchange Server for the first time or those migrating from an earlier Exchange Server version.Microsoft

  7. Microsoft Windows Server Administration Essentials

    CERN Document Server

    Carpenter, Tom

    2011-01-01

    The core concepts and technologies you need to administer a Windows Server OS Administering a Windows operating system (OS) can be a difficult topic to grasp, particularly if you are new to the field of IT. This full-color resource serves as an approachable introduction to understanding how to install a server, the various roles of a server, and how server performance and maintenance impacts a network. With a special focus placed on the new Microsoft Technology Associate (MTA) certificate, the straightforward, easy-to-understand tone is ideal for anyone new to computer administration looking t

  8. Advanced 3-D analysis, client-server systems, and cloud computing-Integration of cardiovascular imaging data into clinical workflows of transcatheter aortic valve replacement.

    Science.gov (United States)

    Schoenhagen, Paul; Zimmermann, Mathis; Falkner, Juergen

    2013-06-01

    Degenerative aortic stenosis is highly prevalent in the aging populations of industrialized countries and is associated with poor prognosis. Surgical valve replacement has been the only established treatment with documented improvement of long-term outcome. However, many of the older patients with aortic stenosis (AS) are high-risk or ineligible for surgery. For these patients, transcatheter aortic valve replacement (TAVR) has emerged as a treatment alternative. The TAVR procedure is characterized by a lack of visualization of the operative field. Therefore, pre- and intra-procedural imaging is critical for patient selection, pre-procedural planning, and intra-operative decision-making. Incremental to conventional angiography and 2-D echocardiography, multidetector computed tomography (CT) has assumed an important role before TAVR. The analysis of 3-D CT data requires extensive post-processing during direct interaction with the dataset, using advance analysis software. Organization and storage of the data according to complex clinical workflows and sharing of image information have become a critical part of these novel treatment approaches. Optimally, the data are integrated into a comprehensive image data file accessible to multiple groups of practitioners across the hospital. This creates new challenges for data management requiring a complex IT infrastructure, spanning across multiple locations, but is increasingly achieved with client-server solutions and private cloud technology. This article describes the challenges and opportunities created by the increased amount of patient-specific imaging data in the context of TAVR.

  9. NCI's Distributed Geospatial Data Server

    Science.gov (United States)

    Larraondo, P. R.; Evans, B. J. K.; Antony, J.

    2016-12-01

    Earth systems, environmental and geophysics datasets are an extremely valuable source of information about the state and evolution of the Earth. However, different disciplines and applications require this data to be post-processed in different ways before it can be used. For researchers experimenting with algorithms across large datasets or combining multiple data sets, the traditional approach to batch data processing and storing all the output for later analysis rapidly becomes unfeasible, and often requires additional work to publish for others to use. Recent developments on distributed computing using interactive access to significant cloud infrastructure opens the door for new ways of processing data on demand, hence alleviating the need for storage space for each individual copy of each product. The Australian National Computational Infrastructure (NCI) has developed a highly distributed geospatial data server which supports interactive processing of large geospatial data products, including satellite Earth Observation data and global model data, using flexible user-defined functions. This system dynamically and efficiently distributes the required computations among cloud nodes and thus provides a scalable analysis capability. In many cases this completely alleviates the need to preprocess and store the data as products. This system presents a standards-compliant interface, allowing ready accessibility for users of the data. Typical data wrangling problems such as handling different file formats and data types, or harmonising the coordinate projections or temporal and spatial resolutions, can now be handled automatically by this service. The geospatial data server exposes functionality for specifying how the data should be aggregated and transformed. The resulting products can be served using several standards such as the Open Geospatial Consortium's (OGC) Web Map Service (WMS) or Web Feature Service (WFS), Open Street Map tiles, or raw binary arrays under

  10. Optimal control of a server farm

    NARCIS (Netherlands)

    Adan, I.J.B.F.; Kulkarni, V.G.; Wijk, van A.C.C.

    2013-01-01

    We consider a server farm consisting of ample exponential servers, that serve a Poisson stream of arriving customers. Each server can be either busy, idle or off. An arriving customer will immediately occupy an idle server, if there is one, and otherwise, an off server will be turned on and start

  11. Server farms with setup costs

    NARCIS (Netherlands)

    Gandhi, A.; Harchol-Balter, M.; Adan, I.J.B.F.

    2010-01-01

    In this paper we consider server farms with a setup cost. This model is common in manufacturing systems and data centers, where there is a cost to turn servers on. Setup costs always take the form of a time delay, and sometimes there is additionally a power penalty, as in the case of data centers.

  12. Structural equation and log-linear modeling: a comparison of methods in the analysis of a study on caregivers' health

    Directory of Open Access Journals (Sweden)

    Rosenbaum Peter L

    2006-10-01

    Full Text Available Abstract Background In this paper we compare the results in an analysis of determinants of caregivers' health derived from two approaches, a structural equation model and a log-linear model, using the same data set. Methods The data were collected from a cross-sectional population-based sample of 468 families in Ontario, Canada who had a child with cerebral palsy (CP. The self-completed questionnaires and the home-based interviews used in this study included scales reflecting socio-economic status, child and caregiver characteristics, and the physical and psychological well-being of the caregivers. Both analytic models were used to evaluate the relationships between child behaviour, caregiving demands, coping factors, and the well-being of primary caregivers of children with CP. Results The results were compared, together with an assessment of the positive and negative aspects of each approach, including their practical and conceptual implications. Conclusion No important differences were found in the substantive conclusions of the two analyses. The broad confirmation of the Structural Equation Modeling (SEM results by the Log-linear Modeling (LLM provided some reassurance that the SEM had been adequately specified, and that it broadly fitted the data.

  13. Unsupervised signature extraction from forensic logs

    NARCIS (Netherlands)

    Thaler, S.M.; Menkovski, V.; Petkovic, M.; Altun, Y.; Das, K.; Mielikäinen, T.; Malerba, D.; Stefanowski, J.; Read, J.; Žitnik, M.; Ceci, M.

    2017-01-01

    Signature extraction is a key part of forensic log analysis. It involves recognizing patterns in log lines such that log lines that originated from the same line of code are grouped together. A log signature consists of immutable parts and mutable parts. The immutable parts define the signature, and

  14. Upgrading a TCABR data analysis and acquisition system for remote participation using Java, XML, RCP and modern client/server communication/authentication

    International Nuclear Information System (INIS)

    Sa, W.P. de

    2010-01-01

    The TCABR data analysis and acquisition system has been upgraded to support a joint research programme using remote participation technologies. The architecture of the new system uses Java language as programming environment. Since application parameters and hardware in a joint experiment are complex with a large variability of components, requirements and specification solutions need to be flexible and modular, independent from operating system and computer architecture. To describe and organize the information on all the components and the connections among them, systems are developed using the eXtensible Markup Language (XML) technology. The communication between clients and servers uses remote procedure call (RPC) based on the XML (RPC-XML technology). The integration among Java language, XML and RPC-XML technologies allows to develop easily a standard data and communication access layer between users and laboratories using common software libraries and Web application. The libraries allow data retrieval using the same methods for all user laboratories in the joint collaboration, and the Web application allows a simple graphical user interface (GUI) access. The TCABR tokamak team in collaboration with the IPFN (Instituto de Plasmas e Fusao Nuclear, Instituto Superior Tecnico, Universidade Tecnica de Lisboa) is implementing this remote participation technologies. The first version was tested at the Joint Experiment on TCABR (TCABRJE), a Host Laboratory Experiment, organized in cooperation with the IAEA (International Atomic Energy Agency) in the framework of the IAEA Coordinated Research Project (CRP) on 'Joint Research Using Small Tokamaks'.

  15. Upgrading a TCABR data analysis and acquisition system for remote participation using Java, XML, RCP and modern client/server communication/authentication

    Energy Technology Data Exchange (ETDEWEB)

    Sa, W.P. de, E-mail: pires@if.usp.b [Instituto de Fisica, Universidade de Sao Paulo, Rua do Matao, Travessa R, 187 CEP 05508-090 Cidade Universitaria, Sao Paulo (Brazil)

    2010-07-15

    The TCABR data analysis and acquisition system has been upgraded to support a joint research programme using remote participation technologies. The architecture of the new system uses Java language as programming environment. Since application parameters and hardware in a joint experiment are complex with a large variability of components, requirements and specification solutions need to be flexible and modular, independent from operating system and computer architecture. To describe and organize the information on all the components and the connections among them, systems are developed using the eXtensible Markup Language (XML) technology. The communication between clients and servers uses remote procedure call (RPC) based on the XML (RPC-XML technology). The integration among Java language, XML and RPC-XML technologies allows to develop easily a standard data and communication access layer between users and laboratories using common software libraries and Web application. The libraries allow data retrieval using the same methods for all user laboratories in the joint collaboration, and the Web application allows a simple graphical user interface (GUI) access. The TCABR tokamak team in collaboration with the IPFN (Instituto de Plasmas e Fusao Nuclear, Instituto Superior Tecnico, Universidade Tecnica de Lisboa) is implementing this remote participation technologies. The first version was tested at the Joint Experiment on TCABR (TCABRJE), a Host Laboratory Experiment, organized in cooperation with the IAEA (International Atomic Energy Agency) in the framework of the IAEA Coordinated Research Project (CRP) on 'Joint Research Using Small Tokamaks'.

  16. Finding External Indicators of Load on a Web Server via Analysis of Black-Box Performance Measurements

    Science.gov (United States)

    Chiarini, Marc A.

    2010-01-01

    Traditional methods for system performance analysis have long relied on a mix of queuing theory, detailed system knowledge, intuition, and trial-and-error. These approaches often require construction of incomplete gray-box models that can be costly to build and difficult to scale or generalize. In this thesis, we present a black-box analysis…

  17. Aligning Event Logs to Task-Time Matrix Clinical Pathways in BPMN for Variance Analysis.

    Science.gov (United States)

    Yan, Hui; Van Gorp, Pieter; Kaymak, Uzay; Lu, Xudong; Ji, Lei; Chiau, Choo Chiap; Korsten, Hendrikus H M; Duan, Huilong

    2018-03-01

    Clinical pathways (CPs) are popular healthcare management tools to standardize care and ensure quality. Analyzing CP compliance levels and variances is known to be useful for training and CP redesign purposes. Flexible semantics of the business process model and notation (BPMN) language has been shown to be useful for the modeling and analysis of complex protocols. However, in practical cases one may want to exploit that CPs often have the form of task-time matrices. This paper presents a new method parsing complex BPMN models and aligning traces to the models heuristically. A case study on variance analysis is undertaken, where a CP from the practice and two large sets of patients data from an electronic medical record (EMR) database are used. The results demonstrate that automated variance analysis between BPMN task-time models and real-life EMR data are feasible, whereas that was not the case for the existing analysis techniques. We also provide meaningful insights for further improvement.

  18. Analysis of the Factors Affecting the Interval between Blood Donations Using Log-Normal Hazard Model with Gamma Correlated Frailties.

    Science.gov (United States)

    Tavakol, Najmeh; Kheiri, Soleiman; Sedehi, Morteza

    2016-01-01

    Time to donating blood plays a major role in a regular donor to becoming continues one. The aim of this study was to determine the effective factors on the interval between the blood donations. In a longitudinal study in 2008, 864 samples of first-time donors in Shahrekord Blood Transfusion Center,  capital city of Chaharmahal and Bakhtiari Province, Iran were selected by a systematic sampling and were followed up for five years. Among these samples, a subset of 424 donors who had at least two successful blood donations were chosen for this study and the time intervals between their donations were measured as response variable. Sex, body weight, age, marital status, education, stay and job were recorded as independent variables. Data analysis was performed based on log-normal hazard model with gamma correlated frailty. In this model, the frailties are sum of two independent components assumed a gamma distribution. The analysis was done via Bayesian approach using Markov Chain Monte Carlo algorithm by OpenBUGS. Convergence was checked via Gelman-Rubin criteria using BOA program in R. Age, job and education were significant on chance to donate blood (Pdonation for the higher-aged donors, clericals, workers, free job, students and educated donors were higher and in return, time intervals between their blood donations were shorter. Due to the significance effect of some variables in the log-normal correlated frailty model, it is necessary to plan educational and cultural program to encourage the people with longer inter-donation intervals to donate more frequently.

  19. NEOS Server 4.0 Administrative Guide

    OpenAIRE

    Dolan, Elizabeth D.

    2001-01-01

    The NEOS Server 4.0 provides a general Internet-based client/server as a link between users and software applications. The administrative guide covers the fundamental principals behind the operation of the NEOS Server, installation and trouble-shooting of the Server software, and implementation details of potential interest to a NEOS Server administrator. The guide also discusses making new software applications available through the Server, including areas of concern to remote solver adminis...

  20. Medical video server construction.

    Science.gov (United States)

    Dańda, Jacek; Juszkiewicz, Krzysztof; Leszczuk, Mikołaj; Loziak, Krzysztof; Papir, Zdzisław; Sikora, Marek; Watza, Rafal

    2003-01-01

    The paper discusses two implementation options for a Digital Video Library, a repository used for archiving, accessing, and browsing of video medical records. Two crucial issues to be decided on are a video compression format and a video streaming platform. The paper presents numerous decision factors that have to be taken into account. The compression formats being compared are DICOM as a format representative for medical applications, both MPEGs, and several new formats targeted for an IP networking. The comparison includes transmission rates supported, compression rates, and at least options for controlling a compression process. The second part of the paper presents the ISDN technique as a solution for provisioning of tele-consultation services between medical parties that are accessing resources uploaded to a digital video library. There are several backbone techniques (like corporate LANs/WANs, leased lines or even radio/satellite links) available, however, the availability of network resources for hospitals was the prevailing choice criterion pointing to ISDN solutions. Another way to provide access to the Digital Video Library is based on radio frequency domain solutions. The paper describes possibilities of both, wireless and cellular network's data transmission service to be used as a medical video server transport layer. For the cellular net-work based solution two communication techniques are used: Circuit Switched Data and Packet Switched Data.

  1. Getting started with SQL Server 2012 cube development

    CERN Document Server

    Lidberg, Simon

    2013-01-01

    As a practical tutorial for Analysis Services, get started with developing cubes. ""Getting Started with SQL Server 2012 Cube Development"" walks you through the basics, working with SSAS to build cubes and get them up and running.Written for SQL Server developers who have not previously worked with Analysis Services. It is assumed that you have experience with relational databases, but no prior knowledge of cube development is required. You need SQL Server 2012 in order to follow along with the exercises in this book.

  2. SPEER-SERVER: a web server for prediction of protein specificity determining sites.

    Science.gov (United States)

    Chakraborty, Abhijit; Mandloi, Sapan; Lanczycki, Christopher J; Panchenko, Anna R; Chakrabarti, Saikat

    2012-07-01

    Sites that show specific conservation patterns within subsets of proteins in a protein family are likely to be involved in the development of functional specificity. These sites, generally termed specificity determining sites (SDS), might play a crucial role in binding to a specific substrate or proteins. Identification of SDS through experimental techniques is a slow, difficult and tedious job. Hence, it is very important to develop efficient computational methods that can more expediently identify SDS. Herein, we present Specificity prediction using amino acids' Properties, Entropy and Evolution Rate (SPEER)-SERVER, a web server that predicts SDS by analyzing quantitative measures of the conservation patterns of protein sites based on their physico-chemical properties and the heterogeneity of evolutionary changes between and within the protein subfamilies. This web server provides an improved representation of results, adds useful input and output options and integrates a wide range of analysis and data visualization tools when compared with the original standalone version of the SPEER algorithm. Extensive benchmarking finds that SPEER-SERVER exhibits sensitivity and precision performance that, on average, meets or exceeds that of other currently available methods. SPEER-SERVER is available at http://www.hpppi.iicb.res.in/ss/.

  3. FARO server: Meta-analysis of gene expression by matching gene expression signatures to a compendium of public gene expression data

    DEFF Research Database (Denmark)

    Manijak, Mieszko P.; Nielsen, Henrik Bjørn

    2011-01-01

    circumvented by instead matching gene expression signatures to signatures of other experiments. FINDINGS: To facilitate this we present the Functional Association Response by Overlap (FARO) server, that match input signatures to a compendium of 242 gene expression signatures, extracted from more than 1700...... Arabidopsis microarray experiments. CONCLUSIONS: Hereby we present a publicly available tool for robust characterization of Arabidopsis gene expression experiments which can point to similar experimental factors in other experiments. The server is available at http://www.cbs.dtu.dk/services/faro/....

  4. Identification and Analysis of Multi-tasking Product Information Search Sessions with Query Logs

    Directory of Open Access Journals (Sweden)

    Xiang Zhou

    2016-09-01

    Full Text Available Purpose: This research aims to identify product search tasks in online shopping and analyze the characteristics of consumer multi-tasking search sessions. Design/methodology/approach: The experimental dataset contains 8,949 queries of 582 users from 3,483 search sessions. A sequential comparison of the Jaccard similarity coefficient between two adjacent search queries and hierarchical clustering of queries is used to identify search tasks. Findings: (1 Users issued a similar number of queries (1.43 to 1.47 with similar lengths (7.3-7.6 characters per task in mono-tasking and multi-tasking sessions, and (2 Users spent more time on average in sessions with more tasks, but spent less time for each task when the number of tasks increased in a session. Research limitations: The task identification method that relies only on query terms does not completely reflect the complex nature of consumer shopping behavior. Practical implications: These results provide an exploratory understanding of the relationships among multiple shopping tasks, and can be useful for product recommendation and shopping task prediction. Originality/value: The originality of this research is its use of query clustering with online shopping task identification and analysis, and the analysis of product search session characteristics.

  5. Server application for automated management training system of NPP personnel

    International Nuclear Information System (INIS)

    Poplavskij, I.A.; Pribysh, P.I.; Karpej, A.L.

    2016-01-01

    This paper describer the server side of automated management training system. This system will increase the efficiency of planning and accounting training activities, simplifies the collecting the necessary documentation and analysis of the results. (authors)

  6. CERN Document Server (CDS): Introduction

    CERN Multimedia

    CERN. Geneva; Costa, Flavio

    2017-01-01

    A short online tutorial introducing the CERN Document Server (CDS). Basic functionality description, the notion of Revisions and the CDS test environment. Links: CDS Production environment CDS Test environment  

  7. A Preliminary Analysis of Keystroke Log Data from a Timed Writing Task. Research Report. ETS RR-12-23

    Science.gov (United States)

    Almond, Russell; Deane, Paul; Quinlan, Thomas; Wagner, Michael; Sydorenko, Tetyana

    2012-01-01

    The Fall 2007 and Spring 2008 pilot tests for the "CBAL"™ Writing assessment included experimental keystroke logging capabilities. This report documents the approaches used to capture the keystroke logs and the algorithms used to process the outputs. It also includes some preliminary findings based on the pilot data. In particular, it…

  8. Analysis of geophysical well logs from the Mariano Lake-Lake Valley drilling project, San Juan Basin, Northwestern New Mexico

    International Nuclear Information System (INIS)

    Scott, J.H.

    1986-01-01

    Geophysical well logs were obtained in eight deep holes drilled and cored by the U.S. Geological Survey to examine the geology of the Mariano Lake-Lake Valley area in the southern part of the San Juan basin, New Mexico. The logs were made to determine the petrophysical properties of the rocks penetrated by the holes, to aid in making stratigraphic correlations between the holes, and to estimate the grade of uranium enrichment in mineralized zones. The logs can be divided into six categories-nuclear, electric, sonic, magnetic, dipmeter, and borehole conditions. Examples of these logs are presented and related to lithological and petrophysical properties of the cores recovered. Gamma-ray and prompt fission neutron logs were used to estimate uranium grade in mineralized zones. Resistivity and spontaneous potential logs were used to make stratigraphic correlations between drill holes and to determine the variability of the sandstone:mudstone ratios of the major sedimentary units. In one drill hole a dipmeter log was used to estimate the direction of sediment transport of the fluvial host rock. Magnetic susceptibility logs provided supportive information for a laboratory study of magnetic mineral alteration in drill cores. This study was used to infer the geochemical and hydrologic environment associated with uranium deposition in the project area

  9. Mastering Citrix XenServer

    CERN Document Server

    Reed, Martez

    2014-01-01

    If you are an administrator who is looking to gain a greater understanding of how to design and implement a virtualization solution based on Citrix® XenServer®, then this book is for you. The book will serve as an excellent resource for those who are already familiar with other virtualization platforms, such as Microsoft Hyper-V or VMware vSphere.The book assumes that you have a good working knowledge of servers, networking, and storage technologies.

  10. Upgrading a TCABR Data Analysis and Acquisition System for Remote Participation Using Java, XML, RCP and Modern Client/Server Communication/Authentication

    Energy Technology Data Exchange (ETDEWEB)

    De Sa, W. [University of Sao Paulo - Institute of Physics - Plasma Physics Laboratory, Sao Paulo (Brazil)

    2009-07-01

    Each plasma physics laboratory has a proprietary scheme to control and data acquisition system. Usually, it is different from one laboratory to another. It means that each laboratory has its own way of control the experiment and retrieving data from the database. Fusion research relies to a great extent on international collaboration and it is difficult to follow the work remotely with private system. The TCABR data analysis and acquisition system has been upgraded to support a joint research programme using remote participation technologies. The architecture of the new system uses Java language as programming environment. Since application parameters and hardware in a joint experiment are very complex and with a large variability of components, requirement and specification solutions need to be flexible and modular, independent from operating system and computers architecture. To describe and to organize the information on all the components and the connections among them, systems are developed using the extensible Markup Language (XML) technology. The communication between clients and servers use Remote Procedure Call (RPC) based on the XML (RPC-XML technology). The integration among Java language, XML and RPC-XML technologies allows developing easily a standard data and communication access layer between users and laboratories using common software libraries and Web application. The libraries allow data retrieval using the same methods for all user laboratories in the joint collaboration and the Web application allows a simple Graphical User Interface (GUI) access. TCABR tokamak team collaborating with the CFN (Nuclear Fusion Center, Technical University of Lisbon) are implementing this Remote Participation technologies that are going to be tested at the Joint Experiment on TCABR (TCABR-JE), a Host Laboratory Experiment, organized in cooperation with the IAEA (International Atomic Energy Agency) in the framework of the IAEA Coordinated Research Project (CRP) on

  11. Performance analysis of a cellular automation algorithm for the solution of the track reconstruction problem on a manycore server at LIT, JINR

    International Nuclear Information System (INIS)

    Kulakov, I.S.; Baginyan, S.A.; Ivanov, V.V.; Kisel', P.I.

    2013-01-01

    The results of the tests for the tracks reconstruction efficiency, the speed of the algorithm and its scalability with respect to the number of cores of the server with two Intel Xeon E5640 CPUs (in total 8 physical or 16 logical cores) are presented and discussed

  12. Windows Server 2012 R2 administrator cookbook

    CERN Document Server

    Krause, Jordan

    2015-01-01

    This book is intended for system administrators and IT professionals with experience in Windows Server 2008 or Windows Server 2012 environments who are looking to acquire the skills and knowledge necessary to manage and maintain the core infrastructure required for a Windows Server 2012 and Windows Server 2012 R2 environment.

  13. A Capacity Supply Model for Virtualized Servers

    Directory of Open Access Journals (Sweden)

    Alexander PINNOW

    2009-01-01

    Full Text Available This paper deals with determining the capacity supply for virtualized servers. First, a server is modeled as a queue based on a Markov chain. Then, the effect of server virtualization on the capacity supply will be analyzed with the distribution function of the server load.

  14. Mac OS X Lion Server For Dummies

    CERN Document Server

    Rizzo, John

    2011-01-01

    The perfect guide to help administrators set up Apple's Mac OS X Lion Server With the overwhelming popularity of the iPhone and iPad, more Macs are appearing in corporate settings. The newest version of Mac Server is the ideal way to administer a Mac network. This friendly guide explains to both Windows and Mac administrators how to set up and configure the server, including services such as iCal Server, Podcast Producer, Wiki Server, Spotlight Server, iChat Server, File Sharing, Mail Services, and support for iPhone and iPad. It explains how to secure, administer, and troubleshoot the networ

  15. An Improved Algorithm Research on the PrefixSpan Based on the Server Session Constraint

    Directory of Open Access Journals (Sweden)

    Cai Hong-Guo

    2017-01-01

    Full Text Available When we mine long sequential pattern and discover knowledge by the PrefixSpan algorithm in Web Usage Mining (WUM.The elements and the suffix sequences are much more may cause the problem of the calculation, such as the space explosion. To further solve the problem a more effective way is that. Firstly, a server session-based server log file format is proposed. Then the improved algorithm on the PrefixSpan based on server session constraint is discussed for mining frequent Sequential patterns on the website. Finally, the validity and superiority of the method are presented by the experiment in the paper.

  16. KFC Server: interactive forecasting of protein interaction hot spots.

    Science.gov (United States)

    Darnell, Steven J; LeGault, Laura; Mitchell, Julie C

    2008-07-01

    The KFC Server is a web-based implementation of the KFC (Knowledge-based FADE and Contacts) model-a machine learning approach for the prediction of binding hot spots, or the subset of residues that account for most of a protein interface's; binding free energy. The server facilitates the automated analysis of a user submitted protein-protein or protein-DNA interface and the visualization of its hot spot predictions. For each residue in the interface, the KFC Server characterizes its local structural environment, compares that environment to the environments of experimentally determined hot spots and predicts if the interface residue is a hot spot. After the computational analysis, the user can visualize the results using an interactive job viewer able to quickly highlight predicted hot spots and surrounding structural features within the protein structure. The KFC Server is accessible at http://kfc.mitchell-lab.org.

  17. Web Log Pre-processing and Analysis for Generation of Learning Profiles in Adaptive E-learning

    Directory of Open Access Journals (Sweden)

    Radhika M. Pai

    2016-03-01

    Full Text Available Adaptive E-learning Systems (AESs enhance the efficiency of online courses in education by providing personalized contents and user interfaces that changes according to learner’s requirements and usage patterns. This paper presents the approach to generate learning profile of each learner which helps to identify the learning styles and provide Adaptive User Interface which includes adaptive learning components and learning material. The proposed method analyzes the captured web usage data to identify the learning profile of the learners. The learning profiles are identified by an algorithmic approach that is based on the frequency of accessing the materials and the time spent on the various learning components on the portal. The captured log data is pre-processed and converted into standard XML format to generate learners sequence data corresponding to the different sessions and time spent. The learning style model adopted in this approach is Felder-Silverman Learning Style Model (FSLSM. This paper also presents the analysis of learner’s activities, preprocessed XML files and generated sequences.

  18. Web Log Pre-processing and Analysis for Generation of Learning Profiles in Adaptive E-learning

    Directory of Open Access Journals (Sweden)

    Radhika M. Pai

    2016-04-01

    Full Text Available Adaptive E-learning Systems (AESs enhance the efficiency of online courses in education by providing personalized contents and user interfaces that changes according to learner’s requirements and usage patterns. This paper presents the approach to generate learning profile of each learner which helps to identify the learning styles and provide Adaptive User Interface which includes adaptive learning components and learning material. The proposed method analyzes the captured web usage data to identify the learning profile of the learners. The learning profiles are identified by an algorithmic approach that is based on the frequency of accessing the materials and the time spent on the various learning components on the portal. The captured log data is pre-processed and converted into standard XML format to generate learners sequence data corresponding to the different sessions and time spent. The learning style model adopted in this approach is Felder-Silverman Learning Style Model (FSLSM. This paper also presents the analysis of learner’s activities, preprocessed XML files and generated sequences.

  19. Encyclopedia of well logging

    International Nuclear Information System (INIS)

    Desbrandes, R.

    1985-01-01

    The 16 chapters of this book aim to provide students, trainees and engineers with a manual covering all well-logging measurements ranging from drilling to productions, from oil to minerals going by way of geothermal energy. Each chapter is a summary but a bibliography is given at the end of each chapter. Well-logging during drilling, wireline logging equipment and techniques, petroleum logging, data processing of borehole data, interpretation of well-logging, sampling tools, completion and production logging, logging in relief wells to kill off uncontrolled blowouts, techniques for high temperature geothermal energy, small-scale mining and hydrology, logging with oil-base mud and finally recommended logging programs are all topics covered. There is one chapter on nuclear well-logging which is indexed separately. (UK)

  20. Holocene tree-line variability in the Kauner Valley, Central Eastern Alps, indicated by dendrochronological analysis of living trees and subfossil logs

    NARCIS (Netherlands)

    Nicolussi, Kurt; Kaufmann, Matthias; Patzelt, Gernot; van der Plicht, Johannes; Thurner, Andrea

    2005-01-01

    The altitude of the Alpine tree-line has often been used as proxy for the climatic conditions in the Holocene epoch. The usual approach for establishing a record for this proxy is the analysis of pollen and macro remains. We analysed living trees and subfossil logs from the timberline ecotone in the

  1. Windows Server 2012 vulnerabilities and security

    Directory of Open Access Journals (Sweden)

    Gabriel R. López

    2015-09-01

    Full Text Available This investigation analyses the history of the vulnerabilities of the base system Windows Server 2012 highlighting the most critic vulnerabilities given every 4 months since its creation until the current date of the research. It was organized by the type of vulnerabilities based on the classification of the NIST. Next, given the official vulnerabilities of the system, the authors show how a critical vulnerability is treated by Microsoft in order to countermeasure the security flaw. Then, the authors present the recommended security approaches for Windows Server 2012, which focus on the baseline software given by Microsoft, update, patch and change management, hardening practices and the application of Active Directory Rights Management Services (AD RMS. AD RMS is considered as an important feature since it is able to protect the system even though it is compromised using access lists at a document level. Finally, the investigation of the state of the art related to the security of Windows Server 2012 shows an analysis of solutions given by third parties vendors, which offer security products to secure the base system objective of this study. The recommended solution given by the authors present the security vendor Symantec with its successful features and also characteristics that the authors considered that may have to be improved in future versions of the security solution.

  2. TBI server: a web server for predicting ion effects in RNA folding.

    Science.gov (United States)

    Zhu, Yuhong; He, Zhaojian; Chen, Shi-Jie

    2015-01-01

    Metal ions play a critical role in the stabilization of RNA structures. Therefore, accurate prediction of the ion effects in RNA folding can have a far-reaching impact on our understanding of RNA structure and function. Multivalent ions, especially Mg²⁺, are essential for RNA tertiary structure formation. These ions can possibly become strongly correlated in the close vicinity of RNA surface. Most of the currently available software packages, which have widespread success in predicting ion effects in biomolecular systems, however, do not explicitly account for the ion correlation effect. Therefore, it is important to develop a software package/web server for the prediction of ion electrostatics in RNA folding by including ion correlation effects. The TBI web server http://rna.physics.missouri.edu/tbi_index.html provides predictions for the total electrostatic free energy, the different free energy components, and the mean number and the most probable distributions of the bound ions. A novel feature of the TBI server is its ability to account for ion correlation and ion distribution fluctuation effects. By accounting for the ion correlation and fluctuation effects, the TBI server is a unique online tool for computing ion-mediated electrostatic properties for given RNA structures. The results can provide important data for in-depth analysis for ion effects in RNA folding including the ion-dependence of folding stability, ion uptake in the folding process, and the interplay between the different energetic components.

  3. TBI server: a web server for predicting ion effects in RNA folding.

    Directory of Open Access Journals (Sweden)

    Yuhong Zhu

    Full Text Available Metal ions play a critical role in the stabilization of RNA structures. Therefore, accurate prediction of the ion effects in RNA folding can have a far-reaching impact on our understanding of RNA structure and function. Multivalent ions, especially Mg²⁺, are essential for RNA tertiary structure formation. These ions can possibly become strongly correlated in the close vicinity of RNA surface. Most of the currently available software packages, which have widespread success in predicting ion effects in biomolecular systems, however, do not explicitly account for the ion correlation effect. Therefore, it is important to develop a software package/web server for the prediction of ion electrostatics in RNA folding by including ion correlation effects.The TBI web server http://rna.physics.missouri.edu/tbi_index.html provides predictions for the total electrostatic free energy, the different free energy components, and the mean number and the most probable distributions of the bound ions. A novel feature of the TBI server is its ability to account for ion correlation and ion distribution fluctuation effects.By accounting for the ion correlation and fluctuation effects, the TBI server is a unique online tool for computing ion-mediated electrostatic properties for given RNA structures. The results can provide important data for in-depth analysis for ion effects in RNA folding including the ion-dependence of folding stability, ion uptake in the folding process, and the interplay between the different energetic components.

  4. Professional Team Foundation Server 2012

    CERN Document Server

    Blankenship, Ed; Holliday, Grant; Keller, Brian

    2012-01-01

    A comprehensive guide to using Microsoft Team Foundation Server 2012 Team Foundation Server has become the leading Microsoft productivity tool for software management, and this book covers what developers need to know to use it effectively. Fully revised for the new features of TFS 2012, it provides developers and software project managers with step-by-step instructions and even assists those who are studying for the TFS 2012 certification exam. You'll find a broad overview of TFS, thorough coverage of core functions, a look at extensibility options, and more, written by Microsoft ins

  5. GeoServer beginner's guide

    CERN Document Server

    Youngblood, Brian

    2013-01-01

    Step-by-step instructions are included and the needs of a beginner are totally satisfied by the book. The book consists of plenty of examples with accompanying screenshots and code for an easy learning curve. You are a web developer with knowledge of server side scripting, and have experience with installing applications on the server. You have a desire to want more than Google maps, by offering dynamically built maps on your site with your latest geospatial data stored in MySQL, PostGIS, MsSQL or Oracle. If this is the case, this book is meant for you.

  6. Professional Team Foundation Server 2010

    CERN Document Server

    Blankenship, Ed; Holliday, Grant; Keller, Brian

    2011-01-01

    Authoritative guide to TFS 2010 from a dream team of Microsoft insiders and MVPs!Microsoft Visual Studio Team Foundation Server (TFS) has evolved until it is now an essential tool for Microsoft?s Application Lifestyle Management suite of productivity tools, enabling collaboration within and among software development teams. By 2011, TFS will replace Microsoft?s leading source control system, VisualSourceSafe, resulting in an even greater demand for information about it. Professional Team Foundation Server 2010, written by an accomplished team of Microsoft insiders and Microsoft MVPs, provides

  7. Selective logging in the Brazilian Amazon.

    Science.gov (United States)

    G. P. Asner; D. E. Knapp; E. N. Broadbent; P. J. C. Oliveira; M Keller; J. N. Silva

    2005-01-01

    Amazon deforestation has been measured by remote sensing for three decades. In comparison, selective logging has been mostly invisible to satellites. We developed a large-scale, high-resolution, automated remote-sensing analysis of selective logging in the top five timber-producing states of the Brazilian Amazon. Logged areas ranged from 12,075 to 19,823 square...

  8. Analisis Forensik Jaringan Studi Kasus Serangan SQL Injection pada Server Universitas Gadjah Mada

    Directory of Open Access Journals (Sweden)

    Resi Utami Putri

    2013-07-01

    Abstract Network forensic is a computer security investigation to find the sources of the attacks on the network by examining log evidences, identifying, analyzing and reconstructing the incidents. This research has been conducted at The Center of Information System and Communication Service, Gadjah Mada University. The method that used was The Forensic Process Model, a model of the digital investigation process, consisted of collection, examination, analysis, and reporting. This research has been conducted over five months by retrieving data that was collected from Snort Intrusion Detection System (IDS. Some log files were retrieved and merged into a single log file, and then the data cleaned to fit for research. Based on the research, there are 68 IP address was that did illegal action, SQL injection, on server www.ugm.ac.id. Most of attackers using Havij and SQLmap (automated tools to exploit vulnerabilities on a website. Beside that, there was also Python script that was derived from the continent of Europe in Romania.   Keywords— Network Forensics, The Forensic Process Models, SQL Injection

  9. Learning SQL Server Reporting Services 2012

    CERN Document Server

    Krishnaswamy, Jayaram

    2013-01-01

    The book is packed with clear instructions and plenty of screenshots, providing all the support and guidance you will need as you begin to generate reports with SQL Server 2012 Reporting Services.This book is for those who are new to SQL Server Reporting Services 2012 and aspiring to create and deploy cutting edge reports. This book is for report developers, report authors, ad-hoc report authors and model developers, and Report Server and SharePoint Server Integrated Report Server administrators. Minimal knowledge of SQL Server is assumed and SharePoint experience would be helpful.

  10. Implementing Citrix XenServer Quickstarter

    CERN Document Server

    Ahmed, Gohar

    2013-01-01

    Implementing Citrix XenServer Quick Starter is a practical, hands-on guide that will help you get started with the Citrix XenServer Virtualization technology with easy-to-follow instructions.Implementing Citrix XenServer Quick Starter is for system administrators who have little to no information on virtualization and specifically Citrix XenServer Virtualization. If you're managing a lot of physical servers and are tired of installing, deploying, updating, and managing physical machines on a daily basis over and over again, then you should probably explore your option of XenServer Virtualizati

  11. Open client/server computing and middleware

    CERN Document Server

    Simon, Alan R

    2014-01-01

    Open Client/Server Computing and Middleware provides a tutorial-oriented overview of open client/server development environments and how client/server computing is being done.This book analyzes an in-depth set of case studies about two different open client/server development environments-Microsoft Windows and UNIX, describing the architectures, various product components, and how these environments interrelate. Topics include the open systems and client/server computing, next-generation client/server architectures, principles of middleware, and overview of ProtoGen+. The ViewPaint environment

  12. Beginning Microsoft SQL Server 2012 Programming

    CERN Document Server

    Atkinson, Paul

    2012-01-01

    Get up to speed on the extensive changes to the newest release of Microsoft SQL Server The 2012 release of Microsoft SQL Server changes how you develop applications for SQL Server. With this comprehensive resource, SQL Server authority Robert Vieira presents the fundamentals of database design and SQL concepts, and then shows you how to apply these concepts using the updated SQL Server. Publishing time and date with the 2012 release, Beginning Microsoft SQL Server 2012 Programming begins with a quick overview of database design basics and the SQL query language and then quickly proceeds to sho

  13. Analysis of borehole-radar reflection logs from selected HC boreholes at the Project Shoal area, Churchill County, Nevada; TOPICAL

    International Nuclear Information System (INIS)

    Lane, J.W. Jr.; Joesten, P.K.; Pohll, Greg; Mihevic, Todd

    2001-01-01

    Single-hole borehole-radar reflection logs were collected and interpreted in support of a study to characterize ground-water flow and transport at the Project Shoal Area (PSA) in Churchill County, Nevada. Radar logging was conducted in six boreholes using 60-MHz omni-directional electric-dipole antennas and a 60-MHz magnetic-dipole directional receiving antenna. Radar data from five boreholes were interpreted to identify the location, orientation, estimated length, and spatial continuity of planar reflectors present in the logs. The overall quality of the radar data is marginal and ranges from very poor to good. Twenty-seven reflectors were interpreted from the directional radar reflection logs. Although the range of orientation interpreted for the reflectors is large, a significant number of reflectors strike northeast-southwest and east-west to slightly northwest-southeast. Reflectors are moderate to steeply dipping and reflector length ranged from less than 7 m to more than 133 m. Qualitative scores were assigned to each reflector to provide a sense of the spatial continuity of the reflector and the characteristics of the field data relative to an ideal planar reflector (orientation score). The overall orientation scores are low, which reflects the general data quality, but also indicates that the properties of most reflectors depart from the ideal planar case. The low scores are consistent with reflections from fracture zones that contain numerous, closely spaced, sub-parallel fractures. Interpretation of borehole-radar direct-wave velocity and amplitude logs identified several characteristics of the logged boreholes: (1) low-velocity zones correlate with decreased direct-wave amplitude, indicating the presence of fracture zones; (2) direct-wave amplitude increases with depth in three of the boreholes, suggesting an increase in electrical resistivity with depth resulting from changes in mineral assemblage or from a decrease in the specific conductance of ground

  14. Analysis of borehole-radar reflection logs from selected HC boreholes at the Project Shoal area, Churchill County, Nevada

    Science.gov (United States)

    Lane, J.W.; Joesten, P.K.; Pohll, G.M.; Mihevic, Todd

    2001-01-01

    Single-hole borehole-radar reflection logs were collected and interpreted in support of a study to characterize ground-water flow and transport at the Project Shoal Area (PSA) in Churchill County, Nevada. Radar logging was conducted in six boreholes using 60-MHz omni-directional electric-dipole antennas and a 60-MHz magnetic-dipole directional receiving antenna.Radar data from five boreholes were interpreted to identify the location, orientation, estimated length, and spatial continuity of planar reflectors present in the logs. The overall quality of the radar data is marginal and ranges from very poor to good. Twenty-seven reflectors were interpreted from the directional radar reflection logs. Although the range of orientation interpreted for the reflectors is large, a significant number of reflectors strike northeast-southwest and east-west to slightly northwest-southeast. Reflectors are moderate to steeply dipping and reflector length ranged from less than 7 m to more than 133 m.Qualitative scores were assigned to each reflector to provide a sense of the spatial continuity of the reflector and the characteristics of the field data relative to an ideal planar reflector (orientation score). The overall orientation scores are low, which reflects the general data quality, but also indicates that the properties of most reflectors depart from the ideal planar case. The low scores are consistent with reflections from fracture zones that contain numerous, closely spaced, sub-parallel fractures.Interpretation of borehole-radar direct-wave velocity and amplitude logs identified several characteristics of the logged boreholes: (1) low-velocity zones correlate with decreased direct-wave amplitude, indicating the presence of fracture zones; (2) direct-wave amplitude increases with depth in three of the boreholes, suggesting an increase in electrical resistivity with depth resulting from changes in mineral assemblage or from a decrease in the specific conductance of ground

  15. Building server capabilities in China

    DEFF Research Database (Denmark)

    Adeyemi, Oluseyi; Slepniov, Dmitrij; Wæhrens, Brian Vejrum

    2012-01-01

    The purpose of this paper is to further our understanding of multinational companies building server capabilities in China. The paper is based on the cases of two western companies with operations in China. The findings highlight a number of common patterns in the 1) managerial challenges related...

  16. Client-server password recovery

    NARCIS (Netherlands)

    Chmielewski, Ł.; Hoepman, J.H.; Rossum, P. van

    2009-01-01

    Human memory is not perfect - people constantly memorize new facts and forget old ones. One example is forgetting a password, a common problem raised at IT help desks. We present several protocols that allow a user to automatically recover a password from a server using partial knowledge of the

  17. Client-Server Password Recovery

    NARCIS (Netherlands)

    Chmielewski, L.; Hoepman, J.H.; Rossum, P. van

    2009-01-01

    Human memory is not perfect – people constantly memorize new facts and forget old ones. One example is forgetting a password, a common problem raised at IT help desks. We present several protocols that allow a user to automatically recover a password from a server using partial knowledge of the

  18. Team Foundation Server 2013 customization

    CERN Document Server

    Beeming, Gordon

    2014-01-01

    This book utilizes a tutorial based approach, focused on the practical customization of key features of the Team Foundation Server for collaborative enterprise software projects.This practical guide is intended for those who want to extend TFS. This book is for intermediate users who have an understanding of TFS, and basic coding skills will be required for the more complex customizations.

  19. SedMob: A mobile application for creating sedimentary logs in the field

    Science.gov (United States)

    Wolniewicz, Pawel

    2014-05-01

    SedMob is an open-source, mobile software package for creating sedimentary logs, targeted for use in tablets and smartphones. The user can create an unlimited number of logs, save data from each bed in the log as well as export and synchronize the data with a remote server. SedMob is designed as a mobile interface to SedLog: a free multiplatform package for drawing graphic logs that runs on PC computers. Data entered into SedMob are saved in the CSV file format, fully compatible with SedLog.

  20. SHADE3 server

    DEFF Research Database (Denmark)

    Madsen, Anders Østergaard; Hoser, Anna Agnieszka

    2014-01-01

    translation-libration-screw analysis with input from periodic ab initio calculations. The second method allows the user to input experimental information from spectroscopic measurements or from neutron diffraction experiments on related structures and utilize this information to evaluate ADPs of H atoms...

  1. Identifying APT Malware Domain Based on Mobile DNS Logging

    Directory of Open Access Journals (Sweden)

    Weina Niu

    2017-01-01

    Full Text Available Advanced Persistent Threat (APT is a serious threat against sensitive information. Current detection approaches are time-consuming since they detect APT attack by in-depth analysis of massive amounts of data after data breaches. Specifically, APT attackers make use of DNS to locate their command and control (C&C servers and victims’ machines. In this paper, we propose an efficient approach to detect APT malware C&C domain with high accuracy by analyzing DNS logs. We first extract 15 features from DNS logs of mobile devices. According to Alexa ranking and the VirusTotal’s judgement result, we give each domain a score. Then, we select the most normal domains by the score metric. Finally, we utilize our anomaly detection algorithm, called Global Abnormal Forest (GAF, to identify malware C&C domains. We conduct a performance analysis to demonstrate that our approach is more efficient than other existing works in terms of calculation efficiency and recognition accuracy. Compared with Local Outlier Factor (LOF, k-Nearest Neighbor (KNN, and Isolation Forest (iForest, our approach obtains more than 99% F-M and R for the detection of C&C domains. Our approach not only can reduce data volume that needs to be recorded and analyzed but also can be applicable to unsupervised learning.

  2. Artificial intelligence approach to interwell log correlation

    Energy Technology Data Exchange (ETDEWEB)

    Lim, Jong-Se [Korea Maritime University, Pusan(Korea); Kang, Joo Myung [Seoul National University, Seoul(Korea); Kim, Jung Whan [Korea National Oil Corp., Anyang(Korea)

    2000-04-30

    This paper describes a new approach to automated interwell log correlation using artificial intelligence and principal component analysis. The approach to correlate wire line logging data is on the basis of a large set of subjective rules that are intended to represent human logical processes. The data processed are mainly the qualitative information such as the characteristics of the shapes extracted along log traces. The apparent geologic zones are identified by pattern recognition for the specific characteristics of log trace collected as a set of objects by object oriented programming. The correlation of zones between wells is made by rule-based inference program. The reliable correlation can be established from the first principal component logs derived from both the important information around well bore and the largest common part of variances of all available well log data. Correlation with field log data shows that this approach can make interwell log correlation more reliable and accurate. (author). 6 refs., 7 figs.

  3. Web Server Configuration for an Academic Intranet

    National Research Council Canada - National Science Library

    Baltzis, Stamatios

    2000-01-01

    .... One of the factors that boosted this ability was the evolution of the Web Servers. Using the web server technology man can be connected and exchange information with the most remote places all over the...

  4. Implementation of SRPT Scheduling in Web Servers

    National Research Council Canada - National Science Library

    Harchol-Balter, Mor

    2000-01-01

    .... Experiments use the Linux operating system and the Flash web server. All experiments are repeated under a range of server loads and under both trace-based workloads and those generated by a Web workload generator...

  5. Locating Nearby Copies of Replicated Internet Servers

    National Research Council Canada - National Science Library

    Guyton, James D; Schwartz, Michael F

    1995-01-01

    In this paper we consider the problem of choosing among a collection of replicated servers focusing on the question of how to make choices that segregate client/server traffic according to network topology...

  6. Log N-log S in inconclusive

    Science.gov (United States)

    Klebesadel, R. W.; Fenimore, E. E.; Laros, J.

    1983-01-01

    The log N-log S data acquired by the Pioneer Venus Orbiter Gamma Burst Detector (PVO) are presented and compared to similar data from the Soviet KONUS experiment. Although the PVO data are consistent with and suggestive of a -3/2 power law distribution, the results are not adequate at this state of observations to differentiate between a -3/2 and a -1 power law slope.

  7. The Meaning of Logs

    NARCIS (Netherlands)

    Etalle, Sandro; Massacci, Fabio; Yautsiukhin, Artsiom

    2007-01-01

    While logging events is becoming increasingly common in computing, in communication and in collaborative work, log systems need to satisfy increasingly challenging (if not conflicting) requirements.Despite the growing pervasiveness of log systems, to date there is no high-level framework which

  8. The Meaning of Logs

    NARCIS (Netherlands)

    Etalle, Sandro; Massacci, Fabio; Yautsiukhin, Artsiom; Lambrinoudakis, Costas; Pernul, Günther; Tjoa, A Min

    While logging events is becoming increasingly common in computing, in communication and in collaborative environments, log systems need to satisfy increasingly challenging (if not conflicting) requirements. In this paper we propose a high-level framework for modeling log systems, and reasoning about

  9. Instant MDX queries for SQL Server 2012

    CERN Document Server

    Emond, Nicholas

    2013-01-01

    Get to grips with a new technology, understand what it is and what it can do for you, and then get to work with the most important features and tasks. This short, focused guide is a great way to get stated with writing MDX queries. New developers can use this book as a reference for how to use functions and the syntax of a query as well as how to use Calculated Members and Named Sets.This book is great for new developers who want to learn the MDX query language from scratch and install SQL Server 2012 with Analysis Services

  10. Parallel Computing Using Web Servers and "Servlets".

    Science.gov (United States)

    Lo, Alfred; Bloor, Chris; Choi, Y. K.

    2000-01-01

    Describes parallel computing and presents inexpensive ways to implement a virtual parallel computer with multiple Web servers. Highlights include performance measurement of parallel systems; models for using Java and intranet technology including single server, multiple clients and multiple servers, single client; and a comparison of CGI (common…

  11. A polling model with an autonomous server

    NARCIS (Netherlands)

    de Haan, Roland; Boucherie, Richardus J.; van Ommeren, Jan C.W.

    Polling models are used as an analytical performance tool in several application areas. In these models, the focus often is on controlling the operation of the server as to optimize some performance measure. For several applications, controlling the server is not an issue as the server moves

  12. Zope based electronic operation log system - Zlog

    International Nuclear Information System (INIS)

    Yoshii, K.; Satoh, Y.; Kitabayashi, T.

    2004-01-01

    Since January 2004, the Zope based electronic operation logging system, named Zlog, has been running at the KEKB and AR accelerator facilities. Since Zope is the python based open source web application server software and python language is familiar for the members in the KEKB accelerator control group, we have developed the Zlog system rapidly. In this paper, we report the development history and the present status of Zlog system. Also we show some general plug-in components, called Zope products, have been useful for our Zlog development. (author)

  13. Identification and Management of Pump Thrombus in the HeartWare Left Ventricular Assist Device System: A Novel Approach Using Log File Analysis.

    Science.gov (United States)

    Jorde, Ulrich P; Aaronson, Keith D; Najjar, Samer S; Pagani, Francis D; Hayward, Christopher; Zimpfer, Daniel; Schlöglhofer, Thomas; Pham, Duc T; Goldstein, Daniel J; Leadley, Katrin; Chow, Ming-Jay; Brown, Michael C; Uriel, Nir

    2015-11-01

    The study sought to characterize patterns in the HeartWare (HeartWare Inc., Framingham, Massachusetts) ventricular assist device (HVAD) log files associated with successful medical treatment of device thrombosis. Device thrombosis is a serious adverse event for mechanical circulatory support devices and is often preceded by increased power consumption. Log files of the pump power are easily accessible on the bedside monitor of HVAD patients and may allow early diagnosis of device thrombosis. Furthermore, analysis of the log files may be able to predict the success rate of thrombolysis or the need for pump exchange. The log files of 15 ADVANCE trial patients (algorithm derivation cohort) with 16 pump thrombus events treated with tissue plasminogen activator (tPA) were assessed for changes in the absolute and rate of increase in power consumption. Successful thrombolysis was defined as a clinical resolution of pump thrombus including normalization of power consumption and improvement in biochemical markers of hemolysis. Significant differences in log file patterns between successful and unsuccessful thrombolysis treatments were verified in 43 patients with 53 pump thrombus events implanted outside of clinical trials (validation cohort). The overall success rate of tPA therapy was 57%. Successful treatments had significantly lower measures of percent of expected power (130.9% vs. 196.1%, p = 0.016) and rate of increase in power (0.61 vs. 2.87, p file parameters can potentially predict the likelihood of successful tPA treatments and if validated prospectively, could substantially alter the approach to thrombus management. Copyright © 2015 American College of Cardiology Foundation. Published by Elsevier Inc. All rights reserved.

  14. LogScope

    Science.gov (United States)

    Havelund, Klaus; Smith, Margaret H.; Barringer, Howard; Groce, Alex

    2012-01-01

    LogScope is a software package for analyzing log files. The intended use is for offline post-processing of such logs, after the execution of the system under test. LogScope can, however, in principle, also be used to monitor systems online during their execution. Logs are checked against requirements formulated as monitors expressed in a rule-based specification language. This language has similarities to a state machine language, but is more expressive, for example, in its handling of data parameters. The specification language is user friendly, simple, and yet expressive enough for many practical scenarios. The LogScope software was initially developed to specifically assist in testing JPL s Mars Science Laboratory (MSL) flight software, but it is very generic in nature and can be applied to any application that produces some form of logging information (which almost any software does).

  15. PENGEMBANGAN ANTIVIRUS BERBASIS CLIENT SERVER

    Directory of Open Access Journals (Sweden)

    Richki Hardi

    2015-07-01

    Full Text Available The era of globalization is included era where the komputer virus has been growing rapidly, not only of mere academic research but has become a common problem for komputer users in the world. The effect of this loss is increasingly becoming the widespread use of the Internet as a global communication line between komputer users around the world, based on the results of the survey CSI / FB. Along with the progress, komputer viruses undergo some evolution in shape, characteristics and distribution medium such as Worms, Spyware Trojan horse and program Malcodelain. Through the development of server-based antivirus clien then the user can easily determine the behavior of viruses and worms, knowing what part of an operating system that is being attacked by viruses and worms, making itself a development of network-based antivirus client server and can also be relied upon as an engine fast and reliable scanner to recognize the virus and saving in memory management.

  16. CERN servers go to Mexico

    CERN Multimedia

    Stefania Pandolfi

    2015-01-01

    On Wednesday, 26 August, 384 servers from the CERN Computing Centre were donated to the Faculty of Science in Physics and Mathematics (FCFM) and the Mesoamerican Centre for Theoretical Physics (MCTP) at the University of Chiapas, Mexico.   CERN’s Director-General, Rolf Heuer, met the Mexican representatives in an official ceremony in Building 133, where the servers were prepared for shipment. From left to right: Frédéric Hemmer, CERN IT Department Head; Raúl Heredia Acosta, Deputy Permanent Representative of Mexico to the United Nations and International Organizations in Geneva; Jorge Castro-Valle Kuehne, Ambassador of Mexico to the Swiss Confederation and the Principality of Liechtenstein; Rolf Heuer, CERN Director-General; Luis Roberto Flores Castillo, President of the Swiss Chapter of the Global Network of Qualified Mexicans Abroad; Virginia Romero Tellez, Coordinator of Institutional Relations of the Swiss Chapter of the Global Network of Qualified Me...

  17. PostgreSQL server programming

    CERN Document Server

    Krosing, Hannu

    2013-01-01

    This practical guide leads you through numerous aspects of working with PostgreSQL. Step by step examples allow you to easily set up and extend PostgreSQL. ""PostgreSQL Server Programming"" is for moderate to advanced PostgreSQL database professionals. To get the best understanding of this book, you should have general experience in writing SQL, a basic idea of query tuning, and some coding experience in a language of your choice.

  18. Creating Large Scale Database Servers

    International Nuclear Information System (INIS)

    Becla, Jacek

    2001-01-01

    The BaBar experiment at the Stanford Linear Accelerator Center (SLAC) is designed to perform a high precision investigation of the decays of the B-meson produced from electron-positron interactions. The experiment, started in May 1999, will generate approximately 300TB/year of data for 10 years. All of the data will reside in Objectivity databases accessible via the Advanced Multi-threaded Server (AMS). To date, over 70TB of data have been placed in Objectivity/DB, making it one of the largest databases in the world. Providing access to such a large quantity of data through a database server is a daunting task. A full-scale testbed environment had to be developed to tune various software parameters and a fundamental change had to occur in the AMS architecture to allow it to scale past several hundred terabytes of data. Additionally, several protocol extensions had to be implemented to provide practical access to large quantities of data. This paper will describe the design of the database and the changes that we needed to make in the AMS for scalability reasons and how the lessons we learned would be applicable to virtually any kind of database server seeking to operate in the Petabyte region

  19. Creating Large Scale Database Servers

    Energy Technology Data Exchange (ETDEWEB)

    Becla, Jacek

    2001-12-14

    The BaBar experiment at the Stanford Linear Accelerator Center (SLAC) is designed to perform a high precision investigation of the decays of the B-meson produced from electron-positron interactions. The experiment, started in May 1999, will generate approximately 300TB/year of data for 10 years. All of the data will reside in Objectivity databases accessible via the Advanced Multi-threaded Server (AMS). To date, over 70TB of data have been placed in Objectivity/DB, making it one of the largest databases in the world. Providing access to such a large quantity of data through a database server is a daunting task. A full-scale testbed environment had to be developed to tune various software parameters and a fundamental change had to occur in the AMS architecture to allow it to scale past several hundred terabytes of data. Additionally, several protocol extensions had to be implemented to provide practical access to large quantities of data. This paper will describe the design of the database and the changes that we needed to make in the AMS for scalability reasons and how the lessons we learned would be applicable to virtually any kind of database server seeking to operate in the Petabyte region.

  20. Mastering Microsoft Windows Server 2008 R2

    CERN Document Server

    Minasi, Mark; Finn, Aidan

    2010-01-01

    The one book you absolutely need to get up and running with Windows Server 2008 R2. One of the world's leading Windows authorities and top-selling author Mark Minasi explores every nook and cranny of the latest version of Microsoft's flagship network operating system, Windows Server 2008 R2, giving you the most in-depth coverage in any book on the market.: Focuses on Windows Windows Server 2008 R2, the newest version of Microsoft's Windows' server line of operating system, and the ideal server for new Windows 7 clients; Author Mark Minasi is one of the world's leading Windows authorities and h

  1. Professional Microsoft SQL Server 2012 Administration

    CERN Document Server

    Jorgensen, Adam; LoForte, Ross; Knight, Brian

    2012-01-01

    An essential how-to guide for experienced DBAs on the most significant product release since 2005! Microsoft SQL Server 2012 will have major changes throughout the SQL Server and will impact how DBAs administer the database. With this book, a team of well-known SQL Server experts introduces the many new features of the most recent version of SQL Server and deciphers how these changes will affect the methods that administrators have been using for years. Loaded with unique tips, tricks, and workarounds for handling the most difficult SQL Server admin issues, this how-to guide deciphers topics s

  2. Multimedia medical data archive and retrieval server on the Internet

    Science.gov (United States)

    Komo, Darmadi; Levine, Betty A.; Freedman, Matthew T.; Mun, Seong K.; Tang, Y. K.; Chiang, Ted T.

    1997-05-01

    The Multimedia Medical Data Archive and Retrieval Server has been installed at the imaging science and information systems (ISIS) center in Georgetown University Medical Center to provide medical data archive and retrieval support for medical researchers. The medical data includes text, images, sound, and video. All medical data is keyword indexed using a database management system and placed temporarily in a staging area and then transferred to a StorageTek one terabyte tape library system with a robotic arm for permanent archive. There are two methods of interaction with the system. The first method is to use a web browser with HTML functions to perform insert, query, update, and retrieve operations. These generate dynamic SQL calls to the database and produce StorageTek API calls to the tape library. The HTML functions consist of a database, StorageTek interface, HTTP server, common gateway interface, and Java programs. The second method is to issue a DICOM store command, which is translated by the system's DICOM server to SQL calls and then produce StorageTek API calls to the tape library. The system performs as both an Internet and a DICOM server using standard protocols such as HTTP, HTML, Java, and DICOM. Users with proper authentication can log on to the server from anywhere on the Internet using a standard web browser resulting in a user-friendly, open environment, and platform independent solution for archiving multimedia medical data. It represents a complex integration of different components including a robotic tape storage system, database, user-interface, WWW protocols, and TCP/IP networking. The user will only deal with the WWW and DICOM server components of the system, the database and robotic tape library system are transparent and the user will not know that the medical data is stored on magnetic tapes. The server provides the researchers a cost-effective tool for archiving and retrieving medical data across a TCP/IP network environment. It will

  3. Using Cluster Analysis for Data Mining in Educational Technology Research

    Science.gov (United States)

    Antonenko, Pavlo D.; Toy, Serkan; Niederhauser, Dale S.

    2012-01-01

    Cluster analysis is a group of statistical methods that has great potential for analyzing the vast amounts of web server-log data to understand student learning from hyperlinked information resources. In this methodological paper we provide an introduction to cluster analysis for educational technology researchers and illustrate its use through…

  4. Application of Well Log Analysis to Assess the Petrophysical Parameters of the Early Eocene Sui Main Limestone (SML in Kharnhak-1 Well, Middle Indus Basin, Pakistan

    Directory of Open Access Journals (Sweden)

    Asad Zia

    2016-04-01

    Full Text Available The petrophysical analysis of the early Eocene Sui Main Limestone (SML has been conducted in Kharnhak-1 well for the prospect of the hydrocarbon exploration of the Khairpur-Jacobabad High, Middle Indus Basin, Pakistan. The petrophysical analysis of SML is carried out on the basis of well logs including gamma ray, spontaneous potential, resistivity, neutron, and density logs. These analyses lead to interpreting the vertical distribution of porosity and permeability in order to measure the reservoir potential of the SML. The Archie equation was used to assess the petrophysical characteristics. The SML has good porosity and poor permeability with positive correlation coefficient between the two parameters. The average volume of shale is 18%. The log signature of SML shows dominance of carbonates (limestone. The reservoir quality of the SML in Kharnhak-1 well is such that it is 77% water saturated. The porosity (x varies inversely with formation resistivity factor (F and compressional wave velocity (Vp. However, F and Vp are directly related with each other. Thus, the electric and elastic properties of the carbonate rocks can be influenced by postdepositional alterations, which include porosity enhancement and reduction processes respectively.

  5. Amino Acid Interaction (INTAA) web server.

    Science.gov (United States)

    Galgonek, Jakub; Vymetal, Jirí; Jakubec, David; Vondrášek, Jirí

    2017-07-03

    Large biomolecules-proteins and nucleic acids-are composed of building blocks which define their identity, properties and binding capabilities. In order to shed light on the energetic side of interactions of amino acids between themselves and with deoxyribonucleotides, we present the Amino Acid Interaction web server (http://bioinfo.uochb.cas.cz/INTAA/). INTAA offers the calculation of the residue Interaction Energy Matrix for any protein structure (deposited in Protein Data Bank or submitted by the user) and a comprehensive analysis of the interfaces in protein-DNA complexes. The Interaction Energy Matrix web application aims to identify key residues within protein structures which contribute significantly to the stability of the protein. The application provides an interactive user interface enhanced by 3D structure viewer for efficient visualization of pairwise and net interaction energies of individual amino acids, side chains and backbones. The protein-DNA interaction analysis part of the web server allows the user to view the relative abundance of various configurations of amino acid-deoxyribonucleotide pairs found at the protein-DNA interface and the interaction energies corresponding to these configurations calculated using a molecular mechanical force field. The effects of the sugar-phosphate moiety and of the dielectric properties of the solvent on the interaction energies can be studied for the various configurations. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  6. A Universal Logging System for LHCb Online

    International Nuclear Information System (INIS)

    Nikolaidis, Fotis; Brarda, Loic; Garnier, Jean-Christophe; Neufeld, Niko

    2011-01-01

    A log is recording of system's activity, aimed to help system administrator to traceback an attack, find the causes of a malfunction and generally with troubleshooting. The fact that logs are the only information an administrator may have for an incident, makes logging system a crucial part of an IT infrastructure. In large scale infrastructures, such as LHCb Online, where quite a few GB of logs are produced daily, it is impossible for a human to review all of these logs. Moreover, a great percentage of them as just n oise . That makes clear that a more automated and sophisticated approach is needed. In this paper, we present a low-cost centralized logging system which allow us to do in-depth analysis of every log.

  7. Nova Event Logging System

    International Nuclear Information System (INIS)

    Calliger, R.J.; Suski, G.J.

    1981-01-01

    Nova is a 200 terawatt, 10-beam High Energy Glass Laser currently under construction at LLNL. This facility, designed to demonstrate the feasibility of laser driven inertial confinement fusion, contains over 5000 elements requiring coordinated control, data acquisition, and analysis functions. The large amounts of data that will be generated must be maintained over the life of the facility. Often the most useful but inaccessible data is that related to time dependent events associated with, for example, operator actions or experiment activity. We have developed an Event Logging System to synchronously record, maintain, and analyze, in part, this data. We see the system as being particularly useful to the physics and engineering staffs of medium and large facilities in that it is entirely separate from experimental apparatus and control devices. The design criteria, implementation, use, and benefits of such a system will be discussed

  8. Querying Workflow Logs

    Directory of Open Access Journals (Sweden)

    Yan Tang

    2018-01-01

    Full Text Available A business process or workflow is an assembly of tasks that accomplishes a business goal. Business process management is the study of the design, configuration/implementation, enactment and monitoring, analysis, and re-design of workflows. The traditional methodology for the re-design and improvement of workflows relies on the well-known sequence of extract, transform, and load (ETL, data/process warehousing, and online analytical processing (OLAP tools. In this paper, we study the ad hoc queryiny of process enactments for (data-centric business processes, bypassing the traditional methodology for more flexibility in querying. We develop an algebraic query language based on “incident patterns” with four operators inspired from Business Process Model and Notation (BPMN representation, allowing the user to formulate ad hoc queries directly over workflow logs. A formal semantics of this query language, a preliminary query evaluation algorithm, and a group of elementary properties of the operators are provided.

  9. Assessing the quality of proton PBS treatment delivery using machine log files: comprehensive analysis of clinical treatments delivered at PSI Gantry 2

    International Nuclear Information System (INIS)

    Scandurra, D; Albertini, F; Van der Meer, R; Meier, G; Weber, D C; Bolsi, A; Lomax, A

    2016-01-01

    Pencil beam scanning (PBS) proton therapy requires the delivery of many thousand proton beams, each modulated for position, energy and monitor units, to provide a highly conformal patient treatment. The quality of the treatment is dependent on the delivery accuracy of each beam and at each fraction. In this work we describe the use of treatment log files, which are a record of the machine parameters for a given field delivery on a given fraction, to investigate the integrity of treatment delivery compared to the nominal planned dose. The dosimetry-relevant log file parameters are used to reconstruct the 3D dose distribution on the patient anatomy, using a TPS-independent dose calculation system. The analysis was performed for patients treated at Paul Scherrer Institute on Gantry 2, both for individual fields and per series (or plan), and delivery quality was assessed by determining the percentage of voxels in the log file dose distribution within  +/−  1% of the nominal dose. It was seen that, for all series delivered, the mean pass rate is 96.4%. Furthermore, this work establishes a correlation between the delivery quality of a field and the beam position accuracy. This correlation is evident for all delivered fields regardless of individual patient or plan characteristics. We have also detailed further usefulness of log file analysis within our clinical workflow. In summary, we have highlighted that the integrity of PBS treatment delivery is dependent on daily machine performance and is specifically highly correlated with the accuracy of beam position. We believe this information will be useful for driving machine performance improvements in the PBS field. (paper)

  10. Analysis of the magnetic susceptibility well log in drill hole UE25a-5, Yucca Mountain, Nevada Test Site

    International Nuclear Information System (INIS)

    Hagstrum, J.T.; Daniels, J.J.; Scott, J.H.

    1980-01-01

    Magnetic susceptibility measurements have been shown to be dependent upon the magnetite content of rocks with variations in rock susceptibility arising from changes in the shape, size, composition, and quantity of the contained magnetite grains. The present study was undertaken to determine the factor(s) responsible for the variation in magnetic susceptibility measurements from borehole UE25a-5 on the Nevada Test Site (NTS). The well logs and sample analyses presented in this paper form part of a larger geophysical well-logging project studying the physical properties of welded tuffs at NTS. The ash-flow sheets at NTS appear to be the products of single compositionally zoned magmas that tend, within a cooling unit, to erupt hotter, more mafic, and more crystal-rich with time. These factors, however, have little effect on the degree to which the tuffs become welded. Furthermore, zones of crystallization and alteration are superimposed upon the welded units. X-ray data show poor correspondence between the relative abundance of magnetite in a sample and the borehole magnetic susceptibility measurement associated with it. Curie balance experiments demonstrate no change in the magnetic mineralogy that could account for the susceptibility variation. Thin-section observations corroborate the x-ray data, but indicate a proportional relationship between the borehole susceptibility measurements and the grain-size distribution of magnetite. The association of magnetic susceptibility anomalies with the crystal-rich zones of the welded tuffs will aid in the identification and correlation of the eruptive sequences at NTS

  11. The weighted 2-server problem

    Czech Academy of Sciences Publication Activity Database

    Chrobak, M.; Sgall, Jiří

    2004-01-01

    Roč. 324, 2-3 (2004), s. 289-319 ISSN 0304-3975 R&D Projects: GA MŠk ME 103; GA MŠk ME 476; GA ČR GA201/01/1195; GA MŠk LN00A056; GA AV ČR IAA1019901; GA AV ČR IAA1019401 Institutional research plan: CEZ:AV0Z1019905 Keywords : online algorithms * k- server problem Subject RIV: BA - General Mathematics Impact factor: 0.676, year: 2004

  12. Measuring SIP proxy server performance

    CERN Document Server

    Subramanian, Sureshkumar V

    2013-01-01

    Internet Protocol (IP) telephony is an alternative to the traditional Public Switched Telephone Networks (PSTN), and the Session Initiation Protocol (SIP) is quickly becoming a popular signaling protocol for VoIP-based applications. SIP is a peer-to-peer multimedia signaling protocol standardized by the Internet Engineering Task Force (IETF), and it plays a vital role in providing IP telephony services through its use of the SIP Proxy Server (SPS), a software application that provides call routing services by parsing and forwarding all the incoming SIP packets in an IP telephony network.SIP Pr

  13. Virtualisasi Server Sederhana Menggunakan Proxmox

    Directory of Open Access Journals (Sweden)

    Teguh Prasandy

    2015-05-01

    Penggunaan proxmox sebagai virtual server bahwa proxmox menyediakan sebuah desktop local dan beberapa node. Di dalam node tersebut sistem operasi akan diinstal sesuai dengan kebutuhan dari user. Routing IP supaya sistem operasi yang berada di dalam proxmox dapat terkoneksi ke internet dengan IP Desktop Virtual box 192.168.56.102 IP ini digunakan sebagai gateway proxmox dan sistem operasi di dalamnya, IP Proxmox 192.168.56.105, IP Linux Ubuntu 192.168.56.109, IP Linux Debian 192.168.56.108.

  14. Logging Work Injuries in Appalachia

    Science.gov (United States)

    Charles H. Wolf; Gilbert P. Dempsey

    1978-01-01

    Logging accidents are costly. They may bring pain to injured workers, hardship to their families, and higher insurance premiums and lower productivity to their employers. Our analysis of 1,172 injuries in central Appalachia reveals that nearly half of all time lost-and almost all fatalities-resulted from accidents during felling and unloading. The largest proportion of...

  15. Log files for testing usability

    NARCIS (Netherlands)

    Klein Teeselink, G.; Siepe, A.H.M.; Pijper, de J.R.

    1999-01-01

    The aim of this study is to gain insight in the usefulness of log file analysis as a method to evaluate the usability of individual interface components and their influence on the usability of the overall user interface. We selected a music player as application, with four different interfaces and

  16. Radiometric well logging instruments

    International Nuclear Information System (INIS)

    Davydov, A.V.

    1975-01-01

    The technical properties of well instruments for radioactive logging used in the radiometric logging complexes PKS-1000-1 (''Sond-1'') and PRKS-2 (''Vitok-2'') are described. The main features of the electric circuit of the measuring channels are given

  17. Power to the logs!

    CERN Multimedia

    CERN. Geneva; MACMAHON, Joseph

    2015-01-01

    Are you tired of using grep, vi and emacs to read your logs? Do you feel like you’re missing the big picture? Does the word "statistics" put a smile on your face? Then it’s time to give power to the logs!

  18. Clinical Accuracy of the Respiratory Tumor Tracking System of the CyberKnife: Assessment by Analysis of Log Files

    International Nuclear Information System (INIS)

    Hoogeman, Mischa; Prevost, Jean-Briac; Nuyttens, Joost; Poell, Johan; Levendag, Peter; Heijmen, Ben

    2009-01-01

    Purpose: To quantify the clinical accuracy of the respiratory motion tracking system of the CyberKnife treatment device. Methods and Materials: Data in log files of 44 lung cancer patients treated with tumor tracking were analyzed. Errors in the correlation model, which relates the internal target motion with the external breathing motion, were quantified. The correlation model error was compared with the geometric error obtained when no respiratory tracking was used. Errors in the prediction method were calculated by subtracting the predicted position from the actual measured position after 192.5 ms (the time lag to prediction in our current system). The prediction error was also measured for a time lag of 115 ms and a new prediction method. Results: The mean correlation model errors were less than 0.3 mm. Standard deviations describing intrafraction variations around the whole-fraction mean error were 0.2 to 1.9 mm for cranio-caudal, 0.1 to 1.9 mm for left-right, and 0.2 to 2.5 mm for anterior-posterior directions. Without the use of respiratory tracking, these variations would have been 0.2 to 8.1 mm, 0.2 to 5.5 mm, and 0.2 to 4.4 mm. The overall mean prediction error was small (0.0 ± 0.0 mm) for all directions. The intrafraction standard deviation ranged from 0.0 to 2.9 mm for a time delay of 192.5 ms but was halved by using the new prediction method. Conclusions: Analyses of the log files of real clinical cases have shown that the geometric error caused by respiratory motion is substantially reduced by the application of respiratory motion tracking.

  19. 0 + 5 Vascular Surgery Residents' Operative Experience in General Surgery: An Analysis of Operative Logs from 12 Integrated Programs.

    Science.gov (United States)

    Smith, Brigitte K; Kang, P Chulhi; McAninch, Chris; Leverson, Glen; Sullivan, Sarah; Mitchell, Erica L

    2016-01-01

    Integrated (0 + 5) vascular surgery (VS) residency programs must include 24 months of training in core general surgery. The Accreditation Council for Graduate Medical Education currently does not require specific case numbers in general surgery for 0 + 5 trainees; however, program directors have structured this time to optimize operative experience. The aim of this study is to determine the case volume and type of cases that VS residents are exposed to during their core surgery training. Accreditation council for graduate medical education operative logs for current 0 + 5 VS residents were obtained and retrospectively reviewed to determine general surgery case volume and distribution between open and laparoscopic cases performed. Standard statistical methods were applied. A total of 12 integrated VS residency programs provided operative case logs for current residents. A total of 41 integrated VS residents in clinical years 2 through 5. During the postgraduate year-1 training year, residents participated in significantly more open than laparoscopic general surgery cases (p surgery cases are hernia repair (20%), skin and soft tissue (7.4%), and breast (6.3%). Residents in programs with core surgery over 3 years participated in significantly more general surgery operations compared with residents in programs with core surgery spread out over 4 years (p = 0.035). 0 + 5 VS residents perform significantly more open operations than laparoscopic operations during their core surgery training. The majority of these operations are minor, nonabdominal procedures. The 0 + 5 VS residency program general surgery operative training requirements should be reevaluated and case minimums defined. The general surgery training component of 0 + 5 VS residencies may need to be restructured to meet the needs of current and future trainees. Copyright © 2016 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  20. HDF-EOS Web Server

    Science.gov (United States)

    Ullman, Richard; Bane, Bob; Yang, Jingli

    2008-01-01

    A shell script has been written as a means of automatically making HDF-EOS-formatted data sets available via the World Wide Web. ("HDF-EOS" and variants thereof are defined in the first of the two immediately preceding articles.) The shell script chains together some software tools developed by the Data Usability Group at Goddard Space Flight Center to perform the following actions: Extract metadata in Object Definition Language (ODL) from an HDF-EOS file, Convert the metadata from ODL to Extensible Markup Language (XML), Reformat the XML metadata into human-readable Hypertext Markup Language (HTML), Publish the HTML metadata and the original HDF-EOS file to a Web server and an Open-source Project for a Network Data Access Protocol (OPeN-DAP) server computer, and Reformat the XML metadata and submit the resulting file to the EOS Clearinghouse, which is a Web-based metadata clearinghouse that facilitates searching for, and exchange of, Earth-Science data.

  1. CERN servers donated to Ghana

    CERN Multimedia

    CERN Bulletin

    2012-01-01

    Cutting-edge research requires a constantly high performance of the computing equipment. At the CERN Computing Centre, computers typically need to be replaced after about four years of use. However, while servers may be withdrawn from cutting-edge use, they are still good for other uses elsewhere. This week, 220 servers and 30 routers were donated to the Kwame Nkrumah University of Science and Technology (KNUST) in Ghana.   “KNUST will provide a good home for these computers. The university has also developed a plan for using them to develop scientific collaboration with CERN,” said John Ellis, a professor at King’s College London and a visiting professor in CERN’s Theory Group.  John Ellis was heavily involved in building the relationship with Ghana, which started in 2006 when a Ghanaian participated in the CERN openlab student programme. Since 2007 CERN has hosted Ghanaians especially from KNUST in the framework of the CERN Summer Student Progr...

  2. Home media server content management

    Science.gov (United States)

    Tokmakoff, Andrew A.; van Vliet, Harry

    2001-07-01

    With the advent of set-top boxes, the convergence of TV (broadcasting) and PC (Internet) is set to enter the home environment. Currently, a great deal of activity is occurring in developing standards (TV-Anytime Forum) and devices (TiVo) for local storage on Home Media Servers (HMS). These devices lie at the heart of convergence of the triad: communications/networks - content/media - computing/software. Besides massive storage capacity and being a communications 'gateway', the home media server is characterised by the ability to handle metadata and software that provides an easy to use on-screen interface and intelligent search/content handling facilities. In this paper, we describe a research prototype HMS that is being developed within the GigaCE project at the Telematica Instituut . Our prototype demonstrates advanced search and retrieval (video browsing), adaptive user profiling and an innovative 3D component of the Electronic Program Guide (EPG) which represents online presence. We discuss the use of MPEG-7 for representing metadata, the use of MPEG-21 working draft standards for content identification, description and rights expression, and the use of HMS peer-to-peer content distribution approaches. Finally, we outline explorative user behaviour experiments that aim to investigate the effectiveness of the prototype HMS during development.

  3. The log S -log N distribution of gamma ray brust

    International Nuclear Information System (INIS)

    Yamagami, Takamasa; Nishimura, Jun; Fujii, Masami

    1982-01-01

    The relation between the size S and the frequency N of gamma ray burst has been studied. This relation may be determined from the celestial distribution of gamma ray burst sources. The present analysis gives that the log S - log N relation for any direction is determined by the celestial distribution of gamma ray burst sources. The observed bursts were analyzed. The celestial distribution of gamma ray burst sources was observed by the satellites of USSR. The results showed that the distribution seemed to be isotropic. However, the calculated log S - log N relation based on the isotropic distribution wasF in disagreement with the observed ones. As the result of analysis, it was found that the observed bursts missed low energy part because of the threshold of detectors. The levels of discrimination of detection were not clear. When a proper threshold level is set for each type of burst, and the size of bursts is determined, the above mentioned discrepancy will be deleted regardless of luminosity and the spatial distribution of bursts. (Kato, T.)

  4. Passive Detection of Misbehaving Name Servers

    Science.gov (United States)

    2013-10-01

    name servers that changed IP address five or more times in a month. Solid red line indicates those servers possibly linked to pharmaceutical scams . 12...malicious and stated that fast-flux hosting “is considered one of the most serious threats to online activities today” [ICANN 2008, p. 2]. The...that time, apparently independent of filters on name-server flux, a large number of pharmaceutical scams1 were taken down. These scams apparently

  5. Lenovo acquires IBM's x86 low-end server business

    Directory of Open Access Journals (Sweden)

    Singh Pal Netra

    2015-01-01

    Full Text Available This paper presents an analysis of the key events, impacts and issues of Lenovo buying IBM's x86 low-end server business. The analysis include (i approval of the deal by regulatory bodies in the United States, Canada, India and China, (ii security concerns of US government departments, (iii pricing of the deals, (iv possible impact on IBM in future, and (v possibilities of Lenovo making it repeat of acquiring ThinkPad business of IBM. The paper presents analysis of qualitative and time series quantitative data. The qualitative data are mainly consists of different events before and after the acquisition of x86 server IBM business by Lenovo. The quantitative data are analyzed with respect to growth parameters of overall server business and Lenovo server business. Research paper also attempts to find out answer to specific 9 research questions with respect to impact on eco-systems of IBM and Lenovo. Based on analysis, it is inferred that IBM is not able to manage its traditional & well accepted products business in the face of fierce competition & low demand but Lenovo will manage. The deal was a financial necessity for IBM and strategic expansion in to new markets strategy for Lenovo.

  6. The SQL Server Database for Non Computer Professional Teaching Reform

    Science.gov (United States)

    Liu, Xiangwei

    2012-01-01

    A summary of the teaching methods of the non-computer professional SQL Server database, analyzes the current situation of the teaching course. According to non computer professional curriculum teaching characteristic, put forward some teaching reform methods, and put it into practice, improve the students' analysis ability, practice ability and…

  7. Mastering Microsoft Windows Small Business Server 2008

    CERN Document Server

    Johnson, Steven

    2010-01-01

    A complete, winning approach to the number one small business solution. Do you have 75 or fewer users or devices on your small-business network? Find out how to integrate everything you need for your mini-enterprise with Microsoft's new Windows Server 2008 Small Business Server, a custom collection of server and management technologies designed to help small operations run smoothly without a giant IT department. This comprehensive guide shows you how to master all SBS components as well as handle integration with other Microsoft technologies.: Focuses on Windows Server 2008 Small Business Serv

  8. Mastering Windows Server 2008 Networking Foundations

    CERN Document Server

    Minasi, Mark; Mueller, John Paul

    2011-01-01

    Find in-depth coverage of general networking concepts and basic instruction on Windows Server 2008 installation and management including active directory, DNS, Windows storage, and TCP/IP and IPv4 networking basics in Mastering Windows Server 2008 Networking Foundations. One of three new books by best-selling author Mark Minasi, this guide explains what servers do, how basic networking works (IP basics and DNS/WINS basics), and the fundamentals of the under-the-hood technologies that support staff must understand. Learn how to install Windows Server 2008 and build a simple network, security co

  9. National Medical Terminology Server in Korea

    Science.gov (United States)

    Lee, Sungin; Song, Seung-Jae; Koh, Soonjeong; Lee, Soo Kyoung; Kim, Hong-Gee

    Interoperable EHR (Electronic Health Record) necessitates at least the use of standardized medical terminologies. This paper describes a medical terminology server, LexCare Suite, which houses terminology management applications, such as a terminology editor, and a terminology repository populated with international standard terminology systems such as Systematized Nomenclature of Medicine (SNOMED). The server is to satisfy the needs of quality terminology systems to local primary to tertiary hospitals. Our partner general hospitals have used the server to test its applicability. This paper describes the server and the results of the applicability test.

  10. Microsoft Windows Server 2012 administration instant reference

    CERN Document Server

    Hester, Matthew

    2013-01-01

    Fast, accurate answers for common Windows Server questions Serving as a perfect companion to all Windows Server books, this reference provides you with quick and easily searchable solutions to day-to-day challenges of Microsoft's newest version of Windows Server. Using helpful design features such as thumb tabs, tables of contents, and special heading treatments, this resource boasts a smooth and seamless approach to finding information. Plus, quick-reference tables and lists provide additional on-the-spot answers. Covers such key topics as server roles and functionality, u

  11. Designing a scalable video-on-demand server with data sharing

    Science.gov (United States)

    Lim, Hyeran; Du, David H. C.

    2001-01-01

    As current disk space and transfer speed increase, the bandwidth between a server and its disks has become critical for video-on-demand (VOD) services. Our VOD server consists of several hosts sharing data on disks through a ring-based network. Data sharing provided by the spatial-reuse ring network between servers and disks not only increases the utilization towards full bandwidth but also improves the availability of videos. Striping and replication methods are introduced in order to improve the efficiency of our VOD server system as well as the availability of videos. We consider tow kinds of resources of a VOD server system. Given a representative access profile, our intention is to propose an algorithm to find an initial condition, place videos on disks in the system successfully. If any copy of a video cannot be placed due to lack of resources, more servers/disks are added. When all videos are place on the disks by our algorithm, the final configuration is determined with indicator of how tolerable it is against the fluctuation in demand of videos. Considering it is a NP-hard problem, our algorithm generates the final configuration with O(M log M) at best, where M is the number of movies.

  12. Digital mineral logging system

    International Nuclear Information System (INIS)

    West, J.B.

    1980-01-01

    A digital mineral logging system acquires data from a mineral logging tool passing through a borehole and transmits the data uphole to an electronic digital signal processor. A predetermined combination of sensors, including a deviometer, is located in a logging tool for the acquisition of the desired data as the logging tool is raised from the borehole. Sensor data in analog format is converted in the logging tool to a digital format and periodically batch transmitted to the surface at a predetermined sampling rate. An identification code is provided for each mineral logging tool, and the code is transmitted to the surface along with the sensor data. The self-identifying tool code is transmitted to the digital signal processor to identify the code against a stored list of the range of numbers assigned to that type of tool. The data is transmitted up the d-c power lines of the tool by a frequency shift key transmission technique. At the surface, a frequency shift key demodulation unit transmits the decoupled data to an asynchronous receiver interfaced to the electronic digital signal processor. During a recording phase, the signals from the logging tool are read by the electronic digital signal processor and stored for later processing. During a calculating phase, the stored data is processed by the digital signal processor and the results are outputted to a printer or plotter, or both

  13. Uncertainty analysis of the radiological characteristics of radioactive waste using a method based on log-normal distributions

    International Nuclear Information System (INIS)

    Gigase, Yves

    2007-01-01

    Available in abstract form only. Full text of publication follows: The uncertainty on characteristics of radioactive LILW waste packages is difficult to determine and often very large. This results from a lack of knowledge of the constitution of the waste package and of the composition of the radioactive sources inside. To calculate a quantitative estimate of the uncertainty on a characteristic of a waste package one has to combine these various uncertainties. This paper discusses an approach to this problem, based on the use of the log-normal distribution, which is both elegant and easy to use. It can provide as example quantitative estimates of uncertainty intervals that 'make sense'. The purpose is to develop a pragmatic approach that can be integrated into existing characterization methods. In this paper we show how our method can be applied to the scaling factor method. We also explain how it can be used when estimating other more complex characteristics such as the total uncertainty of a collection of waste packages. This method could have applications in radioactive waste management, more in particular in those decision processes where the uncertainty on the amount of activity is considered to be important such as in probability risk assessment or the definition of criteria for acceptance or categorization. (author)

  14. Well log and seismic data analysis for complex pore-structure carbonate reservoir using 3D rock physics templates

    Science.gov (United States)

    Li, Hongbing; Zhang, Jiajia

    2018-04-01

    The pore structure in heterogeneous carbonate rock is usually very complex. This complex pore system makes the relationship between the velocity and porosity of the rock highly scattered, so that for the classical two-dimensional rock physics template (2D RPT) it is not enough to accurately describe the quantitative relationship between the rock elastic parameters of this kind of reservoir and its porosity and water saturation. Therefore it is possible to attribute the effect of pore type to that of the porosity or water saturation, and leads to great deviations when applying such a 2D RPT to predict the porosity and water saturation in seismic reservoir prediction and hydrocarbon detection. This paper first presents a method to establish a new three-dimensional rock physics template (3D RPT) by integrating the Gassmann equations and the porous rock physics model, and use it to characterize the quantitative relation between rock elastic properties and the reservoir parameters including the pore aspect ratio, porosity and water saturation, and to predict these parameters from the known elastic properties. The test results on the real logging and seismic inversion data show that the 3D RPT can accurately describe the variations of elastic properties with the porosity, water saturation and pore-structure parameters, and effectively improve the accuracy of reservoir parameters prediction.

  15. Causal analysis of self-sustaining processes in the log-layer of wall-bounded turbulence

    Science.gov (United States)

    Lozano-Duran, Adrian; Bae, Hyunji Jane

    2017-11-01

    Despite the large amount of information provided by direct numerical simulations of turbulent flows, the underlying dynamics remain elusive even in the most simple and canonical configurations. Most standard methods used to investigate turbulence do not provide a clear causal inference between events, which is necessary to determine this dynamics, particularly in self-sustaning processes. In the present work, we examine the causal interactions between streaks and rolls in the logarithmic layer of minimal turbulent channel flow. Causality between structures is assessed in a non-intrusive manner by transfer entropy, i.e., how much the uncertainty of one structure is reduced by knowing the past states of the others. Streaks are represented by the first Fourier modes of the streamwise velocity, while rolls are defined by the wall-normal and spanwise velocities. The results show that the process is mainly unidirectional rather than cyclic, and that the log-layer motions are sustained by extracting energy from the mean shear, which controls the dynamics and time-scales. The well-known lift-up effect is shown to be not a key ingredient in the causal network between shear, streaks and rolls. Funded by ERC Coturb Madrid Summer Program.

  16. Essential Mac OS X panther server administration integrating Mac OS X server into heterogeneous networks

    CERN Document Server

    Bartosh, Michael

    2004-01-01

    If you've ever wondered how to safely manipulate Mac OS X Panther Server's many underlying configuration files or needed to explain AFP permission mapping--this book's for you. From the command line to Apple's graphical tools, the book provides insight into this powerful server software. Topics covered include installation, deployment, server management, web application services, data gathering, and more

  17. The ASDEX Upgrade Parameter Server

    Energy Technology Data Exchange (ETDEWEB)

    Neu, Gregor, E-mail: gregor.neu@ipp.mpg.de [Max-Planck-Institut für Plasmaphysik, Boltzmannstr. 2, 85748 Garching (Germany); Cole, Richard [Unlimited Computer Systems, Seeshaupter Str. 15, 82393 Iffeldorf (Germany); Gräter, Alex [Max-Planck-Institut für Plasmaphysik, Boltzmannstr. 2, 85748 Garching (Germany); Lüddecke, Klaus [Unlimited Computer Systems, Seeshaupter Str. 15, 82393 Iffeldorf (Germany); Rapson, Christopher J.; Raupp, Gerhard; Treutterer, Wolfgang; Zasche, Dietrich; Zehetbauer, Thomas [Max-Planck-Institut für Plasmaphysik, Boltzmannstr. 2, 85748 Garching (Germany)

    2015-10-15

    Highlights: • We describe our main tool in the plasma control configuration process. • Parameter access and computation are configurable with XML files. • Simple implementation of in situ tests by rerouting requests to test data. • Pulse specific overriding of parameters. - Abstract: Concepts for the configuration of plant systems and plasma control of modern devices such as ITER and W7-X are based on global data structures, or “pulse schedules” or “experiment programs”, which specify all physics characteristics (waveforms for controlled actuators and plasma quantities) and all technical characteristics of the plant systems (diagnostics and actuators operation settings) for a planned pulse. At ASDEX Upgrade we use different approach. We observed that the physics characteristics driving the discharge control system (DCS) are frequently modified on a pulse-to-pulse basis. Plant system operation, however, relies on technical standard settings, or “basic configurations” to provide guaranteed resources or services, which evolve according to longer term session or campaign operation schedules. This is why AUG manages technical configuration items separately from physics items. Consistent computation of the DCS configuration requires access to all this physics and technical data, which include the discharge programme (DP), settings of actuator systems and real-time diagnostics, the current system state and a database of static parameters. A Parameter Server provides a unified view on all these parameter sets and acts as the central point of access. We describe the functionality and architecture of the Parameter Server and its embedding into the control environment.

  18. Mariners Weather Log

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Mariners Weather Log (MWL) is a publication containing articles, news and information about marine weather events and phenomena, worldwide environmental impact...

  19. Log-inject-log in sand consolidation

    International Nuclear Information System (INIS)

    Murphy, R.P.; Spurlock, J.W.

    1977-01-01

    A method is described for gathering information for the determination of the adequacy of placement of sand consolidating plastic for sand control in oil and gas wells. The method uses a high neutron cross-section tracer which becomes part of the plastic and uses pulsed neutron logging before and after injection of the plastic. Preferably, the method uses lithium, boron, indium, and/or cadmium tracers. Boron oxide is especially useful and can be dissolved in alcohol and mixed with the plastic ingredients

  20. Elephant logging and environment

    International Nuclear Information System (INIS)

    Tin-Aung-Hla

    1995-01-01

    The natural environment comprises non-biological elements such as air, water, light, heat and biological elements of animal and plant life; all interact with each other to create an ecosystem. Human activities like over-exploitation of forest results in deforestation and desertification. This consequently changes ecological balance. Topics on: (1) history of elephants utilization; (2) elephant logging; (3) classification of elephants; (4) dragging gear; (5) elephant power; (6) elephant logging and environment, are discussed

  1. A polling model with an autonomous server

    NARCIS (Netherlands)

    de Haan, Roland; Boucherie, Richardus J.; van Ommeren, Jan C.W.

    2009-01-01

    This paper considers polling systems with an autonomous server that remain at a queue for an exponential amount of time before moving to a next queue incurring a generally distributed switch-over time. The server remains at a queue until the exponential visit time expires, also when the queue

  2. A tandem queue with delayed server release

    NARCIS (Netherlands)

    Nawijn, W.M.

    1997-01-01

    We consider a tandem queue with two stations. The rst station is an s-server queue with Poisson arrivals and exponential service times. After terminating his service in the rst station, a customer enters the second station to require service at an exponential single server, while in the meantime he

  3. Tandem queue with server slow-down

    NARCIS (Netherlands)

    Miretskiy, D.I.; Scheinhardt, W.R.W.; Mandjes, M.R.H.

    2007-01-01

    We study how rare events happen in the standard two-node tandem Jackson queue and in a generalization, the socalled slow-down network, see [2]. In the latter model the service rate of the first server depends on the number of jobs in the second queue: the first server slows down if the amount of

  4. Personalized Pseudonyms for Servers in the Cloud

    Directory of Open Access Journals (Sweden)

    Xiao Qiuyu

    2017-10-01

    Full Text Available A considerable and growing fraction of servers, especially of web servers, is hosted in compute clouds. In this paper we opportunistically leverage this trend to improve privacy of clients from network attackers residing between the clients and the cloud: We design a system that can be deployed by the cloud operator to prevent a network adversary from determining which of the cloud’s tenant servers a client is accessing. The core innovation in our design is a PoPSiCl (pronounced “popsicle”, a persistent pseudonym for a tenant server that can be used by a single client to access the server, whose real identity is protected by the cloud from both passive and active network attackers. When instantiated for TLS-based access to web servers, our design works with all major browsers and requires no additional client-side software and minimal changes to the client user experience. Moreover, changes to tenant servers can be hidden in supporting software (operating systems and web-programming frameworks without imposing on web-content development. Perhaps most notably, our system boosts privacy with minimal impact to web-browsing performance, after some initial setup during a user’s first access to each web server.

  5. Building mail server on distributed computing system

    International Nuclear Information System (INIS)

    Akihiro Shibata; Osamu Hamada; Tomoko Oshikubo; Takashi Sasaki

    2001-01-01

    The electronic mail has become the indispensable function in daily job, and the server stability and performance are required. Using DCE and DFS we have built the distributed electronic mail sever, that is, servers such as SMTP, IMAP are distributed symmetrically, and provides the seamless access

  6. A Novel Approach for Analysis of the Log-Linear Age-Period-Cohort Model: Application to Lung Cancer Incidence

    Directory of Open Access Journals (Sweden)

    Tengiz Mdzinarishvili

    2009-12-01

    Full Text Available A simple, computationally efficient procedure for analyses of the time period and birth cohort effects on the distribution of the age-specific incidence rates of cancers is proposed. Assuming that cohort effects for neighboring cohorts are almost equal and using the Log-Linear Age-Period-Cohort Model, this procedure allows one to evaluate temporal trends and birth cohort variations of any type of cancer without prior knowledge of the hazard function. This procedure was used to estimate the influence of time period and birth cohort effects on the distribution of the age-specific incidence rates of first primary, microscopically confirmed lung cancer (LC cases from the SEER9 database. It was shown that since 1975, the time period effect coefficients for men increase up to 1980 and then decrease until 2004. For women, these coefficients increase from 1975 up to 1990 and then remain nearly constant. The LC birth cohort effect coefficients for men and women increase from the cohort of 1890–94 until the cohort of 1925–29, then decrease until the cohort of 1950–54 and then remain almost unchanged. Overall, LC incidence rates, adjusted by period and cohort effects, increase up to the age of about 72–75, turn over, and then fall after the age of 75–78. The peak of the adjusted rates in men is around the age of 77–78, while in women, it is around the age of 72–73. Therefore, these results suggest that the age distribution of the incidence rates in men and women fall at old ages.

  7. Sequence Stratigraphy of lower zones of Asmari Formation in Marun Oilfield by using of microfacies analysis, isolith maps and γ- Ray log

    Directory of Open Access Journals (Sweden)

    Jalil Jafari

    2015-01-01

    Full Text Available The Oligo- Miocene Asmari Formation is one of the most important reservoir units of the Marun Oilfield in Dezful Embayment SW Iran, deposited in Zagros foreland basin. The goal of this study is to interpret depositional environment and sequence stratigraphy of lower zones of the Asmari Formation in Well No.281, 342 and 312in Marun Oilfield based on changes in the shape of γ- Ray, isolith maps and microfacies properties. Accordingly, identification of 9 carbonate microfacies and 2 siliciclastic petrofacies were identified that are deposited in four depositional environment including open marine, barrier, lagoon and tidal flat in a homoclinal ramp (consisting of outer, middle and inner ramp. Also, based on the shape of γ- Ray log, There sediment were deposited in marine environment. In open marine and barrier environments, The shape of γ- Ray log is serrated bell-shaped, serrated funnel-shaped, left bow-shaped, serrated shape and right boxcar shape, Whole in the beach environment it is cylinder and funnel shape and in lagoon and tidal flat environment can be seen on right bow to cylinder-shaped. Based on the isolith maps, sandstone of lower zones of the Asmari Formation in Marun Oilfield expanded by deltaic system along the southwestern margin of the basin and influenced by changes in sea level constantly. Sequence stratigraphic analysis led to identification of three third- order (DS1, DS2 and DS3 depositional sequences.

  8. Sequence Stratigraphy of lower zones of Asmari Formation in Marun Oilfield by using of microfacies analysis, isolith maps and γ- Ray log

    Directory of Open Access Journals (Sweden)

    Ahmad Mirmarghabi

    2015-02-01

    Full Text Available The Oligo- Miocene Asmari Formation is one of the most important reservoir units of the Marun Oilfield in Dezful Embayment SW Iran, deposited in Zagros foreland basin. The goal of this study is to interpret depositional environment and sequence stratigraphy of lower zones of the Asmari Formation in Well No.281, 342 and 312in Marun Oilfield based on changes in the shape of γ- Ray, isolith maps and microfacies properties. Accordingly, identification of 9 carbonate microfacies and 2 siliciclastic petrofacies were identified that are deposited in four depositional environment including open marine, barrier, lagoon and tidal flat in a homoclinal ramp (consisting of outer, middle and inner ramp. Also, based on the shape of γ- Ray log, There sediment were deposited in marine environment. In open marine and barrier environments, The shape of γ- Ray log is serrated bell-shaped, serrated funnel-shaped, left bow-shaped, serrated shape and right boxcar shape, Whole in the beach environment it is cylinder and funnel shape and in lagoon and tidal flat environment can be seen on right bow to cylinder-shaped. Based on the isolith maps, sandstone of lower zones of the Asmari Formation in Marun Oilfield expanded by deltaic system along the southwestern margin of the basin and influenced by changes in sea level constantly. Sequence stratigraphic analysis led to identification of three third- order (DS1, DS2 and DS3 depositional sequences.

  9. Distributed control system for demand response by servers

    Science.gov (United States)

    Hall, Joseph Edward

    Within the broad topical designation of smart grid, research in demand response, or demand-side management, focuses on investigating possibilities for electrically powered devices to adapt their power consumption patterns to better match generation and more efficiently integrate intermittent renewable energy sources, especially wind. Devices such as battery chargers, heating and cooling systems, and computers can be controlled to change the time, duration, and magnitude of their power consumption while still meeting workload constraints such as deadlines and rate of throughput. This thesis presents a system by which a computer server, or multiple servers in a data center, can estimate the power imbalance on the electrical grid and use that information to dynamically change the power consumption as a service to the grid. Implementation on a testbed demonstrates the system with a hypothetical but realistic usage case scenario of an online video streaming service in which there are workloads with deadlines (high-priority) and workloads without deadlines (low-priority). The testbed is implemented with real servers, estimates the power imbalance from the grid frequency with real-time measurements of the live outlet, and uses a distributed, real-time algorithm to dynamically adjust the power consumption of the servers based on the frequency estimate and the throughput of video transcoder workloads. Analysis of the system explains and justifies multiple design choices, compares the significance of the system in relation to similar publications in the literature, and explores the potential impact of the system.

  10. Standby-Loss Elimination in Server Power Supply

    Directory of Open Access Journals (Sweden)

    Jong-Woo Kim

    2017-07-01

    Full Text Available In a server power system, a standby converter is required in order to provide the standby output, monitor the system’s status, and communicate with the server power system. Since these functions are always required, losses from the standby converter are produced even though the system operates in normal mode. For these reasons, the losses deteriorate the total efficiency of the system. In this paper, a new structure is proposed to eliminate the losses from the standby converter of a server power supply. The key feature of the proposed structure is that the main direct current (DC/DC converter substitutes all of the output power of the standby converter, and the standby converter is turned off in normal mode. With the proposed structure, the losses from the standby converter can be eliminated in normal mode, and this leads to a higher efficiency in overall load conditions. Although the structure has been proposed in the previous work, very important issues such as a steady state analysis, the transient responses, and how to control the standby converter are not discussed. This paper presents these issues further. The feasibility of the proposed structure has been verified with 400 V link voltage, 12 V/62.5 A main output, and a 12 V/2.1 A standby output server power system.

  11. Server for experimental data from LHD

    International Nuclear Information System (INIS)

    Emoto, M.; Ohdachi, S.; Watanabe, K.; Sudo, S.; Nagayama, Y.

    2006-01-01

    In order to unify various types of data, the Kaiseki Server was developed. This server provides physical experimental data of large helical device (LHD) experiments. Many types of data acquisition systems currently exist in operation, and they produce files of various formats. Therefore, it has been difficult to analyze different types of acquisition data at the same time because the data of each system should be read in a particular manner. To facilitate the usage of this data by researchers, the authors have developed a new server system, which provides a unified data format and a unique data retrieval interface. Although the Kaiseki Server satisfied the initial demand, new requests arose from researchers, one of which was the remote usage of the server. The current system cannot be used remotely because of security issues. Another request was group ownership, i.e., users belonging to the same group should have equal access to data. To satisfy these demands, the authors modified the server. However, since other requests may arise in the future, the new system must be flexible so that it can satisfy future demands. Therefore, the authors decided to develop a new server using a three-tier structure

  12. Log4J

    CERN Document Server

    Perry, Steven

    2009-01-01

    Log4j has been around for a while now, and it seems like so many applications use it. I've used it in my applications for years now, and I'll bet you have too. But every time I need to do something with log4j I've never done before I find myself searching for examples of how to do whatever that is, and I don't usually have much luck. I believe the reason for this is that there is a not a great deal of useful information about log4j, either in print or on the Internet. The information is too simple to be of real-world use, too complicated to be distilled quickly (which is what most developers

  13. A CDC 1700 on-line system for the analysis, data logging and monitoring of big bubble chamber pictures

    International Nuclear Information System (INIS)

    Guyonnet, J.-L.

    1975-01-01

    This work presents the analysis system of large bubble chamber such as Gargamelle, BEBC pictures realized in the heavy liquid bubble chamber group with scanning and measurement stations on-line with a CDC 1700 computer. This work deals with the general characteristics of these stations and of the computer, and puts emphasis on the conception and functions of the analysis programmes: scanning, measurement and data processing. The data acquisition system runs in a context of real time multiprogrammation [fr

  14. Mechanics of log calibration

    International Nuclear Information System (INIS)

    Waller, W.C.; Cram, M.E.; Hall, J.E.

    1975-01-01

    For any measurement to have meaning, it must be related to generally accepted standard units by a valid and specified system of comparison. To calibrate well-logging tools, sensing systems are designed which produce consistent and repeatable indications over the range for which the tool was intended. The basics of calibration theory, procedures, and calibration record presentations are reviewed. Calibrations for induction, electrical, radioactivity, and sonic logging tools will be discussed. The authors' intent is to provide an understanding of the sources of errors, of the way errors are minimized in the calibration process, and of the significance of changes in recorded calibration data

  15. Environment server. Digital field information archival technology

    International Nuclear Information System (INIS)

    Kita, Nobuyuki; Kita, Yasuyo; Yang, Hai-quan

    2002-01-01

    For the safety operation of nuclear power plants, it is important to store various information about plants for a long period and visualize those stored information as desired. The system called Environment Server is developed for realizing it. In this paper, the general concepts of Environment Server is explained and its partial implementation for archiving the image information gathered by inspection mobile robots into virtual world and visualizing them is described. An extension of Environment Server for supporting attention sharing is also briefly introduced. (author)

  16. Optimizing queries in SQL Server 2008

    Directory of Open Access Journals (Sweden)

    Ion LUNGU

    2010-05-01

    Full Text Available Starting from the need to develop efficient IT systems, we intend to review theoptimization methods and tools that can be used by SQL Server database administratorsand developers of applications based on Microsoft technology, focusing on the latestversion of the proprietary DBMS, SQL Server 2008. We’ll reflect on the objectives tobe considered in improving the performance of SQL Server instances, we will tackle themostly used techniques for analyzing and optimizing queries and we will describe the“Optimize for ad hoc workloads”, “Plan Freezing” and “Optimize for unknown" newoptions, accompanied by relevant code examples.

  17. Personalized Pseudonyms for Servers in the Cloud

    OpenAIRE

    Xiao Qiuyu; Reiter Michael K.; Zhang Yinqian

    2017-01-01

    A considerable and growing fraction of servers, especially of web servers, is hosted in compute clouds. In this paper we opportunistically leverage this trend to improve privacy of clients from network attackers residing between the clients and the cloud: We design a system that can be deployed by the cloud operator to prevent a network adversary from determining which of the cloud’s tenant servers a client is accessing. The core innovation in our design is a PoPSiCl (pronounced “popsicle”), ...

  18. Getting started with SQL Server 2014 administration

    CERN Document Server

    Ellis, Gethyn

    2014-01-01

    This is an easytofollow handson tutorial that includes real world examples of SQL Server 2014's new features. Each chapter is explained in a stepbystep manner which guides you to implement the new technology.If you want to create an highly efficient database server then this book is for you. This book is for database professionals and system administrators who want to use the added features of SQL Server 2014 to create a hybrid environment, which is both highly available and allows you to get the best performance from your databases.

  19. Quality assurance of geometric accuracy based on an electronic portal imaging device and log data analysis for Dynamic WaveArc irradiation.

    Science.gov (United States)

    Hirashima, Hideaki; Miyabe, Yuki; Nakamura, Mitsuhiro; Mukumoto, Nobutaka; Mizowaki, Takashi; Hiraoka, Masahiro

    2018-04-06

    The purpose of this study was to develop a simple verification method for the routine quality assurance (QA) of Dynamic WaveArc (DWA) irradiation using electronic portal imaging device (EPID) images and log data analysis. First, an automatic calibration method utilizing the outermost multileaf collimator (MLC) slits was developed to correct the misalignment between the center of the EPID and the beam axis. Moreover, to verify the detection accuracy of the MLC position according to the EPID images, various positions of the MLC with intentional errors in the range 0.1-1 mm were assessed. Second, to validate the geometric accuracy during DWA irradiation, tests were designed in consideration of three indices. Test 1 evaluated the accuracy of the MLC position. Test 2 assessed dose output consistency with variable dose rate (160-400 MU/min), gantry speed (2.2-6°/s), and ring speed (0.5-2.7°/s). Test 3 validated dose output consistency with variable values of the above parameters plus MLC speed (1.6-4.2 cm/s). All tests were delivered to the EPID and compared with those obtained using a stationary radiation beam with a 0° gantry angle. Irradiation log data were recorded simultaneously. The 0.1-mm intentional error on the MLC position could be detected by the EPID, which is smaller than the EPID pixel size. In Test 1, the MLC slit widths agreed within 0.20 mm of their exposed values. The averaged root-mean-square error (RMSE) of the dose outputs was less than 0.8% in Test 2 and Test 3. Using log data analysis in Test 3, the RMSE between the planned and recorded data was 0.1 mm, 0.12°, and 0.07° for the MLC position, gantry angle, and ring angle, respectively. The proposed method is useful for routine QA of the accuracy of DWA. © 2018 The Authors. Journal of Applied Clinical Medical Physics published by Wiley Periodicals, Inc. on behalf of American Association of Physicists in Medicine.

  20. Nuclear well logging in hydrology

    International Nuclear Information System (INIS)

    1971-01-01

    hydrologists trained in the techniques of borehole logging and log analysis. In their report to the IHD Coordinating Council, the fourth session of the Working Group proposed that a technical document be prepared to summarize the status of nuclear logging in hydrology. This report is intended to fulfil that proposal and to meet the need, insofar as is possible at present (1970), for coordinated information on the subject

  1. Improving consensus contact prediction via server correlation reduction.

    Science.gov (United States)

    Gao, Xin; Bu, Dongbo; Xu, Jinbo; Li, Ming

    2009-05-06

    Protein inter-residue contacts play a crucial role in the determination and prediction of protein structures. Previous studies on contact prediction indicate that although template-based consensus methods outperform sequence-based methods on targets with typical templates, such consensus methods perform poorly on new fold targets. However, we find out that even for new fold targets, the models generated by threading programs can contain many true contacts. The challenge is how to identify them. In this paper, we develop an integer linear programming model for consensus contact prediction. In contrast to the simple majority voting method assuming that all the individual servers are equally important and independent, the newly developed method evaluates their correlation by using maximum likelihood estimation and extracts independent latent servers from them by using principal component analysis. An integer linear programming method is then applied to assign a weight to each latent server to maximize the difference between true contacts and false ones. The proposed method is tested on the CASP7 data set. If the top L/5 predicted contacts are evaluated where L is the protein size, the average accuracy is 73%, which is much higher than that of any previously reported study. Moreover, if only the 15 new fold CASP7 targets are considered, our method achieves an average accuracy of 37%, which is much better than that of the majority voting method, SVM-LOMETS, SVM-SEQ, and SAM-T06. These methods demonstrate an average accuracy of 13.0%, 10.8%, 25.8% and 21.2%, respectively. Reducing server correlation and optimally combining independent latent servers show a significant improvement over the traditional consensus methods. This approach can hopefully provide a powerful tool for protein structure refinement and prediction use.

  2. Improving consensus contact prediction via server correlation reduction

    Directory of Open Access Journals (Sweden)

    Xu Jinbo

    2009-05-01

    Full Text Available Abstract Background Protein inter-residue contacts play a crucial role in the determination and prediction of protein structures. Previous studies on contact prediction indicate that although template-based consensus methods outperform sequence-based methods on targets with typical templates, such consensus methods perform poorly on new fold targets. However, we find out that even for new fold targets, the models generated by threading programs can contain many true contacts. The challenge is how to identify them. Results In this paper, we develop an integer linear programming model for consensus contact prediction. In contrast to the simple majority voting method assuming that all the individual servers are equally important and independent, the newly developed method evaluates their correlation by using maximum likelihood estimation and extracts independent latent servers from them by using principal component analysis. An integer linear programming method is then applied to assign a weight to each latent server to maximize the difference between true contacts and false ones. The proposed method is tested on the CASP7 data set. If the top L/5 predicted contacts are evaluated where L is the protein size, the average accuracy is 73%, which is much higher than that of any previously reported study. Moreover, if only the 15 new fold CASP7 targets are considered, our method achieves an average accuracy of 37%, which is much better than that of the majority voting method, SVM-LOMETS, SVM-SEQ, and SAM-T06. These methods demonstrate an average accuracy of 13.0%, 10.8%, 25.8% and 21.2%, respectively. Conclusion Reducing server correlation and optimally combining independent latent servers show a significant improvement over the traditional consensus methods. This approach can hopefully provide a powerful tool for protein structure refinement and prediction use.

  3. Logging Concessions Enable Illegal Logging Crisis in the Peruvian Amazon

    OpenAIRE

    Finer, Matt; Jenkins, Clinton N.; Sky, Melissa A. Blue; Pine, Justin

    2014-01-01

    The Peruvian Amazon is an important arena in global efforts to promote sustainable logging in the tropics. Despite recent efforts to achieve sustainability, such as provisions in the US?Peru Trade Promotion Agreement, illegal logging continues to plague the region. We present evidence that Peru's legal logging concession system is enabling the widespread illegal logging via the regulatory documents designed to ensure sustainable logging. Analyzing official government data, we found that 68.3%...

  4. Log of Apollo 11.

    Science.gov (United States)

    National Aeronautics and Space Administration, Washington, DC.

    The major events of the first manned moon landing mission, Apollo 11, are presented in chronological order from launch time until arrival of the astronauts aboard the U.S.S. Hornet. The log is descriptive, non-technical, and includes numerous color photographs of the astronauts on the moon. (PR)

  5. Borehole logging system

    International Nuclear Information System (INIS)

    Allen, L.S.

    1988-01-01

    A radioactive borehole logging tool employs an epithermal neutron detector having a neutron counter surrounded by an inner thermal neutron filter and an outer thermal neutron filter. Located between the inner and outer filters is a neutron moderating material for extending the lifetime of epithermal neutrons to enhance the counting rate of such epithermal neutrons by the neutron counter

  6. On Advice Complexity of the k-server Problem under Sparse Metrics

    DEFF Research Database (Denmark)

    Gupta, S.; Kamali, S.; López-Ortiz, A.

    2013-01-01

    O (n(log μ +log logN)) bits of advice. Among other results, this gives a 3-competitive algorithm for planar graphs, provided with O (n log log N) bits of advice. On the other side, we show that an advice of size Ω (n) is required to obtain a 1-competitive algorithm for sequences of size n even......We consider the k-Server problem under the advice model of computation when the underlying metric space is sparse. On one side, we introduce Θ (1)-competitive algorithms for a wide range of sparse graphs, which require advice of (almost) linear size. Namely, we show that for graphs of size N...... and treewidth α, there is an online algorithm which receives O (n(log α +log log N))1 bits of advice and optimally serves a sequence of length n. With a different argument, we show that if a graph admits a system of μ collective tree (q, r)- spanners, then there is a (q + r)-competitive algorithm which receives...

  7. Data Mining of Network Logs

    Science.gov (United States)

    Collazo, Carlimar

    2011-01-01

    The statement of purpose is to analyze network monitoring logs to support the computer incident response team. Specifically, gain a clear understanding of the Uniform Resource Locator (URL) and its structure, and provide a way to breakdown a URL based on protocol, host name domain name, path, and other attributes. Finally, provide a method to perform data reduction by identifying the different types of advertisements shown on a webpage for incident data analysis. The procedures used for analysis and data reduction will be a computer program which would analyze the URL and identify and advertisement links from the actual content links.

  8. Mastering Windows Server 2012 R2

    CERN Document Server

    Minasi, Mark; Booth, Christian; Butler, Robert; McCabe, John; Panek, Robert; Rice, Michael; Roth, Stefan

    2013-01-01

    Check out the new Hyper-V, find new and easier ways to remotely connect back into the office, or learn all about Storage Spaces-these are just a few of the features in Windows Server 2012 R2 that are explained in this updated edition from Windows authority Mark Minasi and a team of Windows Server experts led by Kevin Greene. This book gets you up to speed on all of the new features and functions of Windows Server, and includes real-world scenarios to put them in perspective. If you're a system administrator upgrading to, migrating to, or managing Windows Server 2012 R2, find what you need to

  9. Microsoft SQL Server OLAP Solution - A Survey

    OpenAIRE

    Badiozamany, Sobhan

    2010-01-01

    Microsoft SQL Server 2008 offers technologies for performing On-Line Analytical Processing (OLAP), directly on data stored in data warehouses, instead of moving the data into some offline OLAP tool. This brings certain benefits, such as elimination of data copying and better integration with the DBMS compared with off-line OLAP tools. This report reviews SQL Server support for OLAP, solution architectures, tools and components involved. Standard storage options are discussed but the focus of ...

  10. APLIKASI SERVER VIRTUAL IP UNTUK MIKROKONTROLER

    OpenAIRE

    Ashari, Ahmad

    2008-01-01

    Selama ini mikrokontroler yang terhubung ke satu komputer hanya dapat diakses melalui satu IP saja, padahal kebanyakan sistem operasi sekarang dapat memperjanjikan lebih dari satu IP untuk setiap komputer dalam bentuk virtual IP. Penelitian ini mengkaji pemanfaatan virtual IP dari IP aliasing pada sistem operasi Linux sebagai Server Virtual IP untuk mikrokontroler. Prinsip dasar Server Virtual IP adalah pembuatan Virtual Host pada masing-masing IP untuk memproses paket-paket data dan menerjem...

  11. Using Servers to Enhance Control System Capability

    International Nuclear Information System (INIS)

    Bickley, M.; Bowling, B. A.; Bryan, D. A.; Zeijts, J. van; White, K. S.; Witherspoon, S.

    1999-01-01

    Many traditional control systems include a distributed collection of front end machines to control hardware. Backend tools are used to view, modify, and record the signals generated by these front end machines. Software servers, which are a middleware layer between the front and back ends, can improve a control system in several ways. Servers can enable on-line processing of raw data, and consolidation of functionality. It many cases data retrieved from the front end must be processed in order to convert the raw data into useful information. These calculations are often redundantly performance by different programs, frequently offline. Servers can monitor the raw data and rapidly perform calculations, producing new signals which can be treated like any other control system signal, and can be used by any back end application. Algorithms can be incorporated to actively modify signal values in the control system based upon changes of other signals, essentially producing feedback in a control system. Servers thus increase the flexibility of a control system. Lastly, servers running on inexpensive UNIXworkstations can relay or cache frequently needed information, reducing the load on front end hardware by functioning as concentrators. Rather than many back end tools connecting directly to the front end machines, increasing the work load of these machines, they instead connect to the server. Servers like those discussed above have been used successfully at the Thomas Jefferson National Accelerator Facility to provide functionality such as beam steering, fault monitoring, storage of machine parameters, and on-line data processing. The authors discuss the potential uses of such servers, and share the results of work performed to date

  12. Solution for an Improved WEB Server

    Directory of Open Access Journals (Sweden)

    George PECHERLE

    2009-12-01

    Full Text Available We want to present a solution with maximum performance from a web server,in terms of services that the server provides. We do not always know what tools to useor how to configure what we have in order to get what we need. Keeping the Internetrelatedservices you provide in working condition can sometimes be a real challenge.And with the increasing demand in Internet services, we need to come up with solutionsto problems that occur every day.

  13. Difficulties in everyday life: young persons with attention-deficit/hyperactivity disorder and autism spectrum disorders perspectives. A chat-log analysis.

    Science.gov (United States)

    Ahlström, Britt H; Wentz, Elisabet

    2014-01-01

    This study focuses on the everyday life of young persons with attention-deficit/hyperactivity disorder (ADHD) and autism spectrum disorder (ASD). There are follow-up studies describing ADHD, and ASD in adults, and residual impairments that affect life. Few qualitative studies have been conducted on the subject of their experiences of everyday life, and even fewer are from young persons' perspectives. This study's aim was to describe how young persons with ADHD and ASD function and how they manage their everyday life based on analyses of Internet-based chat logs. Twelve young persons (7 males and 5 females aged 15-26) diagnosed with ADHD and ASD were included consecutively and offered 8 weeks of Internet-based Support and Coaching (IBSC). Data were collected from 12 chat logs (445 pages of text) produced interactively by the participants and the coaches. Qualitative content analysis was applied. The text was coded and sorted into subthemes and further interpreted into themes. The findings revealed two themes: "fighting against an everyday life lived in vulnerability" with the following subthemes: "difficult things," "stress and rest," and "when feelings and thoughts are a concern"; and the theme "struggling to find a life of one's own" with the following subthemes: "decide and carry out," "making life choices," and "taking care of oneself." Dealing with the problematic situations that everyday encompasses requires personal strength and a desire to find adequate solutions, as well as to discover a role in society. This study, into the provision of support and coaching over the Internet, led to more in-depth knowledge about these young persons' everyday lives and revealed their ability to use IBSC to express the complexity of everyday life for young persons with ADHD and ASD. The implications of the findings are that using online coaching makes available new opportunities for healthcare professionals to acknowledge these young persons' problems.

  14. Inverse Porosity-Hydraulic Conductivity Relationship in Sand-and-Gravel Aquifers Determined From Analysis of Geophysical Well Logs: Implications for Transport Processes

    Science.gov (United States)

    Morin, R. H.

    2004-05-01

    It is intuitive to think of hydraulic conductivity K as varying directly and monotonically with porosity P in porous media. However, laboratory studies and field observations have documented a possible inverse relationship between these two parameters in unconsolidated deposits under certain grain-size distributions and packing arrangements. This was confirmed at two sites in sand-and-gravel aquifers on Cape Cod, Massachusetts, where sets of geophysical well logs were used to examine the interdependence of several aquifer properties. Along with K and P, the resistivity R and the natural-gamma activity G of the surrounding sediments were measured as a function of depth. Qualitative examination of field results from the first site was useful in locating a contaminant plume and inferred an inverse relation between K and P; this was substantiated by a rigorous multivariate analysis of log data collected from the second site where K and P were determined to respond in a bipolar manner among the four independent variables. Along with this result come some implications regarding our conceptual understanding of contaminant transport processes in the shallow subsurface. According to Darcy's law, the interstitial fluid velocity V is proportional to the ratio K/P and, consequently, a general inverse K-P relationship implies that values of V can extend over a much wider range than conventionally assumed. This situation introduces a pronounced flow stratification within these granular deposits that can result in large values of longitudinal dispersivity; faster velocities occur in already fast zones and slower velocities in already slow zones. An inverse K-P relationship presents a new perspective on the physical processes associated with groundwater flow and transport. Although the results of this study apply strictly to the Cape Cod aquifers, they may merit a re-evaluation of modeling approaches undertaken at other locations having similar geologic environments.

  15. Beginning SQL Server Modeling Model-driven Application Development in SQL Server

    CERN Document Server

    Weller, Bart

    2010-01-01

    Get ready for model-driven application development with SQL Server Modeling! This book covers Microsoft's SQL Server Modeling (formerly known under the code name "Oslo") in detail and contains the information you need to be successful with designing and implementing workflow modeling. Beginning SQL Server Modeling will help you gain a comprehensive understanding of how to apply DSLs and other modeling components in the development of SQL Server implementations. Most importantly, after reading the book and working through the examples, you will have considerable experience using SQL M

  16. A history of nuclear well logging in the oil industry

    International Nuclear Information System (INIS)

    Tittle, C.W.

    1989-01-01

    Spurred by an interest in logging through steel casing γ-ray logging began in the late 1930s followed soon by neutron logging for porosity. These were the first two nuclear well logs. Gamma-gamma density logging was developed during the 1950s. Pulsed neutron lifetime logging appeared in the 1960s; the slim tools came in the early 1970s. Developments since then have included dual detector devices of several types which offered improved measurements or interpretation, γ-ray spectrometry logging (natural and neutron-induced) which identifies certain chemical elements, induced radioactivity logging, and the photoelectric absorption log, which, combined with the density log in a single tool, is known as litho-density logging. A combination of several γ-ray spectrometers in one tool, designed to determine 10 formation elements, was recently introduced, and a new neutron porosity tool measuring epithermal neutron die-away time has been developed. Digital transmission of logging data was a step forward in about 1975. Also, log interpretation techniques have greatly expanded since the advent of digital computers, and the microcomputer has had a distinct impact. It is now practical and economical to do iterative analysis on a suite of logs to obtain an optimum overall interpretation. (author)

  17. Round-Trip Delay Estimation in OPC UA Server-Client Communication Channel

    OpenAIRE

    Nakutis, Zilvinas; Deksnys, Vytautas; Jarusevicius, Ignas; Dambrauskas, Vilius; Cincikas, Gediminas; Kriauceliunas, Alenas

    2017-01-01

    In this paper an estimation of round-trip delay (RTD) in OPC UA server-client channel was investigated in various data communication networks including Ethernet, WiFi, and 3G. Testing was carried out using the developed IoT gateway device running OPC UA server and remote computer running OPC UA client. The server and the client machines were configured to operate in Virtual Private Network powered by OpenVPN. Experimental analysis revealed that RTD values are distributed in the wide range exh...

  18. Financial feasibility of a log sort yard handling small-diameter logs: A preliminary study

    Science.gov (United States)

    Han-Sup Han; E. M. (Ted) Bilek; John (Rusty) Dramm; Dan Loeffler; Dave Calkin

    2011-01-01

    The value and use of the trees removed in fuel reduction thinning and restoration treatments could be enhanced if the wood were effectively evaluated and sorted for quality and highest value before delivery to the next manufacturing destination. This article summarizes a preliminary financial feasibility analysis of a log sort yard that would serve as a log market to...

  19. The Medicago truncatula gene expression atlas web server

    Directory of Open Access Journals (Sweden)

    Tang Yuhong

    2009-12-01

    Full Text Available Abstract Background Legumes (Leguminosae or Fabaceae play a major role in agriculture. Transcriptomics studies in the model legume species, Medicago truncatula, are instrumental in helping to formulate hypotheses about the role of legume genes. With the rapid growth of publically available Affymetrix GeneChip Medicago Genome Array GeneChip data from a great range of tissues, cell types, growth conditions, and stress treatments, the legume research community desires an effective bioinformatics system to aid efforts to interpret the Medicago genome through functional genomics. We developed the Medicago truncatula Gene Expression Atlas (MtGEA web server for this purpose. Description The Medicago truncatula Gene Expression Atlas (MtGEA web server is a centralized platform for analyzing the Medicago transcriptome. Currently, the web server hosts gene expression data from 156 Affymetrix GeneChip® Medicago genome arrays in 64 different experiments, covering a broad range of developmental and environmental conditions. The server enables flexible, multifaceted analyses of transcript data and provides a range of additional information about genes, including different types of annotation and links to the genome sequence, which help users formulate hypotheses about gene function. Transcript data can be accessed using Affymetrix probe identification number, DNA sequence, gene name, functional description in natural language, GO and KEGG annotation terms, and InterPro domain number. Transcripts can also be discovered through co-expression or differential expression analysis. Flexible tools to select a subset of experiments and to visualize and compare expression profiles of multiple genes have been implemented. Data can be downloaded, in part or full, in a tabular form compatible with common analytical and visualization software. The web server will be updated on a regular basis to incorporate new gene expression data and genome annotation, and is accessible

  20. Neutron--neutron logging

    International Nuclear Information System (INIS)

    Allen, L.S.

    1977-01-01

    A borehole logging tool includes a steady-state source of fast neutrons, two epithermal neutron detectors, and two thermal neutron detectors. A count rate meter is connected to each neutron detector. A first ratio detector provides an indication of the porosity of the formation surrounding the borehole by determining the ratio of the outputs of the two count rate meters connected to the two epithermal neutron detectors. A second ratio detector provides an indication of both porosity and macroscopic absorption cross section of the formation surrounding the borehole by determining the ratio of the outputs of the two count rate meters connected to the two thermal neutron detectors. By comparing the signals of the two ratio detectors, oil bearing zones and salt water bearing zones within the formation being logged can be distinguished and the amount of oil saturation can be determined. 6 claims, 2 figures

  1. Pulsed radiation decay logging

    International Nuclear Information System (INIS)

    Mills, W.R. Jr.

    1983-01-01

    There are provided new and improved well logging processes and systems wherein the detection of secondary radiation is accomplished during a plurality of time windows in a manner to accurately characterize the decay rate of the secondary radiation. The system comprises a well logging tool having a primary pulsed radiation source which emits repetitive time-spaced bursts of primary radiation and detector means for detecting secondary radiation resulting from the primary radiation and producing output signals in response to the detected radiation. A plurality of measuring channels are provided, each of which produces a count rate function representative of signals received from the detector means during successive time windows occurring between the primary radiation bursts. The logging system further comprises means responsive to the measuring channels for producing a plurality of functions representative of the ratios of the radiation count rates measured during adjacent pairs of the time windows. Comparator means function to compare the ratio functions and select at least one of the ratio functions to generate a signal representative of the decay rate of the secondary radiation

  2. Geophysical borehole logging

    International Nuclear Information System (INIS)

    McCann, D.; Barton, K.J.; Hearn, K.

    1981-08-01

    Most of the available literature on geophysical borehole logging refers to studies carried out in sedimentary rocks. It is only in recent years that any great interest has been shown in geophysical logging in boreholes in metamorphic and igneous rocks following the development of research programmes associated with geothermal energy and nuclear waste disposal. This report is concerned with the programme of geophysical logging carried out on the three deep boreholes at Altnabreac, Caithness, to examine the effectiveness of these methods in crystalline rock. Of particular importance is the assessment of the performance of the various geophysical sondes run in the boreholes in relation to the rock mass properties. The geophysical data can be used to provide additional in-situ information on the geological, hydrogeological and engineering properties of the rock mass. Fracturing and weathering in the rock mass have a considerable effect on both the design parameters for an engineering structure and the flow of water through the rock mass; hence, the relation between the geophysical properties and the degree of fracturing and weathering is examined in some detail. (author)

  3. Interactive machine learning for postprocessing CT images of hardwood logs

    Science.gov (United States)

    Erol Sarigul; A. Lynn Abbott; Daniel L. Schmoldt

    2003-01-01

    This paper concerns the nondestructive evaluation of hardwood logs through the analysis of computed tomography (CT) images. Several studies have shown that the commercial value of resulting boards can be increased substantially if log sawing strategies are chosen using prior knowledge of internal log defects. Although CT imaging offers a potential means of obtaining...

  4. PUMA Internet Task Logging Using the IDAC-1

    Directory of Open Access Journals (Sweden)

    K. N. Tarchanidis

    2014-08-01

    Full Text Available This project uses an IDAC-1 board to sample the joint angle position of the PUMA 76 1 robot and log the results on a computer. The robot is at the task location and the logging computer is located in a different one. The task the robot is performing is based on a Pseudo Stereo Vision System (PSVS. Internet is the transport media. The protocol used in this project is UDP/IP. The actual angle is taken straight from the PUMA controller. High-resolution potentiometers are connected on each robot joint and are buffered and sampled as potential difference on an A/D converter integrated on the IDAC-1. The logging computer through the Internet acting as client asks for the angle set, the IDAC-1 responds as server with the 10-bit resolution sampling of the joint position. The whole task is logged in a file on the logging computer. This application can give the ability to the Internet user to monitor and log the robot tasks anywhere in the Word Wide Web (www.

  5. Exam 70-411 administering Windows Server 2012

    CERN Document Server

    Course, Microsoft Official Academic

    2014-01-01

    Microsoft Windows Server is a multi-purpose server designed to increase reliability and flexibility of  a network infrastructure. Windows Server is the paramount tool used by enterprises in their datacenter and desktop strategy. The most recent versions of Windows Server also provide both server and client virtualization. Its ubiquity in the enterprise results in the need for networking professionals who know how to plan, design, implement, operate, and troubleshoot networks relying on Windows Server. Microsoft Learning is preparing the next round of its Windows Server Certification program

  6. Multivariate and spatial statistical analysis of Callovo-Oxfordian physical properties from lab and borehole logs data: towards a characterization of lateral and vertical spatial trends in the Meuse/Haute-Marne transposition zone

    International Nuclear Information System (INIS)

    Garcia, M.H.; Rabaute, A.; Yven, B.; Guillemot, D.

    2010-01-01

    Document available in extended abstract form only. The geological exploration of the Meuse/Haute-Marne area began in 1994. Several boreholes were drilled, and the Callovo-Oxfordian argillite, thought to become a potential storage formation, were cored and logged. 2D and 3D seismic surveys were completed, as well as geological field observations, and an underground research laboratory was created. A 250 km 2 -wide Transposition Zone was delimited, which was subject to further investigations in 2007 and 2008, including another series of coring and logging in four additional boreholes, and a tighter 2D seismic survey. The main objective of this study was to improve the knowledge of the spatial variability of geological and physical properties of the Callovo-Oxfordian formation. The paper focuses on the three following aspects of the study to present and discuss the methods that have been used and the results that have been obtained: - Use of well-log data to identify equivalent homogeneous log-units on the boreholes. - Relating log attributes to physical properties of argillites measured on cores in laboratory. - Study of lateral and vertical spatial trends of selected physical properties across the Transposition Zone (TZ). To identify equivalent homogeneous log-units, a combination of Principal Component Analysis (PCA) and Fuzzy Cluster Analysis (FCA) was used. PCA was classically performed to reduce the number of variables to retain principal components gathering most of the original dataset variance. PCA was also used to identify isolated groups of correlated variables that could be associated to different properties of the formation. Then, FCA was applied to identify homogeneous log-units on the eight boreholes across the TZ. Well-logs data being much more numerous and better distributed along boreholes than lab data measured on rock samples, relations and correlations were sought between the two types of data to identify log attributes that were likely to provide

  7. Generation and performance of a multigroup coupled neutron-gamma cross-section library for deterministic and Monte Carlo borehole logging analysis

    International Nuclear Information System (INIS)

    Kodeli, I.; Aldama, D. L.; De Leege, P. F. A.; Legrady, D.; Hoogenboom, J. E.; Cowan, P.

    2004-01-01

    As part of the IRTMBA (Improved Radiation Transport Modelling for Borehole Applications) project of the EU community's 5. framework program a special purpose multigroup cross-section library was prepared for use in deterministic and Monte Carlo oil well logging particle transport calculations. This library is expected to improve the prediction of the neutron and gamma spectra at the detector positions of the logging tool, and their use for the interpretation of the neutron logging measurements was studied. Preparation and testing of this library is described. (authors)

  8. Freiburg RNA Tools: a web server integrating INTARNA, EXPARNA and LOCARNA.

    Science.gov (United States)

    Smith, Cameron; Heyne, Steffen; Richter, Andreas S; Will, Sebastian; Backofen, Rolf

    2010-07-01

    The Freiburg RNA tools web server integrates three tools for the advanced analysis of RNA in a common web-based user interface. The tools IntaRNA, ExpaRNA and LocARNA support the prediction of RNA-RNA interaction, exact RNA matching and alignment of RNA, respectively. The Freiburg RNA tools web server and the software packages of the stand-alone tools are freely accessible at http://rna.informatik.uni-freiburg.de.

  9. Pulse neutron logging technique

    International Nuclear Information System (INIS)

    Bespalov, D.F.; Dylyuk, A.A.

    1975-01-01

    A new method of neutron-burst logging is proposed, residing in irradiating rocks with fast neutron bursts and registering the integrated flux burst of thermal and/or epithermal neutrons, from the moment of its initiation to that of full absorption. The obtaained value is representative of the rock properties (porosity, hydrogen content). The integrated flux in a burst of thermal and epithermal neutrons can be measured both by way of activation of a reference sample of a known chemical composition during the neutron burst and by recording the radiation of induced activity of the sample within an interval between two bursts. The proposed method features high informative value, accuracy and efficiency

  10. 3Drefine: an interactive web server for efficient protein structure refinement.

    Science.gov (United States)

    Bhattacharya, Debswapna; Nowotny, Jackson; Cao, Renzhi; Cheng, Jianlin

    2016-07-08

    3Drefine is an interactive web server for consistent and computationally efficient protein structure refinement with the capability to perform web-based statistical and visual analysis. The 3Drefine refinement protocol utilizes iterative optimization of hydrogen bonding network combined with atomic-level energy minimization on the optimized model using a composite physics and knowledge-based force fields for efficient protein structure refinement. The method has been extensively evaluated on blind CASP experiments as well as on large-scale and diverse benchmark datasets and exhibits consistent improvement over the initial structure in both global and local structural quality measures. The 3Drefine web server allows for convenient protein structure refinement through a text or file input submission, email notification, provided example submission and is freely available without any registration requirement. The server also provides comprehensive analysis of submissions through various energy and statistical feedback and interactive visualization of multiple refined models through the JSmol applet that is equipped with numerous protein model analysis tools. The web server has been extensively tested and used by many users. As a result, the 3Drefine web server conveniently provides a useful tool easily accessible to the community. The 3Drefine web server has been made publicly available at the URL: http://sysbio.rnet.missouri.edu/3Drefine/. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  11. Difficulties in everyday life: Young persons with attention-deficit/hyperactivity disorder and autism spectrum disorders perspectives. A chat-log analysis

    Directory of Open Access Journals (Sweden)

    Britt H. Ahlström

    2014-05-01

    Full Text Available This study focuses on the everyday life of young persons with attention-deficit/hyperactivity disorder (ADHD and autism spectrum disorder (ASD. There are follow-up studies describing ADHD, and ASD in adults, and residual impairments that affect life. Few qualitative studies have been conducted on the subject of their experiences of everyday life, and even fewer are from young persons’ perspectives. This study's aim was to describe how young persons with ADHD and ASD function and how they manage their everyday life based on analyses of Internet-based chat logs. Twelve young persons (7 males and 5 females aged 15–26 diagnosed with ADHD and ASD were included consecutively and offered 8 weeks of Internet-based Support and Coaching (IBSC. Data were collected from 12 chat logs (445 pages of text produced interactively by the participants and the coaches. Qualitative content analysis was applied. The text was coded and sorted into subthemes and further interpreted into themes. The findings revealed two themes: “fighting against an everyday life lived in vulnerability” with the following subthemes: “difficult things,” “stress and rest,” and “when feelings and thoughts are a concern”; and the theme “struggling to find a life of one's own” with the following subthemes: “decide and carry out,” “making life choices,” and “taking care of oneself.” Dealing with the problematic situations that everyday encompasses requires personal strength and a desire to find adequate solutions, as well as to discover a role in society. This study, into the provision of support and coaching over the Internet, led to more in-depth knowledge about these young persons’ everyday lives and revealed their ability to use IBSC to express the complexity of everyday life for young persons with ADHD and ASD. The implications of the findings are that using online coaching makes available new opportunities for healthcare professionals to acknowledge

  12. APPLICATION OF WELL LOG ANALYSIS IN ASSESSMENT OF PETROPHYSICAL PARAMETERS AND RESERVOIR CHARACTERIZATION OF WELLS IN THE “OTH” FIELD, ANAMBRA BASIN, SOUTHERN NIGERIA

    Directory of Open Access Journals (Sweden)

    Eugene URORO

    2014-12-01

    Full Text Available Over the past years, the Anambra basin one of Nigeria’s inland basins has recorded significant level of hydrocarbon exploration activities. The basin has been confirmed by several authors from source rock analyses to have the potential for generating hydrocarbon. For the hydrocarbon to be exploited, it is imperative to have a thorough understanding of the reservoir. Computer-assisted log analyses were employed to effectively evaluate the petrophysical parameters such as the shale volume (Vsh, total porosity (TP, effective porosity (EP, water saturation (Sw, and hydrocarbon saturation (Sh. Cross-plots of the petrophysical parameters versus depth were illustrated. Five hydrocarbon bearing reservoirs were delineated in well 1, four in well 2. The reservoirs in well 3 do not contain hydrocarbon. The estimated reservoir porosity varies from 10% to 21% while their permeability values range from 20md to 1400md. The porosity and permeability values suggest that reservoirs are good enough to store and also permit free flow of fluid. The volume of shale (0.05% to 0.35% analysis reveals that the reservoirs range from shaly sand to slightly shaly sand to clean sand reservoir. On the basis of petrophysics data, the reservoirs are interpreted a good quality reservoir rocks which has been confirmed with high effective porosity range between 20% and high hydrocarbon saturation exceeding 55% water saturation in well 1 and well 2. Water saturation 3 is nearly 100% although the reservoir properties are good.  

  13. Energy-efficient server management; Energieeffizientes Servermanagement

    Energy Technology Data Exchange (ETDEWEB)

    Sauter, B.

    2003-07-01

    This final report for the Swiss Federal Office of Energy (SFOE) presents the results of a project that aimed to develop an automatic shut-down system for the servers used in typical electronic data processing installations to be found in small and medium-sized enterprises. The purpose of shutting down these computers - the saving of energy - is discussed. The development of a shutdown unit on the basis of a web-server that automatically shuts down the servers connected to it and then interrupts their power supply is described. The functions of the unit, including pre-set times for switching on and off, remote operation via the Internet and its interaction with clients connected to it are discussed. Examples of the system's user interface are presented.

  14. AML (Advanced Mud Logging: First Among Equals

    Directory of Open Access Journals (Sweden)

    T. Loermans

    2017-09-01

    Full Text Available During the past ten years an enormous development in mud logging technology has been made. Traditional mud logging was only qualitative in nature, and mudlogs could not be used for the petrophysical well evaluations which form the basis for all subsequent activities on wells and fields. AML however can provide quantitative information, logs with a reliability, trueness and precision like LWD and WLL. Hence for well evaluation programmes there are now three different logging methods available, each with its own pros and cons on specific aspects: AML, LWD and WLL. The largest improvements have been made in mud gas analysis and elemental analysis of cuttings. Mud gas analysis can yield hydrocarbon fluid composition for some components with a quality like PVT analysis, hence not only revolutionising the sampling programme so far done with only LWD/WLL, but also making it possible to geosteer on fluid properties. Elemental analysis of cuttings, e.g. with XRF, with an ability well beyond the capabilities of the spectroscopy measurements possible earlier with LWD/WLL tools, is opening up improved ways to evaluate formations, especially of course where the traditional methods are falling short of requirements, such as in unconventional reservoirs. An overview and specific examples of these AML logs is given, from which it may be concluded that AML now ought to be considered as “first among its equals”.

  15. ACFIS: a web server for fragment-based drug discovery.

    Science.gov (United States)

    Hao, Ge-Fei; Jiang, Wen; Ye, Yuan-Nong; Wu, Feng-Xu; Zhu, Xiao-Lei; Guo, Feng-Biao; Yang, Guang-Fu

    2016-07-08

    In order to foster innovation and improve the effectiveness of drug discovery, there is a considerable interest in exploring unknown 'chemical space' to identify new bioactive compounds with novel and diverse scaffolds. Hence, fragment-based drug discovery (FBDD) was developed rapidly due to its advanced expansive search for 'chemical space', which can lead to a higher hit rate and ligand efficiency (LE). However, computational screening of fragments is always hampered by the promiscuous binding model. In this study, we developed a new web server Auto Core Fragment in silico Screening (ACFIS). It includes three computational modules, PARA_GEN, CORE_GEN and CAND_GEN. ACFIS can generate core fragment structure from the active molecule using fragment deconstruction analysis and perform in silico screening by growing fragments to the junction of core fragment structure. An integrated energy calculation rapidly identifies which fragments fit the binding site of a protein. We constructed a simple interface to enable users to view top-ranking molecules in 2D and the binding mode in 3D for further experimental exploration. This makes the ACFIS a highly valuable tool for drug discovery. The ACFIS web server is free and open to all users at http://chemyang.ccnu.edu.cn/ccb/server/ACFIS/. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  16. The new protein topology graph library web server.

    Science.gov (United States)

    Schäfer, Tim; Scheck, Andreas; Bruneß, Daniel; May, Patrick; Koch, Ina

    2016-02-01

    We present a new, extended version of the Protein Topology Graph Library web server. The Protein Topology Graph Library describes the protein topology on the super-secondary structure level. It allows to compute and visualize protein ligand graphs and search for protein structural motifs. The new server features additional information on ligand binding to secondary structure elements, increased usability and an application programming interface (API) to retrieve data, allowing for an automated analysis of protein topology. The Protein Topology Graph Library server is freely available on the web at http://ptgl.uni-frankfurt.de. The website is implemented in PHP, JavaScript, PostgreSQL and Apache. It is supported by all major browsers. The VPLG software that was used to compute the protein ligand graphs and all other data in the database is available under the GNU public license 2.0 from http://vplg.sourceforge.net. tim.schaefer@bioinformatik.uni-frankfurt.de; ina.koch@bioinformatik.uni-frankfurt.de Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  17. AlignMe—a membrane protein sequence alignment web server

    Science.gov (United States)

    Stamm, Marcus; Staritzbichler, René; Khafizov, Kamil; Forrest, Lucy R.

    2014-01-01

    We present a web server for pair-wise alignment of membrane protein sequences, using the program AlignMe. The server makes available two operational modes of AlignMe: (i) sequence to sequence alignment, taking two sequences in fasta format as input, combining information about each sequence from multiple sources and producing a pair-wise alignment (PW mode); and (ii) alignment of two multiple sequence alignments to create family-averaged hydropathy profile alignments (HP mode). For the PW sequence alignment mode, four different optimized parameter sets are provided, each suited to pairs of sequences with a specific similarity level. These settings utilize different types of inputs: (position-specific) substitution matrices, secondary structure predictions and transmembrane propensities from transmembrane predictions or hydrophobicity scales. In the second (HP) mode, each input multiple sequence alignment is converted into a hydrophobicity profile averaged over the provided set of sequence homologs; the two profiles are then aligned. The HP mode enables qualitative comparison of transmembrane topologies (and therefore potentially of 3D folds) of two membrane proteins, which can be useful if the proteins have low sequence similarity. In summary, the AlignMe web server provides user-friendly access to a set of tools for analysis and comparison of membrane protein sequences. Access is available at http://www.bioinfo.mpg.de/AlignMe PMID:24753425

  18. ACFIS: a web server for fragment-based drug discovery

    Science.gov (United States)

    Hao, Ge-Fei; Jiang, Wen; Ye, Yuan-Nong; Wu, Feng-Xu; Zhu, Xiao-Lei; Guo, Feng-Biao; Yang, Guang-Fu

    2016-01-01

    In order to foster innovation and improve the effectiveness of drug discovery, there is a considerable interest in exploring unknown ‘chemical space’ to identify new bioactive compounds with novel and diverse scaffolds. Hence, fragment-based drug discovery (FBDD) was developed rapidly due to its advanced expansive search for ‘chemical space’, which can lead to a higher hit rate and ligand efficiency (LE). However, computational screening of fragments is always hampered by the promiscuous binding model. In this study, we developed a new web server Auto Core Fragment in silico Screening (ACFIS). It includes three computational modules, PARA_GEN, CORE_GEN and CAND_GEN. ACFIS can generate core fragment structure from the active molecule using fragment deconstruction analysis and perform in silico screening by growing fragments to the junction of core fragment structure. An integrated energy calculation rapidly identifies which fragments fit the binding site of a protein. We constructed a simple interface to enable users to view top-ranking molecules in 2D and the binding mode in 3D for further experimental exploration. This makes the ACFIS a highly valuable tool for drug discovery. The ACFIS web server is free and open to all users at http://chemyang.ccnu.edu.cn/ccb/server/ACFIS/. PMID:27150808

  19. Log(s) physics results from CDF

    International Nuclear Information System (INIS)

    1989-01-01

    The Collider Detector at Fermilab (CDF) is a large, azimuthally symmetric detector designed to study bar pp interactions at the Fermilab Tevatron Collider. Results are presented from data taken with a minimum bias trigger at √s = 630 and 1800 GeV during the 1987 run. The topics include the current analysis of dn/dη and some very preliminary results on short range pseudorapidity correlations and Bose-Einstein correlations. 7 refs., 5 figs., 2 tabs

  20. Instant Hyper-v Server Virtualization starter

    CERN Document Server

    Eguibar, Vicente Rodriguez

    2013-01-01

    Get to grips with a new technology, understand what it is and what it can do for you, and then get to work with the most important features and tasks.The approach would be in a tutorial manner that will guide the users in an orderly manner toward virtualization.This book is conceived for system administrator and advanced PC enthusiasts who want to venture into the virtualization world. Although this book goes from scratch up, knowledge on server Operative Systems, LAN and networking has to be in place. Having a good background on server administration is desirable, including networking service

  1. Saving Money and Time with Virtual Server

    CERN Document Server

    Sanders, Chris

    2006-01-01

    Microsoft Virtual Server 2005 consistently proves to be worth its weight in gold, with new implementations thought up every day. With this product now a free download from Microsoft, scores of new users are able to experience what the power of virtualization can do for their networks. This guide is aimed at network administrators who are interested in ways that Virtual Server 2005 can be implemented in their organizations in order to save money and increase network productivity. It contains information on setting up a virtual network, virtual consolidation, virtual security, virtual honeypo

  2. Professional Microsoft SQL Server 2012 Integration Services

    CERN Document Server

    Knight, Brian; Moss, Jessica M; Davis, Mike; Rock, Chris

    2012-01-01

    An in-depth look at the radical changes to the newest release of SISS Microsoft SQL Server 2012 Integration Services (SISS) builds on the revolutionary database product suite first introduced in 2005. With this crucial resource, you will explore how this newest release serves as a powerful tool for performing extraction, transformation, and load operations (ETL). A team of SQL Server experts deciphers this complex topic and provides detailed coverage of the new features of the 2012 product release. In addition to technical updates and additions, the authors present you with a new set of SISS b

  3. Windows Server® 2008 Inside Out

    CERN Document Server

    Stanek, William R

    2009-01-01

    Learn how to conquer Windows Server 2008-from the inside out! Designed for system administrators, this definitive resource features hundreds of timesaving solutions, expert insights, troubleshooting tips, and workarounds for administering Windows Server 2008-all in concise, fast-answer format. You will learn how to perform upgrades and migrations, automate deployments, implement security features, manage software updates and patches, administer users and accounts, manage Active Directory® directory services, and more. With INSIDE OUT, you'll discover the best and fastest ways to perform core a

  4. On the single-server retrial queue

    Directory of Open Access Journals (Sweden)

    Djellab Natalia V.

    2006-01-01

    Full Text Available In this work, we review the stochastic decomposition for the number of customers in M/G/1 retrial queues with reliable server and server subjected to breakdowns which has been the subject of investigation in the literature. Using the decomposition property of M/G/1 retrial queues with breakdowns that holds under exponential assumption for retrial times as an approximation in the non-exponential case, we consider an approximate solution for the steady-state queue size distribution.

  5. AMMOS2: a web server for protein–ligand–water complexes refinement via molecular mechanics

    Science.gov (United States)

    Labbé, Céline M.; Pencheva, Tania; Jereva, Dessislava; Desvillechabrol, Dimitri; Becot, Jérôme; Villoutreix, Bruno O.; Pajeva, Ilza

    2017-01-01

    Abstract AMMOS2 is an interactive web server for efficient computational refinement of protein–small organic molecule complexes. The AMMOS2 protocol employs atomic-level energy minimization of a large number of experimental or modeled protein–ligand complexes. The web server is based on the previously developed standalone software AMMOS (Automatic Molecular Mechanics Optimization for in silico Screening). AMMOS utilizes the physics-based force field AMMP sp4 and performs optimization of protein–ligand interactions at five levels of flexibility of the protein receptor. The new version 2 of AMMOS implemented in the AMMOS2 web server allows the users to include explicit water molecules and individual metal ions in the protein–ligand complexes during minimization. The web server provides comprehensive analysis of computed energies and interactive visualization of refined protein–ligand complexes. The ligands are ranked by the minimized binding energies allowing the users to perform additional analysis for drug discovery or chemical biology projects. The web server has been extensively tested on 21 diverse protein–ligand complexes. AMMOS2 minimization shows consistent improvement over the initial complex structures in terms of minimized protein–ligand binding energies and water positions optimization. The AMMOS2 web server is freely available without any registration requirement at the URL: http://drugmod.rpbs.univ-paris-diderot.fr/ammosHome.php. PMID:28486703

  6. AMMOS2: a web server for protein-ligand-water complexes refinement via molecular mechanics.

    Science.gov (United States)

    Labbé, Céline M; Pencheva, Tania; Jereva, Dessislava; Desvillechabrol, Dimitri; Becot, Jérôme; Villoutreix, Bruno O; Pajeva, Ilza; Miteva, Maria A

    2017-07-03

    AMMOS2 is an interactive web server for efficient computational refinement of protein-small organic molecule complexes. The AMMOS2 protocol employs atomic-level energy minimization of a large number of experimental or modeled protein-ligand complexes. The web server is based on the previously developed standalone software AMMOS (Automatic Molecular Mechanics Optimization for in silico Screening). AMMOS utilizes the physics-based force field AMMP sp4 and performs optimization of protein-ligand interactions at five levels of flexibility of the protein receptor. The new version 2 of AMMOS implemented in the AMMOS2 web server allows the users to include explicit water molecules and individual metal ions in the protein-ligand complexes during minimization. The web server provides comprehensive analysis of computed energies and interactive visualization of refined protein-ligand complexes. The ligands are ranked by the minimized binding energies allowing the users to perform additional analysis for drug discovery or chemical biology projects. The web server has been extensively tested on 21 diverse protein-ligand complexes. AMMOS2 minimization shows consistent improvement over the initial complex structures in terms of minimized protein-ligand binding energies and water positions optimization. The AMMOS2 web server is freely available without any registration requirement at the URL: http://drugmod.rpbs.univ-paris-diderot.fr/ammosHome.php. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  7. Network data analysis server (NDAS) prototype development

    International Nuclear Information System (INIS)

    Marka, Szabolcs; Mours, Benoit; Williams, Roy

    2002-01-01

    We have developed a simple and robust system based on standard UNIX tools and frame library code to transfer and merge data from multiple gravitational wave detectors distributed worldwide. The transfer and merger take place with less than 20 minute delay and the output frames are available for all participants. Presently VIRGO and LIGO participate in the exchange and only environmental data are shared. The system is modular to allow future improvements and the use of new tools like Grid

  8. Delay Bound: Fractal Traffic Passes through Network Servers

    Directory of Open Access Journals (Sweden)

    Ming Li

    2013-01-01

    Full Text Available Delay analysis plays a role in real-time systems in computer communication networks. This paper gives our results in the aspect of delay analysis of fractal traffic passing through servers. There are three contributions presented in this paper. First, we will explain the reasons why conventional theory of queuing systems ceases in the general sense when arrival traffic is fractal. Then, we will propose a concise method of delay computation for hard real-time systems as shown in this paper. Finally, the delay computation of fractal traffic passing through severs is presented.

  9. Server-side Statistics Scripting in PHP

    Directory of Open Access Journals (Sweden)

    Jan de Leeuw

    1997-06-01

    Full Text Available On the UCLA Statistics WWW server there are a large number of demos and calculators that can be used in statistics teaching and research. Some of these demos require substantial amounts of computation, others mainly use graphics. These calculators and demos are implemented in various different ways, reflecting developments in WWW based computing. As usual, one of the main choices is between doing the work on the client-side (i.e. in the browser or on the server-side (i.e. on our WWW server. Obviously, client-side computation puts fewer demands on the server. On the other hand, it requires that the client downloads Java applets, or installs plugins and/or helpers. If JavaScript is used, client-side computations will generally be slow. We also have to assume that the client is installed properly, and has the required capabilities. Requiring too much on the client-side has caused browsing machines such as Netscape Communicator to grow beyond all reasonable bounds, both in size and RAM requirements. Moreover requiring Java and JavaScript rules out such excellent browsers as Lynx or Emacs W3. For server-side computing, we can configure the server and its resources ourselves, and we need not worry about browser capabilities and configuration. Nothing needs to be downloaded, except the usual HTML pages and graphics. In the same way as on the client side, there is a scripting solution, where code is interpreted, or a ob ject-code solution using compiled code. For the server-side scripting, we use embedded languages, such as PHP/FI. The scripts in the HTML pages are interpreted by a CGI program, and the output of the CGI program is send to the clients. Of course the CGI program is compiled, but the statistics procedures will usually be interpreted, because PHP/FI does not have the appropriate functions in its scripting language. This will tend to be slow, because embedded languages do not deal efficiently with loops and similar constructs. Thus a first

  10. Microsoft Exchange Server PowerShell cookbook

    CERN Document Server

    Andersson, Jonas

    2015-01-01

    This book is for messaging professionals who want to build real-world scripts with Windows PowerShell 5 and the Exchange Management Shell. If you are a network or systems administrator responsible for managing and maintaining Exchange Server 2013, you will find this highly useful.

  11. Client/Server Architecture Promises Radical Changes.

    Science.gov (United States)

    Freeman, Grey; York, Jerry

    1991-01-01

    This article discusses the emergence of the client/server paradigm for the delivery of computer applications, its emergence in response to the proliferation of microcomputers and local area networks, the applicability of the model in academic institutions, and its implications for college campus information technology organizations. (Author/DB)

  12. Implementing bioinformatic workflows within the bioextract server

    Science.gov (United States)

    Computational workflows in bioinformatics are becoming increasingly important in the achievement of scientific advances. These workflows typically require the integrated use of multiple, distributed data sources and analytic tools. The BioExtract Server (http://bioextract.org) is a distributed servi...

  13. Solarwinds Server & Application Monitor deployment and administration

    CERN Document Server

    Brant, Justin

    2013-01-01

    A concise and practical guide to using SolarWinds Server & Application Monitor.If you are an IT professionals ranging from an entry-level technician to a more advanced network or system administrator who is new to network monitoring services and/or SolarWinds SAM, this book is ideal for you.

  14. Creating a Data Warehouse using SQL Server

    DEFF Research Database (Denmark)

    Sørensen, Jens Otto; Alnor, Karl

    1999-01-01

    In this paper we construct a Star Join Schema and show how this schema can be created using the basic tools delivered with SQL Server 7.0. Major objectives are to keep the operational database unchanged so that data loading can be done with out disturbing the business logic of the operational...

  15. Mastering SQL Server 2014 data mining

    CERN Document Server

    Bassan, Amarpreet Singh

    2014-01-01

    If you are a developer who is working on data mining for large companies and would like to enhance your knowledge of SQL Server Data Mining Suite, this book is for you. Whether you are brand new to data mining or are a seasoned expert, you will be able to master the skills needed to build a data mining solution.

  16. Non-destructive analysis and detection of internal characteristics of spruce logs through X computerized tomography; Detection et analyse non destructive de caracteristiques internes de billons d'epicea commun (PICEA ABIES (L.) KARST) par tomographie a rayons X

    Energy Technology Data Exchange (ETDEWEB)

    Longuetaud, F

    2005-10-15

    Computerized tomography allows a direct access to internal features of scanned logs on the basis of density and moisture content variations. The objective of this work is to assess the feasibility of an automatic detection of internal characteristics with the final aim of conducting scientific analyses. The database is constituted by CT images of 24 spruces obtained with a medical CT scanner. Studied trees are representative of several social status and are coming from four stands located in North-Eastern France, themselves are representative of several age, density and fertility classes. The automatic processing developed are the following. First, pith detection in logs dealing with the problem of knot presence and ring eccentricity. The accuracy of the localisation was less than one mm. Secondly, the detection of the sapwood/heart-wood limit in logs dealing with the problem of knot presence (main source of difficulty). The error on the diameter was 1.8 mm which corresponds to a relative error of 1.3 per cent. Thirdly, the detection of the whorls location and comparison with an optical method. Fourthly the detection of individualized knots. This process allows to count knots and to locate them in a log (longitudinal position and azimuth); however, the validation of the method and extraction of branch diameter and inclination are still to be developed. An application of this work was a variability analysis of the sapwood content in the trunk: at the within-tree level, the sapwood width was found to be constant under the living crown; at the between-tree level, a strong correlation was found with the amount of living branches. A great number of analyses are possible from our work results, among others: architectural analysis with the pith tracking and the apex death occurrence; analysis of radial variations of the heart-wood shape; analysis of the knot distribution in logs. (author)

  17. Non-destructive analysis and detection of internal characteristics of spruce logs through X computerized tomography; Detection et analyse non destructive de caracteristiques internes de billons d'epicea commun (PICEA ABIES (L.) KARST) par tomographie a rayons X

    Energy Technology Data Exchange (ETDEWEB)

    Longuetaud, F

    2005-10-15

    Computerized tomography allows a direct access to internal features of scanned logs on the basis of density and moisture content variations. The objective of this work is to assess the feasibility of an automatic detection of internal characteristics with the final aim of conducting scientific analyses. The database is constituted by CT images of 24 spruces obtained with a medical CT scanner. Studied trees are representative of several social status and are coming from four stands located in North-Eastern France, themselves are representative of several age, density and fertility classes. The automatic processing developed are the following. First, pith detection in logs dealing with the problem of knot presence and ring eccentricity. The accuracy of the localisation was less than one mm. Secondly, the detection of the sapwood/heart-wood limit in logs dealing with the problem of knot presence (main source of difficulty). The error on the diameter was 1.8 mm which corresponds to a relative error of 1.3 per cent. Thirdly, the detection of the whorls location and comparison with an optical method. Fourthly the detection of individualized knots. This process allows to count knots and to locate them in a log (longitudinal position and azimuth); however, the validation of the method and extraction of branch diameter and inclination are still to be developed. An application of this work was a variability analysis of the sapwood content in the trunk: at the within-tree level, the sapwood width was found to be constant under the living crown; at the between-tree level, a strong correlation was found with the amount of living branches. A great number of analyses are possible from our work results, among others: architectural analysis with the pith tracking and the apex death occurrence; analysis of radial variations of the heart-wood shape; analysis of the knot distribution in logs. (author)

  18. RANCANG BANGUN PERANGKAT LUNAK MANAJEMEN DATABASE SQL SERVER BERBASIS WEB

    Directory of Open Access Journals (Sweden)

    Muchammad Husni

    2005-01-01

    Full Text Available Normal 0 false false false IN X-NONE X-NONE MicrosoftInternetExplorer4 Microsoft SQL Server merupakan aplikasi desktop database server yang bersifat client/server, karena memiliki komponen client, yang  berfungsi menampilkan dan memanipulasi data; serta komponen server yang berfungsi menyimpan, memanggil, dan mengamankan database. Operasi-operasi manajemen semua server database dalam jaringan dilakukan administrator database dengan menggunakan tool administratif utama SQL Server yang bernama Enterprise Manager. Hal ini mengakibatkan administrator database hanya bisa  melakukan operasi-operasi tersebut di komputer yang telah diinstalasi Microsoft SQL Server. Pada penelitian ini, dirancang suatu aplikasi berbasis web dengan menggunakan ASP.Net untuk melakukan pengaturan database server. Aplikasi ini menggunakan ADO.NET yang memanfaatkan Transact-SQL dan stored procedure pada server untuk melakukan operasi-operasi manajemen database pada suatu server database SQL, dan menampilkannya ke dalam web. Administrator database bisa menjalankan aplikasi berbasis web tersebut dari komputer mana saja pada jaringan dan melakukan koneksi ke server database SQL dengan menggunakan web browser. Dengan demikian memudahkan administrator melakukan tugasnya tanpa harus menggunakan komputer server.   Kata Kunci : Transact-SQL, ASP.Net, ADO.NET, SQL Server

  19. Interpretation of horizontal well production logs: influence of logging tool

    Energy Technology Data Exchange (ETDEWEB)

    Ozkan, E. [Colorado School of Mines, Boulder, CO (United States); Sarica, C. [Pennsylvania State Univ., College Park, PA (United States); Haci, M. [Drilling Measurements, Inc (United States)

    1998-12-31

    The influence of a production-logging tool on wellbore flow rate and pressure measurements was investigated, focusing on the disturbence caused by the production-logging tool and the coiled tubing on the original flow conditions in the wellbore. The investigation was carried out using an analytical model and single-phase liquid flow was assumed. Results showed that the production-logging tool influenced the measurements as shown by the deviation of the original flow-rate, pressure profiles and low-conductivity wellbores. High production rates increase the effect of the production-logging tool. Recovering or inferring the original flow conditions in the wellbore from the production-logging data is a very complex process which cannot be solved easily. For this reason, the conditions under which the information obtained by production-logging is meaningful is of considerable practical interest. 7 refs., 2 tabs., 15 figs.

  20. Artificial neural network modeling and cluster analysis for organic facies and burial history estimation using well log data: A case study of the South Pars Gas Field, Persian Gulf, Iran

    Science.gov (United States)

    Alizadeh, Bahram; Najjari, Saeid; Kadkhodaie-Ilkhchi, Ali

    2012-08-01

    Intelligent and statistical techniques were used to extract the hidden organic facies from well log responses in the Giant South Pars Gas Field, Persian Gulf, Iran. Kazhdomi Formation of Mid-Cretaceous and Kangan-Dalan Formations of Permo-Triassic Data were used for this purpose. Initially GR, SGR, CGR, THOR, POTA, NPHI and DT logs were applied to model the relationship between wireline logs and Total Organic Carbon (TOC) content using Artificial Neural Networks (ANN). The correlation coefficient (R2) between the measured and ANN predicted TOC equals to 89%. The performance of the model is measured by the Mean Squared Error function, which does not exceed 0.0073. Using Cluster Analysis technique and creating a binary hierarchical cluster tree the constructed TOC column of each formation was clustered into 5 organic facies according to their geochemical similarity. Later a second model with the accuracy of 84% was created by ANN to determine the specified clusters (facies) directly from well logs for quick cluster recognition in other wells of the studied field. Each created facies was correlated to its appropriate burial history curve. Hence each and every facies of a formation could be scrutinized separately and directly from its well logs, demonstrating the time and depth of oil or gas generation. Therefore potential production zone of Kazhdomi probable source rock and Kangan- Dalan reservoir formation could be identified while well logging operations (especially in LWD cases) were in progress. This could reduce uncertainty and save plenty of time and cost for oil industries and aid in the successful implementation of exploration and exploitation plans.

  1. A distributed design for monitoring, logging, and replaying device readings at LAMPF

    International Nuclear Information System (INIS)

    Burns, M.

    1992-01-01

    As control of the Los Alamos Meson Physics linear accelerator and Proton Storage Ring moves to a more distributed system, it has been necessary to redesign the software which monitors, logs, and replays device readings throughout the facility. The new design allows devices to be monitored and their readings logged locally on a network of computers. Control of the monitoring and logging process is available throughout the network from user interfaces which communicate via remote procedure calls with server processes running on each node which monitors and records device readings. Similarly, the logged data can be replayed from anywhere on the network. Two major requirements influencing the final design were the need to reduce the load on the CPU of the control machines, and the need for much faster replay of the logged device readings. (author)

  2. A distributed design for monitoring, logging, and replaying device readings at LAMPF

    International Nuclear Information System (INIS)

    Burns, M.

    1991-01-01

    As control of the Los Alamos Meson Physics linear accelerator and Proton Storage Ring moves to a more distributed system, it has been necessary to redesign the software which monitors, logs, and replays device readings throughout the facility. The new design allows devices to be monitored and their readings logged locally on a network of computers. Control of the monitoring and logging process is available throughout the network from user interfaces which communicate via remote procedure calls with server processes running on each node which monitors and records device readings. Similarly, the logged data can be replayed from anywhere on the network. Two major requirements influencing the final design were the need to reduce the load on the CPU of the control machines, and the need for much faster replay of the logged device readings. 1 ref., 2 figs

  3. Server virtualization management of corporate network with hyper-v

    OpenAIRE

    Kovalenko, Taras

    2012-01-01

    On a paper main tasks and problems of server virtualization are considerate. Practical value of virtualization in a corporate network, advantages and disadvantages of application of server virtualization are also considerate.

  4. Web server's reliability improvements using recurrent neural networks

    DEFF Research Database (Denmark)

    Madsen, Henrik; Albu, Rǎzvan-Daniel; Felea, Ioan

    2012-01-01

    In this paper we describe an interesting approach to error prediction illustrated by experimental results. The application consists of monitoring the activity for the web servers in order to collect the specific data. Predicting an error with severe consequences for the performance of a server (t...... usage, network usage and memory usage. We collect different data sets from monitoring the web server's activity and for each one we predict the server's reliability with the proposed recurrent neural network. © 2012 Taylor & Francis Group...

  5. The Development of Mobile Server for Language Courses

    OpenAIRE

    Tokumoto, Hiroko; Yoshida, Mitsunobu

    2009-01-01

    The aim of this paper is to introduce the conceptual design of the mobile server software "MY Server" for language teaching drafted by Tokumoto. It is to report how this software is designed and adopted effectively to Japanese language teaching. Most of the current server systems for education require facilities in a big scale including high-spec server machines, professional administrators, which naturally result in big budget projects that individual teachers or small size schools canno...

  6. Distance Based Root Cause Analysis and Change Impact Analysis of Performance Regressions

    Directory of Open Access Journals (Sweden)

    Junzan Zhou

    2015-01-01

    Full Text Available Performance regression testing is applied to uncover both performance and functional problems of software releases. A performance problem revealed by performance testing can be high response time, low throughput, or even being out of service. Mature performance testing process helps systematically detect software performance problems. However, it is difficult to identify the root cause and evaluate the potential change impact. In this paper, we present an approach leveraging server side logs for identifying root causes of performance problems. Firstly, server side logs are used to recover call tree of each business transaction. We define a novel distance based metric computed from call trees for root cause analysis and apply inverted index from methods to business transactions for change impact analysis. Empirical studies show that our approach can effectively and efficiently help developers diagnose root cause of performance problems.

  7. Client/server approach to image capturing

    Science.gov (United States)

    Tuijn, Chris; Stokes, Earle

    1998-01-01

    The diversity of the digital image capturing devices on the market today is quite astonishing and ranges from low-cost CCD scanners to digital cameras (for both action and stand-still scenes), mid-end CCD scanners for desktop publishing and pre- press applications and high-end CCD flatbed scanners and drum- scanners with photo multiplier technology. Each device and market segment has its own specific needs which explains the diversity of the associated scanner applications. What all those applications have in common is the need to communicate with a particular device to import the digital images; after the import, additional image processing might be needed as well as color management operations. Although the specific requirements for all of these applications might differ considerably, a number of image capturing and color management facilities as well as other services are needed which can be shared. In this paper, we propose a client/server architecture for scanning and image editing applications which can be used as a common component for all these applications. One of the principal components of the scan server is the input capturing module. The specification of the input jobs is based on a generic input device model. Through this model we make abstraction of the specific scanner parameters and define the scan job definitions by a number of absolute parameters. As a result, scan job definitions will be less dependent on a particular scanner and have a more universal meaning. In this context, we also elaborate on the interaction of the generic parameters and the color characterization (i.e., the ICC profile). Other topics that are covered are the scheduling and parallel processing capabilities of the server, the image processing facilities, the interaction with the ICC engine, the communication facilities (both in-memory and over the network) and the different client architectures (stand-alone applications, TWAIN servers, plug-ins, OLE or Apple-event driven

  8. Construction of a nuclear data server using TCP/IP

    Energy Technology Data Exchange (ETDEWEB)

    Kawano, Toshihiko; Sakai, Osamu [Kyushu Univ., Fukuoka (Japan)

    1997-03-01

    We construct a nuclear data server which provides data in the evaluated nuclear data library through the network by means of TCP/IP. The client is not necessarily a user but a computer program. Two examples with a prototype server program are demonstrated, the first is data transfer from the server to a user, and the second is to a computer program. (author)

  9. On-line single server dial-a-ride problems

    NARCIS (Netherlands)

    Feuerstein, E.; Stougie, L.

    1998-01-01

    In this paper results on the dial-a-ride problem with a single server are presented. Requests for rides consist of two points in a metric space, a source and a destination. A ride has to be made by the server from the source to the destination. The server travels at unit speed in the metric space

  10. JAFA: a protein function annotation meta-server

    DEFF Research Database (Denmark)

    Friedberg, Iddo; Harder, Tim; Godzik, Adam

    2006-01-01

    Annotations, or JAFA server. JAFA queries several function prediction servers with a protein sequence and assembles the returned predictions in a legible, non-redundant format. In this manner, JAFA combines the predictions of several servers to provide a comprehensive view of what are the predicted functions...

  11. Logging Activity in the Trinational Amazonian Region of Pando/Bolivia, Acre and Rond“nia/Brazil, and Madre de Dios/Peru: Analysis of Existing Data

    Science.gov (United States)

    Mendoza, E.; Brilhante, S. H.; Brown, I.; Peralta, R.; Rivero, S.; Melendez, N.

    2002-12-01

    Logging activity in the trinational southwestern Amazonia will grow in importance as a driver of regional land-use change as expanding road access facilitates both timber extraction and transport to international markets. Official data on current activity in this ~50 million ha region are limited and inconsistent with differences as much as twenty-fold between official estimates; nevertheless, they serve as guides for understanding the relative magnitude of logging activities. For 2000, an estimated 5 million m3 of timber were commercialized in Rondonia, 400,000 m3 in Acre, Brazil, and 200,000 m3 for the combined departments of Pando, Bolivia and Madre de Dios, Peru. About 70% of this timber originates from clear cutting done for pasture and agriculture activities, nearly a third from unregulated selective logging, and only 2% from managed selective logging. Eight timber species are preferentially extracted. The total area for timber concessions in Acre, Pando and Madre de Dios extends to about 4 million ha for a potential timber supply of 65 million m3. About 150,000 m3/yr of illegal timber is confiscated by federal and state agencies in Acre, Pando and Madre de Dios. Problems of enforcement in the region are due principally to the lack of trained personnel and little cooperation among agencies of the three countries. Proposed development plans indicate a 3- to >10-fold increase in logging activity in the Acre and Pando regions during the coming decade. More detailed studies are urgently needed to guide sustainable development of this resource in southwestern Amazonia.

  12. PENGUKURAN KINERJA ROUND-ROBIN SCHEDULER UNTUK LINUX VIRTUAL SERVER PADA KASUS WEB SERVER

    Directory of Open Access Journals (Sweden)

    Royyana Muslim Ijtihadie

    2005-07-01

    Full Text Available Normal 0 false false false IN X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0cm 5.4pt 0cm 5.4pt; mso-para-margin:0cm; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} Dengan meningkatnya perkembangan jumlah pengguna internet dan mulai diadopsinya penggunaan internet dalam kehidupan sehari-hari, maka lalulintas data di Internet telah meningkat secara signifikan. Sejalan dengan itu pula beban kerja server-server yang memberikan service di Internet juga mengalami kenaikan yang cukup signifikan. Hal tersebut dapat mengakibatkan suatu server mengalami kelebihan beban pada suatu saat. Untuk mengatasi hal tersebut maka diterapkan skema konfigurasi server cluster menggunakan konsep load balancing. Load balancing server menerapkan algoritma dalam melakukan pembagian tugas. Algoritma round robin telah digunakan pada Linux Virtual Server. Penelitian ini melakukan pengukuran kinerja terhadap Linux Virtual Server yang menggunakan algoritma round robin untuk melakukan penjadwalan pembagian beban terhadap server. Penelitian ini mengukur performa dari sisi client yang mencoba mengakses web server.performa yang diukur adalah jumlah request yang bisa diselesaikan perdetik (request per second, waktu untuk menyelesaikan per satu request, dan   throughput yang dihasilkan. Dari hasil percobaan didapatkan bahwa penggunaan LVS bisa meningkatkan performa, yaitu menaikkan jumlah request per detik

  13. A Server-Client-Based Graphical Development Environment for Physics Analyses (VISPA)

    International Nuclear Information System (INIS)

    Bretz, H-P; Erdmann, M; Fischer, R; Hinzmann, A; Klingebiel, D; Komm, M; Müller, G; Rieger, M; Steffens, J; Steggemann, J; Urban, M; Winchen, T

    2012-01-01

    The Visual Physics Analysis (VISPA) project provides a graphical development environment for data analysis. It addresses the typical development cycle of (re-)designing, executing, and verifying an analysis. We present the new server-client-based web application of the VISPA project to perform physics analyses via a standard internet browser. This enables individual scientists to work with a large variety of devices including touch screens, and teams of scientists to share, develop, and execute analyses on a server via the web interface.

  14. Logging Concessions Enable Illegal Logging Crisis in the Peruvian Amazon

    Science.gov (United States)

    Finer, Matt; Jenkins, Clinton N.; Sky, Melissa A. Blue; Pine, Justin

    2014-04-01

    The Peruvian Amazon is an important arena in global efforts to promote sustainable logging in the tropics. Despite recent efforts to achieve sustainability, such as provisions in the US-Peru Trade Promotion Agreement, illegal logging continues to plague the region. We present evidence that Peru's legal logging concession system is enabling the widespread illegal logging via the regulatory documents designed to ensure sustainable logging. Analyzing official government data, we found that 68.3% of all concessions supervised by authorities were suspected of major violations. Of the 609 total concessions, nearly 30% have been cancelled for violations and we expect this percentage to increase as investigations continue. Moreover, the nature of the violations indicate that the permits associated with legal concessions are used to harvest trees in unauthorized areas, thus threatening all forested areas. Many of the violations pertain to the illegal extraction of CITES-listed timber species outside authorized areas. These findings highlight the need for additional reforms.

  15. Logging concessions enable illegal logging crisis in the Peruvian Amazon.

    Science.gov (United States)

    Finer, Matt; Jenkins, Clinton N; Sky, Melissa A Blue; Pine, Justin

    2014-04-17

    The Peruvian Amazon is an important arena in global efforts to promote sustainable logging in the tropics. Despite recent efforts to achieve sustainability, such as provisions in the US-Peru Trade Promotion Agreement, illegal logging continues to plague the region. We present evidence that Peru's legal logging concession system is enabling the widespread illegal logging via the regulatory documents designed to ensure sustainable logging. Analyzing official government data, we found that 68.3% of all concessions supervised by authorities were suspected of major violations. Of the 609 total concessions, nearly 30% have been cancelled for violations and we expect this percentage to increase as investigations continue. Moreover, the nature of the violations indicate that the permits associated with legal concessions are used to harvest trees in unauthorized areas, thus threatening all forested areas. Many of the violations pertain to the illegal extraction of CITES-listed timber species outside authorized areas. These findings highlight the need for additional reforms.

  16. Energy Servers Deliver Clean, Affordable Power

    Science.gov (United States)

    2010-01-01

    K.R. Sridhar developed a fuel cell device for Ames Research Center, that could use solar power to split water into oxygen for breathing and hydrogen for fuel on Mars. Sridhar saw the potential of the technology, when reversed, to create clean energy on Earth. He founded Bloom Energy, of Sunnyvale, California, to advance the technology. Today, the Bloom Energy Server is providing cost-effective, environmentally friendly energy to a host of companies such as eBay, Google, and The Coca-Cola Company. Bloom's NASA-derived Energy Servers generate energy that is about 67-percent cleaner than a typical coal-fired power plant when using fossil fuels and 100-percent cleaner with renewable fuels.

  17. Extracting useful knowledge from event logs

    DEFF Research Database (Denmark)

    Djenouri, Youcef; Belhadi, Asma; Fournier-Viger, Philippe

    2018-01-01

    Business process analysis is a key activity that aims at increasing the efficiency of business operations. In recent years, several data mining based methods have been designed for discovering interesting patterns in event logs. A popular type of methods consists of applying frequent itemset mini...

  18. Reporting with Microsoft SQL Server 2012

    CERN Document Server

    Serra, James

    2014-01-01

    This is a step-by-step tutorial that deals with Microsoft Server 2012 reporting tools:SSRS and Power View.If you are a BI developer, consultant, or architect who wishes to learn how to use SSRS and Power View, and want to understand the best use for each tool, then this book will get you up and running quickly. No prior experience is required with either tool!

  19. Descriptors of server capabilities in China

    DEFF Research Database (Denmark)

    Adeyemi, Oluseyi; Slepniov, Dmitrij; Wæhrens, Brian Vejrum

    are relevant to determine subsidiary roles and as an indication of the capabilities required. These descriptors are identified through extensive literature review and validated by case studies of two Danish multinational companies subsidiaries operating in China. They provided the empirical basis......China with the huge market potential it possesses is an important issue for subsidiaries of western multinational companies. The objective of this paper is therefore to strengthen researchers’ and practitioners’ perspectives on what are the descriptors of server capabilities. The descriptors...

  20. Instant Debian build a web server

    CERN Document Server

    Parrella, Jose Miguel

    2013-01-01

    Filled with practical, step-by-step instructions and clear explanations for the most important and useful tasks. A concise guide full of step-by-step recipes to teach you how to install and configure a Debian web server.This is an ideal book if you are an administrator on a Development Operations team or infrastructure management, who is passionate about Linux and their Web applications but have no previous experience with Debian or APT-based systems.

  1. SQL Server 2012 reporting services blueprints

    CERN Document Server

    Ribunal, Marlon

    2013-01-01

    Follow the fictional John Kirkland through a series of real-world reporting challenges based on actual business conditions. Use his detailed blueprints to develop your own reports for every requirement.This book is for report developers, data analysts, and database administrators struggling to master the complex world of effective reporting in SQL Server 2012. Knowledge of how data sources and data sets work will greatly help readers to speed through the tutorials.

  2. TwiddleNet: Smartphones as Personal Servers

    OpenAIRE

    Gurminder, Singh; Center for the Study of Mobile Devices and Communications

    2012-01-01

    TwiddleNet uses smartphones as personal servers to enable instant content capture and dissemination for firstresponders. It supports the information sharing needs of first responders in the early stages of an emergency response operation. In TwiddleNet, content, once captured, is automatically tagged and disseminated using one of the several networking channels available in smartphones. TwiddleNet pays special attention to minimizing the equipment, network set-up time, and content...

  3. Preprint server seeks way to halt plagiarists

    CERN Multimedia

    Giles, J

    2003-01-01

    "An unusual case of plagiarism has struck ArXiv, the popular physics preprint server at Cornell University in Ithaca, New York, resulting in the withdrawal of 22 papers...The plagiarism case traces its origins to June 2002, when Yasushi Watanabe, a high-energy physicist at the Tokyo Insitute of Technology, was contacted by Ramy Noboulsi, who said he was a mathematical physicist" (1 page)

  4. Metastability of Queuing Networks with Mobile Servers

    Science.gov (United States)

    Baccelli, F.; Rybko, A.; Shlosman, S.; Vladimirov, A.

    2018-04-01

    We study symmetric queuing networks with moving servers and FIFO service discipline. The mean-field limit dynamics demonstrates unexpected behavior which we attribute to the metastability phenomenon. Large enough finite symmetric networks on regular graphs are proved to be transient for arbitrarily small inflow rates. However, the limiting non-linear Markov process possesses at least two stationary solutions. The proof of transience is based on martingale techniques.

  5. Smartphone log data in a qualitative perspective

    DEFF Research Database (Denmark)

    Ørmen, Jacob; Thorhauge, Anne Mette

    2015-01-01

    into studies of smartphones in everyday life. Through an illustrative study, we explore a more nuanced perspective on what can be considered “log data” and how these types of data can be collected and analysed. A qualitative approach to log data analysis offers researchers new opportunities to situate......Log data from smartphones have primarily been used in large-scale research designs to draw statistical inferences from hundreds or even thousands of participants. In this article, we argue that more qualitatively oriented designs can also benefit greatly from integrating these rich data sources...... serve as cues to instigate discussion and reflection as well as act as resources for contextualizing and organizing related empirical material. In the discussion, the advantages of a qualitative perspective for research designs are assessed in relation to issues of validity. Further perspectives...

  6. Securing SQL Server Protecting Your Database from Attackers

    CERN Document Server

    Cherry, Denny

    2012-01-01

    Written by Denny Cherry, a Microsoft MVP for the SQL Server product, a Microsoft Certified Master for SQL Server 2008, and one of the biggest names in SQL Server today, Securing SQL Server, Second Edition explores the potential attack vectors someone can use to break into your SQL Server database as well as how to protect your database from these attacks. In this book, you will learn how to properly secure your database from both internal and external threats using best practices and specific tricks the author uses in his role as an independent consultant while working on some of the largest

  7. Experience with Server Self Service Center (S3C)

    CERN Multimedia

    Sucik, J

    2009-01-01

    CERN has a successful experience with running Server Self Service Center (S3C) for virtual server provisioning which is based on Microsoft® Virtual Server 2005. With the introduction of Windows Server 2008 and its built-in hypervisor based virtualization (Hyper-V) there are new possibilities for the expansion of the current service. This paper describes the architecture of the redesigned virtual Server Self Service based on Hyper-V which provides dynamically scalable virtualized resources on demand as needed and outlines the possible implications on the future use of virtual machines at CERN.

  8. Experience with Server Self Service Center (S3C)

    International Nuclear Information System (INIS)

    Sucik, Juraj; Bukowiec, Sebastian

    2010-01-01

    CERN has a successful experience with running Server Self Service Center (S3C) for virtual server provisioning which is based on Microsoft (registered) Virtual Server 2005. With the introduction of Windows Server 2008 and its built-in hypervisor based virtualization (Hyper-V) there are new possibilities for the expansion of the current service. This paper describes the architecture of the redesigned virtual Server Self Service based on Hyper-V which provides dynamically scalable virtualized resources on demand as needed and outlines the possible implications on the future use of virtual machines at CERN.

  9. Securing SQL server protecting your database from attackers

    CERN Document Server

    Cherry, Denny

    2015-01-01

    SQL server is the most widely-used database platform in the world, and a large percentage of these databases are not properly secured, exposing sensitive customer and business data to attack. In Securing SQL Server, Third Edition, you will learn about the potential attack vectors that can be used to break into SQL server databases as well as how to protect databases from these attacks. In this book, Denny Cherry - a Microsoft SQL MVP and one of the biggest names in SQL server - will teach you how to properly secure an SQL server database from internal and external threats using best practic

  10. SNG-logs at Skjern

    DEFF Research Database (Denmark)

    Korsbech, Uffe C C; Petersen, Jesper; Aage, Helle Karina

    1998-01-01

    Spectral Natural Gamma-ray logs have been run in two water supply borings at Skjern. The log data have been examined by a new technique - Noise Adjusted Singular Value Decomposition - in order to get a detailed and reliable picture of the distribution of uranium and thorium gamma-rays from heavy...

  11. 4DGeoBrowser: A Web-Based Data Browser and Server for Accessing and Analyzing Multi-Disciplinary Data

    National Research Council Canada - National Science Library

    Lerner, Steven

    2001-01-01

    .... Once the information is loaded onto a Geobrowser server the investigator-user is able to login to the website and use a set of data access and analysis tools to search, plot, and display this information...

  12. SciServer: An Online Collaborative Environment for Big Data in Research and Education

    Science.gov (United States)

    Raddick, Jordan; Souter, Barbara; Lemson, Gerard; Taghizadeh-Popp, Manuchehr

    2017-01-01

    For the past year, SciServer Compute (http://compute.sciserver.org) has offered access to big data resources running within server-side Docker containers. Compute has allowed thousands of researchers to bring advanced analysis to big datasets like the Sloan Digital Sky Survey and others, while keeping the analysis close to the data for better performance and easier read/write access. SciServer Compute is just one part of the SciServer system being developed at Johns Hopkins University, which provides an easy-to-use collaborative research environment for astronomy and many other sciences.SciServer enables these collaborative research strategies using Jupyter notebooks, in which users can write their own Python and R scripts and execute them on the same server as the data. We have written special-purpose libraries for querying, reading, and writing data. Intermediate results can be stored in large scratch space (hundreds of TBs) and analyzed directly from within Python or R with state-of-the-art visualization and machine learning libraries. Users can store science-ready results in their permanent allocation on SciDrive, a Dropbox-like system for sharing and publishing files.SciServer Compute’s virtual research environment has grown with the addition of task management and access control functions, allowing collaborators to share both data and analysis scripts securely across the world. These features also open up new possibilities for education, allowing instructors to share datasets with students and students to write analysis scripts to share with their instructors. We are leveraging these features into a new system called “SciServer Courseware,” which will allow instructors to share assignments with their students, allowing students to engage with big data in new ways.SciServer has also expanded to include more datasets beyond the Sloan Digital Sky Survey. A part of that growth has been the addition of the SkyQuery component, which allows for simple, fast

  13. A comparison of dried shiitake mushroom in log cultivation and mycelial cultivation from different geographical origins using stable carbon and nitrogen isotope analysis

    International Nuclear Information System (INIS)

    Suzuki, Yaeko; Nakashita, Rumiko; Ishikawa, Noemia Kazue; Tabuchi, Akiko; Sakuno, Emi; Tokimoto, Keisuke

    2015-01-01

    We determined carbon and nitrogen isotopic compositions (δ 13 C and δ 15 N) of dried shiitake mushroom (Lentinula edodes) samples from Japan, China, South Korea and Brazil in order to discriminate their geographical origins. In log cultivation, the δ 13 C values of Japanese dried shiitake samples were lower than those of Chinese samples, depending on the δ 13 C values of log and their growth conditions. In mycelial cultivation, the δ 13 C and δ 15 N values of Japanese dried shiitake samples were higher than those of Chinese samples. By using the δ 13 C and δ 15 N values, 87.4% of Japanese dried shiitake samples (n = 95) and 87.9% of Chinese dried shiitake samples (n = 66) in log cultivation, 90.0% of the Japanese dried shiitake samples (n = 50) and 93.9% of Chinese dried shiitake samples (n = 114) in mycelial cultivation, were correctly classified according to the production site. These results suggested that the δ 13 C and δ 15 N values will be potentially useful for tracing their geographical origin of dried shiitake samples. (author)

  14. Working with Data: Discovering Knowledge through Mining and Analysis; Systematic Knowledge Management and Knowledge Discovery; Text Mining; Methodological Approach in Discovering User Search Patterns through Web Log Analysis; Knowledge Discovery in Databases Using Formal Concept Analysis; Knowledge Discovery with a Little Perspective.

    Science.gov (United States)

    Qin, Jian; Jurisica, Igor; Liddy, Elizabeth D.; Jansen, Bernard J; Spink, Amanda; Priss, Uta; Norton, Melanie J.

    2000-01-01

    These six articles discuss knowledge discovery in databases (KDD). Topics include data mining; knowledge management systems; applications of knowledge discovery; text and Web mining; text mining and information retrieval; user search patterns through Web log analysis; concept analysis; data collection; and data structure inconsistency. (LRW)

  15. Development of an Intelligent System to Synthesize Petrophysical Well Logs

    Directory of Open Access Journals (Sweden)

    Morteza Nouri Taleghani

    2013-07-01

    Full Text Available Porosity is one of the fundamental petrophysical properties that should be evaluated for hydrocarbon bearing reservoirs. It is a vital factor in precise understanding of reservoir quality in a hydrocarbon field. Log data are exceedingly crucial information in petroleum industries, for many of hydrocarbon parameters are obtained by virtue of petrophysical data. There are three main petrophysical logging tools for the determination of porosity, namely neutron, density, and sonic well logs. Porosity can be determined by the use of each of these tools; however, a precise analysis requires a complete set of these tools. Log sets are commonly either incomplete or unreliable for many reasons (i.e. incomplete logging, measurement errors, and loss of data owing to unsuitable data storage. To overcome this drawback, in this study several intelligent systems such as fuzzy logic (FL, neural network (NN, and support vector machine are used to predict synthesized petrophysical logs including neutron, density, and sonic. To accomplish this, the petrophysical well logs data were collected from a real reservoir in one of Iran southwest oil fields. The corresponding correlation was obtained through the comparison of synthesized log values with real log values. The results showed that all intelligent systems were capable of synthesizing petrophysical well logs, but SVM had better accuracy and could be used as the most reliable method compared to the other techniques.

  16. Intelligent approaches for the synthesis of petrophysical logs

    International Nuclear Information System (INIS)

    Rezaee, M Reza; Kadkhodaie-Ilkhchi, Ali; Alizadeh, Pooya Mohammad

    2008-01-01

    Log data are of prime importance in acquiring petrophysical data from hydrocarbon reservoirs. Reliable log analysis in a hydrocarbon reservoir requires a complete set of logs. For many reasons, such as incomplete logging in old wells, destruction of logs due to inappropriate data storage and measurement errors due to problems with logging apparatus or hole conditions, log suites are either incomplete or unreliable. In this study, fuzzy logic and artificial neural networks were used as intelligent tools to synthesize petrophysical logs including neutron, density, sonic and deep resistivity. The petrophysical data from two wells were used for constructing intelligent models in the Fahlian limestone reservoir, Southern Iran. A third well from the field was used to evaluate the reliability of the models. The results showed that fuzzy logic and artificial neural networks were successful in synthesizing wireline logs. The combination of the results obtained from fuzzy logic and neural networks in a simple averaging committee machine (CM) showed a significant improvement in the accuracy of the estimations. This committee machine performed better than fuzzy logic or the neural network model in the problem of estimating petrophysical properties from well logs

  17. CovalentDock Cloud: a web server for automated covalent docking.

    Science.gov (United States)

    Ouyang, Xuchang; Zhou, Shuo; Ge, Zemei; Li, Runtao; Kwoh, Chee Keong

    2013-07-01

    Covalent binding is an important mechanism for many drugs to gain its function. We developed a computational algorithm to model this chemical event and extended it to a web server, the CovalentDock Cloud, to make it accessible directly online without any local installation and configuration. It provides a simple yet user-friendly web interface to perform covalent docking experiments and analysis online. The web server accepts the structures of both the ligand and the receptor uploaded by the user or retrieved from online databases with valid access id. It identifies the potential covalent binding patterns, carries out the covalent docking experiments and provides visualization of the result for user analysis. This web server is free and open to all users at http://docking.sce.ntu.edu.sg/.

  18. CheD: chemical database compilation tool, Internet server, and client for SQL servers.

    Science.gov (United States)

    Trepalin, S V; Yarkov, A V

    2001-01-01

    An efficient program, which runs on a personal computer, for the storage, retrieval, and processing of chemical information, is presented, The program can work both as a stand-alone application or in conjunction with a specifically written Web server application or with some standard SQL servers, e.g., Oracle, Interbase, and MS SQL. New types of data fields are introduced, e.g., arrays for spectral information storage, HTML and database links, and user-defined functions. CheD has an open architecture; thus, custom data types, controls, and services may be added. A WWW server application for chemical data retrieval features an easy and user-friendly installation on Windows NT or 95 platforms.

  19. Effect of Temporal Relationships in Associative Rule Mining for Web Log Data

    Science.gov (United States)

    Mohd Khairudin, Nazli; Mustapha, Aida

    2014-01-01

    The advent of web-based applications and services has created such diverse and voluminous web log data stored in web servers, proxy servers, client machines, or organizational databases. This paper attempts to investigate the effect of temporal attribute in relational rule mining for web log data. We incorporated the characteristics of time in the rule mining process and analysed the effect of various temporal parameters. The rules generated from temporal relational rule mining are then compared against the rules generated from the classical rule mining approach such as the Apriori and FP-Growth algorithms. The results showed that by incorporating the temporal attribute via time, the number of rules generated is subsequently smaller but is comparable in terms of quality. PMID:24587757

  20. Geomicrobial Optical Logging Detectors (GOLD)

    Science.gov (United States)

    Bramall, N. E.; Stoker, C. R.; Price, P. B.; Coates, J. D.; Allamandola, L. J.; Mattioda, A. L.

    2008-12-01

    We will present concepts for downhole instrumentation that could be used in the Deep Underground Science and Engineering Laboratory (DUSEL). We envision optical borehole-logging instruments that could monitor bacterial concentration, mineralogy, aromatic organics, temperature and oxygen concentration, allowing for the in situ monitoring of time-dependent microbial and short-scale geologic processes and provide valuable in situ data on stratigraphy to supplement core analyses, especially where instances of missing or damaged core sections make such studies difficult. Incorporated into these instruments will be a sampling/inoculation tool to allow for the recovery and/or manipulation of particularly interesting sections of the borehole wall for further study, enabling a series of microbiological studies. The borehole tools we will develop revolve around key emerging technologies and methods, some of which are briefly described below: 1) Autofluorescence Spectroscopy: Building on past instruments, we will develop a new borehole logger that searches for microbial life and organics using fluorescence spectroscopy. Many important organic compounds (e.g. PAHs) and biomolecules (e.g. aromatic amino acids, proteins, methanogenic coenzymes) fluoresce when excited with ultraviolet and visible light. Through the careful selection of excitation wavelength(s) and temporal gating parameters, a borehole logging instrument can detect and differentiate between these different compounds and the mineral matrix in which they exist. 2) Raman Spectroscopy: Though less sensitive than fluorescence spectroscopy, Raman spectroscopy is more definitive: it can provide important mineral phase distribution/proportions and other chemical data enabling studies of mineralogy and microbe-mineral interactions (when combined with fluorescence). 3) Borehole Camera: Imaging of the borehole wall with extended information in the UV, visible, and NIR for a more informative view can provide a lot of insight

  1. Engineering aspects of radiometric logging

    International Nuclear Information System (INIS)

    Huppert, P.

    1982-01-01

    Engineering problems encountered in the development of nuclear borehole logging techniques are discussed. Spectrometric techniques require electronic stability of the equipment. In addition the electronics must be capable of handling high count rates of randomly distributed pulses of fast rise time from the detector and the systems must be designed so that precise calibration is possible under field operating conditions. Components of a logging system are discussed in detail. They include the logging probe (electronics, detector, high voltage supply, preamplifier), electronic instrumentation for data collection and processing and auxiliary equipment

  2. Log-balanced combinatorial sequences

    Directory of Open Access Journals (Sweden)

    Tomislav Došlic

    2005-01-01

    Full Text Available We consider log-convex sequences that satisfy an additional constraint imposed on their rate of growth. We call such sequences log-balanced. It is shown that all such sequences satisfy a pair of double inequalities. Sufficient conditions for log-balancedness are given for the case when the sequence satisfies a two- (or more- term linear recurrence. It is shown that many combinatorially interesting sequences belong to this class, and, as a consequence, that the above-mentioned double inequalities are valid for all of them.

  3. An Efficient Algorithm for Server Thermal Fault Diagnosis Based on Infrared Image

    Science.gov (United States)

    Liu, Hang; Xie, Ting; Ran, Jian; Gao, Shan

    2017-10-01

    It is essential for a data center to maintain server security and stability. Long-time overload operation or high room temperature may cause service disruption even a server crash, which would result in great economic loss for business. Currently, the methods to avoid server outages are monitoring and forecasting. Thermal camera can provide fine texture information for monitoring and intelligent thermal management in large data center. This paper presents an efficient method for server thermal fault monitoring and diagnosis based on infrared image. Initially thermal distribution of server is standardized and the interest regions of the image are segmented manually. Then the texture feature, Hu moments feature as well as modified entropy feature are extracted from the segmented regions. These characteristics are applied to analyze and classify thermal faults, and then make efficient energy-saving thermal management decisions such as job migration. For the larger feature space, the principal component analysis is employed to reduce the feature dimensions, and guarantee high processing speed without losing the fault feature information. Finally, different feature vectors are taken as input for SVM training, and do the thermal fault diagnosis after getting the optimized SVM classifier. This method supports suggestions for optimizing data center management, it can improve air conditioning efficiency and reduce the energy consumption of the data center. The experimental results show that the maximum detection accuracy is 81.5%.

  4. Energy Efficiency in Small Server Rooms: Field Surveys and Findings

    Energy Technology Data Exchange (ETDEWEB)

    Cheung, Iris [Hoi; Greenberg, Steve; Mahdavi, Roozbeh; Brown, Richard; Tschudi, William

    2014-08-11

    Fifty-seven percent of US servers are housed in server closets, server rooms, and localized data centers, in what are commonly referred to as small server rooms, which comprise 99percent of all server spaces in the US. While many mid-tier and enterprise-class data centers are owned by large corporations that consider energy efficiency a goal to minimize business operating costs, small server rooms typically are not similarly motivated. They are characterized by decentralized ownership and management and come in many configurations, which creates a unique set of efficiency challenges. To develop energy efficiency strategies for these spaces, we surveyed 30 small server rooms across eight institutions, and selected four of them for detailed assessments. The four rooms had Power Usage Effectiveness (PUE) values ranging from 1.5 to 2.1. Energy saving opportunities ranged from no- to low-cost measures such as raising cooling set points and better airflow management, to more involved but cost-effective measures including server consolidation and virtualization, and dedicated cooling with economizers. We found that inefficiencies mainly resulted from organizational rather than technical issues. Because of the inherent space and resource limitations, the most effective measure is to operate servers through energy-efficient cloud-based services or well-managed larger data centers, rather than server rooms. Backup power requirement, and IT and cooling efficiency should be evaluated to minimize energy waste in the server space. Utility programs are instrumental in raising awareness and spreading technical knowledge on server operation, and the implementation of energy efficiency measures in small server rooms.

  5. Web server for priority ordered multimedia services

    Science.gov (United States)

    Celenk, Mehmet; Godavari, Rakesh K.; Vetnes, Vermund

    2001-10-01

    In this work, our aim is to provide finer priority levels in the design of a general-purpose Web multimedia server with provisions of the CM services. The type of services provided include reading/writing a web page, downloading/uploading an audio/video stream, navigating the Web through browsing, and interactive video teleconferencing. The selected priority encoding levels for such operations follow the order of admin read/write, hot page CM and Web multicasting, CM read, Web read, CM write and Web write. Hot pages are the most requested CM streams (e.g., the newest movies, video clips, and HDTV channels) and Web pages (e.g., portal pages of the commercial Internet search engines). Maintaining a list of these hot Web pages and CM streams in a content addressable buffer enables a server to multicast hot streams with lower latency and higher system throughput. Cold Web pages and CM streams are treated as regular Web and CM requests. Interactive CM operations such as pause (P), resume (R), fast-forward (FF), and rewind (RW) have to be executed without allocation of extra resources. The proposed multimedia server model is a part of the distributed network with load balancing schedulers. The SM is connected to an integrated disk scheduler (IDS), which supervises an allocated disk manager. The IDS follows the same priority handling as the SM, and implements a SCAN disk-scheduling method for an improved disk access and a higher throughput. Different disks are used for the Web and CM services in order to meet the QoS requirements of CM services. The IDS ouput is forwarded to an Integrated Transmission Scheduler (ITS). The ITS creates a priority ordered buffering of the retrieved Web pages and CM data streams that are fed into an auto regressive moving average (ARMA) based traffic shaping circuitry before being transmitted through the network.

  6. Expitope: a web server for epitope expression.

    Science.gov (United States)

    Haase, Kerstin; Raffegerst, Silke; Schendel, Dolores J; Frishman, Dmitrij

    2015-06-01

    Adoptive T cell therapies based on introduction of new T cell receptors (TCRs) into patient recipient T cells is a promising new treatment for various kinds of cancers. A major challenge, however, is the choice of target antigens. If an engineered TCR can cross-react with self-antigens in healthy tissue, the side-effects can be devastating. We present the first web server for assessing epitope sharing when designing new potential lead targets. We enable the users to find all known proteins containing their peptide of interest. The web server returns not only exact matches, but also approximate ones, allowing a number of mismatches of the users choice. For the identified candidate proteins the expression values in various healthy tissues, representing all vital human organs, are extracted from RNA Sequencing (RNA-Seq) data as well as from some cancer tissues as control. All results are returned to the user sorted by a score, which is calculated using well-established methods and tools for immunological predictions. It depends on the probability that the epitope is created by proteasomal cleavage and its affinities to the transporter associated with antigen processing and the major histocompatibility complex class I alleles. With this framework, we hope to provide a helpful tool to exclude potential cross-reactivity in the early stage of TCR selection for use in design of adoptive T cell immunotherapy. The Expitope web server can be accessed via http://webclu.bio.wzw.tum.de/expitope. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  7. The HMMER Web Server for Protein Sequence Similarity Search.

    Science.gov (United States)

    Prakash, Ananth; Jeffryes, Matt; Bateman, Alex; Finn, Robert D

    2017-12-08

    Protein sequence similarity search is one of the most commonly used bioinformatics methods for identifying evolutionarily related proteins. In general, sequences that are evolutionarily related share some degree of similarity, and sequence-search algorithms use this principle to identify homologs. The requirement for a fast and sensitive sequence search method led to the development of the HMMER software, which in the latest version (v3.1) uses a combination of sophisticated acceleration heuristics and mathematical and computational optimizations to enable the use of profile hidden Markov models (HMMs) for sequence analysis. The HMMER Web server provides a common platform by linking the HMMER algorithms to databases, thereby enabling the search for homologs, as well as providing sequence and functional annotation by linking external databases. This unit describes three basic protocols and two alternate protocols that explain how to use the HMMER Web server using various input formats and user defined parameters. © 2017 by John Wiley & Sons, Inc. Copyright © 2017 John Wiley & Sons, Inc.

  8. Implementing VMware vCenter Server

    CERN Document Server

    Kuminsky, Konstantin

    2013-01-01

    This book is a practical, hands-on guide that will help you learn everything you need to know to administer your environment with VMware vCenter Server. Throughout the book, there are best practices and useful tips and tricks which can be used for day-to-day tasks.If you are an administrator or a technician starting with VMware, with little or no knowledge of virtualization products, this book is ideal for you. Even if you are an IT professional looking to expand your existing environment, you will be able to use this book to help you improve the management of these environments. IT managers w

  9. Getting started with Microsoft Lync server 2013

    CERN Document Server

    Volpe, Fabrizio

    2013-01-01

    This book has a practical approach with a lot of step-by-step guides and explanations as to where and why we're doing the various operations.Getting Started with Microsoft Lync Server 2013 is a starting point for system administrators, IT pros, unified communication technicians, and decision makers in companies or in the consultancy business. For people who have never managed Lync (or a U.C. product), the book will guide you through the basic concepts and mistakes. If you are already managing a Lync deployment you will find important explanations and ideas put together in a single text. If you

  10. Map server of Slovak Environmental Agency

    International Nuclear Information System (INIS)

    Koska, M.

    2005-01-01

    The Slovak Environmental Agency (SAZP) is professional organization of the Ministry of Environment of the Slovak Republic. In the area of informatics SAZP is responsible for operation of information system about environment in the Slovak Republic (ISE). The main goal of the ISE is collection, evaluating and accessing of relevant information about environment between organizations of state or administration, public administration, public, scientific institutes etc. SAZP uses technology of publishing of geo-space data so-called WEB maps (dynamic mapping) - maps are formed online. As a technologic part of information system is internet map server

  11. HS06 Benchmark for an ARM Server

    Science.gov (United States)

    Kluth, Stefan

    2014-06-01

    We benchmarked an ARM cortex-A9 based server system with a four-core CPU running at 1.1 GHz. The system used Ubuntu 12.04 as operating system and the HEPSPEC 2006 (HS06) benchmarking suite was compiled natively with gcc-4.4 on the system. The benchmark was run for various settings of the relevant gcc compiler options. We did not find significant influence from the compiler options on the benchmark result. The final HS06 benchmark result is 10.4.

  12. Professional Microsoft SQL Server 2012 Reporting Services

    CERN Document Server

    Turley, Paul; Silva, Thiago; Withee, Ken; Paisley, Grant

    2012-01-01

    A must-have guide for the latest updates to the new release of Reporting Services SQL Server Reporting Services allows you to create reports and business intelligence (BI) solutions. With this updated resource, a team of experts shows you how Reporting Services makes reporting faster, easier and more powerful than ever in web, desktop, and portal solutions. New coverage discusses the new reporting tool called Crescent, BI semantic model's impact on report design and creation, semantic model design, and more. You'll explore the major enhancements to Report Builder and benefit from best practice

  13. HS06 benchmark for an ARM server

    International Nuclear Information System (INIS)

    Kluth, Stefan

    2014-01-01

    We benchmarked an ARM cortex-A9 based server system with a four-core CPU running at 1.1 GHz. The system used Ubuntu 12.04 as operating system and the HEPSPEC 2006 (HS06) benchmarking suite was compiled natively with gcc-4.4 on the system. The benchmark was run for various settings of the relevant gcc compiler options. We did not find significant influence from the compiler options on the benchmark result. The final HS06 benchmark result is 10.4.

  14. Publication Life Cycle at CERN Document Server

    CERN Multimedia

    Witowski, Sebastian; Costa, Flavio; Gabancho, Esteban; Marian, Ludmila; Tzovanakis, Harris

    2017-01-01

    This presentation guides listeners through all the stages of publication life cycle at CERN Document Server, from the ingestion using one of the various tools, through curation and processing, until the data is ready to be exported to other systems. It describes different tools that we are using to curate the incoming publications as well as to further improve the existing data on CDS. The second part of the talk goes through various challenges we have faced in the past and how we are going to overcome them in the new version of CDS.

  15. VT Route Log Points 2017

    Data.gov (United States)

    Vermont Center for Geographic Information — This data layer is used with VTrans' Integrated Route Log System (IRA). It is also used to calibrate the linear referencing systems, including the End-to-End and...

  16. New materials for fireplace logs

    Science.gov (United States)

    Kieselback, D. J.; Smock, A. W.

    1971-01-01

    Fibrous insulation and refractory concrete are used for logs as well as fireproof walls, incinerator bricks, planters, and roof shingles. Insulation is lighter and more shock resistant than fireclay. Lightweight slag bonded with refractory concrete serves as aggregrate.

  17. 3USS: a web server for detecting alternative 3'UTRs from RNA-seq experiments.

    KAUST Repository

    Le Pera, Loredana; Mazzapioda, Mariagiovanna; Tramontano, Anna

    2015-01-01

    Protein-coding genes with multiple alternative polyadenylation sites can generate mRNA 3'UTR sequences of different lengths, thereby causing the loss or gain of regulatory elements, which can affect stability, localization and translation efficiency. 3USS is a web-server developed with the aim of giving experimentalists the possibility to automatically identify alternative 3 ': UTRs (shorter or longer with respect to a reference transcriptome), an option that is not available in standard RNA-seq data analysis procedures. The tool reports as putative novel the 3 ': UTRs not annotated in available databases. Furthermore, if data from two related samples are uploaded, common and specific alternative 3 ': UTRs are identified and reported by the server.3USS is freely available at http://www.biocomputing.it/3uss_server.

  18. Data decomposition of Monte Carlo particle transport simulations via tally servers

    International Nuclear Information System (INIS)

    Romano, Paul K.; Siegel, Andrew R.; Forget, Benoit; Smith, Kord

    2013-01-01

    An algorithm for decomposing large tally data in Monte Carlo particle transport simulations is developed, analyzed, and implemented in a continuous-energy Monte Carlo code, OpenMC. The algorithm is based on a non-overlapping decomposition of compute nodes into tracking processors and tally servers. The former are used to simulate the movement of particles through the domain while the latter continuously receive and update tally data. A performance model for this approach is developed, suggesting that, for a range of parameters relevant to LWR analysis, the tally server algorithm should perform with minimal overhead on contemporary supercomputers. An implementation of the algorithm in OpenMC is then tested on the Intrepid and Titan supercomputers, supporting the key predictions of the model over a wide range of parameters. We thus conclude that the tally server algorithm is a successful approach to circumventing classical on-node memory constraints en route to unprecedentedly detailed Monte Carlo reactor simulations

  19. 3USS: a web server for detecting alternative 3'UTRs from RNA-seq experiments.

    KAUST Repository

    Le Pera, Loredana

    2015-01-22

    Protein-coding genes with multiple alternative polyadenylation sites can generate mRNA 3\\'UTR sequences of different lengths, thereby causing the loss or gain of regulatory elements, which can affect stability, localization and translation efficiency. 3USS is a web-server developed with the aim of giving experimentalists the possibility to automatically identify alternative 3 \\': UTRs (shorter or longer with respect to a reference transcriptome), an option that is not available in standard RNA-seq data analysis procedures. The tool reports as putative novel the 3 \\': UTRs not annotated in available databases. Furthermore, if data from two related samples are uploaded, common and specific alternative 3 \\': UTRs are identified and reported by the server.3USS is freely available at http://www.biocomputing.it/3uss_server.

  20. MARSIS data and simulation exploited using array databases: PlanetServer/EarthServer for sounding radars

    Science.gov (United States)

    Cantini, Federico; Pio Rossi, Angelo; Orosei, Roberto; Baumann, Peter; Misev, Dimitar; Oosthoek, Jelmer; Beccati, Alan; Campalani, Piero; Unnithan, Vikram

    2014-05-01

    MARSIS is an orbital synthetic aperture radar for both ionosphere and subsurface sounding on board ESA's Mars Express (Picardi et al. 2005). It transmits electromagnetic pulses centered at 1.8, 3, 4 or 5 MHz that penetrate below the surface and are reflected by compositional and/or structural discontinuities in the subsurface of Mars. MARSIS data are available as a collection of single orbit data files. The availability of tools for a more effective access to such data would greatly ease data analysis and exploitation by the community of users. For this purpose, we are developing a database built on the raster database management system RasDaMan (e.g. Baumann et al., 1994), to be populated with MARSIS data and integrated in the PlanetServer/EarthServer (e.g. Oosthoek et al., 2013; Rossi et al., this meeting) project. The data (and related metadata) are stored in the db for each frequency used by MARSIS radar. The capability of retrieving data belonging to a certain orbit or to multiple orbit on the base of latitute/longitude boundaries is a key requirement of the db design, allowing, besides the "classical" radargram representation of the data, and in area with sufficiently hight orbit density, a 3D data extraction, subset and analysis of subsurface structures. Moreover the use of the OGC WCPS (Web Coverage Processing Service) standard can allow calculations on database query results for multiple echoes and/or subsets of a certain data product. Because of the low directivity of its dipole antenna, MARSIS receives echoes from portions of the surface of Mars that are distant from nadir and can be mistakenly interpreted as subsurface echoes. For this reason, methods have been developed to simulate surface echoes (e.g. Nouvel et al., 2004), to reveal the true origin of an echo through comparison with instrument data. These simulations are usually time-consuming, and so far have been performed either on a case-by-case basis or in some simplified form. A code for

  1. Explorations in statistics: the log transformation.

    Science.gov (United States)

    Curran-Everett, Douglas

    2018-06-01

    Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This thirteenth installment of Explorations in Statistics explores the log transformation, an established technique that rescales the actual observations from an experiment so that the assumptions of some statistical analysis are better met. A general assumption in statistics is that the variability of some response Y is homogeneous across groups or across some predictor variable X. If the variability-the standard deviation-varies in rough proportion to the mean value of Y, a log transformation can equalize the standard deviations. Moreover, if the actual observations from an experiment conform to a skewed distribution, then a log transformation can make the theoretical distribution of the sample mean more consistent with a normal distribution. This is important: the results of a one-sample t test are meaningful only if the theoretical distribution of the sample mean is roughly normal. If we log-transform our observations, then we want to confirm the transformation was useful. We can do this if we use the Box-Cox method, if we bootstrap the sample mean and the statistic t itself, and if we assess the residual plots from the statistical model of the actual and transformed sample observations.

  2. CORAL Server and CORAL Server Proxy: Scalable Access to Relational Databases from CORAL Applications

    CERN Document Server

    Valassi, A; Kalkhof, A; Salnikov, A; Wache, M

    2011-01-01

    The CORAL software is widely used at CERN for accessing the data stored by the LHC experiments using relational database technologies. CORAL provides a C++ abstraction layer that supports data persistency for several backends and deployment models, including local access to SQLite files, direct client access to Oracle and MySQL servers, and read-only access to Oracle through the FroNTier web server and cache. Two new components have recently been added to CORAL to implement a model involving a middle tier "CORAL server" deployed close to the database and a tree of "CORAL server proxy" instances, with data caching and multiplexing functionalities, deployed close to the client. The new components are meant to provide advantages for read-only and read-write data access, in both offline and online use cases, in the areas of scalability and performance (multiplexing for several incoming connections, optional data caching) and security (authentication via proxy certificates). A first implementation of the two new c...

  3. From Server to Desktop: Capital and Institutional Planning for Client/Server Technology.

    Science.gov (United States)

    Mullig, Richard M.; Frey, Keith W.

    1994-01-01

    Beginning with a request for an enhanced system for decision/strategic planning support, the University of Chicago's biological sciences division has developed a range of administrative client/server tools, instituted a capital replacement plan for desktop technology, and created a planning and staffing approach enabling rapid introduction of new…

  4. Palm distributions for log Gaussian Cox processes

    DEFF Research Database (Denmark)

    Coeurjolly, Jean-Francois; Møller, Jesper; Waagepetersen, Rasmus Plenge

    2017-01-01

    This paper establishes a remarkable result regarding Palm distributions for a log Gaussian Cox process: the reduced Palm distribution for a log Gaussian Cox process is itself a log Gaussian Cox process that only differs from the original log Gaussian Cox process in the intensity function. This new...... result is used to study functional summaries for log Gaussian Cox processes....

  5. Triple-server blind quantum computation using entanglement swapping

    Science.gov (United States)

    Li, Qin; Chan, Wai Hong; Wu, Chunhui; Wen, Zhonghua

    2014-04-01

    Blind quantum computation allows a client who does not have enough quantum resources or technologies to achieve quantum computation on a remote quantum server such that the client's input, output, and algorithm remain unknown to the server. Up to now, single- and double-server blind quantum computation have been considered. In this work, we propose a triple-server blind computation protocol where the client can delegate quantum computation to three quantum servers by the use of entanglement swapping. Furthermore, the three quantum servers can communicate with each other and the client is almost classical since one does not require any quantum computational power, quantum memory, and the ability to prepare any quantum states and only needs to be capable of getting access to quantum channels.

  6. Remote Sensing Data Analytics for Planetary Science with PlanetServer/EarthServer

    Science.gov (United States)

    Rossi, Angelo Pio; Figuera, Ramiro Marco; Flahaut, Jessica; Martinot, Melissa; Misev, Dimitar; Baumann, Peter; Pham Huu, Bang; Besse, Sebastien

    2016-04-01

    Planetary Science datasets, beyond the change in the last two decades from physical volumes to internet-accessible archives, still face the problem of large-scale processing and analytics (e.g. Rossi et al., 2014, Gaddis and Hare, 2015). PlanetServer, the Planetary Science Data Service of the EC-funded EarthServer-2 project (#654367) tackles the planetary Big Data analytics problem with an array database approach (Baumann et al., 2014). It is developed to serve a large amount of calibrated, map-projected planetary data online, mainly through Open Geospatial Consortium (OGC) Web Coverage Processing Service (WCPS) (e.g. Rossi et al., 2014; Oosthoek et al., 2013; Cantini et al., 2014). The focus of the H2020 evolution of PlanetServer is still on complex multidimensional data, particularly hyperspectral imaging and topographic cubes and imagery. In addition to hyperspectral and topographic from Mars (Rossi et al., 2014), the use of WCPS is applied to diverse datasets on the Moon, as well as Mercury. Other Solar System Bodies are going to be progressively available. Derived parameters such as summary products and indices can be produced through WCPS queries, as well as derived imagery colour combination products, dynamically generated and accessed also through OGC Web Coverage Service (WCS). Scientific questions translated into queries can be posed to a large number of individual coverages (data products), locally, regionally or globally. The new PlanetServer system uses the the Open Source Nasa WorldWind (e.g. Hogan, 2011) virtual globe as visualisation engine, and the array database Rasdaman Community Edition as core server component. Analytical tools and client components of relevance for multiple communities and disciplines are shared across service such as the Earth Observation and Marine Data Services of EarthServer. The Planetary Science Data Service of EarthServer is accessible on http://planetserver.eu. All its code base is going to be available on GitHub, on

  7. UPGRADE OF THE CENTRAL WEB SERVERS

    CERN Multimedia

    WEB Services

    2000-01-01

    During the weekend of the 25-26 March, the infrastructure of the CERN central web servers will undergo a major upgrade.As a result, the web services hosted by the central servers (that is, the services the address of which starts with www.cern.ch) will be unavailable Friday 24th, from 17:30 to 18:30, and may suffer from short interruptions until 20:00. This includes access to the CERN top-level page as well as the services referenced by this page (such as access to the scientific program and events information, or training, recruitment, housing services).After the upgrade, the change will be transparent to the users. Expert readers may however notice that when they connect to a web page starting with www.cern.ch this address is slightly changed when the page is actually displayed on their screen (e.g. www.cern.ch/Press will be changed to Press.web.cern.ch/Press). They should not worry: this behaviour, necessary for technical reasons, is normal.web.services@cern.chTel 74989

  8. Supervisory control system implemented in programmable logical controller web server

    OpenAIRE

    Milavec, Simon

    2012-01-01

    In this thesis, we study the feasibility of supervisory control and data acquisition (SCADA) system realisation in a web server of a programmable logic controller. With the introduction of Ethernet protocol to the area of process control, the more powerful programmable logic controllers obtained integrated web servers. The web server of a programmable logic controller, produced by Siemens, will also be described in this thesis. Firstly, the software and the hardware equipment used for real...

  9. Log quality enhancement: A systematic assessment of logging company wellsite performance and log quality

    International Nuclear Information System (INIS)

    Farnan, R.A.; Mc Hattie, C.M.

    1984-01-01

    To improve the monitoring of logging company performance, computer programs were developed to assess information en masse from log quality check lists completed on wellsite by the service company engineer and Phillips representative. A study of all logging jobs performed by different service companies for Phillips in Oklahoma (panhandle excepted) during 1982 enabled several pertinent and beneficial interpretations to be made. Company A provided the best tool and crew service. Company B incurred an excessive amount of lost time related to tool failure, in particular the neutron-density tool combination. Company C, although used only three times, incurred no lost time. With a reasonable data base valid conclusions were made pertaining, for example, to repeated tool malfunctions. The actual logs were then assessed for quality

  10. The Case for Adopting Server-side Analytics

    Science.gov (United States)

    Tino, C.; Holmes, C. P.; Feigelson, E.; Hurlburt, N. E.

    2017-12-01

    The standard method for accessing Earth and space science data relies on a scheme developed decades ago: data residing in one or many data stores must be parsed out and shipped via internet lines or physical transport to the researcher who in turn locally stores the data for analysis. The analyses tasks are varied and include visualization, parameterization, and comparison with or assimilation into physics models. In many cases this process is inefficient and unwieldy as the data sets become larger and demands on the analysis tasks become more sophisticated and complex. For about a decade, several groups have explored a new paradigm to this model. The names applied to the paradigm include "data analytics", "climate analytics", and "server-side analytics". The general concept is that in close network proximity to the data store there will be a tailored processing capability appropriate to the type and use of the data served. The user of the server-side analytics will operate on the data with numerical procedures. The procedures can be accessed via canned code, a scripting processor, or an analysis package such as Matlab, IDL or R. Results of the analytics processes will then be relayed via the internet to the user. In practice, these results will be at a much lower volume, easier to transport to and store locally by the user and easier for the user to interoperate with data sets from other remote data stores. The user can also iterate on the processing call to tailor the results as needed. A major component of server-side analytics could be to provide sets of tailored results to end users in order to eliminate the repetitive preconditioning that is both often required with these data sets and which drives much of the throughput challenges. NASA's Big Data Task Force studied this issue. This paper will present the results of this study including examples of SSAs that are being developed and demonstrated and suggestions for architectures that might be developed for

  11. Single Sign-on Authentication server (part of CLARIN infrastructure)

    NARCIS (Netherlands)

    de Jong, H.A.; Maas, M.

    2013-01-01

    The Huygens Single Sign-on server allows federated logins (authentication) via SURFconext affiliates thus facilitating all connected (academic / research) institutes to use online Huygens ING software services.

  12. Enforcing Resource Sharing Agreements Among Distributed Server Clusters

    National Research Council Canada - National Science Library

    Zhao, Tao; Karamcheti, Vijay

    2001-01-01

    Future scalable, high throughput, and high performance applications are likely to execute on platforms constructed by clustering multiple autonomous distributed servers, with resource access governed...

  13. Single-server blind quantum computation with quantum circuit model

    Science.gov (United States)

    Zhang, Xiaoqian; Weng, Jian; Li, Xiaochun; Luo, Weiqi; Tan, Xiaoqing; Song, Tingting

    2018-06-01

    Blind quantum computation (BQC) enables the client, who has few quantum technologies, to delegate her quantum computation to a server, who has strong quantum computabilities and learns nothing about the client's quantum inputs, outputs and algorithms. In this article, we propose a single-server BQC protocol with quantum circuit model by replacing any quantum gate with the combination of rotation operators. The trap quantum circuits are introduced, together with the combination of rotation operators, such that the server is unknown about quantum algorithms. The client only needs to perform operations X and Z, while the server honestly performs rotation operators.

  14. Securing SQL Server Protecting Your Database from Attackers

    CERN Document Server

    Cherry, Denny

    2011-01-01

    There is a lot at stake for administrators taking care of servers, since they house sensitive data like credit cards, social security numbers, medical records, and much more. In Securing SQL Server you will learn about the potential attack vectors that can be used to break into your SQL Server database, and how to protect yourself from these attacks. Written by a Microsoft SQL Server MVP, you will learn how to properly secure your database, from both internal and external threats. Best practices and specific tricks employed by the author will also be revealed. Learn expert techniques to protec

  15. An Electronic Healthcare Record Server Implemented in PostgreSQL

    Directory of Open Access Journals (Sweden)

    Tony Austin

    2015-01-01

    Full Text Available This paper describes the implementation of an Electronic Healthcare Record server inside a PostgreSQL relational database without dependency on any further middleware infrastructure. The five-part international standard for communicating healthcare records (ISO EN 13606 is used as the information basis for the design of the server. We describe some of the features that this standard demands that are provided by the server, and other areas where assumptions about the durability of communications or the presence of middleware lead to a poor fit. Finally, we discuss the use of the server in two real-world scenarios including a commercial application.

  16. Pro SQL Server 2012 relational database design and implementation

    CERN Document Server

    Davidson, Louis

    2012-01-01

    Learn effective and scalable database design techniques in a SQL Server environment. Pro SQL Server 2012 Relational Database Design and Implementation covers everything from design logic that business users will understand, all the way to the physical implementation of design in a SQL Server database. Grounded in best practices and a solid understanding of the underlying theory, Louis Davidson shows how to "get it right" in SQL Server database design and lay a solid groundwork for the future use of valuable business data. Gives a solid foundation in best practices and relational theory Covers

  17. Server-Aided Two-Party Computation with Simultaneous Corruption

    DEFF Research Database (Denmark)

    Cascudo Pueyo, Ignacio; Damgård, Ivan Bjerre; Ranellucci, Samuel

    We consider secure two-party computation in the client-server model where there are two adversaries that operate separately but simultaneously, each of them corrupting one of the parties and a restricted subset of servers that they interact with. We model security via the local universal composab......We consider secure two-party computation in the client-server model where there are two adversaries that operate separately but simultaneously, each of them corrupting one of the parties and a restricted subset of servers that they interact with. We model security via the local universal...

  18. Server-Aided Verification Signature with Privacy for Mobile Computing

    Directory of Open Access Journals (Sweden)

    Lingling Xu

    2015-01-01

    Full Text Available With the development of wireless technology, much data communication and processing has been conducted in mobile devices with wireless connection. As we know that the mobile devices will always be resource-poor relative to static ones though they will improve in absolute ability, therefore, they cannot process some expensive computational tasks due to the constrained computational resources. According to this problem, server-aided computing has been studied in which the power-constrained mobile devices can outsource some expensive computation to a server with powerful resources in order to reduce their computational load. However, in existing server-aided verification signature schemes, the server can learn some information about the message-signature pair to be verified, which is undesirable especially when the message includes some secret information. In this paper, we mainly study the server-aided verification signatures with privacy in which the message-signature pair to be verified can be protected from the server. Two definitions of privacy for server-aided verification signatures are presented under collusion attacks between the server and the signer. Then based on existing signatures, two concrete server-aided verification signature schemes with privacy are proposed which are both proved secure.

  19. Foundations of SQL Server 2008 R2 Business Intelligence

    CERN Document Server

    Fouche, Guy

    2011-01-01

    Foundations of SQL Server 2008 R2 Business Intelligence introduces the entire exciting gamut of business intelligence tools included with SQL Server 2008. Microsoft has designed SQL Server 2008 to be more than just a database. It's a complete business intelligence (BI) platform. The database is at its core, and surrounding the core are tools for data mining, modeling, reporting, analyzing, charting, and integration with other enterprise-level software packages. SQL Server 2008 puts an incredible amount of BI functionality at your disposal. But how do you take advantage of it? That's what this

  20. Log Gaussian Cox processes on the sphere

    DEFF Research Database (Denmark)

    Pacheco, Francisco Andrés Cuevas; Møller, Jesper

    We define and study the existence of log Gaussian Cox processes (LGCPs) for the description of inhomogeneous and aggregated/clustered point patterns on the d-dimensional sphere, with d = 2 of primary interest. Useful theoretical properties of LGCPs are studied and applied for the description of sky...... positions of galaxies, in comparison with previous analysis using a Thomas process. We focus on simple estimation procedures and model checking based on functional summary statistics and the global envelope test....