WorldWideScience

Sample records for users comparative analysis

  1. A Comparative Analysis of User Preferences for for Major Internet Based Education Media in China

    Science.gov (United States)

    Wan, Chunyang; Jiang, Yanqing

    2014-01-01

    Internet based education media are developing at an amazing rate and being seen as an upstart that will likely take the place of traditional education means worldwide in the future. This paper presents the results of a comparative analysis on user preferences for four major categories of internet-based media used in China. In this paper, we first…

  2. A comparative analysis of users and non-users of prescribed psychotropic medication among individuals who reported mental health problems

    OpenAIRE

    Silvia Gallagher; Donna Tedstone Doherty

    2010-01-01

    Objective: The use of psychotropic medication has increased over the years and there are concerns about the inappropriate use and prescribing of such medication. The objective of this study was to compare the characteristics of users and non-users of prescribed psychotropic medication among individuals who report mental health problems. Method: Data from the 2006 Health Research Board, National Psychological Wellbeing and Distress Survey (HRB NPWDS) was used to compare users and non-users of ...

  3. A comparative analysis of user preference-based and existing knowledge management systems attributes in the aerospace industry

    Science.gov (United States)

    Varghese, Nishad G.

    Knowledge management (KM) exists in various forms throughout organizations. Process documentation, training courses, and experience sharing are examples of KM activities performed daily. The goal of KM systems (KMS) is to provide a tool set which serves to standardize the creation, sharing, and acquisition of business critical information. Existing literature provides numerous examples of targeted evaluations of KMS, focusing on specific system attributes. This research serves to bridge the targeted evaluations with an industry-specific, holistic approach. The user preferences of aerospace employees in engineering and engineering-related fields were compared to profiles of existing aerospace KMS based on three attribute categories: technical features, system administration, and user experience. The results indicated there is a statistically significant difference between aerospace user preferences and existing profiles in the user experience attribute category, but no statistically significant difference in the technical features and system administration attribute categories. Additional analysis indicated in-house developed systems exhibit higher technical features and user experience ratings than commercial-off-the-self (COTS) systems.

  4. Non-Academic Service Quality: Comparative Analysis of Students and Faculty as Users

    Science.gov (United States)

    Sharif, Khurram; Kassim, Norizan Mohd

    2012-01-01

    The research focus was a non-academic service quality assessment within higher education. In particular, non-academic service quality perceptions of faculty and students were evaluated using a service profit chain. This enabled a comparison which helped understanding of non-academic service quality orientation from a key users' perspective. Data…

  5. Comparing Text-based and Graphic User Interfaces for Novice and Expert Users

    OpenAIRE

    Chen, Jung-Wei; Zhang, Jiajie

    2007-01-01

    Graphic User Interface (GUI) is commonly considered to be superior to Text-based User Interface (TUI). This study compares GUI and TUI in an electronic dental record system. Several usability analysis techniques compared the relative effectiveness of a GUI and a TUI. Expert users and novice users were evaluated in time required and steps needed to complete the task. A within-subject design was used to evaluate if the experience with either interface will affect task performance. The results s...

  6. Emotional Dimensions of User Experience ? A User Psychological Analysis

    OpenAIRE

    Saariluoma, Pertti; Jokinen, Jussi

    2014-01-01

    User psychology is a human–technology interaction research approach that uses psychological concepts, theories, and findings to structure problems of human–technology interaction. As the notion of user experience has become central in human–technology interaction research and in product development, it is necessary to investigate the user psychology of user experience. This analysis of emotional human–technology interaction is based on the psychological theory of basic emotions. Three...

  7. Trajectory analysis and optimization system (TAOS) user`s manual

    Energy Technology Data Exchange (ETDEWEB)

    Salguero, D.E.

    1995-12-01

    The Trajectory Analysis and Optimization System (TAOS) is software that simulates point--mass trajectories for multiple vehicles. It expands upon the capabilities of the Trajectory Simulation and Analysis program (TAP) developed previously at Sandia National Laboratories. TAOS is designed to be a comprehensive analysis tool capable of analyzing nearly any type of three degree-of-freedom, point-mass trajectory. Trajectories are broken into segments, and within each segment, guidance rules provided by the user control how the trajectory is computed. Parametric optimization provides a powerful method for satisfying mission-planning constraints. Althrough TAOS is not interactive, its input and output files have been designed for ease of use. When compared to TAP, the capability to analyze trajectories for more than one vehicle is the primary enhancement, although numerous other small improvements have been made. This report documents the methods used in TAOS as well as the input and output file formats.

  8. A COMPARATIVE STUDY OF PERSONALITY CHARACERISITCSOF FACEBOOK USER AND NON-USER.

    Directory of Open Access Journals (Sweden)

    Ahire Rajkumarsing Bhagwan

    2015-01-01

    Full Text Available The present study was undertaken a comparative study of personality characteristics of Facebook users andnon-users (personality characters, introversion and extroversion. The sample consists of 60 youths of Facebookusers and non-users selected from Aurangabad districts. The age range of acebookusers and non-users are 18 to 21. The research tools selected to NeymannKohlstedt Extraversion, Introversion Scalenamely. The proposed statistical procedure is descriptive statistics i .e Mean SD was computed and‘t’ test. Conclusion in this study on the basis of data and discussion of result the hypotheses are tested and verified. The result to be found was there is a significant difference between Facebook user and non-user on their personality characteristics Extroverts and Introverts.

  9. Comparative risk analysis

    International Nuclear Information System (INIS)

    In this paper, the risks of various energy systems are discussed considering severe accidents analysis, particularly the probabilistic safety analysis, and probabilistic safety criteria, and the applications of these criteria and analysis. The comparative risk analysis has demonstrated that the largest source of risk in every society is from daily small accidents. Nevertheless, we have to be more concerned about severe accidents. The comparative risk analysis of five different energy systems (coal, oil, gas, LWR and STEC (Solar)) for the public has shown that the main sources of risks are coal and oil. The latest comparative risk study of various energy has been conducted in the USA and has revealed that the number of victims from coal is 42 as many than victims from nuclear. A study for severe accidents from hydro-dams in United States has estimated the probability of dam failures at 1 in 10,000 years and the number of victims between 11,000 and 260,000. The average occupational risk from coal is one fatal accident in 1,000 workers/year. The probabilistic safety analysis is a method that can be used to assess nuclear energy risks, and to analyze the severe accidents, and to model all possible accident sequences and consequences. The 'Fault tree' analysis is used to know the probability of failure of the different systems at each point of accident sequences and to calculate the probability of risks. After calculating the probability of failure, the criteria for judginbility of failure, the criteria for judging the numerical results have to be developed, that is the quantitative and qualitative goals. To achieve these goals, several systems have been devised by various countries members of AIEA. The probabilistic safety ana-lysis method has been developed by establishing a computer program permit-ting to know different categories of safety related information. 19 tabs. (author)

  10. Cybermetrics: User Identification through Network Flow Analysis

    OpenAIRE

    Melnikov, Nikolay; Scho?nwa?lder, Ju?rgen

    2010-01-01

    Recent studies on user identification focused on behavioral aspects of biometric patterns, such as keystroke dynamics or activity cycles in on-line games. The aim of our work is to identify users through the detection and analysis of characteristic network flow patterns. The transformation of concepts from the biometric domain into the network domain leads to the concept of a cybermetric pattern -- a pattern that identifies a user based on her characteristic Internet activity.

  11. Electronic Medical Record Cancer Incidence over Six Years Comparing New Users of Glargine with New Users of NPH Insulin

    Science.gov (United States)

    He, Wei; Bianca, Porneala C.; Yelibi, Carine; Marquis, Alison; Stürmer, Til; Buse, John B.; Meigs, James B.

    2014-01-01

    Background Recent studies suggested that insulin glargine use could be associated with increased risk of cancer. We compared the incidence of cancer in new users of glargine versus new users of NPH in a longitudinal clinical cohort with diabetes for up to 6 years. Methods and Findings From all patients who had been regularly followed at Massachusetts General Hospital from 1/01/2005 to 12/31/2010, 3,680 patients who had a medication record for glargine or NPH usage were obtained from the electronic medical record (EMR). From those we selected 539 new glargine users (age: 60.1±13.6 years, BMI: 32.7±7.5 kg/m2) and 343 new NPH users (61.5±14.1 years, 32.7±8.3 kg/m2) who had no prevalent cancer during 19 months prior to glargine or NPH initiation. All incident cancer cases were ascertained from the EMR requiring at least 2 ICD-9 codes within a 2 month period. Insulin exposure time and cumulative dose were validated. The statistical analysis compared the rates of cancer in new glargine vs. new NPH users while on treatment, adjusted for the propensity to receive one or the other insulin. There were 26 and 28 new cancer cases in new glargine and new NPH users for 1559 and 1126 person-years follow-up, respectively. There were no differences in the propensity-adjusted clinical characteristics between groups. The adjusted hazard ratio for the cancer incidence comparing glargine vs. NPH use was 0.65 (95% CI: 0.36–1.19). Conclusions Insulin glargine is not associated with development of cancers when compared with NPH in this longitudinal and carefully retrieved EMR data. PMID:25329887

  12. Similar methodological analysis involving the user experience.

    Science.gov (United States)

    Almeida e Silva, Caio Márcio; Okimoto, Maria Lúcia R L; Tanure, Raffaela Leane Zenni

    2012-01-01

    This article deals with the use of a protocol for analysis of similar methodological analysis related to user experience. For both, were selected articles recounting experiments in the area. They were analyze based on the similar analysis protocol and finally, synthesized and associated. PMID:22316847

  13. LEXICAL ANALYSIS TO EFFECTIVELY DETECT USERS’ OPINION

    Directory of Open Access Journals (Sweden)

    Anil Kumar K.M

    2011-11-01

    Full Text Available In this paper we present a lexical approach that will identify opinion of web users popularly expressedusing short words or sms words. These words are pretty popular with diverse web users and are used forexpressing their opinion on the web. The study of opinion from web arises to know the diverse opinion ofweb users. The opinion expressed by web users may be on diverse topics such as politics, sports, products,movies etc. These opinions will be very useful to others such as, leaders of political parties, selectioncommittees of various sports, business analysts and other stake holders of products, directors andproducers of movies as well as to the other concerned web users. We use semantic based approach to findusers opinion from short words or sms words apart of regular opinionated phrases. Our approachefficiently detects opinion from opinionated texts using lexical analysis and is found to be better than theother approaches on different data sets.

  14. User analysis of LHCb data with Ganga

    International Nuclear Information System (INIS)

    GANGA (http://cern.ch/ganga) is a job-management tool that offers a simple, efficient and consistent user analysis tool in a variety of heterogeneous environments: from local clusters to global Grid systems. Experiment specific plug-ins allow GANGA to be customised for each experiment. For LHCb users GANGA is the officially supported and advertised tool for job submission to the Grid. The LHCb specific plug-ins allow support for end-to-end analysis helping the user to perform his complete analysis with the help of GANGA. This starts with the support for data selection, where a user can select data sets from the LHCb Bookkeeping system. Next comes the set up for large analysis jobs: with tailored plug-ins for the LHCb core software, jobs can be managed by the splitting of these analysis jobs with the subsequent merging of the resulting files. Furthermore, GANGA offers support for Toy Monte-Carlos to help the user tune their analysis. In addition to describing the GANGA architecture, typical usage patterns within LHCb and experience with the updated LHCb DIRAC workload management system are presented.

  15. User analysis of LHCb data with Ganga

    CERN Document Server

    Maier, A; Cowan, G; Egede, U; Elmsheuser, J; Gaidioz, B; Harrison, K; Lee, H -C; Liko, D; Moscicki, J; Muraru, A; Pajchel, K; Reece, W; Samset, B; Slater, M; Soroko, A; van der Ster, D; Williams, M; Tan, C L; 10.1088/1742-6596/219/7/072008

    2010-01-01

    GANGA (http://cern.ch/ganga) is a job-management tool that offers a simple, efficient and consistent user analysis tool in a variety of heterogeneous environments: from local clusters to global Grid systems. Experiment specific plug-ins allow GANGA to be customised for each experiment. For LHCb users GANGA is the officially supported and advertised tool for job submission to the Grid. The LHCb specific plug-ins allow support for end-to-end analysis helping the user to perform his complete analysis with the help of GANGA. This starts with the support for data selection, where a user can select data sets from the LHCb Bookkeeping system. Next comes the set up for large analysis jobs: with tailored plug-ins for the LHCb core software, jobs can be managed by the splitting of these analysis jobs with the subsequent merging of the resulting files. Furthermore, GANGA offers support for Toy Monte-Carlos to help the user tune their analysis. In addition to describing the GANGA architecture, typical usage patterns with...

  16. User analysis of LHCb data with Ganga

    Science.gov (United States)

    Maier, Andrew; Brochu, Frederic; Cowan, Greg; Egede, Ulrik; Elmsheuser, Johannes; Gaidioz, Benjamin; Harrison, Karl; Lee, Hurng-Chun; Liko, Dietrich; Moscicki, Jakub; Muraru, Adrian; Pajchel, Katarina; Reece, Will; Samset, Bjørn; Slater, Mark; Soroko, Alexander; van der Ster, Daniel; Williams, Mike; Lik Tan, Chun

    2010-04-01

    GANGA (http://cern.ch/ganga) is a job-management tool that offers a simple, efficient and consistent user analysis tool in a variety of heterogeneous environments: from local clusters to global Grid systems. Experiment specific plug-ins allow GANGA to be customised for each experiment. For LHCb users GANGA is the officially supported and advertised tool for job submission to the Grid. The LHCb specific plug-ins allow support for end-to-end analysis helping the user to perform his complete analysis with the help of GANGA. This starts with the support for data selection, where a user can select data sets from the LHCb Bookkeeping system. Next comes the set up for large analysis jobs: with tailored plug-ins for the LHCb core software, jobs can be managed by the splitting of these analysis jobs with the subsequent merging of the resulting files. Furthermore, GANGA offers support for Toy Monte-Carlos to help the user tune their analysis. In addition to describing the GANGA architecture, typical usage patterns within LHCb and experience with the updated LHCb DIRAC workload management system are presented.

  17. Spreadsheet End-User Behaviour Analysis

    OpenAIRE

    Bishop, Brian; McDaid, Kevin

    2008-01-01

    To aid the development of spreadsheet debugging tools, a knowledge of end-users natural behaviour within the Excel environment would be advantageous. This paper details the design and application of a novel data acquisition tool, which can be used for the unobtrusive recording of end-users mouse, keyboard and Excel specific actions during the debugging of Excel spreadsheets. A debugging experiment was conducted using this data acquisition tool, and based on analysis of end-u...

  18. Crianças usuárias de lente de contato nos serviços público e privado: análise comparativa / Pediatric contact lens users in public and private services: comparative analysis

    Scientific Electronic Library Online (English)

    Daniela Araújo, Toscano; Ana Cláudia Tabosa, Florêncio; Maria da Conceição, Sales; Márcia Trovão Duarte, Cavalcanti; Daniela Almeida Lyra, Antunes.

    2009-04-01

    Full Text Available OBJETIVOS: Analisar as indicações, tipo, complicações do uso de lentes de contato e acuidade visual em crianças de serviços de Oftalmologia público e privado. MÉTODOS: Os dados dos prontuários de 59 crianças usuárias de lentes de contato em serviço privado (Hospital de Olhos de Pernambuco - Grupo 1) [...] , e 43 no serviço público (Fundação Altino Ventura - Grupo 2), foram analisados. A coleta de dados incluiu características sociodemográficas, idade da primeira consulta, indicação do uso da lente, tipo de lente, complicações e acuidade visual. RESULTADOS: As mais comuns indicações do uso de lente de contato no grupo 1 foram: ametropia (55,9%), anisometropia (18,6%) e esotropia (16,9%). Neste grupo o leucoma e phthisis não estavam presentes. No grupo 2, as indicações mais comuns foram: anisometropia (23,2%), ametropia e leucoma (18,6%) cada, e phthisis (16,3%). A esotropia não apareceu no grupo 2. O tipo de lente de contato mais prescrita foi a gelatinosa de uso permanente (não descartável) no grupo 1 (45,8%) e no grupo 2 (32,6%). A complicação mais encontrada no grupo 1 foi desconforto (33,3%) e no grupo 2 perda da lente (60%). CONCLUSÕES: A indicação de ametropia predominou nos pacientes privados e as anisometropias nos públicos. O tipo de lente de contato mais prescrita nos dois grupos foi a gelatinosa de uso permanente. A complicação mais frequente no grupo 1 foi desconforto e no grupo 2 perda da lente. A acuidade visual na maioria dos pacientes manteve-se. Abstract in english PURPOSE: To analyze the indications, type and complications of contact lens use and visual acuity in children, in ophthalmological, public and private, services. METHODS: The information from the medical records of 59 contact lens users at a private service (Hospital de Olhos de Pernambuco - Recife [...] - PE- Brazil - group 1), and 43 at public service (Fundação Altino Ventura - Recife - PE - Brazil - group 2), was analyzed. The collected data included: demographic information; age at first examination; indication of lens use; contact lens type; complications and visual acuity. RESULTS: The most common indications of contact lenses in group 1 were: ametropia (55.9%), anisometropia (18.6%) and esotropia (16.9%). In this group leukoma and phthisis were not present. In group 2 the most common indications were: anisometropia (23.2%), ametropia (18.6%), leukoma (18.6%) and phthisis (16.3%). Esotropia was not found in group 2. The most prescribed contact lens was soft and of permanent use in group 1 (45.8%) and in group 2 (32.6%). The most frequent complication in group 1 was discomfort (33.3%) and in group 2 was the loss of the lens (60%). CONCLUSIONS: The most frequent indication in private services was ametropia and anisometropia in the public ones. The type of lens mostly prescribed in both groups was soft and of permanent use. The most frequent complication in group 1 was discomfort and in group 2 loss of the lens. The visual acuity was the same in the majority of the patients.

  19. Language workbench user interfaces for data analysis.

    Science.gov (United States)

    Benson, Victoria M; Campagne, Fabien

    2015-01-01

    Biological data analysis is frequently performed with command line software. While this practice provides considerable flexibility for computationally savy individuals, such as investigators trained in bioinformatics, this also creates a barrier to the widespread use of data analysis software by investigators trained as biologists and/or clinicians. Workflow systems such as Galaxy and Taverna have been developed to try and provide generic user interfaces that can wrap command line analysis software. These solutions are useful for problems that can be solved with workflows, and that do not require specialized user interfaces. However, some types of analyses can benefit from custom user interfaces. For instance, developing biomarker models from high-throughput data is a type of analysis that can be expressed more succinctly with specialized user interfaces. Here, we show how Language Workbench (LW) technology can be used to model the biomarker development and validation process. We developed a language that models the concepts of Dataset, Endpoint, Feature Selection Method and Classifier. These high-level language concepts map directly to abstractions that analysts who develop biomarker models are familiar with. We found that user interfaces developed in the Meta-Programming System (MPS) LW provide convenient means to configure a biomarker development project, to train models and view the validation statistics. We discuss several advantages of developing user interfaces for data analysis with a LW, including increased interface consistency, portability and extension by language composition. The language developed during this experiment is distributed as an MPS plugin (available at http://campagnelab.org/software/bdval-for-mps/). PMID:25755929

  20. Analysis of the 2011 CERN Document Server User Satisfaction Survey

    CERN Document Server

    Le Meur, J-Y

    2012-01-01

    This document analyses the results of the CDS User Satisfaction Survey that ran during Autumn 2011. It shows the feedback received relative to the Search engine, the Submission procedures and the Collaborative tools. It then describes the feedback received relative to the content of the CERN Document Server and general user impressions. The feedback is compared with some key statistics that were automatically extracted from the CDS user activity logs in 2011. A selection of the most useful free text comments that were made by the respondents of the survey is also listed. In the last part, an action list derived from the combined analysis of the survey, the statistics and the comments is being drafted. 150 answers have been received in total (some, with sections not completed). 2/3rd of these answers came from users working at CERN.

  1. Semantic compared cross impact analysis

    OpenAIRE

    Thorleuchter, Dirk; Den Poel, Dirk

    2014-01-01

    The aim of cross impact analysis (CIA) is to predict the impact of a first event on a second. For organizations strategic planning, it is helpful to identify the impacts among organizations internal events and to compare these impacts to the corresponding impacts of external events from organizations competitors. For this, literature has introduced compared cross impact analysis (CCIA) that depicts advantages and disadvantages of the relationships between organizations events to the relations...

  2. Analysis of the Skpos® users Initialisation Times

    Science.gov (United States)

    Droš?ák, Branislav; Smolík, Karol

    2014-09-01

    From the establishment of the Slovak real time positioning service (SKPOS®), the reference stations' observations, network solutions and outputs from the user communications with the service control software were set for archiving. Today we know that all those archived data have the potential to give us valuable information about the service's character and quality and about the conditions during the performance of Real time Kinematic (RTK) measurements. After conducting some analyses, we are able to easily understand how important factors such as the number of satellites used, the state of the ionosphere, network solutions in the border zone, densification of the network, etc., can affect those measurements. For those purposes the users' initialisation times derived from the archived National Marine Electronics Association (NMEA) messages are used in advance. As a tool for analysis the new Application for SKPOS® Monitoring and R

  3. Automation of user analysis workflow in CMS

    Science.gov (United States)

    Spiga, D.; Cinquilli, M.; Codispoti, G.; Fanfani, A.; Fanzago, F.; Farina, F.; Lacaprara, S.; Miccio, E.; Riahi, H.; Vaandering, E.

    2010-04-01

    CMS has a distributed computing model, based on a hierarchy of tiered regional computing centres. However, the end physicist is not interested in the details of the computing model nor the complexity of the underlying infrastructure, but only to access and use efficiently and easily the remote services. The CMS Remote Analysis Builder (CRAB) is the official CMS tool that allows the access to the distributed data in a transparent way. We present the current development direction, which is focused on improving the interface presented to the user and adding intelligence to CRAB such that it can be used to automate more and more the work done on behalf of user. We also present the status of deployment of the CRAB system and the lessons learnt in deploying this tool to the CMS collaboration.

  4. Automation of user analysis workflow in CMS

    International Nuclear Information System (INIS)

    CMS has a distributed computing model, based on a hierarchy of tiered regional computing centres. However, the end physicist is not interested in the details of the computing model nor the complexity of the underlying infrastructure, but only to access and use efficiently and easily the remote services. The CMS Remote Analysis Builder (CRAB) is the official CMS tool that allows the access to the distributed data in a transparent way. We present the current development direction, which is focused on improving the interface presented to the user and adding intelligence to CRAB such that it can be used to automate more and more the work done on behalf of user. We also present the status of deployment of the CRAB system and the lessons learnt in deploying this tool to the CMS collaboration.

  5. Security Analysis of the Swedish Road User Charging System

    OpenAIRE

    Carlsson, Bengt; Boldt, Martin

    2008-01-01

    A security analysis based on probabilities, consequences and costs resulted in a priority ranking for physical, logical and human threats for the proposed Swedish road user charging system using a smartcard solution. Countermeasures are described as top prioritized, highly prioritized, average prioritized and low prioritized and compared to operational errors. Logical countermeasures like encryption and local buffering are most cost efficient to implement and different...

  6. AXAF user interfaces for heterogeneous analysis environments

    Science.gov (United States)

    Mandel, Eric; Roll, John; Ackerman, Mark S.

    1992-01-01

    The AXAF Science Center (ASC) will develop software to support all facets of data center activities and user research for the AXAF X-ray Observatory, scheduled for launch in 1999. The goal is to provide astronomers with the ability to utilize heterogeneous data analysis packages, that is, to allow astronomers to pick the best packages for doing their scientific analysis. For example, ASC software will be based on IRAF, but non-IRAF programs will be incorporated into the data system where appropriate. Additionally, it is desired to allow AXAF users to mix ASC software with their own local software. The need to support heterogeneous analysis environments is not special to the AXAF project, and therefore finding mechanisms for coordinating heterogeneous programs is an important problem for astronomical software today. The approach to solving this problem has been to develop two interfaces that allow the scientific user to run heterogeneous programs together. The first is an IRAF-compatible parameter interface that provides non-IRAF programs with IRAF's parameter handling capabilities. Included in the interface is an application programming interface to manipulate parameters from within programs, and also a set of host programs to manipulate parameters at the command line or from within scripts. The parameter interface has been implemented to support parameter storage formats other than IRAF parameter files, allowing one, for example, to access parameters that are stored in data bases. An X Windows graphical user interface called 'agcl' has been developed, layered on top of the IRAF-compatible parameter interface, that provides a standard graphical mechanism for interacting with IRAF and non-IRAF programs. Users can edit parameters and run programs for both non-IRAF programs and IRAF tasks. The agcl interface allows one to communicate with any command line environment in a transparent manner and without any changes to the original environment. For example, the authors routinely layer the GUI on top of IRAF, ksh, SMongo, and IDL. The agcl, based on the facilities of a system called Answer Garden, also has sophisticated support for examining documentation and help files, asking questions of experts, and developing a knowledge base of frequently required information. Thus, the GUI becomes a total environment for running programs, accessing information, examining documents, and finding human assistance. Because the agcl can communicate with any command-line environment, most projects can make use of it easily. New applications are continually being found for these interfaces. It is the authors' intention to evolve the GUI and its underlying parameter interface in response to these needs - from users as well as developers - throughout the astronomy community. This presentation describes the capabilities and technology of the above user interface mechanisms and tools. It also discusses the design philosophies guiding the work, as well as hopes for the future.

  7. Factors Affecting Mobile Users’ Switching Intentions: A Comparative Study between the Brazilian and German Markets

    Directory of Open Access Journals (Sweden)

    Rodrigo C. Martins

    2013-07-01

    Full Text Available In the competitive wireless market, there are many drivers behind customer defection. Switching barriers, service performance, perceived value in carriers’ offers, satisfaction and other constructs can play a pivotal role in customer switching processes among carriers. This study attempts to compare the influence of these factors, taking into account cultural similarities and dissimilarities, between Brazilian and German mobile users. A survey was conducted on two samples, comprising 202 users in Brazil and 200 users in Germany, with culture being employed as a context variable to compare their behavior. Analysis by means of multi-group structural equation modeling suggests that, in both countries, customer satisfaction, service performance and perceived value have important roles in defining customer switching intentions, while switching barriers did not prove to have significant effects upon switching behavior. The results also suggest that the two cultures are sufficientlysimilar (considering the sample and the variables involved in the model to not present differences in the studied consumer behavior, except for the effect of service performance upon satisfaction.

  8. Factors affecting mobile users' switching intentions: a comparative study between the brazilian and german markets

    Scientific Electronic Library Online (English)

    Rodrigo C., Martins; Luis Fernando, Hor-Meyll; Jorge Brantes, Ferreira.

    2013-09-01

    Full Text Available In the competitive wireless market, there are many drivers behind customer defection. Switching barriers, service performance, perceived value in carriers' offers, satisfaction and other constructs can play a pivotal role in customer switching processes among carriers. This study attempts to compare [...] the influence of these factors, taking into account cultural similarities and dissimilarities, between Brazilian and German mobile users. A survey was conducted on two samples, comprising 202 users in Brazil and 200 users in Germany, with culture being employed as a context variable to compare their behavior. Analysis by means of multi-group structural equation modeling suggests that, in both countries, customer satisfaction, service performance and perceived value have important roles in defining customer switching intentions, while switching barriers did not prove to have significant effects upon switching behavior. The results also suggest that the two cultures are sufficiently similar (considering the sample and the variables involved in the model) to not present differences in the studied consumer behavior, except for the effect of service performance upon satisfaction.

  9. Social network based microblog user behavior analysis

    Science.gov (United States)

    Yan, Qiang; Wu, Lianren; Zheng, Lan

    2013-04-01

    The influence of microblog on information transmission is becoming more and more obvious. By characterizing the behavior of following and being followed as out-degree and in-degree respectively, a microblog social network was built in this paper. It was found to have short diameter of connected graph, short average path length and high average clustering coefficient. The distributions of out-degree, in-degree and total number of microblogs posted present power-law characters. The exponent of total number distribution of microblogs is negatively correlated with the degree of each user. With the increase of degree, the exponent decreases much slower. Based on empirical analysis, we proposed a social network based human dynamics model in this paper, and pointed out that inducing drive and spontaneous drive lead to the behavior of posting microblogs. The simulation results of our model match well with practical situation.

  10. Comparative Analysis of Classifier Fusers

    OpenAIRE

    Marcin Zmyslony; Michal Wozniak; Konrad Jackowski

    2012-01-01

    There are many methods of decision making by an ensemble of classifiers. The most popular are methods that have their origin in voting method, where the decision of the common classifier is a combination of individual classifiers’ outputs. This work presents comparative analysis of some classifier fusion methods based on weighted voting of classifiers’ responses and combination of classifiers’ discriminant functions. We discus different methods of producing combined classifiers based on...

  11. Comparative analysis of collaboration networks

    International Nuclear Information System (INIS)

    In this paper we carry out a comparative analysis of the word network as the collaboration network based on the novel by M. Bulgakov 'Master and Margarita', the synonym network of the Russian language as well as the Russian movie actor network. We have constructed one-mode projections of these networks, defined degree distributions for them and have calculated main characteristics. In the paper a generation algorithm of collaboration networks has been offered which allows one to generate networks statistically equivalent to the studied ones. It lets us reveal a structural correlation between word network, synonym network and movie actor network. We show that the degree distributions of all analyzable networks are described by the distribution of q-type.

  12. Comparative Analysis of Classifier Fusers

    Directory of Open Access Journals (Sweden)

    Marcin Zmyslony

    2012-06-01

    Full Text Available There are many methods of decision making by an ensemble of classifiers. The most popular are methods that have their origin in voting method, where the decision of the common classifier is a combination of individual classifiers’ outputs. This work presents comparative analysis of some classifier fusion methods based on weighted voting of classifiers’ responses and combination of classifiers’ discriminant functions. We discus different methods of producing combined classifiers based on weights. We show that it is notpossible to obtain classifier better than an abstract model of committee known as an Oracle if it is based only on weighted voting but models based on discriminant function or classifier using feature values and class numbers could outperform the Oracle as well. Delivered conclusions are confirmed by the results of computer experiments carried out on benchmark and computer generated data.

  13. Comparative analysis of collaboration networks

    Science.gov (United States)

    Progulova, Tatiana; Gadjiev, Bahruz

    2011-03-01

    In this paper we carry out a comparative analysis of the word network as the collaboration network based on the novel by M. Bulgakov "Master and Margarita", the synonym network of the Russian language as well as the Russian movie actor network. We have constructed one-mode projections of these networks, defined degree distributions for them and have calculated main characteristics. In the paper a generation algorithm of collaboration networks has been offered which allows one to generate networks statistically equivalent to the studied ones. It lets us reveal a structural correlation between word network, synonym network and movie actor network. We show that the degree distributions of all analyzable networks are described by the distribution of q-type.

  14. Analysis of user profile in social networks

    OpenAIRE

    Rodrigues, Ada?o Carlos Fernandes

    2013-01-01

    With this work it is intended to create / identify user profiles through their actions on social networks. This identification is to determine, in a specific way, which profile each user has, linking between the following dimensions and their sets of variables: sociodemographic characteristics (gender, age, education, situation before the economic activity indicator and occupational class) the specific type of aggregate practices conducted over the internet (study, work, services, search f...

  15. Statistical Analysis and Learning Method on Users' Feedbacks

    OpenAIRE

    Wong, Doris H. T.; Sureswaran Ramadass

    2011-01-01

    Problem statement: The purpose of this study was constructing an effective algorithm in order to learn the users? feedbacks from their displayed visualization. This is due to existing visualization tools typically involve presenting network data regardless of considering level of network data knowledge among different levels of computer users. The machine learning algorithm has been applied in order to find the most effective statistical analysis and learning algorithm in learning users? fe...

  16. Characteristics of Bitcoin Users: An Analysis of Google Search Data

    OpenAIRE

    Wilson, Matthew; Yelowitz, Aaron

    2014-01-01

    The anonymity of Bitcoin prevents analysis of its users. We collect Google Trends data to examine determinants of interest in Bitcoin. Based on anecdotal evidence regarding Bitcoin users, we construct proxies for four possible clientele: computer programming enthusiasts, speculative investors, Libertarians, and criminals. Computer programming and illegal activity search terms are positively correlated with Bitcoin interest, while Libertarian and investment terms are not.

  17. User-Centered Analysis of Corpora using Semantic Features Redundancy

    OpenAIRE

    Roy, Thibault; Beust, Pierre; Ferrari, Stéphane

    2008-01-01

    Accessing textual information is still a complex activity when users have to browse through large corpus or long texts. In order to help users in such tasks, we propose a model dedicated to lexical representation of thematic domains as well as tools for personal corpora analysis.

  18. User office proposal handling and analysis software

    CERN Document Server

    Beckmann, J; Beckmann, Joern; Pulz, Joerg

    2002-01-01

    At FRM-II the installation of a user office software is under consideration, supporting tasks like proposal handling, beam time allocation, data handling and report creation. Although there are several software systems in use at major facilities, most of them are not portable to other institutes. In this paper the requirements for a modular and extendable user office software are discussed with focus on security related aspects like how to prevent a denial of service attack on fully automated systems. A suitable way seems to be the creation of a web based application using Apache as webserver, MySQL as database system and PHP as scripting language.

  19. User Behavior and IM Topology Analysis

    Directory of Open Access Journals (Sweden)

    Qiang Yan

    2008-07-01

    Full Text Available The use of Instant Messaging, or IM, has become widely adopted in private and corporate communication. They can provide instant, multi-directed and multi-types of communications which make the message spreading in IM different from those in WWW, Blog and email systems. Groups have great impacts on message spreading in IM. The research demonstrates the power law distribution of groups in MSN with parameter ? ranging from 0.76 to 1.22. Based on an online survey, IM user behavior is analyzed from the aspects of message sending/receiving and contacts maintaining. According to the results, degree distribution of users has a peak value and doesn't present power law character. This may indicate that social networks should be a prospective direction for the research on IM topology.

  20. Composing user models through logic analysis.

    OpenAIRE

    Bergeron, B. P.; Shiffman, R. N.; Rouse, R. L.; Greenes, R. A.

    1991-01-01

    The evaluation of tutorial strategies, interface designs, and courseware content is an area of active research in the medical education community. Many of the evaluation techniques that have been developed (e.g., program instrumentation), commonly produce data that are difficult to decipher or to interpret effectively. We have explored the use of decision tables to automatically simplify and categorize data for the composition of user models--descriptions of student's learning styles and pref...

  1. User Behavior and IM Topology Analysis

    OpenAIRE

    Qiang Yan; Xiaoyan Huang

    2008-01-01

    The use of Instant Messaging, or IM, has become widely adopted in private and corporate communication. They can provide instant, multi-directed and multi-types of communications which make the message spreading in IM different from those in WWW, Blog and email systems. Groups have great impacts on message spreading in IM. The research demonstrates the power law distribution of groups in MSN with parameter ? ranging from 0.76 to 1.22. Based on an online survey, IM user behavior is analyzed fr...

  2. The Radiological Safety Analysis Computer Program (RSAC-5) user's manual

    International Nuclear Information System (INIS)

    The Radiological Safety Analysis Computer Program (RSAC-5) calculates the consequences of the release of radionuclides to the atmosphere. Using a personal computer, a user can generate a fission product inventory from either reactor operating history or nuclear criticalities. RSAC-5 models the effects of high-efficiency particulate air filters or other cleanup systems and calculates decay and ingrowth during transport through processes, facilities, and the environment. Doses are calculated through the inhalation, immersion, ground surface, and ingestion pathways. RSAC+, a menu-driven companion program to RSAC-5, assists users in creating and running RSAC-5 input files. This user's manual contains the mathematical models and operating instructions for RSAC-5 and RSAC+. Instructions, screens, and examples are provided to guide the user through the functions provided by RSAC-5 and RSAC+. These programs are designed for users who are familiar with radiological dose assessment methods

  3. TRANSNET: a user-accessible network of transportation analysis models

    International Nuclear Information System (INIS)

    Models and associated data bases, developed under the sponsorship of the Department of Energy (DOE) through the Transportation Technology Center at Sandia National Laboratories, have been used to support transportation analysis efforts for specific sites and for the assessments of the impacts of transportation of specific waste forms to processing/storage sites. TRANSNET, an interactive computer network, was developed to allow outside users access to these models. TRANSNET contains the most recent versions of models developed under DOE/TTC sponsorship - code modifications that have been made since the last published documentation is noted to the user on the TRANSNET screens. To permit a greater spectrum of users to utilize the models, considerable attention has been given to making the models user-friendly and in providing default data sets for typical problems. TRANSNET access and use is limited to support of DOE related program activities; for such activities there are currently no access or user charges

  4. Sociological analysis and comparative education

    Science.gov (United States)

    Woock, Roger R.

    1981-12-01

    It is argued that comparative education is essentially a derivative field of study, in that it borrows theories and methods from academic disciplines. After a brief humanistic phase, in which history and philosophy were central for comparative education, sociology became an important source. In the mid-50's and 60's, sociology in the United States was characterised by Structural Functionalism as a theory, and Social Survey as a dominant methodology. Both were incorporated into the development of comparative education. Increasingly in the 70's, and certainly today, the new developments in sociology are characterised by an attack on Positivism, which is seen as the philosophical position underlying both functionalism and survey methods. New or re-discovered theories with their attendant methodologies included Marxism, Phenomenological Sociology, Critical Theory, and Historical Social Science. The current relationship between comparative education and social science is one of uncertainty, but since social science is seen to be returning to its European roots, the hope is held out for the development of an integrated social theory and method which will provide a much stronger basis for developments in comparative education.

  5. Reinforcing user data analysis with Ganga in the LHC era: scalability, monitoring and user-support

    International Nuclear Information System (INIS)

    Ganga is a grid job submission and management system widely used in the ATLAS and LHCb experiments and several other communities in the context of the EGEE project. The particle physics communities have entered the LHC operation era which brings new challenges for user data analysis: a strong growth in the number of users and jobs is already noticeable. Current work in the Ganga project is focusing on dealing with these challenges. In recent Ganga releases the support for the pilot job based grid systems Panda and Dirac of the ATLAS and LHCb experiment respectively have been strengthened. A more scalable job repository architecture, which allows efficient storage of many thousands of jobs in XML or several database formats, was recently introduced. A better integration with monitoring systems, including the Dashboard and job execution monitor systems is underway. These will provide comprehensive and easy job monitoring. A simple to use error reporting tool integrated at the Ganga command-line will help to improve user support and debugging user problems. Ganga is a mature, stable and widely-used tool with long-term support from the HEP community. We report on how it is being constantly improved following the user needs for faster and easier distributed data analysis on the grid.

  6. Reinforcing user data analysis with Ganga in the LHC era: scalability, monitoring and user-support

    Science.gov (United States)

    Elmsheuser, Johannes; Brochu, Frederic; Dzhunov, Ivan; Ebke, Johannes; Egede, Ulrik; Jha, Manoj Kumar; Kokoszkiewicz, Lukasz; Lee, Hurng-Chun; Maier, Andrew; Mo?cicki, Jakub; München, Tim; Reece, Will; Samset, Bjorn; Slater, Mark; Tuckett, David; Vanderster, Daniel; Williams, Michael

    2011-12-01

    Ganga is a grid job submission and management system widely used in the ATLAS and LHCb experiments and several other communities in the context of the EGEE project. The particle physics communities have entered the LHC operation era which brings new challenges for user data analysis: a strong growth in the number of users and jobs is already noticeable. Current work in the Ganga project is focusing on dealing with these challenges. In recent Ganga releases the support for the pilot job based grid systems Panda and Dirac of the ATLAS and LHCb experiment respectively have been strengthened. A more scalable job repository architecture, which allows efficient storage of many thousands of jobs in XML or several database formats, was recently introduced. A better integration with monitoring systems, including the Dashboard and job execution monitor systems is underway. These will provide comprehensive and easy job monitoring. A simple to use error reporting tool integrated at the Ganga command-line will help to improve user support and debugging user problems. Ganga is a mature, stable and widely-used tool with long-term support from the HEP community. We report on how it is being constantly improved following the user needs for faster and easier distributed data analysis on the grid.

  7. An Analysis of User Attitudes to SNS

    OpenAIRE

    Tsuyoshi Aburai; Yasuo Ishii; Kazuhiro Takeyasu

    2013-01-01

    Social Networking Service (SNS) have become widely used in Japan in recent years with Facebook, mixi and Twitter being the most popular. These are used in various fields of life together with the convenient devices such as smart-phones. A questionnaire investigation was used to clarify the current usage condition, issues and desired function etc. Information for marketing purposes was then extracted. Fundamental Statistical Analysis, Multi Corresponding Analysis, Quantitative Analysis and Tex...

  8. Forecasting methods: a comparative analysis

    OpenAIRE

    Iqbal, Javed

    2001-01-01

    Forecasting is an important tool for management, planning and administration in various fields. In this paper forecasting performance of different methods is considered using time series data of Pakistan's export to United Sates and money supply. It is found that, like other studies of this nature, no single forecasting method provides better forecast for both the series. The techniques considered are ARIMA, Regression Analysis, Vector Autoregression (VAR), Error Correction Model ...

  9. An Analysis of User Attitudes to SNS

    Directory of Open Access Journals (Sweden)

    Tsuyoshi Aburai

    2013-04-01

    Full Text Available Social Networking Service (SNS have become widely used in Japan in recent years with Facebook, mixi and Twitter being the most popular. These are used in various fields of life together with the convenient devices such as smart-phones. A questionnaire investigation was used to clarify the current usage condition, issues and desired function etc. Information for marketing purposes was then extracted. Fundamental Statistical Analysis, Multi Corresponding Analysis, Quantitative Analysis and Text Minig Analysis were then performed. Reviewing past research, there are some related papers, but they do not include new tools which are evolving rapidly. Moreover there has been little research conducted on this precise topic. Some interesting results were obtained.

  10. Dairy Analytics and Nutrient Analysis (DANA) Prototype System User Manual

    Energy Technology Data Exchange (ETDEWEB)

    Sam Alessi; Dennis Keiser

    2012-10-01

    This document is a user manual for the Dairy Analytics and Nutrient Analysis (DANA) model. DANA provides an analysis of dairy anaerobic digestion technology and allows users to calculate biogas production, co-product valuation, capital costs, expenses, revenue and financial metrics, for user customizable scenarios, dairy and digester types. The model provides results for three anaerobic digester types; Covered Lagoons, Modified Plug Flow, and Complete Mix, and three main energy production technologies; electricity generation, renewable natural gas generation, and compressed natural gas generation. Additional options include different dairy types, bedding types, backend treatment type as well as numerous production, and economic parameters. DANA’s goal is to extend the National Market Value of Anaerobic Digester Products analysis (informa economics, 2012; Innovation Center, 2011) to include a greater and more flexible set of regional digester scenarios and to provide a modular framework for creation of a tool to support farmer and investor needs. Users can set up scenarios from combinations of existing parameters or add new parameters, run the model and view a variety of reports, charts and tables that are automatically produced and delivered over the web interface. DANA is based in the INL’s analysis architecture entitled Generalized Environment for Modeling Systems (GEMS) , which offers extensive collaboration, analysis, and integration opportunities and greatly speeds the ability construct highly scalable web delivered user-oriented decision tools. DANA’s approach uses server-based data processing and web-based user interfaces, rather a client-based spreadsheet approach. This offers a number of benefits over the client-based approach. Server processing and storage can scale up to handle a very large number of scenarios, so that analysis of county, even field level, across the whole U.S., can be performed. Server based databases allow dairy and digester parameters be held and managed in a single managed data repository, while allows users to customize standard values and perform individual analysis. Server-based calculations can be easily extended, versions and upgrades managed, and any changes are immediately available to all users. This user manual describes how to use and/or modify input database tables, run DANA, view and modify reports.

  11. GRAFLAB 2.3 for UNIX - A MATLAB database, plotting, and analysis tool: User`s guide

    Energy Technology Data Exchange (ETDEWEB)

    Dunn, W.N.

    1998-03-01

    This report is a user`s manual for GRAFLAB, which is a new database, analysis, and plotting package that has been written entirely in the MATLAB programming language. GRAFLAB is currently used for data reduction, analysis, and archival. GRAFLAB was written to replace GRAFAID, which is a FORTRAN database, analysis, and plotting package that runs on VAX/VMS.

  12. Conversation Analysis and the User Experience

    CERN Document Server

    Woodruff, A; Woodruff, Allison; Aoki, Paul M.

    2004-01-01

    We provide two case studies in the application of ideas drawn from conversation analysis to the design of technologies that enhance the experience of human conversation. We first present a case study of the design of an electronic guidebook, focusing on how conversation analytic principles played a role in the design process. We then discuss how the guidebook project has inspired our continuing work in social, mobile audio spaces. In particular, we describe some as yet unrealized concepts for adaptive audio spaces.

  13. Conversation Analysis and the User Experience

    OpenAIRE

    Woodruff, Allison; Aoki, Paul M.

    2004-01-01

    We provide two case studies in the application of ideas drawn from conversation analysis to the design of technologies that enhance the experience of human conversation. We first present a case study of the design of an electronic guidebook, focusing on how conversation analytic principles played a role in the design process. We then discuss how the guidebook project has inspired our continuing work in social, mobile audio spaces. In particular, we describe some as yet unrea...

  14. Rent control: a comparative analysis

    Scientific Electronic Library Online (English)

    S, Maass.

    Full Text Available Recent case law shows that vulnerable, previously disadvantaged private sector tenants are currently facing eviction orders - and consequential homelessness - on the basis that their leases have expired. In terms of the case law it is evident that once their leases have expired, these households do [...] not have access to alternative accommodation. In terms of the Constitution, this group of marginalised tenants have a constitutional right of access to adequate housing and a right to occupy land with legally secure tenure. The purpose of this article is to critically analyse a number of legislative interventions, and specifically rent control, that were imposed in various jurisdictions in order to provide strengthened tenure protection for tenants. The rationale for this analysis is to determine whether the current South African landlord-tenant regime is able to provide adequate tenure protection for vulnerable tenants and therefore in the process of transforming in line with the Constitution. The legal construction of rent control was adopted in pre-1994 South Africa, England and New York City to provide substantive tenure protection for tenants during housing shortages. These statutory interventions in the different private rental markets were justified on the basis that there was a general need to protect tenants against exploitation by landlords. However, the justification for the persistent imposition of rent control in New York City is different since it protects a minority group of financially weak tenants against homelessness. The English landlord-tenant regime highlights the importance of a well-structured social sector that can provide secure, long-term housing options for low-income households who are struggling to access the private rental sector. Additionally, the English rental housing framework shows that if the social sector is functioning as a "safety net" for low-income households, the private sector would be able to uphold deregulation. In light of these comparisons and the fact that the South African social sector is not functioning optimally yet, the question is whether the South African private sector is able to provide the required level of tenure protection for struggling tenants. Recent case law shows that tenants are at liberty to lodge unfair practice complaints with the Rental Housing Tribunals on the basis that the landlords' ground for termination of the lease constitutes an unfair practice. The Court defined an unfair practice as a practice that unreasonably prejudices the tenants' rights or interests. This judicial development signifies some transformation in the private sector since it allows the Tribunals to scrutinise landlords' reasons for termination of tenancies in light of tenants' personal and socioeconomic circumstances. The Tribunals are therefore empowered to weigh the interests of both parties and decide whether to confirm termination of the lease or set aside such termination. In light of this recent development, the Tribunals can provide strengthened tenure protection for destitute tenants on a case by case basis, which incorporates a flexible context-sensitive approach to the provision of secure housing rights in the landlord-tenant framework. This methodology is similar to the German approach. Even though this judicial development is welcomed, it raises some concerns with regard to landlords' property rights and specifically landlords' constitutional property rights since Tribunals are now at liberty to set aside contractually agreed grounds for termination of leases without any statutory guidance. The legislation fails to provide any information regarding legitimate grounds for termination, which might have to be rectified in future. The grounds listed in the rent control legislation should serve as a starting point to determine which grounds for termination of a lease should generally be upheld. However, German landlord-tenant law shows that a statutory ground for termination of a lease should not be imposed in an absolutist fashion but rather place a

  15. RENT CONTROL: A COMPARATIVE ANALYSIS

    Directory of Open Access Journals (Sweden)

    Sue-Mari Maass

    2012-11-01

    Full Text Available Recent case law shows that vulnerable, previously disadvantaged private sector tenants are currently facing eviction orders – and consequential homelessness – on the basis that their leases have expired. In terms of the case law it is evident that once their leases have expired, these households do not have access to alternative accommodation. In terms of the Constitution, this group of marginalised tenants have a constitutional right of access to adequate housing and a right to occupy land with legally secure tenure. The purpose of this article is to critically analyse a number of legislative interventions, and specifically rent control, that were imposed in various jurisdictions in order to provide strengthened tenure protection for tenants. The rationale for this analysis is to determine whether the current South African landlord-tenant regime is able to provide adequate tenure protection for vulnerable tenants and therefore in the process of transforming in line with the Constitution. The legal construction of rent control was adopted in pre-1994 South Africa, England and New York City to provide substantive tenure protection for tenants during housing shortages. These statutory interventions in the different private rental markets were justified on the basis that there was a general need to protect tenants against exploitation by landlords. However, the justification for the persistent imposition of rent control in New York City is different since it protects a minority group of financially weak tenants against homelessness. The English landlord-tenant regime highlights the importance of a well-structured social sector that can provide secure, long-term housing options for low-income households who are struggling to access the private rental sector. Additionally, the English rental housing framework shows that if the social sector is functioning as a "safety net" for low-income households, the private sector would be able to uphold deregulation. In light of these comparisons and the fact that the South African social sector is not functioning optimally yet, the question is whether the South African private sector is able to provide the required level of tenure protection for struggling tenants. Recent case law shows that tenants are at liberty to lodge unfair practice complaints with the Rental Housing Tribunals on the basis that the landlords' ground for termination of the lease constitutes an unfair practice. The Court defined an unfair practice as a practice that unreasonably prejudices the tenants' rights or interests. This judicial development signifies some transformation in the private sector since it allows the Tribunals to scrutinise landlords' reasons for termination of tenancies in light of tenants' personal and socio-economic circumstances. The Tribunals are therefore empowered to weigh the interests of both parties and decide whether to confirm termination of the lease or set aside such termination. In light of this recent development, the Tribunals can provide strengthened tenure protection for destitute tenants on a case by case basis, which incorporates a flexible context-sensitive approach to the provision of secure housing rights in the landlord-tenant framework. This methodology is similar to the German approach. Even though this judicial development is welcomed, it raises some concerns with regard to landlords' property rights and specifically landlords' constitutional property rights since Tribunals are now at liberty to set aside contractually agreed grounds for termination of leases without any statutory guidance. The legislation fails to provide any information regarding legitimate grounds for termination, which might have to be rectified in future. The grounds listed in the rent control legislation should serve as a starting point to determine which grounds for termination of a lease should generally be upheld. However, German landlord-tenant law shows that a statutory ground for termination of a lease should not be imposed in an absolutist fashion but rather place a

  16. Comparative Study and Analysis of Variability Tools

    OpenAIRE

    Bhumula, Mahendra Reddy

    2013-01-01

    The dissertation provides a comparative analysis of a number of variability tools currently in use. It serves as a catalogue for practitioners interested in the topic. We compare a range of modelling, configuring, and management tools for product line engineering. The tools surveyed are compared against the following criteria: functional, non-functional, governance issues and Technical aspects. The outcome of the analysis is provided in tabular format.

  17. User's manual for rocket combustor interactive design (ROCCID) and analysis computer program. Volume 1: User's manual

    Science.gov (United States)

    Muss, J. A.; Nguyen, T. V.; Johnson, C. W.

    1991-01-01

    The user's manual for the rocket combustor interactive design (ROCCID) computer program is presented. The program, written in Fortran 77, provides a standardized methodology using state of the art codes and procedures for the analysis of a liquid rocket engine combustor's steady state combustion performance and combustion stability. The ROCCID is currently capable of analyzing mixed element injector patterns containing impinging like doublet or unlike triplet, showerhead, shear coaxial, and swirl coaxial elements as long as only one element type exists in each injector core, baffle, or barrier zone. Real propellant properties of oxygen, hydrogen, methane, propane, and RP-1 are included in ROCCID. The properties of other propellants can easily be added. The analysis model in ROCCID can account for the influence of acoustic cavities, helmholtz resonators, and radial thrust chamber baffles on combustion stability. ROCCID also contains the logic to interactively create a combustor design which meets input performance and stability goals. A preliminary design results from the application of historical correlations to the input design requirements. The steady state performance and combustion stability of this design is evaluated using the analysis models, and ROCCID guides the user as to the design changes required to satisfy the user's performance and stability goals, including the design of stability aids. Output from ROCCID includes a formatted input file for the standardized JANNAF engine performance prediction procedure.

  18. Factors Affecting Mobile Users’ Switching Intentions: A Comparative Study between the Brazilian and German Markets

    OpenAIRE

    Martins, Rodrigo C.; Luis Fernando Hor-Meyll; Jorge Brantes Ferreira

    2013-01-01

    In the competitive wireless market, there are many drivers behind customer defection. Switching barriers, service performance, perceived value in carriers’ offers, satisfaction and other constructs can play a pivotal role in customer switching processes among carriers. This study attempts to compare the influence of these factors, taking into account cultural similarities and dissimilarities, between Brazilian and German mobile users. A survey was conducted on two samples, comprising 202 us...

  19. A comparative study of approaches in user-centered health information retrieval

    OpenAIRE

    Thakkar, Harsh; Iyer, Ganesh; Majumder, Prasenjit

    2015-01-01

    In this paper, we survey various user-centered or context-based biomedical health information retrieval systems. We present and discuss the performance of systems submitted in CLEF eHealth 2014 Task 3 for this purpose. We classify and focus on comparing the two most prevalent retrieval models in biomedical information retrieval namely: Language Model (LM) and Vector Space Model (VSM). We also report on the effectiveness of using external medical resources and ontologies like...

  20. User`s manual of a support system for human reliability analysis

    Energy Technology Data Exchange (ETDEWEB)

    Yokobayashi, Masao [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment; Tamura, Kazuo

    1995-10-01

    Many kinds of human reliability analysis (HRA) methods have been developed. However, users are required to be skillful so as to use them, and also required complicated works such as drawing event tree (ET) and calculation of uncertainty bounds. Moreover, each method is not so complete that only one method of them is not enough to evaluate human reliability. Therefore, a personal computer (PC) based support system for HRA has been developed to execute HRA practically and efficiently. The system consists of two methods, namely, simple method and detailed one. The former uses ASEP that is a simplified THERP-technique, and combined method of OAT and HRA-ET/DeBDA is used for the latter. Users can select a suitable method for their purpose. Human error probability (HEP) data were collected and a database of them was built to use for the support system. This paper describes outline of the HRA methods, support functions and user`s guide of the system. (author).

  1. Residence time distribution software analysis. User's manual

    International Nuclear Information System (INIS)

    Radiotracer applications cover a wide range of industrial activities in chemical and metallurgical processes, water treatment, mineral processing, environmental protection and civil engineering. Experiment design, data acquisition, treatment and interpretation are the basic elements of tracer methodology. The application of radiotracers to determine impulse response as RTD as well as the technical conditions for conducting experiments in industry and in the environment create a need for data processing using special software. Important progress has been made during recent years in the preparation of software programs for data treatment and interpretation. The software package developed for industrial process analysis and diagnosis by the stimulus-response methods contains all the methods for data processing for radiotracer experiments

  2. Single and Multiple Hand Gesture Recognition Systems: A Comparative Analysis

    Directory of Open Access Journals (Sweden)

    Siddharth Rautaray

    2014-10-01

    Full Text Available With the evolution of higher computing speed, efficient communication technologies, and advanced display techniques the legacy HCI techniques become obsolete and are no more helpful in accurate and fast flow of information in present day computing devices. Hence the need of user friendly human machine interfaces for real time interfaces for human computer interaction have to be designed and developed to make the man machine interaction more intuitive and user friendly. The vision based hand gesture recognition affords users with the ability to interact with computers in more natural and intuitive ways. These gesture recognition systems generally consist of three main modules like hand segmentation, hand tracking and gesture recognition from hand features, designed using different image processing techniques which are further integrated with different applications. An increase use of new interfaces based on hand gesture recognition designed to cope up with the computing devices for interaction. This paper is an effort to provide a comparative analysis between such real time vision based hand gesture recognition systems which are based on interaction using single and multiple hand gestures. Single hand gesture based recognition systems (SHGRS have fewer complexes to implement, with a constraint to the count of different gestures which is large enough with various permutations and combinations of gesture, which is possible with multiple hands in multiple hand gesture recognition systems (MHGRS. The thorough comparative analysis has been done on various other vital parameters for the recognition systems.

  3. User's manual of a support system for human reliability analysis

    International Nuclear Information System (INIS)

    Many kinds of human reliability analysis (HRA) methods have been developed. However, users are required to be skillful so as to use them, and also required complicated works such as drawing event tree (ET) and calculation of uncertainty bounds. Moreover, each method is not so complete that only one method of them is not enough to evaluate human reliability. Therefore, a personal computer (PC) based support system for HRA has been developed to execute HRA practically and efficiently. The system consists of two methods, namely, simple method and detailed one. The former uses ASEP that is a simplified THERP-technique, and combined method of OAT and HRA-ET/DeBDA is used for the latter. Users can select a suitable method for their purpose. Human error probability (HEP) data were collected and a database of them was built to use for the support system. This paper describes outline of the HRA methods, support functions and user's guide of the system. (author)

  4. Comparative Analysis of Virtual Education Applications

    Directory of Open Access Journals (Sweden)

    Mehmet KURT

    2006-10-01

    Full Text Available The research was conducted in order to make comparative analysis of virtual education applications. The research is conducted in survey model. The study group consists of total 300 institutes providing virtual education in the fall, spring and summer semesters of 2004; 246 in USA, 10 in Australia, 3 in South Africa, 10 in India, 21 in UK, 6 in Japan, 4 in Turkey. The information has been collected by online questionnaire sent to the target mass by e-mail. The questionnaire has been developed in two information categories as personal information and institutes and their virtual education applications. The English web design of the online questionnaire and the database has been prepared by Microsoft ASP codes which is the script language of Microsoft Front Page editor and has been tested on personal web site. The questionnaire has been pre applied in institutions providing virtual education in Australia. The English text of the questionnaire and web site design have been sent to educational technology and virtual education specialists in the countries of the study group. With the feedback received, the spelling mistakes have been corrected and concept and language validity have been completed. The application of the questionnaire has taken 40 weeks during March-November 2004. Only 135 institutes have replied. Two of the questionnaires have been discharged because they included mistaken coding, names of the institutions and countries. Valid 133 questionnaires cover approximately 44% of the study group. Questionnaires saved in the online database have been transferred to Microsoft Excel and then to SPSS by external database connection. In regards of the research objectives, the data collected has been analyzed on computer and by using SPSS statistics package program. In data analysis frequency (f, percentage (% and arithmetic mean ( have been used. In comparisons of country, institute, year, and other variables, che-square test, independent t-Test and one way variance analysis (F Test have been used. Kruskal-Wallis H test and Mann-Whitney U test have been used. Although virtual education applications differentiate in choices and applications in different countries, education levels and types, after completion of the data analysis it is seen that study group consists of people whom are graduate and undergraduate level, personal users having education expectations, between the ages of 18-45 and working full time. They mostly offer programs providing undergraduate and graduate education in social sciences, giving accredited document, certificate and title. It is seen that most of the instructors have taken a planned education and they are mostly working as full time instructors and they are taking technical support. Financial resources are obtained from fees taken from students and the resources are mostly used for personnel costs. In applications central administration and organization take place and it is seen that they interfere with universities, for physical facilities they use information process centers and virtual classrooms, and for infrastructure and support services they use information process services. It is seen that while in the teaching process they use both synchronous and asynchronous presentation technologies; in order to support course content they use e-mail, web, cd, and course book technologies to provide basic learning environment function; they prefer different environments to cover face to face education needs; they take self learning and collaboration as basis and they take projects and term paper evaluations serious; they mostly prefer multiple choice tests and they usually make virtual courses exams through the internet. Regarding the characteristics of their institutions’ applications, the study group have agreed on mostly to connection and being dependent on connection opportunities. A significant difference between their institutions’ characteristics and the model for developing computer labs, when they had started to provide virtual lessons and presentation technologies u

  5. ModelMate - A graphical user interface for model analysis

    Science.gov (United States)

    Banta, Edward R.

    2011-01-01

    ModelMate is a graphical user interface designed to facilitate use of model-analysis programs with models. This initial version of ModelMate supports one model-analysis program, UCODE_2005, and one model software program, MODFLOW-2005. ModelMate can be used to prepare input files for UCODE_2005, run UCODE_2005, and display analysis results. A link to the GW_Chart graphing program facilitates visual interpretation of results. ModelMate includes capabilities for organizing directories used with the parallel-processing capabilities of UCODE_2005 and for maintaining files in those directories to be identical to a set of files in a master directory. ModelMate can be used on its own or in conjunction with ModelMuse, a graphical user interface for MODFLOW-2005 and PHAST.

  6. Development of output user interface software to support analysis

    Science.gov (United States)

    Wahanani, Nursinta Adi; Natsir, Khairina; Hartini, Entin

    2014-09-01

    Data processing software packages such as VSOP and MCNPX are softwares that has been scientifically proven and complete. The result of VSOP and MCNPX are huge and complex text files. In the analyze process, user need additional processing like Microsoft Excel to show informative result. This research develop an user interface software for output of VSOP and MCNPX. VSOP program output is used to support neutronic analysis and MCNPX program output is used to support burn-up analysis. Software development using iterative development methods which allow for revision and addition of features according to user needs. Processing time with this software 500 times faster than with conventional methods using Microsoft Excel. PYTHON is used as a programming language, because Python is available for all major operating systems: Windows, Linux/Unix, OS/2, Mac, Amiga, among others. Values that support neutronic analysis are k-eff, burn-up and mass Pu239 and Pu241. Burn-up analysis used the mass inventory values of actinide (Thorium, Plutonium, Neptunium and Uranium). Values are visualized in graphical shape to support analysis.

  7. Development of output user interface software to support analysis

    Energy Technology Data Exchange (ETDEWEB)

    Wahanani, Nursinta Adi, E-mail: sintaadi@batan.go.id; Natsir, Khairina, E-mail: sintaadi@batan.go.id; Hartini, Entin, E-mail: sintaadi@batan.go.id [Center for Development of Nuclear Informatics - National Nuclear Energy Agency, PUSPIPTEK, Serpong, Tangerang, Banten (Indonesia)

    2014-09-30

    Data processing software packages such as VSOP and MCNPX are softwares that has been scientifically proven and complete. The result of VSOP and MCNPX are huge and complex text files. In the analyze process, user need additional processing like Microsoft Excel to show informative result. This research develop an user interface software for output of VSOP and MCNPX. VSOP program output is used to support neutronic analysis and MCNPX program output is used to support burn-up analysis. Software development using iterative development methods which allow for revision and addition of features according to user needs. Processing time with this software 500 times faster than with conventional methods using Microsoft Excel. PYTHON is used as a programming language, because Python is available for all major operating systems: Windows, Linux/Unix, OS/2, Mac, Amiga, among others. Values that support neutronic analysis are k-eff, burn-up and mass Pu{sup 239} and Pu{sup 241}. Burn-up analysis used the mass inventory values of actinide (Thorium, Plutonium, Neptunium and Uranium). Values are visualized in graphical shape to support analysis.

  8. Development of output user interface software to support analysis

    International Nuclear Information System (INIS)

    Data processing software packages such as VSOP and MCNPX are softwares that has been scientifically proven and complete. The result of VSOP and MCNPX are huge and complex text files. In the analyze process, user need additional processing like Microsoft Excel to show informative result. This research develop an user interface software for output of VSOP and MCNPX. VSOP program output is used to support neutronic analysis and MCNPX program output is used to support burn-up analysis. Software development using iterative development methods which allow for revision and addition of features according to user needs. Processing time with this software 500 times faster than with conventional methods using Microsoft Excel. PYTHON is used as a programming language, because Python is available for all major operating systems: Windows, Linux/Unix, OS/2, Mac, Amiga, among others. Values that support neutronic analysis are k-eff, burn-up and mass Pu239 and Pu241. Burn-up analysis used the mass inventory values of actinide (Thorium, Plutonium, Neptunium and Uranium). Values are visualized in graphical shape to support analysis

  9. FISCAL DISCIPLINE WITHIN THE EU: COMPARATIVE ANALYSIS

    Directory of Open Access Journals (Sweden)

    SORIN CELEA

    2013-12-01

    Full Text Available This paper focuses on the analysis of the convergence indicators relative to fiscal area in the EU; subsequent to a description of the main peculiarities of the convergence criteria, the reseach develops a critical analysis on a comparative perspective of the actual values of fiscal convergence indicators registered in EU countries compared with the reference values of the indicators, with emphasis on the differences between emerging and developed countries.

  10. Comparative Environmental Threat Analysis: Three Case Studies.

    Science.gov (United States)

    Latour, J. B.; Reiling, R.

    1994-01-01

    Reviews how carrying capacity for different environmental problems is operationalized. Discusses whether it is possible to compare threats, using the exceeding of carrying capacity as a yardstick. Points out problems in comparative threat analysis using three case studies: threats to European groundwater resources, threats to ecosystems in Europe,…

  11. MultiMetEval: Comparative and Multi-Objective Analysis of Genome-Scale Metabolic Models

    OpenAIRE

    Zakrzewski, P.; Medema, M. H.; Gevorgyan, A.; Kierzek, A. M.; Breitling, R.; Takano, E.

    2012-01-01

    Comparative metabolic modelling is emerging as a novel field, supported by the development of reliable and standardized approaches for constructing genome-scale metabolic models in high throughput. New software solutions are needed to allow efficient comparative analysis of multiple models in the context of multiple cellular objectives. Here, we present the user-friendly software framework Multi-Metabolic Evaluator (MultiMetEval), built upon SurreyFBA, which allows the user to compose collect...

  12. Learning Mobile App Design from User Review Analysis

    OpenAIRE

    Elisabeth Platzer; Otto Petrovic

    2011-01-01

    This paper presents a new learning environment for developers of mobile apps that merges two quite different views of the same topic. Creative design and system engineering are core issues in the development process that are based on diverging principles. This new learning environment aims to address both points of view by not suppressing one of them but trying to benefit from both. User review content analysis is introduced as a tool to generate information that is useful for both aspects.

  13. A Comparative analysis: QA evaluation questions versus real-world queries

    OpenAIRE

    Leveling, Johannes

    2010-01-01

    This paper presents a comparative analysis of user queries to a web search engine, questions to a Q&A service (answers.com), and questions employed in question answering (QA) evaluations at TREC and CLEF. The analysis shows that user queries to search engines contain mostly content words (i.e. keywords) but lack structure words (i.e. stopwords) and capitalization. Thus, they resemble natural language input after case folding and stopword removal. In contrast, topics for QA evaluation and ques...

  14. Graphical User Interface for Simulink Integrated Performance Analysis Model

    Science.gov (United States)

    Durham, R. Caitlyn

    2009-01-01

    The J-2X Engine (built by Pratt & Whitney Rocketdyne,) in the Upper Stage of the Ares I Crew Launch Vehicle, will only start within a certain range of temperature and pressure for Liquid Hydrogen and Liquid Oxygen propellants. The purpose of the Simulink Integrated Performance Analysis Model is to verify that in all reasonable conditions the temperature and pressure of the propellants are within the required J-2X engine start boxes. In order to run the simulation, test variables must be entered at all reasonable values of parameters such as heat leak and mass flow rate. To make this testing process as efficient as possible in order to save the maximum amount of time and money, and to show that the J-2X engine will start when it is required to do so, a graphical user interface (GUI) was created to allow the input of values to be used as parameters in the Simulink Model, without opening or altering the contents of the model. The GUI must allow for test data to come from Microsoft Excel files, allow those values to be edited before testing, place those values into the Simulink Model, and get the output from the Simulink Model. The GUI was built using MATLAB, and will run the Simulink simulation when the Simulate option is activated. After running the simulation, the GUI will construct a new Microsoft Excel file, as well as a MATLAB matrix file, using the output values for each test of the simulation so that they may graphed and compared to other values.

  15. Development of a User Interface for a Regression Analysis Software Tool

    Science.gov (United States)

    Ulbrich, Norbert Manfred; Volden, Thomas R.

    2010-01-01

    An easy-to -use user interface was implemented in a highly automated regression analysis tool. The user interface was developed from the start to run on computers that use the Windows, Macintosh, Linux, or UNIX operating system. Many user interface features were specifically designed such that a novice or inexperienced user can apply the regression analysis tool with confidence. Therefore, the user interface s design minimizes interactive input from the user. In addition, reasonable default combinations are assigned to those analysis settings that influence the outcome of the regression analysis. These default combinations will lead to a successful regression analysis result for most experimental data sets. The user interface comes in two versions. The text user interface version is used for the ongoing development of the regression analysis tool. The official release of the regression analysis tool, on the other hand, has a graphical user interface that is more efficient to use. This graphical user interface displays all input file names, output file names, and analysis settings for a specific software application mode on a single screen which makes it easier to generate reliable analysis results and to perform input parameter studies. An object-oriented approach was used for the development of the graphical user interface. This choice keeps future software maintenance costs to a reasonable limit. Examples of both the text user interface and graphical user interface are discussed in order to illustrate the user interface s overall design approach.

  16. User's manual for the Composite HTGR Analysis Program (CHAP-1)

    International Nuclear Information System (INIS)

    CHAP-1 is the first release version of an HTGR overall plant simulation program with both steady-state and transient solution capabilities. It consists of a model-independent systems analysis program and a collection of linked modules, each representing one or more components of the HTGR plant. Detailed instructions on the operation of the code and detailed descriptions of the HTGR model are provided. Information is also provided to allow the user to easily incorporate additional component modules, to modify or replace existing modules, or to incorporate a completely new simulation model into the CHAP systems analysis framework

  17. Development of a task analysis tool to facilitate user interface design

    Science.gov (United States)

    Scholtz, Jean C.

    1992-01-01

    A good user interface is one that facilitates the user in carrying out his task. Such interfaces are difficult and costly to produce. The most important aspect in producing a good interface is the ability to communicate to the software designers what the user's task is. The Task Analysis Tool is a system for cooperative task analysis and specification of the user interface requirements. This tool is intended to serve as a guide to development of initial prototypes for user feedback.

  18. Comparative Analysis of Protein Domain Organization

    OpenAIRE

    Ye, Yuzhen; Godzik, Adam

    2004-01-01

    We have developed a set of graph theory-based tools, which we call Comparative Analysis of Protein Domain Organization (CADO), to survey and compare protein domain organizations of different organisms. In the language of CADO, the organization of protein domains in a given organism is shown as a domain graph in which protein domains are represented as vertices, and domain combinations, defined as instances of two domains found in one protein, are represented as edges. CADO provides a new way ...

  19. Comparative analysis of methods of hardness assessment

    OpenAIRE

    Czarski, A.

    2009-01-01

    Purpose: The aim of this paper is to show how it could utilize the statistical methods for the process management.Design/methodology/approach: The research methodology bases on a theoretical analysis and empirical researches. A practical solution is presented to compare measurements methods of hardness and to estimate capability indices of measurement system.Findings: Measurement system analysis (MSA), particularly theory of statistical tests brings correct results for the analysed case.Resea...

  20. Strategic Analysis of Trust Models for User-Centric Networks

    Directory of Open Access Journals (Sweden)

    Marta Kwiatkowska

    2013-03-01

    Full Text Available We present a strategic analysis of a trust model that has recently been proposed for promoting cooperative behaviour in user-centric networks. The mechanism for cooperation is based on a combination of reputation and virtual currency schemes in which service providers reward paying customers and punish non-paying ones by adjusting their reputation, and hence the price they pay for services. We model and analyse this system using PRISM-games, a tool that performs automated verification and strategy synthesis for stochastic multi-player games using the probabilistic alternating-time temporal logic with rewards (rPATL. We construct optimal strategies for both service users and providers, which expose potential risks of the cooperation mechanism and which we use to devise improvements that counteract these risks.

  1. Reinforcing User Data Analysis with Ganga in the LHC Era: Scalability, Monitoring and User-support

    CERN Document Server

    Brochu, F; The ATLAS collaboration; Ebke, J; Egede, U; Elmsheuser, J; Jha, M K; Kokoszkiewicz, L; Lee, H C; Maier, A; Moscicki, J; Munchen, T; Reece, W; Samset, B; Slater, M; Tuckett, D; Van der Ster, D; Williams, M

    2010-01-01

    Ganga is a grid job submission and management system widely used in the ATLAS and LHCb experiments and several other communities in the context of the EGEE project. The particle physics communities have entered the LHC operation era which brings new challenges for user data analysis: a strong growth in the number of users and jobs is already noticable. Current work in the Ganga project is focusing on dealing with these challenges. In recent Ganga releases the support for the pilot job based grid systems Panda and Dirac of the ATLAS and LHCb experiment respectively have been strengthened. A more scalable job repository architecture, which allows efficient storage of many thousands of jobs in XML or several database formats, was recently introduced. A better integration with monitoring systems, including the Dashboard and job execution monitor systems is underway. These will provide comprehensive and easy job monitoring. A simple to use error reporting tool integrated at the Ganga command-line will help to impr...

  2. Reinforcing user data analysis with Ganga in the LHC era: scalability, monitoring and user-support.

    CERN Document Server

    Brochu, F; The ATLAS collaboration; Ebke, J; Egede, U; Elmsheuser, J; Jha, M K; Kokoszkiewicz, L; Lee, H C; Maier, A; Moscicki, J; Munchen, T; Reece, W; Samset, B; Slater, M; Tuckett, D; Van der Ster, D; Williams, M

    2011-01-01

    Ganga is a grid job submission and management system widely used in the ATLAS and LHCb experiments and several other communities in the context of the EGEE project. The particle physics communities have entered the LHC operation era which brings new challenges for user data analysis: a strong growth in the number of users and jobs is already noticeable. Current work in the Ganga project is focusing on dealing with these challenges. In recent Ganga releases the support for the pilot job based grid systems Panda and Dirac of the ATLAS and LHCb experiment respectively have been strengthened. A more scalable job repository architecture, which allows efficient storage of many thousands of jobs in XML or several database formats, was recently introduced. A better integration with monitoring systems, including the Dashboard and job execution monitor systems is underway. These will provide comprehensive and easy job monitoring. A simple to use error reporting tool integrated at the Ganga command-line will help to imp...

  3. User`s Guide for the NREL Force and Loads Analysis Program. Version 2.2

    Energy Technology Data Exchange (ETDEWEB)

    Wright, A D

    1992-08-01

    The following report gives the reader an overview of and instructions on the proper use of the National Renewable Energy Laboratory Force and Loads Analysis Program (FLAP, version 2.2). It is intended as a tool for prediction of rotor and blade loads and response for two- or three-bladed rigid hub wind turbines. The effects of turbulence are accounted for. The objectives of the report are to give an overview of the code and also show the methods of data input and correct code execution steps in order to model an example two-bladed rigid hub turbine. A large portion of the discussion (Sections 6.0, 7.0, and 8.0) is devoted to the subject of inputting and running the code for wind turbulence effects. The ability to include turbulent wind effects is perhaps the biggest change in the code since the release of FLAP version 2.01 in 1988. This report is intended to be a user`s guide. It does not contain a theoretical discussion on equations of motion, assumptions, underlying theory, etc. It is intended to be used in conjunction with Wright, Buhl, and Thresher (1988).

  4. Comparative Efficiency Analysis of Referral Costs in

    OpenAIRE

    Portela, Maria; Thanassoulis, Emmanuel; Graveney, Mike

    2010-01-01

    The aim of this paper is to compare English General Practitioner (GP) units in terms of their overall referral costs through Data Envelopment Analysis (DEA). Results revealed potential cost savings and benchmark practices under 4 perspectives: ‘overall cost efficiency’, ‘technical efficiency’, ‘allocative efficiency’, and ‘price efficiency’.

  5. Diagnosing MOV problems using comparative trace analysis

    International Nuclear Information System (INIS)

    The paper presents the concept of comparative trace analysis and shows it to be very effective in diagnosing motor operated valve (MOV) problems. Comparative trace analysis is simply the process of interpreting simultaneously gathered traces, each presenting a different perspective on the same series of events. The opening and closing of a motor operated valve is such a series of events. The simultaneous traces are obtained using Liberty Technologies' Valve Operation Test and Evaluation System (VOTES)reg-sign. The traces include stem thrust, motor current, motor power factor, motor power, switch actuations, vibration in three different frequency bands, spring pack displacement, and spring pack force. Spare and auxiliary channels enable additional key parameters to be measured, such as differential pressure and stem displacement. Though not specifically illustrated in this paper, the VOTES system also provides for FFT analysis on all traces except switches

  6. Motives, barriers and quality evaluation in fish consumption situations : Exploring and comparing heavy and light users in Spain and Belgium

    DEFF Research Database (Denmark)

    BrunsØ, Karen; Verbeke, Wim

    2009-01-01

    Purpose - The purpose of this paper is to investigate motives and barriers for eating fish among light users and heavy users, to discuss consumer evaluation of fish quality, and to explore the existence of cross-cultural fish consumer  segments. Design/methodology/approach - Qualitative data were collected through six focus group discussions, three in Spain and three in Belgium. In each country, one group consisted of heavy users while two groups included light users. Findings - The same attitudinal motives and barriers for fish consumption can be found in both countries and across user groups, even though fish consumption levels differ considerably. The main motives for eating fish are health and taste, while the main barriers are price perception, smell when cooking fish, and that fish does not deliver the same level of satiety as compared to meat. Big differences are found between countries and user groups with respect to preparation skills and the use of quality cues. Heavy users are very skilled in evaluating fish quality, especially those in Spain, while light users, especially those in Belgium, make seemingly irrational assumptions when evaluating the quality of fish. Research limitations/implications - This study is based on qualitative focus group discussions in two European countries only. Originality/value - This study explores and compares motives, barriers and quality evaluation among heavy and light fish consumers in two European countries. The paper yields valuable insights for further quantitative research into explaining variations in fish consumption, as well as for fish quality evaluation and fish market segmentation studies.

  7. Learning Mobile App Design from User Review Analysis

    Directory of Open Access Journals (Sweden)

    Elisabeth Platzer

    2011-07-01

    Full Text Available This paper presents a new learning environment for developers of mobile apps that merges two quite different views of the same topic. Creative design and system engineering are core issues in the development process that are based on diverging principles. This new learning environment aims to address both points of view by not suppressing one of them but trying to benefit from both. User review content analysis is introduced as a tool to generate information that is useful for both aspects.

  8. BWR plant dynamic analysis code BWRDYN user's manual

    International Nuclear Information System (INIS)

    Computer code BWRDYN has been developed for thermal-hydraulic analysis of a BWR plant. It can analyze the various types of transient caused by not only small but also large disturbances such as operating mode changes and/or system malfunctions. The verification of main analytical models of the BWRDYN code has been performed with measured data of actual BWR plant. Furthermore, the installation of BOP (Balance of Plant) model has made it possible to analyze the effect of BOP on reactor system. This report describes on analytical models and instructions for user of the BWRDYN code. (author)

  9. User Interface Design for Analysis of Sensor Systems

    OpenAIRE

    Jonsson, Lisa; Sallhammar, Karin

    2003-01-01

    In the future network-based Swedish Defence (NBD), attaining information superiority will be of great importance. This will be achieved by a network of networks where decision-makers, information- and weapon-systems are linked together. As a part of the development of NBD, we have performed a study of user interface design for a future network-based tool package for analysis of sensor systems, referred to as the C2SR-system. This thesis was performed at Ericsson Microwave Systems AB, Sensor ...

  10. A graphical user-interface for propulsion system analysis

    Science.gov (United States)

    Curlett, Brian P.; Ryall, Kathleen

    1993-01-01

    NASA LeRC uses a series of computer codes to calculate installed propulsion system performance and weight. The need to evaluate more advanced engine concepts with a greater degree of accuracy has resulted in an increase in complexity of this analysis system. Therefore, a graphical user interface was developed to allow the analyst to more quickly and easily apply these codes. The development of this interface and the rationale for the approach taken are described. The interface consists of a method of pictorially representing and editing the propulsion system configuration, forms for entering numerical data, on-line help and documentation, post processing of data, and a menu system to control execution.

  11. Comparing the performance of expert user heuristics and an integer linear program in aircraft carrier deck operations.

    Science.gov (United States)

    Ryan, Jason C; Banerjee, Ashis Gopal; Cummings, Mary L; Roy, Nicholas

    2014-06-01

    Planning operations across a number of domains can be considered as resource allocation problems with timing constraints. An unexplored instance of such a problem domain is the aircraft carrier flight deck, where, in current operations, replanning is done without the aid of any computerized decision support. Rather, veteran operators employ a set of experience-based heuristics to quickly generate new operating schedules. These expert user heuristics are neither codified nor evaluated by the United States Navy; they have grown solely from the convergent experiences of supervisory staff. As unmanned aerial vehicles (UAVs) are introduced in the aircraft carrier domain, these heuristics may require alterations due to differing capabilities. The inclusion of UAVs also allows for new opportunities for on-line planning and control, providing an alternative to the current heuristic-based replanning methodology. To investigate these issues formally, we have developed a decision support system for flight deck operations that utilizes a conventional integer linear program-based planning algorithm. In this system, a human operator sets both the goals and constraints for the algorithm, which then returns a proposed schedule for operator approval. As a part of validating this system, the performance of this collaborative human-automation planner was compared with that of the expert user heuristics over a set of test scenarios. The resulting analysis shows that human heuristics often outperform the plans produced by an optimization algorithm, but are also often more conservative. PMID:23934675

  12. Comparative Genome Analysis of Enterobacter cloacae

    OpenAIRE

    Liu, Wing-yee; Wong, Chi-fat; Chung, Karl Ming-kar; Jiang, Jing-wei; Leung, Frederick Chi-ching

    2013-01-01

    The Enterobacter cloacae species includes an extremely diverse group of bacteria that are associated with plants, soil and humans. Publication of the complete genome sequence of the plant growth-promoting endophytic E. cloacae subsp. cloacae ENHKU01 provided an opportunity to perform the first comparative genome analysis between strains of this dynamic species. Examination of the pan-genome of E. cloacae showed that the conserved core genome retains the general physiological and survival gene...

  13. Comparative analysis of multicriteria decision making methods

    OpenAIRE

    Mota, Pedro Jorge Gomes

    2013-01-01

    The main objective of this dissertation is to perform a Comparative Analysis of different Multicriteria Decision Making Methods applied to real-world problems, in order to produce relevant information to enable the incorporation of those methods on computational platforms. The current document presents a simple case study concerning a decision support application targeted for a real problem regarding retrofitting alternatives of a building with energy efficiency impact. The application proces...

  14. Comparative Analysis on Constitutional Supervision Modes

    OpenAIRE

    Wang, Wenjing; Wang, Xiaorui

    2012-01-01

    Constitution is the fundamental law of a nation and also the general regulations on administering state affairs and ensuring national security. This is why constitutional supervision is so important for a country. However, there are still many problems existing under the supervision mechanism regarding to its operability, materiality, and rationality. This paper tries to give proper suggestions on perfecting Chinese constitutional supervision through comparative analysis and other co...

  15. Comparative analysis of twelve Dothideomycete plant pathogens

    Energy Technology Data Exchange (ETDEWEB)

    Ohm, Robin; Aerts, Andrea; Salamov, Asaf; Goodwin, Stephen B.; Grigoriev, Igor

    2011-03-11

    The Dothideomycetes are one of the largest and most diverse groups of fungi. Many are plant pathogens and pose a serious threat to agricultural crops grown for biofuel, food or feed. Most Dothideomycetes have only a single host and related Dothideomycete species can have very diverse host plants. Twelve Dothideomycete genomes have currently been sequenced by the Joint Genome Institute and other sequencing centers. They can be accessed via Mycocosm which has tools for comparative analysis

  16. Embedded Hyperchaotic Generators: A Comparative Analysis

    Science.gov (United States)

    Sadoudi, Said; Tanougast, Camel; Azzaz, Mohamad Salah; Dandache, Abbas

    In this paper, we present a comparative analysis of FPGA implementation performances, in terms of throughput and resources cost, of five well known autonomous continuous hyperchaotic systems. The goal of this analysis is to identify the embedded hyperchaotic generator which leads to designs with small logic area cost, satisfactory throughput rates, low power consumption and low latency required for embedded applications such as secure digital communications between embedded systems. To implement the four-dimensional (4D) chaotic systems, we use a new structural hardware architecture based on direct VHDL description of the forth order Runge-Kutta method (RK-4). The comparative analysis shows that the hyperchaotic Lorenz generator provides attractive performances compared to that of others. In fact, its hardware implementation requires only 2067 CLB-slices, 36 multipliers and no block RAMs, and achieves a throughput rate of 101.6 Mbps, at the output of the FPGA circuit, at a clock frequency of 25.315 MHz with a low latency time of 316 ns. Consequently, these good implementation performances offer to the embedded hyperchaotic Lorenz generator the advantage of being the best candidate for embedded communications applications.

  17. GCtool for fuel cell systems design and analysis : user documentation.

    Energy Technology Data Exchange (ETDEWEB)

    Ahluwalia, R.K.; Geyer, H.K.

    1999-01-15

    GCtool is a comprehensive system design and analysis tool for fuel cell and other power systems. A user can analyze any configuration of component modules and flows under steady-state or dynamic conditions. Component models can be arbitrarily complex in modeling sophistication and new models can be added easily by the user. GCtool also treats arbitrary system constraints over part or all of the system, including the specification of nonlinear objective functions to be minimized subject to nonlinear, equality or inequality constraints. This document describes the essential features of the interpreted language and the window-based GCtool environment. The system components incorporated into GCtool include a gas flow mixer, splitier, heater, compressor, gas turbine, heat exchanger, pump, pipe, diffuser, nozzle, steam drum, feed water heater, combustor, chemical reactor, condenser, fuel cells (proton exchange membrane, solid oxide, phosphoric acid, and molten carbonate), shaft, generator, motor, and methanol steam reformer. Several examples of system analysis at various levels of complexity are presented. Also given are instructions for generating two- and three-dimensional plots of data and the details of interfacing new models to GCtool.

  18. A graphical user interface for infant ERP analysis.

    Science.gov (United States)

    Kaatiala, Jussi; Yrttiaho, Santeri; Forssman, Linda; Perdue, Katherine; Leppänen, Jukka

    2014-09-01

    Recording of event-related potentials (ERPs) is one of the best-suited technologies for examining brain function in human infants. Yet the existing software packages are not optimized for the unique requirements of analyzing artifact-prone ERP data from infants. We developed a new graphical user interface that enables an efficient implementation of a two-stage approach to the analysis of infant ERPs. In the first stage, video records of infant behavior are synchronized with ERPs at the level of individual trials to reject epochs with noncompliant behavior and other artifacts. In the second stage, the interface calls MATLAB and EEGLAB (Delorme & Makeig, Journal of Neuroscience Methods 134(1):9-21, 2004) functions for further preprocessing of the ERP signal itself (i.e., filtering, artifact removal, interpolation, and rereferencing). Finally, methods are included for data visualization and analysis by using bootstrapped group averages. Analyses of simulated and real EEG data demonstrated that the proposed approach can be effectively used to establish task compliance, remove various types of artifacts, and perform representative visualizations and statistical comparisons of ERPs. The interface is available for download from http://www.uta.fi/med/icl/methods/eeg.html in a format that is widely applicable to ERP studies with special populations and open for further editing by users. PMID:24264591

  19. A Comparative Study of Mortality Among Puerto Rican Injection Drug Users in East Harlem, New York, and Bayamon, Puerto Rico

    OpenAIRE

    Colon, Hector Manuel; Deren, Sherry; Robles, Rafaela Rivera; Kang, Sung -Yeon; Cabassa, Myrna; Sahai, Hardeo

    2006-01-01

    Drug users have been found to be at high risk of mortality but the mortality experience of Hispanic drug users remains understudied. This study assessed mortality among Puerto Rican injection drug users (IDUs) in New York City (NY), and in Puerto Rico (PR). Study subjects were 637 IDUs from NY and 319 IDUs from PR. Mortality was ascertained using data from the National Death Index. Annual mortality rate of the NY cohort was 1.3 per 100 person years compared to the PR cohort with a rate of 4.8...

  20. Comparative promoter region analysis powered by CORG

    Directory of Open Access Journals (Sweden)

    Arndt Peter F

    2005-02-01

    Full Text Available Abstract Background Promoters are key players in gene regulation. They receive signals from various sources (e.g. cell surface receptors and control the level of transcription initiation, which largely determines gene expression. In vertebrates, transcription start sites and surrounding regulatory elements are often poorly defined. To support promoter analysis, we present CORG http://corg.molgen.mpg.de, a framework for studying upstream regions including untranslated exons (5' UTR. Description The automated annotation of promoter regions integrates information of two kinds. First, statistically significant cross-species conservation within upstream regions of orthologous genes is detected. Pairwise as well as multiple sequence comparisons are computed. Second, binding site descriptions (position-weight matrices are employed to predict conserved regulatory elements with a novel approach. Assembled EST sequences and verified transcription start sites are incorporated to distinguish exonic from other sequences. As of now, we have included 5 species in our analysis pipeline (man, mouse, rat, fugu and zebrafish. We characterized promoter regions of 16,127 groups of orthologous genes. All data are presented in an intuitive way via our web site. Users are free to export data for single genes or access larger data sets via our DAS server http://tomcat.molgen.mpg.de:8080/das. The benefits of our framework are exemplarily shown in the context of phylogenetic profiling of transcription factor binding sites and detection of microRNAs close to transcription start sites of our gene set. Conclusion The CORG platform is a versatile tool to support analyses of gene regulation in vertebrate promoter regions. Applications for CORG cover a broad range from studying evolution of DNA binding sites and promoter constitution to the discovery of new regulatory sequence elements (e.g. microRNAs and binding sites.

  1. Automatic generation of user material subroutines for biomechanical growth analysis.

    Science.gov (United States)

    Young, Jonathan M; Yao, Jiang; Ramasubramanian, Ashok; Taber, Larry A; Perucchio, Renato

    2010-10-01

    The analysis of the biomechanics of growth and remodeling in soft tissues requires the formulation of specialized pseudoelastic constitutive relations. The nonlinear finite element analysis package ABAQUS allows the user to implement such specialized material responses through the coding of a user material subroutine called UMAT. However, hand coding UMAT subroutines is a challenge even for simple pseudoelastic materials and requires substantial time to debug and test the code. To resolve this issue, we develop an automatic UMAT code generation procedure for pseudoelastic materials using the symbolic mathematics package MATHEMATICA and extend the UMAT generator to include continuum growth. The performance of the automatically coded UMAT is tested by simulating the stress-stretch response of a material defined by a Fung-orthotropic strain energy function, subject to uniaxial stretching, equibiaxial stretching, and simple shear in ABAQUS. The MATHEMATICA UMAT generator is then extended to include continuum growth by adding a growth subroutine to the automatically generated UMAT. The MATHEMATICA UMAT generator correctly derives the variables required in the UMAT code, quickly providing a ready-to-use UMAT. In turn, the UMAT accurately simulates the pseudoelastic response. In order to test the growth UMAT, we simulate the growth-based bending of a bilayered bar with differing fiber directions in a nongrowing passive layer. The anisotropic passive layer, being topologically tied to the growing isotropic layer, causes the bending bar to twist laterally. The results of simulations demonstrate the validity of the automatically coded UMAT, used in both standardized tests of hyperelastic materials and for a biomechanical growth analysis. PMID:20887023

  2. Blog Content and User Engagement - An Insight Using Statistical Analysis.

    Directory of Open Access Journals (Sweden)

    Apoorva Vikrant Kulkarni

    2013-06-01

    Full Text Available Since the past few years organizations have increasingly realized the value of social media in positioning, propagating and marketing the product/service and organization itself. Today every organization be it small or big has realized the essence of creating a space in the World Wide Web. Social Media through its multifaceted platforms has enabled the organizations to propagate their brands. There are a number of social media networks which are helpful in spreading the message to customers. Many organizations are having full time web analytics teams that are regularly trying to ensure that prospectivecustomers are visiting their organization through various forms of social media. Web analytics is foreseen as a tool for Business Intelligence by organizations and there are a large number of analytics tools available for monitoring the visibility of a particular brand on the web. For example, Google has its ownanalytic tool that is very widely used. There are number of free as well as paid analytical tools available on the internet. The objective of this paper is to study what content in a blog present in the social media creates a greater impact on user engagement. The study statistically analyzes the relation between content of the blog and user engagement. The statistical analysis was carried out on a blog of a reputed management institute in Pune to arrive at conclusions.

  3. User's guide for the REBUS-3 fuel cycle analysis capability

    International Nuclear Information System (INIS)

    REBUS-3 is a system of programs designed for the fuel-cycle analysis of fast reactors. This new capability is an extension and refinement of the REBUS-3 code system and complies with the standard code practices and interface dataset specifications of the Committee on Computer Code Coordination (CCCC). The new code is hence divorced from the earlier ARC System. In addition, the coding has been designed to enhance code exportability. Major new capabilities not available in the REBUS-2 code system include a search on burn cycle time to achieve a specified value for the multiplication constant at the end of the burn step; a general non-repetitive fuel-management capability including temporary out-of-core fuel storage, loading of fresh fuel, and subsequent retrieval and reloading of fuel; significantly expanded user input checking; expanded output edits; provision of prestored burnup chains to simplify user input; option of fixed-or free-field BCD input formats; and, choice of finite difference, nodal or spatial flux-synthesis neutronics in one-, two-, or three-dimensions

  4. Comparative analysis of proliferation resistance assessment methodologies

    International Nuclear Information System (INIS)

    Comparative analysis of the methodologies was performed based on the discussions in the international workshop on 'Assessment Methodology of Proliferation Resistance for Future Nuclear Energy Systems' held in Tokyo, on March 2005. Through the workshop and succeeding considerations, it is clarified that the proliferation resistance assessment methodologies are affected by the broader nuclear options being pursued and also by the political situations of the state. Even the definition of proliferation resistance, despite the commonality of fundamental issues, derives from perceived threat and implementation circumstances inherent to the larger programs. Deep recognitions of the 'difference' among communities would help us to make further essential and progressed discussion with harmonization. (author)

  5. Comparative Analysis of Hand Gesture Recognition Techniques

    Directory of Open Access Journals (Sweden)

    Arpana K. Patel

    2015-03-01

    Full Text Available During past few years, human hand gesture for interaction with computing devices has continues to be active area of research. In this paper survey of hand gesture recognition is provided. Hand Gesture Recognition is contained three stages: Pre-processing, Feature Extraction or matching and Classification or recognition. Each stage contains different methods and techniques. In this paper define small description of different methods used for hand gesture recognition in existing system with comparative analysis of all method with its benefits and drawbacks are provided.

  6. Privacy for Sale? : Analysis of Online User Privacy?

    DEFF Research Database (Denmark)

    SØrensen, Lene Tolstrup; SØrensen, Jannick Kirk

    Data brokers have become central players in the collection online of private user data. Data brokers’ activities are however not very transparent or even known by users. Many users regard privacy a central element when they use online services. Based on 12 short interviews with users, this paper analyses how users perceive the concept of online privacy in respect to data brokers col- lection of private data, and particularly novel services that offer users the possi- bility to sell their private data. Two groups of users are identified: Those who are considering selling their data under specific conditions, and those who reject the idea completely. Based on the literature we identify two positions to privacy either as an instrumental good, or as an intrinsic good. The paper positions vari- ous user perceptions on privacy that are relevant for future service develop- ment.

  7. Mobile Phone Usage for M-Learning: Comparing Heavy and Light Mobile Phone Users

    Science.gov (United States)

    Suki, Norbayah Mohd; Suki, Norazah Mohd

    2007-01-01

    Purpose: Mobile technologies offer the opportunity to embed learning in a natural environment. The objective of the study is to examine how the usage of mobile phones for m-learning differs between heavy and light mobile phone users. Heavy mobile phone users are hypothesized to have access to/subscribe to one type of mobile content than light…

  8. Comparative evaluation of user interfaces for robot-assisted laser phonomicrosurgery.

    Science.gov (United States)

    Dagnino, Giulio; Mattos, Leonardo S; Becattini, Gabriele; Dellepiane, Massimo; Caldwell, Darwin G

    2011-01-01

    This research investigates the impact of three different control devices and two visualization methods on the precision, safety and ergonomics of a new medical robotic system prototype for assistive laser phonomicrosurgery. This system allows the user to remotely control the surgical laser beam using either a flight simulator type joystick, a joypad, or a pen display system in order to improve the traditional surgical setup composed by a mechanical micromanipulator coupled with a surgical microscope. The experimental setup and protocol followed to obtain quantitative performance data from the control devices tested are fully described here. This includes sets of path following evaluation experiments conducted with ten subjects with different skills, for a total of 700 trials. The data analysis method and experimental results are also presented, demonstrating an average 45% error reduction when using the joypad and up to 60% error reduction when using the pen display system versus the standard phonomicrosurgery setup. These results demonstrate the new system can provide important improvements in terms of surgical precision, ergonomics and safety. In addition, the evaluation method presented here is shown to support an objective selection of control devices for this application. PMID:22256043

  9. siGnum: graphical user interface for EMG signal analysis.

    Science.gov (United States)

    Kaur, Manvinder; Mathur, Shilpi; Bhatia, Dinesh; Verma, Suresh

    2015-01-01

    Electromyography (EMG) signals that represent the electrical activity of muscles can be used for various clinical and biomedical applications. These are complicated and highly varying signals that are dependent on anatomical location and physiological properties of the muscles. EMG signals acquired from the muscles require advanced methods for detection, decomposition and processing. This paper proposes a novel Graphical User Interface (GUI) siGnum developed in MATLAB that will apply efficient and effective techniques on processing of the raw EMG signals and decompose it in a simpler manner. It could be used independent of MATLAB software by employing a deploy tool. This would enable researcher's to gain good understanding of EMG signal and its analysis procedures that can be utilized for more powerful, flexible and efficient applications in near future. PMID:25385355

  10. Methodological proposal for the analysis of user participation mechanisms in online media

    Directory of Open Access Journals (Sweden)

    Jaime Alonso, Ph.D.

    2012-01-01

    Full Text Available This paper presents the results of an analysis of user participation mechanisms, particularly those based in Web 2.0 technologies and applications, based on a sample of fourteen relevant Spanish online media, including the websites of newspapers, radio stations, and television channels. This analysis was conducted in October and November 2010 as part of the research subproject La evolución de los cibermedios en el marco de la convergencia digital. Tecnología y distribución (The evolution of online media in the context of digital convergence. Tecnology and distribution. The study is based on a taxonomy of the different user participation mechanisms, which distinguishes between those that are integrated within the media’s news sections and those that are independent spaces. The analysis also examines the form in which these mechanisms are managed by the media in function of the role they are assigned. Finally, the study aims to compare the different online media and to show examples and trends in the field of user participation.

  11. User-Defined Material Model for Progressive Failure Analysis

    Science.gov (United States)

    Knight, Norman F. Jr.; Reeder, James R. (Technical Monitor)

    2006-01-01

    An overview of different types of composite material system architectures and a brief review of progressive failure material modeling methods used for structural analysis including failure initiation and material degradation are presented. Different failure initiation criteria and material degradation models are described that define progressive failure formulations. These progressive failure formulations are implemented in a user-defined material model (or UMAT) for use with the ABAQUS/Standard1 nonlinear finite element analysis tool. The failure initiation criteria include the maximum stress criteria, maximum strain criteria, the Tsai-Wu failure polynomial, and the Hashin criteria. The material degradation model is based on the ply-discounting approach where the local material constitutive coefficients are degraded. Applications and extensions of the progressive failure analysis material model address two-dimensional plate and shell finite elements and three-dimensional solid finite elements. Implementation details and use of the UMAT subroutine are described in the present paper. Parametric studies for composite structures are discussed to illustrate the features of the progressive failure modeling methods that have been implemented.

  12. User's manual for seismic analysis code 'SONATINA-2V'

    International Nuclear Information System (INIS)

    The seismic analysis code, SONATINA-2V, has been developed to analyze the behavior of the HTTR core graphite components under seismic excitation. The SONATINA-2V code is a two-dimensional computer program capable of analyzing the vertical arrangement of the HTTR graphite components, such as fuel blocks, replaceable reflector blocks, permanent reflector blocks, as well as their restraint structures. In the analytical model, each block is treated as rigid body and is restrained by dowel pins which restrict relative horizontal movement but allow vertical and rocking motions between upper and lower blocks. Moreover, the SONATINA-2V code is capable of analyzing the core vibration behavior under both simultaneous excitations of vertical and horizontal directions. The SONATINA-2V code is composed of the main program, pri-processor for making the input data to SONATINA-2V and post-processor for data processing and making the graphics from analytical results. Though the SONATINA-2V code was developed in order to work in the MSP computer system of Japan Atomic Energy Research Institute (JAERI), the computer system was abolished with the technical progress of computer. Therefore, improvement of this analysis code was carried out in order to operate the code under the UNIX machine, SR8000 computer system, of the JAERI. The users manual for seismic analysis code, SONATINA-2V, including pri- and post-processor is given in the present report. (author)

  13. Automatic waveform analysis and measurement system user manual

    Science.gov (United States)

    Chesnut, S. M.; Paulter, N. G.

    1991-12-01

    The theory and operation of an upgraded version of the National Institute of Standards and Technology (NIST) Automatic Waveform Analysis and Measurement System (AWAMS) is described. This system, the AWAMS, was commissioned by the Army Primary Standards Laboratory to facilitate measurement comparability with NIST. The AWAMS was installed at the Redstone Arsenal, Alabama.

  14. Spanish Internet users and tourism: analysis of online tourist behaviour among experienced users.

    OpenAIRE

    Garrido Pintado, Pablo

    2013-01-01

    Internet and E-commerce continue to expand in today´s information and communication society. This research focuses on the most relevant characteristics of experienced Spanish Internet users. More specifically, this document concentrates on Internet users and travellers who buy or may buy in the future services offered by travel agents or other operators in the tourism industry. The findings from this study were collected through questionnaires. Once collected, all the data ...

  15. Spanish Internet users and tourism : analysis of online tourist behaviour among experienced users

    OpenAIRE

    Garrido Pintado, Pablo

    2013-01-01

    Internet and E-commerce continue to expand in today´s information and communication society. This research focuses on the most relevant characteristics of experienced Spanish Internet users. More specifically, this document concentrates on Internet users and travellers who buy or may buy in the future services offered by travel agents or other operators in the tourism industry. The findings from this study were collected through questionnaires. Once collected, all the data ...

  16. Assessing Quality of Experience while comparing competing mobile broadband services from the user perspective

    Science.gov (United States)

    Madruga, Ewerton L.; David, Rodrigo; Sabóia de Souza, Rodolfo; Dantas, Romulo

    2015-01-01

    The growth of mobile traffic is exploding globally, and users can already choose their best smartphone or tablet options from a handful of manufacturers based on specific criteria such as price and usability. It is much less clear when the user needs to pick from various mobile broadband service providers when choices are available. After all, how does one know what is the best provider for a given usage profile? This work uses drive tests to investigate the variation of radio frequency conditions and relate them to the quality of experience from the viewpoint of the user.

  17. Comparative Genome Analysis of Basidiomycete Fungi

    Energy Technology Data Exchange (ETDEWEB)

    Riley, Robert; Salamov, Asaf; Morin, Emmanuelle; Nagy, Laszlo; Manning, Gerard; Baker, Scott; Brown, Daren; Henrissat, Bernard; Levasseur, Anthony; Hibbett, David; Martin, Francis; Grigoriev, Igor

    2012-03-19

    Fungi of the phylum Basidiomycota (basidiomycetes), make up some 37percent of the described fungi, and are important in forestry, agriculture, medicine, and bioenergy. This diverse phylum includes the mushrooms, wood rots, symbionts, and plant and animal pathogens. To better understand the diversity of phenotypes in basidiomycetes, we performed a comparative analysis of 35 basidiomycete fungi spanning the diversity of the phylum. Phylogenetic patterns of lignocellulose degrading genes suggest a continuum rather than a sharp dichotomy between the white rot and brown rot modes of wood decay. Patterns of secondary metabolic enzymes give additional insight into the broad array of phenotypes found in the basidiomycetes. We suggest that the profile of an organism in lignocellulose-targeting genes can be used to predict its nutritional mode, and predict Dacryopinax sp. as a brown rot; Botryobasidium botryosum and Jaapia argillacea as white rots.

  18. Comparative Analysis on Visual Cryptographic Schemes?

    Directory of Open Access Journals (Sweden)

    T. Anuradha

    2014-09-01

    Full Text Available Visual cryptography is the techniques that deal with providing security to the multimedia data. The main concept behind this is, to encrypt a secret image into some shares. The secret can be revealed only when all the shares are combined. The central theme of visual cryptography is that it doesn’t require any manipulation or tough cryptographic knowledge and the decryption is done by human vision without the help of computers. Thus, visual cryptography is known for its least computational complexity yet much secure. In this work, we compared traditional visual cryptography, extended visual cryptography and colour extended cryptography with respect to PSNR, NCC and MSE. On analysis, it is found that the performance of colour extended visual cryptography is much better than the traditional visual cryptography and extended visual cryptography, in terms of Peak Signal to Noise Ratio (PSNR, Normalized Correlation Coefficient (NCC and Mean Square Error (MSE.

  19. Construction QA/QC systems: comparative analysis

    International Nuclear Information System (INIS)

    An analysis which compares the quality assurance/quality control (QA/QC) systems adopted in the highway, nuclear power plant, and U.S. Navy construction areas with the traditional quality control approach used in building construction is presented. Full participation and support by the owner as well as the contractor and AE firm are required if a QA/QC system is to succeed. Process quality control, acceptance testing and quality assurance responsibilities must be clearly defined in the contract documents. The owner must audit these responsibilities. A contractor quality control plan, indicating the tasks which will be performed and the fact that QA/QC personnel are independent of project time/cost pressures should be submitted for approval. The architect must develop realistic specifications which consider the natural variability of material. Acceptance criteria based on the random sampling technique should be used. 27 refs

  20. Compare containment subcompartment analysis code evaluation

    International Nuclear Information System (INIS)

    Nuclear power plant subcompartment analyses are required to determine the containment pressure distribution that might result from a loss-of-coolant accident. The pressure distribution is used to calculate structural and mechanical design loads. The COMPARE code is used widely to perform subcompartment analysis. However, several simplifying assumptions are utilized to facilitate solution of the complex transient, two-phase, multidimensional flow problem. In particular, it is assumed that the flow is homogeneous, in thermodynamic equilibrium, and one-dimensional. In this study, these assumptions are evaluated by performing simplified transport and relaxation analyses. This results in definition of (a) geometric features and early-time periods that produce significant deviations from reality and (b) specific areas that require further study

  1. Comparing user acceptance of a computer system in two pediatric offices: a qualitative study.

    OpenAIRE

    Travers, D. A.; Downs, S. M.

    2000-01-01

    The purpose of this qualitative study was to examine user acceptance of a clinical computer system in two pediatric practices in the southeast. Data were gathered through interviews with practice and IS staff, observations in the clinical area, and review of system implementation records. Five months after implementation, Practice A continued to use the system but Practice B had quit using it because it was unacceptable to the users. The results are presented here, in relation to a conceptual...

  2. Comparative Analysis of Competitive Strategy Implementation

    Directory of Open Access Journals (Sweden)

    Maina A. S. Waweru

    2011-09-01

    Full Text Available This paper presents research findings on Competitive Strategy Implementation which compared the levels of strategy implementation achieved by different generic strategy groups, comprising firms inclined towards low cost leadership, differentiation or dual strategic advantage.  The study sought to determine the preferences for use of implementation armaments and compared how such armaments related to the level of implementation achieved.   Respondents comprised 71 top executives from 59 companies among the top 300 private sector firms in Kenya.  SPSS software was used to conduct t-test, ANOVA, and multiple linear regression analysis, to within 95% confidence interval or 5% statistical significance. The results indicated that there was no significant difference between the levels of strategy implementation achieved by any pair set of the three strategic groups.  The study revealed that the predictors of strategy implementation include the firm’s capacity to overcome resistance to change, having incentives based on meeting strictly quantitative targets, adopting a win-lose competitive posture, its effectiveness in strategy implementation, and the environmental rate of change.  The results also indicated that there was no significant difference between the preferences for use of either win-lose or win-win competition by any pair set of the strategic groups. 

  3. Performance and security analysis of Gait-based user authentication

    OpenAIRE

    Gafurov, Davrondzhon

    2008-01-01

    Verifying the identity of a user, usually referred to as user authentication, before granting access to the services or objects is a very important step in many applications. People pass through some sorts of authentication process in their daily life. For example, to prove having access to the computer the user is required to know a password. Similarly, to be able to activate a mobile phone the owner has to know its PIN code, etc. Some user authentication techniques are based on human physio...

  4. Internal versus external preference analysis : an exploratory study on end-user evaluation

    OpenAIRE

    Kleef, E.; Trijp, H. C. M.; Luning, P. A.

    2006-01-01

    Internal and external preference analysis emphasise fundamentally different perspectives on the same data. We extend the literature on comparisons between internal and external preference analysis by incorporating the perspective of the end user of the preference analysis results. From a conceptual analysis of the methodological similarities and differences between these two techniques, we develop and implement a framework for end-user evaluation of preference analysis output in terms of perc...

  5. Sentiment Detection of Web Users Using Probabilistic Latent Semantic Analysis

    Directory of Open Access Journals (Sweden)

    Weijian Ren

    2014-10-01

    Full Text Available With the wide application of Internet in almost all fields, it has become the most important way for information publication, providing a large number of channels for spreading public opinion. Public opinions, as the response of Internet users to the information such as social events and government policies, reflect the status of both society and economics, which is highly valuable for the decision-making and public relations of enterprises. At present, the analysis methods for Internet public opinion are mainly based on discriminative approaches, such as Support Vector Machine (SVM and neural network. However, when these approaches analyze the sentiment of Internet public opinion, they are failed to exploit information hidden in text, e.g. topic. Motivated by the above observation, this paper proposes a detection method for public sentiment based on Probabilistic Latent Semantic Analysis (PLSA model. PLSA inherits the advantages of LSA, exploiting the semantic topic hidden in data. The procedure of detecting the public sentiment using this algorithm is composed of three main steps: (1 Chinese word segmentation and word refinement, with which each document is represented by a bag of words; (2 modeling the probabilistic distribution of documents using PLSA; (3 using the Z-vector of PLSA as the features of documents and delivering it to SVM for sentiment detection. We collect a set of text data from Weibo, blog, BBS etc. to evaluate our proposed approach. The experimental results shows that the proposed method in this paper can detect the public sentiment with high accuracy, outperforming the state-of-the-art approaches, i.e., word histogram based approach. The results also suggest that, text semantic analysis using PLSA could significantly boost the sentiment detection

  6. Comparative kinetic analysis of two fungal ?-glucosidases

    Directory of Open Access Journals (Sweden)

    Casanave Dominique

    2010-02-01

    Full Text Available Abstract Background The enzymatic hydrolysis of cellulose is still considered as one of the main limiting steps of the biological production of biofuels from lignocellulosic biomass. It is a complex multistep process, and various kinetic models have been proposed. The cellulase enzymatic cocktail secreted by Trichoderma reesei has been intensively investigated. ?-glucosidases are one of a number of cellulolytic enzymes, and catalyze the last step releasing glucose from the inhibitory cellobiose. ?-glucosidase (BGL1 is very poorly secreted by Trichoderma reesei strains, and complete hydrolysis of cellulose often requires supplementation with a commercial ?-glucosidase preparation such as that from Aspergillus niger (Novozymes SP188. Surprisingly, kinetic modeling of ?-glucosidases lacks reliable data, and the possible differences between native T. reesei and supplemented ?-glucosidases are not taken into consideration, possibly because of the difficulty of purifying BGL1. Results A comparative kinetic analysis of ?-glucosidase from Aspergillus niger and BGL1 from Trichoderma reesei, purified using a new and efficient fast protein liquid chromatography protocol, was performed. This purification is characterized by two major steps, including the adsorption of the major cellulases onto crystalline cellulose, and a final purification factor of 53. Quantitative analysis of the resulting ?-glucosidase fraction from T. reesei showed it to be 95% pure. Kinetic parameters were determined using cellobiose and a chromogenic artificial substrate. A new method allowing easy and rapid determination of the kinetic parameters was also developed. ?-Glucosidase SP188 (Km = 0.57 mM; Kp = 2.70 mM has a lower specific activity than BGL1 (Km = 0.38 mM; Kp = 3.25 mM and is also more sensitive to glucose inhibition. A Michaelis-Menten model integrating competitive inhibition by the product (glucose has been validated and is able to predict the ?-glucosidase activity of both enzymes. Conclusions This article provides a useful comparison between the activity of ?-glucosidases from two different fungi, and shows the importance of fully characterizing both enzymes. A Michaelis-Menten model was developed, including glucose inhibition and kinetic parameters, which were accurately determined and compared. This model can be further integrated into a cellulose hydrolysis model dissociating ?-glucosidase activity from that of other cellulases. It can also help to define the optimal enzymatic cocktails for new ?-glucosidase activities.

  7. SALOME. Software for the analysis of lines or multiplets from Extrap. User's guide

    International Nuclear Information System (INIS)

    This user's guide describes the centre piece of spectral analysis programs for Extrap-T1. The method for spectral analysis is presented theoretically. It also presents the actual use of the program PROBESCHUSS and how to work on the multiplet library. The present user's guide is about PROBESCHUSS 2.1 and MULTIFIT 2.0. 7 figs, 5 appendices

  8. Analysis of distributed energy resources for domestic electricity users

    Scientific Electronic Library Online (English)

    Atanda, Raji; Mohamed Tariq, Kahn.

    Full Text Available After over a century with utilization of the benefits of economics of scale, power systems planning and development gets bigger and transmission grids have needed to transmit wide bringing the concept of onsite or close-to-load generation back. The turnaround strategy is prompted by market liberaliz [...] ation, transmission expansion constraints, related technology advancements, environmental pollution, health hazards, fossil fuel depletion, and climate change concerns. In the last decade, many countries have started the process of liberalisation of the electric systems, opening access to transmission and distribution grids.Technical feasibility analysis of a hybrid energy system for two types of geographical regions in South Africa using Homer is performed in this paper. Wind-PV hybrid systems are modelled as a micro-power system using Homer. The simulation results analyses conducted for a typical middle income earner electricity load profile for both a coaster and inlander domestic users of electricity showed that Wind-PV hybrid system is technically feasible and economical.

  9. Model for Analysis of Energy Demand (MAED-2). User's manual

    International Nuclear Information System (INIS)

    The IAEA has been supporting its Member States in the area of energy planning for sustainable development. Development and dissemination of appropriate methodologies and their computer codes are important parts of this support. This manual has been produced to facilitate the use of the MAED model: Model for Analysis of Energy Demand. The methodology of the MAED model was originally developed by. B. Chateau and B. Lapillonne of the Institute Economique et Juridique de l'Energie (IEJE) of the University of Grenoble, France, and was presented as the MEDEE model. Since then the MEDEE model has been developed and adopted to be appropriate for modelling of various energy demand system. The IAEA adopted MEDEE-2 model and incorporated important modifications to make it more suitable for application in the developing countries, and it was named as the MAED model. The first version of the MAED model was designed for the DOS based system, which was later on converted for the Windows system. This manual presents the latest version of the MAED model. The most prominent feature of this version is its flexibility for representing structure of energy consumption. The model now allows country-specific representations of energy consumption patterns using the MAED methodology. The user can now disaggregate energy consumption according to the needs and/or data availability in her/his country. As such, MAED has now become a powerful tool for modelling widely diverse energy consumption patterns. This manual presents the model in details and provides guidelines for its application

  10. Comparative genomic analysis of prion genes

    Directory of Open Access Journals (Sweden)

    Gamulin Vera

    2007-01-01

    Full Text Available Abstract Background The homologues of human disease genes are expected to contribute to better understanding of physiological and pathogenic processes. We made use of the present availability of vertebrate genomic sequences, and we have conducted the most comprehensive comparative genomic analysis of the prion protein gene PRNP and its homologues, shadow of prion protein gene SPRN and doppel gene PRND, and prion testis-specific gene PRNT so far. Results While the SPRN and PRNP homologues are present in all vertebrates, PRND is known in tetrapods, and PRNT is present in primates. PRNT could be viewed as a TE-associated gene. Using human as the base sequence for genomic sequence comparisons (VISTA, we annotated numerous potential cis-elements. The conserved regions in SPRNs harbour the potential Sp1 sites in promoters (mammals, birds, C-rich intron splicing enhancers and PTB intron splicing silencers in introns (mammals, birds, and hsa-miR-34a sites in 3'-UTRs (eutherians. We showed the conserved PRNP upstream regions, which may be potential enhancers or silencers (primates, dog. In the PRNP 3'-UTRs, there are conserved cytoplasmic polyadenylation element sites (mammals, birds. The PRND core promoters include highly conserved CCAAT, CArG and TATA boxes (mammals. We deduced 42 new protein primary structures, and performed the first phylogenetic analysis of all vertebrate prion genes. Using the protein alignment which included 122 sequences, we constructed the neighbour-joining tree which showed four major clusters, including shadoos, shadoo2s and prion protein-likes (cluster 1, fish prion proteins (cluster 2, tetrapode prion proteins (cluster 3 and doppels (cluster 4. We showed that the entire prion protein conformationally plastic region is well conserved between eutherian prion proteins and shadoos (18–25% identity and 28–34% similarity, and there could be a potential structural compatibility between shadoos and the left-handed parallel beta-helical fold. Conclusion It is likely that the conserved genomic elements identified in this analysis represent bona fide cis-elements. However, this idea needs to be confirmed by functional assays in transgenic systems.

  11. Analysis of Users Web Browsing Behavior Using Markov chain Model

    Directory of Open Access Journals (Sweden)

    Diwakar Shukla

    2011-03-01

    Full Text Available In present days of growing information technology, many browsers available for surfing and web mining. A user has option to use any of them at a time to mine out the desired website. Every browser has pre-defined level of popularity and reputation in the market. This paper considers the setup of only two browsers in a computer system and a user prefers to any one, if fails, switches to the other one .The behavior of user is modeled through Markov chain procedure and transition probabilities are calculated. The quitting to browsing is treated as a parameter of variation over the popularity. Graphical study is performed to explain the inter relationship between user behavior parameters and browser market popularity parameters. If rate of a company is lowest in terms of browser failure and lowest in terms of quitting probability then company enjoys better popularity and larger user proportion

  12. Microblogging User Feature Analysis based on Boolean Classification Operations

    Directory of Open Access Journals (Sweden)

    Bing Li

    2014-01-01

    Full Text Available Due to the advance of many social network applications, social group feature analytics are attracting a lot of attention. In the meantime, microblogging, as a kind of social network application, attracts more and more people to use it. With the utilization of bigger and broader crowds over microblogging, surveying massive user features will be an important aspect of exploitation of crowd-sourced data. For better understanding microblogging user group features, in this study, a user classification approach was proposed by means of Boolean operations and it is easily find different microblogging user group features by this approach. In the experiment, some facts were discussed on the exploratory survey to exploit a great deal of microblogging data and how to analyze the features of the different user groups.

  13. Comparative analysis of safety related site characteristics

    Energy Technology Data Exchange (ETDEWEB)

    Andersson, Johan (ed.)

    2010-12-15

    This document presents a comparative analysis of site characteristics related to long-term safety for the two candidate sites for a final repository for spent nuclear fuel in Forsmark (municipality of Oesthammar) and in Laxemar (municipality of Oskarshamn) from the point of view of site selection. The analyses are based on the updated site descriptions of Forsmark /SKB 2008a/ and Laxemar /SKB 2009a/, together with associated updated repository layouts and designs /SKB 2008b and SKB 2009b/. The basis for the comparison is thus two equally and thoroughly assessed sites. However, the analyses presented here are focussed on differences between the sites rather than evaluating them in absolute terms. The document serves as a basis for the site selection, from the perspective of long-term safety, in SKB's application for a final repository. A full evaluation of safety is made for a repository at the selected site in the safety assessment SR-Site /SKB 2011/, referred to as SR-Site main report in the following

  14. Comparative analysis of safety related site characteristics

    International Nuclear Information System (INIS)

    This document presents a comparative analysis of site characteristics related to long-term safety for the two candidate sites for a final repository for spent nuclear fuel in Forsmark (municipality of Oesthammar) and in Laxemar (municipality of Oskarshamn) from the point of view of site selection. The analyses are based on the updated site descriptions of Forsmark /SKB 2008a/ and Laxemar /SKB 2009a/, together with associated updated repository layouts and designs /SKB 2008b and SKB 2009b/. The basis for the comparison is thus two equally and thoroughly assessed sites. However, the analyses presented here are focussed on differences between the sites rather than evaluating them in absolute terms. The document serves as a basis for the site selection, from the perspective of long-term safety, in SKB's application for a final repository. A full evaluation of safety is made for a repository at the selected site in the safety assessment SR-Site /SKB 2011/, referred to as SR-Site main report in the following

  15. Structural Analysis of User Association Patterns in Wireless LAN

    CERN Document Server

    Hsu, W; Helmy, A; Hsu, Wei-jen; Dutta, Debojyoti; Helmy, Ahmed

    2006-01-01

    Due to the rapid growth in wireless local area networks (WLANs), it has become important to characterize the fine-grained structure of user association patterns. In this paper, we focus on unraveling the structure in user's daily association patterns in WLANs in the long run. The daily access pattern is defined by the fraction of time it spends with a particular location. We answer three questions: 1) Do users demonstrate consistent behavior? Using our novel metrics and clustering, we conclude that many users (more than 50%) are multi-modal. 2) Is it possible to represent user association patterns using a compact representation? Using eigen-decomposition, we show that the intrinsic dimensionality of the constructed user association matrices is low and only the top five eigenvalues and their corresponding eigenvectors can be used to reconstruct those association matrices with an error of 5%, in terms of the L1 and L2 matrix norms. 3) How can we decide if two users have similar association patterns? We define t...

  16. Comparative analysis of planetary laser ranging concepts

    Science.gov (United States)

    Dirkx, D.; Bauer, S.; Noomen, R.; Vermeersen, B. L. A.; Visser, P. N.

    2014-12-01

    Laser ranging is an emerging technology for tracking interplanetary missions, offering improved range accuracy and precision (mm-cm), compared to existing DSN tracking. The ground segment uses existing Satellite Laser Ranging (SLR) technology, whereas the space segment is modified with an active system. In a one-way system, such as that currently being used on the LRO spacecraft (Zuber et al., 2010), only an active detector is required on the spacecraft. For a two-way system, such as that tested by using the laser altimeter system on the MESSENGER spacecraft en route to Mercury (Smith et al., 2006), a laser transmitter system is additionally placed on the space segment, which will asynchronously fire laser pulses towards the ground stations. Although the one-way system requires less hardware, clock errors on both the space and ground segments will accumulate over time, polluting the range measurements. For a two-way system, the range measurements are only sensitive to clock errors integrated over the the two-way light time.We investigate the performance of both one- and two-way laser range systems by simulating their operation. We generate realizations of clock error time histories from Allan variance profiles, and use them to create range measurement error profiles. We subsequently perform the orbit determination process from this data to quanitfy the system's performance. For our simulations, we use two test cases: a lunar orbiter similar to LRO and a Phobos lander similar to the Phobos Laser Ranging concept (Turyshev et al., 2010). For the lunar orbiter, we include an empirical model for unmodelled non-gravitational accelerations in our truth model to include errors ihe dynamics. We include the estimation of clock parameters over a number of arc lengths for our simulations of the one-way range system and use a variety of state arc durations for the lunar orbiter simulations.We perform Monte Carlo simulations and generate true error distributions for both missions for various combinations of clock and state arc length. Thereby, we quantify the relative capabilities of the one- and two-way laser range systems. In addition, we study the optimal data analysis strategies for these missions, which we apply for LRO orbit determination. Finally, we compare the performance of the laser ranging systems with typical DSN tracking.

  17. Comparative analysis of pharmacophore screening tools.

    Science.gov (United States)

    Sanders, Marijn P A; Barbosa, Arménio J M; Zarzycka, Barbara; Nicolaes, Gerry A F; Klomp, Jan P G; de Vlieg, Jacob; Del Rio, Alberto

    2012-06-25

    The pharmacophore concept is of central importance in computer-aided drug design (CADD) mainly because of its successful application in medicinal chemistry and, in particular, high-throughput virtual screening (HTVS). The simplicity of the pharmacophore definition enables the complexity of molecular interactions between ligand and receptor to be reduced to a handful set of features. With many pharmacophore screening softwares available, it is of the utmost interest to explore the behavior of these tools when applied to different biological systems. In this work, we present a comparative analysis of eight pharmacophore screening algorithms (Catalyst, Unity, LigandScout, Phase, Pharao, MOE, Pharmer, and POT) for their use in typical HTVS campaigns against four different biological targets by using default settings. The results herein presented show how the performance of each pharmacophore screening tool might be specifically related to factors such as the characteristics of the binding pocket, the use of specific pharmacophore features, and the use of these techniques in specific steps/contexts of the drug discovery pipeline. Algorithms with rmsd-based scoring functions are able to predict more compound poses correctly as overlay-based scoring functions. However, the ratio of correctly predicted compound poses versus incorrectly predicted poses is better for overlay-based scoring functions that also ensure better performances in compound library enrichments. While the ensemble of these observations can be used to choose the most appropriate class of algorithm for specific virtual screening projects, we remarked that pharmacophore algorithms are often equally good, and in this respect, we also analyzed how pharmacophore algorithms can be combined together in order to increase the success of hit compound identification. This study provides a valuable benchmark set for further developments in the field of pharmacophore search algorithms, e.g., by using pose predictions and compound library enrichment criteria. PMID:22646988

  18. AUDITOR ROTATION - A CRITICAL AND COMPARATIVE ANALYSIS

    Directory of Open Access Journals (Sweden)

    Mocanu Mihaela

    2011-12-01

    Full Text Available The present paper starts out from the challenge regarding auditor tenure launched in 2010 by the Green Paper of the European Commission Audit Policy: Lessons from the Crisis. According to this document, the European Commission speaks both in favor of the mandatory rotation of the audit firm, and in favor of the mandatory rotation of audit partners. Rotation is considered a solution to mitigate threats to independence generated by familiarity, intimidation and self-interest in the context of a long-term audit-client relationship. At international level, there are several studies on auditor rotation, both empirical (e.g. Lu and Sivaramakrishnan, 2009, Li, 2010, Kaplan and Mauldin, 2008, Jackson et al., 2008 and normative in nature (e.g. Marten et al., 2007, Muller, 2006 and Gelter, 2004. The objective of the present paper is to perform a critical and comparative analysis of the regulations on internal and external rotation in force at international level, in the European Union and in the United States of America. Moreover, arguments both in favor and against mandatory rotation are brought into discussion. With regard to the research design, the paper has a normative approach. The main findings are first of all that by comparison, all regulatory authorities require internal rotation at least in the case of public interest entities, while the external rotation is not in the focus of the regulators. In general, the most strict and detailed requirements are those issued by the Securities and Exchange Commission from the United States of America. Second of all, in favor of mandatory rotation speaks the fact that the auditor becomes less resilient in case of divergence of opinions between him and company management, less stimulated to follow his own interest, and more scrupulous in conducting the audit. However, mandatory rotation may also have negative consequences, thus the debate on the opportunity of this regulatory measure remains open-ended.

  19. A comparative analysis of the statistical properties of large mobile phone calling networks

    CERN Document Server

    Li, Ming-Xia; Xie, Wen-Jie; Miccichè, Salvatore; Tumminello, Michele; Zhou, Wei-Xing; Mantegna, Rosario N

    2014-01-01

    Mobile phone calling is one of the most widely used communication methods in modern society. The records of calls among mobile phone users provide us a valuable proxy for the understanding of human communication patterns embedded in social networks. Mobile phone users call each other forming a directed calling network. If only reciprocal calls are considered, we obtain an undirected mutual calling network. The preferential communication behavior between two connected users can be statistically tested and it results in two Bonferroni networks with statistically validated edges. We perform a comparative analysis of the statistical properties of four networks, which are constructed from the calling records of more than nine million individuals in Shanghai over a period of 110 days. We find that these networks share many common structural properties and also exhibit idiosyncratic features when compared with previously studied large mobile calling networks. The empirical findings provide us an intriguing picture o...

  20. Interactive user's application to Genie 2000 spectroscopy system for automation of hair neutron activation analysis

    International Nuclear Information System (INIS)

    Full text: In recent years lower plants such as mosses or lichens and for arid countries bark and leaves of tree have been used as biomonitors in environmental studies. Alongside with plants the trace elemental human hair composition also has been used as an indicator of pollution of natural and industrial environments. Because of convenience, easy access, nondestruction of sampling, and also preservation of information for a long time period, human hair even more often and widely used in various researches. In the Institute of Nuclear Physics of the Academy of Sciences of the Republic of Uzbekistan hair trace element analysis in environment monitoring and mapping and in health status studies have been used. Scientist of activation analysis laboratory always has a lot of routine work on biological objects analysis, so they regularly improved applied nuclear techniques. At present one of such good work-out technique is consider a hair multielement instrumental neutron activation analysis using single comparator standard method. Since in frames of the 'Enhanced nuclear techniques for materials identification' STCU project, the Radioanalytical Center (RAC) was created in October 2004, for analysis such objects as metals and alloys, minerals and ores, hydrogeological samples, technological products, soils, fertilizers, biological samples, foodstuff, water, sediments, construction materials, as well as materials of unknown composition the unique equipment of RAC have been used. For example, human hair analysis has performed on the base of HP Ge-detector with high resolution gamma-spectrometer of Canberra Industries, Inc. Genie-2000 Spectroscopy System of Canberra spectrometers, represents the true state of the art in spectroscopy software platforms. Genie 2000 is a comprehensive set of capabilities for acquiring and analyzing spectra from Multichannel Analyzers (MCA). Its functions include MCA control, spectral display and manipulation, basic spectrum analysis and reporting. Genie 2000 software is available in several variations and with several layered optional packages. Genie 2000 Basic Spectroscopy and Gamma Analysis Software, which available in RAC permitting us automatically obtain nuclide identification report with all needed parameters. Any applications of Genie 2000 software have not possibility to calculate analyzed elements concentration. For automation this step of INAA by using Canberra Genie 2000 Spectroscopy System we developed user's 'Human hair analysis Application' software for single comparator standard method of hair INAA. The work with the developed Application for GENIE-2000 begins with the menu, which contains four items. 1. Copying of the data. 2. Data input. 3. Viewing, editing and analyzing of the data. 4. EXIT. The item 'Copying of the data' makes copying the entered values of special user parameters from one data source into another. It is very user-friendly. It is enough to him once in one data source to enter values of necessary parameters (nuclides name, ?-lines value, factors of transformation for various times of an irradiation and cooling). Further, with the help of procedure 'Copying of the data' he can transfer them to any other data source. The item 'Data input' is carried out with the help of Graphical Batch Tools function GBTPARS and specially developed set of Form Design Specification (FDS) files for this function. This developed Application works in interactive environment as a dialogue system with user and allows calculating required nuclides concentration in analyzed samples, separately for long-lived, middle-lived and short-lived nuclides. Using the Nuclide Library Editor and comprehensive standard libraries of Genie package we created three custom libraries: Stdlib.HairL, Stdlib.HairM, Stdlib.HairS, accordingly for long-, middle- and short-lived nuclides. After processing of the next data source the Application returns the user to the menu. From here he can continue data processing, having chosen the following data source, or through menu item EXIT to leave from the application. Th

  1. Business intelligence gap analysis: a user, supplier and academic perspective

    OpenAIRE

    Molensky, L.; Ketter, W.; Collins, J.; Bloemhof, J.M.; Koppel, H

    2010-01-01

    Business intelligence (BI) takes many different forms, as indicated by the varying definitions of BI that can be found in industry and academia. These different definitions help us understand of what BI issues are important to the main players in the field of BI; users, suppliers and academics. The goal of this research is to discover gaps and trends from the standpoints of BI users, BI suppliers and academics, and to examine their effects on business and academia. Consultants also play an im...

  2. Microblogging User Feature Analysis based on Boolean Classification Operations

    OpenAIRE

    Bing Li; Bingjie Sun; Xuan Wang; Xintong Huang; Xiaoyu Xiu

    2014-01-01

    Due to the advance of many social network applications, social group feature analytics are attracting a lot of attention. In the meantime, microblogging, as a kind of social network application, attracts more and more people to use it. With the utilization of bigger and broader crowds over microblogging, surveying massive user features will be an important aspect of exploitation of crowd-sourced data. For better understanding microblogging user group features, ...

  3. Comparing Respondent-Driven Sampling and Targeted Sampling Methods of Recruiting Injection Drug Users in San Francisco

    OpenAIRE

    Kral, Alex H.; Malekinejad, Mohsen; Vaudrey, Jason; Martinez, Alexis N.; Lorvick, Jennifer; Mcfarland, Willi; Raymond, H. Fisher

    2010-01-01

    The objective of this article is to compare demographic characteristics, risk behaviors, and service utilization among injection drug users (IDUs) recruited from two separate studies in San Francisco in 2005, one which used targeted sampling (TS) and the other which used respondent-driven sampling (RDS). IDUs were recruited using TS (n?=?651) and RDS (n?=?534) and participated in quantitative interviews that included demographic characteristics, risk behaviors, and service utilization...

  4. Weighted Centroid Algorithm for Estimating Primary User Location: Theoretical Analysis and Distributed Implementation

    CERN Document Server

    Wang, Jun; Han, Yuxing; ?abri?, Danijela

    2010-01-01

    Information about primary user (PU) location is crucial in enabling several key capabilities in dynamic spectrum access networks, including improved spatio-temporal sensing, intelligent location-aware routing, as well as aiding spectrum policy enforcement. Compared to other proposed non-interactive localization algorithms, the weighted centroid localization (WCL) scheme uses only received signal strength information, which makes it simple and robust to variations in the propagation environment. In contrast to prior work, which focused mainly on proposing algorithmic variations and verifying their performance through simulations, in this paper we present the first theoretical framework for WCL performance analysis in terms of its localization error distribution parameterized by node density, node placement, shadowing variance and correlation distance. Using this analysis, we quantify the robustness of WCL to various physical conditions and provide guidelines, such as node placement, for practical deployment of...

  5. Sentiment Analysis of User-Generated Content on Drug Review Websites

    Directory of Open Access Journals (Sweden)

    Na, Jin-Cheon

    2015-03-01

    Full Text Available This study develops an effective method for sentiment analysis of user-generated content on drug review websites, which has not been investigated extensively compared to other general domains, such as product reviews. A clause-level sentiment analysis algorithm is developed since each sentence can contain multiple clauses discussing multiple aspects of a drug. The method adopts a pure linguistic approach of computing the sentiment orientation (positive, negative, or neutral of a clause from the prior sentiment scores assigned to words, taking into consideration the grammatical relations and semantic annotation (such as disorder terms of words in the clause. Experiment results with 2,700 clauses show the effectiveness of the proposed approach, and it performed significantly better than the baseline approaches using a machine learning approach. Various challenging issues were identified and discussed through error analysis. The application of the proposed sentiment analysis approach will be useful not only for patients, but also for drug makers and clinicians to obtain valuable summaries of public opinion. Since sentiment analysis is domain specific, domain knowledge in drug reviews is incorporated into the sentiment analysis algorithm to provide more accurate analysis. In particular, MetaMap is used to map various health and medical terms (such as disease and drug names to semantic types in the Unified Medical Language System (UMLS Semantic Network.

  6. Transportation Routing Analysis Geographic Information System (TRAGIS) User's Manual

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, PE

    2003-09-18

    The Transportation Routing Analysis Geographic Information System (TRAGIS) model is used to calculate highway, rail, or waterway routes within the United States. TRAGIS is a client-server application with the user interface and map data files residing on the user's personal computer and the routing engine and network data files on a network server. The user's manual provides documentation on installation and the use of the many features of the model.

  7. Premo and Kansei: A Comparative Analysis

    Directory of Open Access Journals (Sweden)

    Anitawati Mohd Lokman

    2013-04-01

    Full Text Available Kansei Engineering is a technology that enables incorporation of human emotion in design requirements. It has in its perspective that Kansei is unique for different domain and unique for different target user group, and use mainly a verbal measurement instruments in its methodology. The technology is seen to have little shortcoming when there is a need to build universal design for universal target user. It will involve complexity when handling semantics since people do not speak the same language across the planet. Hence, a non-verbal emotion measurement tool is assumed to enhance the capability of K.E. in managing universal Kansei. This study aims to investigate the possibility of integrating PrEmo, a non-verbal self-reporting tool which were developed based on studies across culture and demographical setting into Kansei Engineering. The objectives are to analyze the similarities and differences of Kansei structure resulted by two different measurement tools, non-verbal (PrEmo and verbal (Kansei checklist self-reporting instrument, to provide hypothetical evidence of the feasibility of PrEmo as a tool to measure Kansei. 10 websites with significant visual design differences were used as stimuli in the evaluation procedure involving 30 respondents, who provided their Kansei responses using both instruments. The result has shown that the Kansei structure by both instruments are mostly similar, thus provide hypothetical evidence that PrEmo could be used as non-verbal self-reporting instrument to measure Kansei. The findings provide insights into future research for integration of universal Kansei.

  8. Human Capital Development: Comparative Analysis of BRICs

    Science.gov (United States)

    Ardichvili, Alexandre; Zavyalova, Elena; Minina, Vera

    2012-01-01

    Purpose: The goal of this article is to conduct macro-level analysis of human capital (HC) development strategies, pursued by four countries commonly referred to as BRICs (Brazil, Russia, India, and China). Design/methodology/approach: This analysis is based on comparisons of macro indices of human capital and innovativeness of the economy and a…

  9. Comparing Generalized Procrustes Analysis and STATIS

    OpenAIRE

    Meyners, M.; Kunert, Joachim; Qannari, El Mostafa

    1998-01-01

    We consider a model for sensory profiling data including translation, rotation and scaling. We compare two methods to calculate an overall consensus from several data matrices: GPA and STATIS. These methods are briefly illustrated and explained under our model. A series of simulations to compare their performance has been carried out. We found significant differences in performance depending on the variance of random errors and on the dimensionality of the true underlying consensus. Therefore...

  10. Comparative analysis of Indonesian and Korean governance

    OpenAIRE

    Hwang, Yunwon

    2011-01-01

    This paper overviews governance issues in Indonesia and Korea from a comparative perspective. To do so, the WGI (World Governance Index) developed by the World Bank is employed for a more objective and consistent comparison between the two countries. WGI consists of six dimensions of voice and accountability, political stability and absence of violence, government effectiveness, regulatory quality, rule of law, and control of corruption. The two countries are analyzed and compared by ea...

  11. Comparative Analysis of Competitive Strategy Implementation

    OpenAIRE

    Waweru, Maina A. S.

    2011-01-01

    This paper presents research findings on Competitive Strategy Implementation which compared the levels of strategy implementation achieved by different generic strategy groups, comprising firms inclined towards low cost leadership, differentiation or dual strategic advantage.  The study sought to determine the preferences for use of implementation armaments and compared how such armaments related to the level of implementation achieved.   Respondents comprised 71 top executives from 59 com...

  12. Chronic illness and multimorbidity among problem drug users: a comparative cross sectional pilot study in primary care.

    LENUS (Irish Health Repository)

    Cullen, Walter

    2012-02-01

    BACKGROUND: Although multimorbidity has important implications for patient care in general practice, limited research has examined chronic illness and health service utilisation among problem drug users. This study aimed to determine chronic illness prevalence and health service utilisation among problem drug users attending primary care for methadone treatment, to compare these rates with matched \\'controls\\' and to develop and pilot test a valid study instrument. METHODS: A cross-sectional study of patients attending three large urban general practices in Dublin, Ireland for methadone treatment was conducted, and this sample was compared with a control group matched by practice, age, gender and General Medical Services (GMS) status. RESULTS: Data were collected on 114 patients. Fifty-seven patients were on methadone treatment, of whom 52(91%) had at least one chronic illness (other then substance use) and 39(68%) were prescribed at least one regular medication. Frequent utilisation of primary care services and secondary care services in the previous six months was observed among patients on methadone treatment and controls, although the former had significantly higher chronic illness prevalence and primary care contact rates. The study instrument facilitated data collection that was feasible and with minimal inter-observer variation. CONCLUSION: Multimorbidity is common among problem drug users attending general practice for methadone treatment. Primary care may therefore have an important role in primary and secondary prevention of chronic illnesses among this population. This study offers a feasible study instrument for further work on this issue. (238 words).

  13. Chronic illness and multimorbidity among problem drug users: a comparative cross sectional pilot study in primary care.

    LENUS (Irish Health Repository)

    Cullen, Walter

    2009-01-01

    BACKGROUND: Although multimorbidity has important implications for patient care in general practice, limited research has examined chronic illness and health service utilisation among problem drug users. This study aimed to determine chronic illness prevalence and health service utilisation among problem drug users attending primary care for methadone treatment, to compare these rates with matched \\'controls\\' and to develop and pilot test a valid study instrument. METHODS: A cross-sectional study of patients attending three large urban general practices in Dublin, Ireland for methadone treatment was conducted, and this sample was compared with a control group matched by practice, age, gender and General Medical Services (GMS) status. RESULTS: Data were collected on 114 patients. Fifty-seven patients were on methadone treatment, of whom 52(91%) had at least one chronic illness (other then substance use) and 39(68%) were prescribed at least one regular medication. Frequent utilisation of primary care services and secondary care services in the previous six months was observed among patients on methadone treatment and controls, although the former had significantly higher chronic illness prevalence and primary care contact rates. The study instrument facilitated data collection that was feasible and with minimal inter-observer variation. CONCLUSION: Multimorbidity is common among problem drug users attending general practice for methadone treatment. Primary care may therefore have an important role in primary and secondary prevention of chronic illnesses among this population. This study offers a feasible study instrument for further work on this issue. (238 words).

  14. DOA Estimation-a Comparative Analysis

    Directory of Open Access Journals (Sweden)

    Ayesha Naaz

    2014-03-01

    Full Text Available In this paper, the direction of arrival (DOA angle estimation of signals impinging on 3- D array of sensors in cubical arrangement is studied. The results thus obtainedwere compared with the direction of arrivals obtained with a combination of two uniform square arrays which were considered in parallel to form a structure as cube. MUSIC algorithm (Multiple Signal Classification was used to estimate the directions of arrival (DOA of the signals .Also in this paper cubical array geometry for low signal to noise ratio was tested and the results compared were with that of the two parallel square arrays .Experimental results demonstrate that the cubical geometry has better detection capability as compared to two 2-D square arrays with the same or even a higher SNR.

  15. User`s guide for the frequency domain algorithms in the LIFE2 fatigue analysis code

    Energy Technology Data Exchange (ETDEWEB)

    Sutherland, H.J. [Sandia National Labs., Albuquerque, NM (United States); Linker, R.L. [New Mexico Engineering Research Inst., Albuquerque, NM (United States)

    1993-10-01

    The LIFE2 computer code is a fatigue/fracture analysis code that is specialized to the analysis of wind turbine components. The numerical formulation of the code uses a series of cycle count matrices to describe the cyclic stress states imposed upon the turbine. However, many structural analysis techniques yield frequency-domain stress spectra and a large body of experimental loads (stress) data is reported in the frequency domain. To permit the analysis of this class of data, a Fourier analysis is used to transform a frequency-domain spectrum to an equivalent time series suitable for rainflow counting by other modules in the code. This paper describes the algorithms incorporated into the code and their numerical implementation. Example problems are used to illustrate typical inputs and outputs.

  16. Software Graphical User Interface For Analysis Of Images

    Science.gov (United States)

    Leonard, Desiree M.; Nolf, Scott R.; Avis, Elizabeth L.; Stacy, Kathryn

    1992-01-01

    CAMTOOL software provides graphical interface between Sun Microsystems workstation and Eikonix Model 1412 digitizing camera system. Camera scans and digitizes images, halftones, reflectives, transmissives, rigid or flexible flat material, or three-dimensional objects. Users digitize images and select from three destinations: work-station display screen, magnetic-tape drive, or hard disk. Written in C.

  17. Wellness Model of Supervision: A Comparative Analysis

    Science.gov (United States)

    Lenz, A. Stephen; Sangganjanavanich, Varunee Faii; Balkin, Richard S.; Oliver, Marvarene; Smith, Robert L.

    2012-01-01

    This quasi-experimental study compared the effectiveness of the Wellness Model of Supervision (WELMS; Lenz & Smith, 2010) with alternative supervision models for developing wellness constructs, total personal wellness, and helping skills among counselors-in-training. Participants were 32 master's-level counseling students completing their…

  18. Chipster: user-friendly analysis software for microarray and other high-throughput data

    Directory of Open Access Journals (Sweden)

    Scheinin Ilari

    2011-10-01

    Full Text Available Abstract Background The growth of high-throughput technologies such as microarrays and next generation sequencing has been accompanied by active research in data analysis methodology, producing new analysis methods at a rapid pace. While most of the newly developed methods are freely available, their use requires substantial computational skills. In order to enable non-programming biologists to benefit from the method development in a timely manner, we have created the Chipster software. Results Chipster (http://chipster.csc.fi/ brings a powerful collection of data analysis methods within the reach of bioscientists via its intuitive graphical user interface. Users can analyze and integrate different data types such as gene expression, miRNA and aCGH. The analysis functionality is complemented with rich interactive visualizations, allowing users to select datapoints and create new gene lists based on these selections. Importantly, users can save the performed analysis steps as reusable, automatic workflows, which can also be shared with other users. Being a versatile and easily extendable platform, Chipster can be used for microarray, proteomics and sequencing data. In this article we describe its comprehensive collection of analysis and visualization tools for microarray data using three case studies. Conclusions Chipster is a user-friendly analysis software for high-throughput data. Its intuitive graphical user interface enables biologists to access a powerful collection of data analysis and integration tools, and to visualize data interactively. Users can collaborate by sharing analysis sessions and workflows. Chipster is open source, and the server installation package is freely available.

  19. Comparative Distributions of Hazard Modeling Analysis

    Directory of Open Access Journals (Sweden)

    Rana Abdul Wajid

    2006-07-01

    Full Text Available In this paper we present the comparison among the distributions used in hazard analysis. Simulation technique has been used to study the behavior of hazard distribution modules. The fundamentals of Hazard issues are discussed using failure criteria. We present the flexibility of the hazard modeling distribution that approaches to different distributions.

  20. COMPARATIVE ANALYSIS OF VAT IN FOREIGN COUNTRIES

    Directory of Open Access Journals (Sweden)

    ?.?. ??????

    2011-06-01

    Full Text Available  The article is devoted to the study of the implementation of the VAT in various countries around the world. The author analyses the peculiarities of VAT in different taxation models, studies the modern trends in the application of VAT. Based on the carried out analysis the recommendations on the possible improvement of the VAT application in Ukraine are given.

  1. Comparative Lifecycle Energy Analysis: Theory and Practice.

    Science.gov (United States)

    Morris, Jeffrey; Canzoneri, Diana

    1992-01-01

    Explores the position that more energy is conserved through recycling secondary materials than is generated from municipal solid waste incineration. Discusses one component of a lifecycle analysis--a comparison of energy requirements for manufacturing competing products. Includes methodological issues, energy cost estimates, and difficulties…

  2. Loss Given Default Modelling: Comparative Analysis

    OpenAIRE

    Yashkir, Olga; Yashkir, Yuriy

    2013-01-01

    In this study we investigated several most popular Loss Given Default (LGD) models (LSM, Tobit, Three-Tiered Tobit, Beta Regression, Inflated Beta Regression, Censored Gamma Regression) in order to compare their performance. We show that for a given input data set, the quality of the model calibration depends mainly on the proper choice (and availability) of explanatory variables (model factors), but not on the fitting model. Model factors were chosen based on the amplitude of their correlati...

  3. ST-analyzer: a web-based user interface for simulation trajectory analysis.

    Science.gov (United States)

    Jeong, Jong Cheol; Jo, Sunhwan; Wu, Emilia L; Qi, Yifei; Monje-Galvan, Viviana; Yeom, Min Sun; Gorenstein, Lev; Chen, Feng; Klauda, Jeffery B; Im, Wonpil

    2014-05-01

    Molecular dynamics (MD) simulation has become one of the key tools to obtain deeper insights into biological systems using various levels of descriptions such as all-atom, united-atom, and coarse-grained models. Recent advances in computing resources and MD programs have significantly accelerated the simulation time and thus increased the amount of trajectory data. Although many laboratories routinely perform MD simulations, analyzing MD trajectories is still time consuming and often a difficult task. ST-analyzer, http://im.bioinformatics.ku.edu/st-analyzer, is a standalone graphical user interface (GUI) toolset to perform various trajectory analyses. ST-analyzer has several outstanding features compared to other existing analysis tools: (i) handling various formats of trajectory files from MD programs, such as CHARMM, NAMD, GROMACS, and Amber, (ii) intuitive web-based GUI environment--minimizing administrative load and reducing burdens on the user from adapting new software environments, (iii) platform independent design--working with any existing operating system, (iv) easy integration into job queuing systems--providing options of batch processing either on the cluster or in an interactive mode, and (v) providing independence between foreground GUI and background modules--making it easier to add personal modules or to recycle/integrate pre-existing scripts utilizing other analysis tools. The current ST-analyzer contains nine main analysis modules that together contain 18 options, including density profile, lipid deuterium order parameters, surface area per lipid, and membrane hydrophobic thickness. This article introduces ST-analyzer with its design, implementation, and features, and also illustrates practical analysis of lipid bilayer simulations. PMID:24638223

  4. Comparing methods for Twitter Sentiment Analysis

    OpenAIRE

    Psomakelis, Evangelos; Tserpes, Konstantinos; Anagnostopoulos, Dimosthenis; Varvarigou, Theodora

    2015-01-01

    This work extends the set of works which deal with the popular problem of sentiment analysis in Twitter. It investigates the most popular document ("tweet") representation methods which feed sentiment evaluation mechanisms. In particular, we study the bag-of-words, n-grams and n-gram graphs approaches and for each of them we evaluate the performance of a lexicon-based and 7 learning-based classification algorithms (namely SVM, Na\\"ive Bayesian Networks, Logistic Regression, ...

  5. Dashboard Task Monitor for managing ATLAS user analysis on the Grid

    CERN Document Server

    Sargsyan, L; The ATLAS collaboration; Jha, M; Karavakis, E; Kokoszkiewicz, L; Saiz, P; Schovancova, J; Tuckett, D

    2013-01-01

    The organization of the distributed user analysis on the Worldwide LHC Computing Grid (WLCG) infrastructure is one of the most challenging tasks among the computing activities at the Large Hadron Collider. The Experiment Dashboard offers a solution that not only monitors but also manages (kill, resubmit) user tasks and jobs via a web interface. The ATLAS Dashboard Task Monitor provides analysis users with a tool that is operating system and GRID environment independent. This contribution describes the functionality of the application and its implementation details, in particular authentication, authorization and audit of the management operations.

  6. Dashboard Task Monitor for managing ATLAS user analysis on the Grid

    CERN Document Server

    Sargsyan, L; Jha, M; Karavakis, E; Kokoszkiewicz, L; Saiz, P; Schovancova, J; Tuckett, D

    2013-01-01

    The organization of the distributed user analysis on the Worldwide LHC Computing Grid (WLCG) infrastructure is one of the most challenging tasks among the computing activities at the Large Hadron Collider. The Experiment Dashboard offers a solution that not only monitors but also manages (kill, resubmit) user tasks and jobs via a web interface. The ATLAS Dashboard Task Monitor provides analysis users with a tool that is independent of the operating system and GRID environment . This contribution describes the functionality of the application and its implementation details, in particular authentication, authorization and audit of the management operations.

  7. Dashboard Task Monitor for Managing ATLAS User Analysis on the Grid

    Science.gov (United States)

    Sargsyan, L.; Andreeva, J.; Jha, M.; Karavakis, E.; Kokoszkiewicz, L.; Saiz, P.; Schovancova, J.; Tuckett, D.; Atlas Collaboration

    2014-06-01

    The organization of the distributed user analysis on the Worldwide LHC Computing Grid (WLCG) infrastructure is one of the most challenging tasks among the computing activities at the Large Hadron Collider. The Experiment Dashboard offers a solution that not only monitors but also manages (kill, resubmit) user tasks and jobs via a web interface. The ATLAS Dashboard Task Monitor provides analysis users with a tool that is independent of the operating system and Grid environment. This contribution describes the functionality of the application and its implementation details, in particular authentication, authorization and audit of the management operations.

  8. Comparative analysis of some search engines

    OpenAIRE

    Edosomwan, Taiwo O.; Joseph Edosomwan

    2010-01-01

    We compared the information retrieval performances of some popular search engines (namely, Google, Yahoo, AlltheWeb, Gigablast, Zworks and AltaVista and Bing/MSN) in response to a list of ten queries, varying in complexity. These queries were run on each search engine and the precision and response time of the retrieved results were recorded. The first ten documents on each retrieval output were evaluated as being ‘relevant’ or ‘non-relevant’ for evaluation of the search engine’s pr...

  9. Comparative Analysis of Color Video Enhancment Techniques

    Directory of Open Access Journals (Sweden)

    Rajeev Sunakara

    2013-10-01

    Full Text Available Contrast enhancement has an important role in image processing applications. This paper presents a color enhancement algorithm based on adaptive filter technique. First, the proposed method is  divided into  three  major  parts:  obtain  luminance  image  and backdrop  image,  adaptive  modification  and  color restoration.  different  traditional  color  image  enhancement algorithms,  the  adaptive  filter  in  the  algorithm  takes  color information  into  consideration.  The algorithm finds the significance of color information in color image enhancement and utilizes color space conversion to obtain a much better visibility.  . In the practical results, the proposed method reproduces better enhancement and reduce the halo distortion compared with the bilateral  methods.

  10. Nigerian Criminal Networks; A comparative analysis.

    OpenAIRE

    Alkholt, Aimar

    2010-01-01

    Why is an African federation the home to one of the more dominating criminal networks operating globally? Nigeria is not well known for its high level of Internet-infrastructure. Still, it is in a class of its own when it comes to e-fraud or 419 spam mails. It is also prominent within the drug trade and the African-European trafficking network. By comparatively analysing other forms of Organized Crime against the Nigerian Brand, the thesis has tried to find the particulars of Nigerian Crimina...

  11. Radionuclides in sediments - a comparative analysis, 1981

    International Nuclear Information System (INIS)

    On behalf of the BMI (Federal German Ministry of the Interior), the BfG in 1981 again started an interlaboratory comparison (among 42 measuring points) on the topic of radionuclides in sediments. The study was intended to test the reliability of G?, G?/R? measurements in sedimentary samples under practical conditions. The comparative analyses again revealed a number of error sources, and errors could be corrected. This was achieved not least by a good cooperation among the participating laboratories who contributed a.o. very useful information and ideas. (orig./HP)

  12. Research on Analysis of Hindi language Graphical user Interface

    OpenAIRE

    Ms. Nikita Bhati

    2014-01-01

    The interface between humans and computers is an ever critical issue due to the increased complexity of computerized systems and the wide variety of problems they solve. Controlled natural languages might prove a promising medium between humans and computers, however, they are not easy to design, and humans need time to adapt to them. Authors propose to solve these issues by using a controlled user interface which is powered by an automatically constructed application-oriented...

  13. Risk analysis of SIP monitoring and control system user interface

    OpenAIRE

    Markevi?ius, V.; Navikas, D.; Jonynas, V.; Dubauskien?, N.

    2008-01-01

    There are analyzed properties of syringe infusion pumps (SIP) control system user interface that allows reducing patient risk during the infusion. Also there are analysed main reasons of SIP security problems for the patient. Patient risk variation evaluation method according to statistical SIP reliability and personel human mistake factor during the work is presented. Infusion for one patient using n single syringe infusion pumps reliability and infusion for one patient using SIPCS with n sy...

  14. Analysis of User Requirements in Interactive 3D Video Systems

    OpenAIRE

    Haiyue Yuan; Janko ?ali?; Ahmet Kondoz

    2012-01-01

    The recent development of three dimensional (3D) display technologies has resulted in a proliferation of 3D video production and broadcasting, attracting a lot of research into capture, compression and delivery of stereoscopic content. However, the predominant design practice of interactions with 3D video content has failed to address its differences and possibilities in comparison the existing 2D video interactions. This paper presents a study of user requirements related to interaction w...

  15. Analysis and Improvement of a User Authentication Improved Protocol

    Directory of Open Access Journals (Sweden)

    Zuowen Tan

    2010-05-01

    Full Text Available Remote user authentication always adopts the method of password to login the server within insecure network environments. Recently, Peyravin and Jeffries proposed a practical authentication scheme based on one-way collision-resistant hash functions. However, Shim and Munilla independently showed that the scheme is vulnerable to off-line guessing attacks. In order to remove the weakness, Hölbl, Welzer and Brumenn presented an improved secure password-based protocols for remote user authentication, password change and session key establishment.  Unfortunately, the remedies of their improved scheme cannot work. The improved scheme still suffers from the off-line attacks. And the password change protocol is insecure against Denial-of-Service attack. A proposed scheme is presented which overcomes these weaknesses. Detailed cryanalysis show that the proposed password-based protocols for remote user authentication, password change and session key establishment are immune against man-in-the-middle attacks, replay attacks, password guessing attacks, outsider attacks, denial-of-Service attacks and impersonation attacks.

  16. Comparative analysis of life insurance market

    Directory of Open Access Journals (Sweden)

    Malynych, Anna Mykolayivna

    2011-05-01

    Full Text Available The article deals with the comprehensive analysis of statistic insight into development of the world and regional life insurance markets on the basis of macroeconomic indicators. The author located domestic life insurance market on the global scale, analyzed its development and suggested the methods to calculate the marketing life insurance index. There was also approbated the mentioned methods on database of 77 countries all over the world. The author also defined the national rating on the basis of marketing life insurance index.

  17. A comparative analysis of Cannabis material.

    Science.gov (United States)

    Coutts, R T; Jones, G R

    1979-04-01

    Extracts of 100 plant-like or resinous materials were analyzed for CBD, CBC, delta 9-THC, and CBN by GC using two different column packings and by GC-MS. Our independent identification of these cannabinoids confirmed those of other forensic science analysts who used microscopic examination, the Duquenois-Levine color test, and TLC for their analyses of the same samples. The identifications of cannabinoids by forensic acience analysts using TLC were corroborated by GC-MS analysis of hexane extracts of appropriate chromatogram spots. PMID:541611

  18. The Constant Comparative Method of Qualitative Analysis

    OpenAIRE

    Barney G Glaser, Ph D.

    2008-01-01

    Currently, the general approaches to the analysis of qualitative data are these:1.) If the analyst wishes to convert qualitative data into crudely quantifiable form so that he can provisionally test a hypothesis, he codes the data first and then analyzes it. He makes an effort to code “all relevant data [that] can be brought to bear on a point,” and then systematically assembles, assesses and analyzes these data in a fashion that will “constitute proof for a given proposition.”i2.) If...

  19. Comparable Analysis of Folk Home Comparable Analysis of Folk Home of Lithuania and Poland

    Directory of Open Access Journals (Sweden)

    Egl? Kumpikait?

    2014-10-01

    Full Text Available Region of East Prussia is investigated very low in Lithuania and in Poland. The aim of this investigation is to make complex analysis and to compare folk home textile of East Prussia weaved in territories of Lithuania and Poland presenting technological recommendations for reconstructions manufacturing. In this work there were investigated 56 folk home textile pieces (27 bedspreads and 29 tablecloths from 5 Lithuanian museums and 2 Polish museums. Number of pieces, patterns, weaves, colours, number of heald shafts etc, and their chronological tendencies were established during investigation. According to the separated samples of home textile from East Prussia region, the reconstructions of home textile fabrics were designed and weaved: 24 units (123.77 m, 12 units of them—with industrial weaving loom, 12 units—with hand weaving loom.

  20. Comparing structural decomposition analysis and index

    International Nuclear Information System (INIS)

    To analyze and understand historical changes in economic, environmental, employment or other socio-economic indicators, it is useful to assess the driving forces or determinants that underlie these changes. Two techniques for decomposing indicator changes at the sector level are structural decomposition analysis (SDA) and index decomposition analysis (IDA). For example, SDA and IDA have been used to analyze changes in indicators such as energy use, CO2-emissions, labor demand and value added. The changes in these variables are decomposed into determinants such as technological, demand, and structural effects. SDA uses information from input-output tables while IDA uses aggregate data at the sector-level. The two methods have developed quite independently, which has resulted in each method being characterized by specific, unique techniques and approaches. This paper has three aims. First, the similarities and differences between the two approaches are summarized. Second, the possibility of transferring specific techniques and indices is explored. Finally, a numerical example is used to illustrate differences between the two approaches

  1. The Constant Comparative Method of Qualitative Analysis

    Directory of Open Access Journals (Sweden)

    Barney G. Glaser, Ph.D.

    2008-11-01

    Full Text Available Currently, the general approaches to the analysis of qualitative data are these:1. If the analyst wishes to convert qualitative data into crudely quantifiable form so that he can provisionally test a hypothesis, he codes the data first and then analyzes it. He makes an effort to code “all relevant data [that] can be brought to bear on a point,” and then systematically assembles, assesses and analyzes these data in a fashion that will “constitute proof for a given proposition.”i2. If the analyst wishes only to generate theoretical ideasnew categories and their properties, hypotheses and interrelated hypotheses- he cannot be confined to the practice of coding first and then analyzing the data since, in generating theory, he is constantly redesigning and reintegrating his theoretical notions as he reviews his material.ii Analysis with his purpose, but the explicit coding itself often seems an unnecessary, burdensome task. As a result, the analyst merely inspects his data for new properties of his theoretical categories, and writes memos on these properties.We wish to suggest a third approach

  2. Comparative analysis of some search engines

    Scientific Electronic Library Online (English)

    J, Edosomwan; TO, Edosomwan.

    2010-12-01

    Full Text Available We compared the information retrieval performances of some popular search engines (namely, Google, Yahoo, AlltheWeb, Gigablast, Zworks and AltaVista and Bing/MSN) in response to a list of ten queries, varying in complexity. These queries were run on each search engine and the precision and response [...] time of the retrieved results were recorded. The first ten documents on each retrieval output were evaluated as being 'relevant' or 'non-relevant' for evaluation of the search engine's precision. To evaluate response time, normalised recall ratios were calculated at various cut-off points for each query and search engine. This study shows that Google appears to be the best search engine in terms of both average precision (70%) and average response time (2 s). Gigablast and AlltheWeb performed the worst overall in this study.

  3. Comparative analysis of some search engines

    Directory of Open Access Journals (Sweden)

    Taiwo O. Edosomwan

    2010-10-01

    Full Text Available We compared the information retrieval performances of some popular search engines (namely, Google, Yahoo, AlltheWeb, Gigablast, Zworks and AltaVista and Bing/MSN in response to a list of ten queries, varying in complexity. These queries were run on each search engine and the precision and response time of the retrieved results were recorded. The first ten documents on each retrieval output were evaluated as being ‘relevant’ or ‘non-relevant’ for evaluation of the search engine’s precision. To evaluate response time, normalised recall ratios were calculated at various cut-off points for each query and search engine. This study shows that Google appears to be the best search engine in terms of both average precision (70% and average response time (2 s. Gigablast and AlltheWeb performed the worst overall in this study.

  4. Resilience and electricity systems: A comparative analysis

    International Nuclear Information System (INIS)

    Electricity systems have generally evolved based on the natural resources available locally. Few metrics exist to compare the security of electricity supply of different countries despite the increasing likelihood of potential shocks to the power system like energy price increases and carbon price regulation. This paper seeks to calculate a robust measure of national power system resilience by analysing each step in the process of transformation from raw energy to consumed electricity. Countries with sizeable deposits of mineral resources are used for comparison because of the need for electricity-intensive metals processing. We find that shifts in electricity-intensive industry can be predicted based on countries' power system resilience. - Highlights: ? We establish a resilience index measure for major electricity systems. ? We examine a range of OECD and developing nations electricity systems and their ability to cope with shocks. ? Robustness measures are established to show resilience of electricity systems.

  5. Nonlinear analysis of RED - a comparative study

    International Nuclear Information System (INIS)

    Random Early Detection (RED) is an active queue management (AQM) mechanism for routers on the Internet. In this paper, performance of RED and Adaptive RED are compared from the viewpoint of nonlinear dynamics. In particular, we reveal the relationship between the performance of the network and its nonlinear dynamical behavior. We measure the maximal Lyapunov exponent and Hurst parameter of the average queue length of RED and Adaptive RED, as well as the throughput and packet loss rate of the aggregate traffic on the bottleneck link. Our simulation scenarios include FTP flows and Web flows, one-way and two-way traffic. In most situations, Adaptive RED has smaller maximal Lyapunov exponents, lower Hurst parameters, higher throughput and lower packet loss rate than that of RED. This confirms that Adaptive RED has better performance than RED

  6. Detailed description and user`s manual of high burnup fuel analysis code EXBURN-I

    Energy Technology Data Exchange (ETDEWEB)

    Suzuki, Motoe [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment; Saitou, Hiroaki

    1997-11-01

    EXBURN-I has been developed for the analysis of LWR high burnup fuel behavior in normal operation and power transient conditions. In the high burnup region, phenomena occur which are different in quality from those expected for the extension of behaviors in the mid-burnup region. To analyze these phenomena, EXBURN-I has been formed by the incorporation of such new models as pellet thermal conductivity change, burnup-dependent FP gas release rate, and cladding oxide layer growth to the basic structure of low- and mid-burnup fuel analysis code FEMAXI-IV. The present report describes in detail the whole structure of the code, models, and materials properties. Also, it includes a detailed input manual and sample output, etc. (author). 55 refs.

  7. Community detection algorithms: a comparative analysis

    CERN Document Server

    Lancichinetti, Andrea

    2009-01-01

    Uncovering the community structure exhibited by real networks is a crucial step towards an understanding of complex systems that goes beyond the local organization of their constituents. Many algorithms have been proposed so far, but none of them has been subjected to strict tests to evaluate their performance. Most of the sporadic tests performed so far involved small networks with known community structure and/or artificial graphs with a simplified structure, which is very uncommon in real systems. Here we test several methods against a recently introduced class of benchmark graphs, with heterogeneous distributions of degree and community size. The methods are also tested against the benchmark by Girvan and Newman and on random graphs. As a result of our analysis, three recent algorithms introduced by Rosvall and Bergstrom, Blondel et al. and Ronhovde and Nussinov, respectively, have an excellent performance, with the additional advantage of low computational complexity, which enables one to analyze large s...

  8. Comparative Analysis Of Cloud Computing Security Issues

    Directory of Open Access Journals (Sweden)

    AKRAM MUJAHID

    2014-01-01

    Full Text Available Almost all the organizations are seriously thinking to adopt the cloud computingservices, seeing its benefits in terms of cost, accessibility, availability, flexibility andhighly automated process of updation. Cloud Computing enhance the current capabilitiesdynamically without further investment. Cloud Computing is a band of resources, applicationsand services. In cloud computing customer’s access IT related services in terms of infrastructure platform and software without getting knowledge of underlying technologies. With the executionof cloud computing, organizations have strong concerns about the security of their data.Organizations are hesitating to take initiatives in the deployment of their businesses due to data security problem. This paper gives an overview of cloud computing and analysis of security issues in cloud computing.

  9. Investigating and Comparing User Experiences of Course Management Systems: Blackboard vs. Moodle

    Science.gov (United States)

    Unal, Zafer; Unal, Aslihan

    2014-01-01

    The goal of this study is to report the results of a comparative usability study conducted in 2008-2009 on two different course management systems (CMS), BlackBoard and Moodle. 135 students enrolled in the fall 2008 and spring 2009 section of Introduction to Educational Technology participated in the study (72 and 63 respectively). At the…

  10. Point Analysis in Java applied to histological images of the perforant pathway: A user’s account

    Science.gov (United States)

    Wright, Susan N.; Card, J. Patrick; Ascoli, Giorgio A.; Barrionuevo, Germán

    2015-01-01

    The freeware Java tool PAJ, created to perform 3D point analysis, was tested in an independent laboratory setting. The input data consisted of images of the hippocampal perforant pathway from serial immunocytochemical localizations of the rat brain in multiple views at different resolutions. The low magnification set (2× objective) comprised the entire perforant pathway, while the high magnification set (100× objective) allowed the identification of individual fibers. A preliminary stereological study revealed a striking linear relationship between the fiber count at high magnification and the optical density at low magnification. PAJ enabled fast analysis for down-sampled data sets and a friendly interface with automated plot drawings. Noted strengths included the multi-platform support as well as the free availability of the source code, conducive to a broad user base and maximum flexibility for ad hoc requirements. PAJ has great potential to extend its usability by (a) improving its graphical user interface, (b) increasing its input size limit, (c) improving response time for large data sets, and (d) potentially being integrated with other Java graphical tools such as ImageJ. PMID:18350259

  11. Comparative economic analysis: Anaerobic digester case study

    International Nuclear Information System (INIS)

    An economic guide is developed to assess the value of anaerobic digesters used on dairy farms. Two varieties of anaerobic digesters, a conventional mixed-tank mesophilic and an innovative earthen psychrophilic, are comparatively evaluated using a cost-effectiveness index. The two case study examples are also evaluated using three other investment merit statistics: simple payback period, net present value, and internal rate of return. Life-cycle savings are estimated for both varieties, with sensitivities considered for investment risk. The conclusion is that an earthen psychrophilic digester can have a significant economic advantage over a mixed-tank mesophilic digester because of lower capital cost and reduced operation and maintenance expenses. Because of this economic advantage, additional projects are being conducted in North Carolina to increase the rate of biogas utilization. The initial step includes using biogas for milk cooling at the dairy farm where the existing psychrophilic digester is located. Further, a new project is being initiated for electricity production with thermal reclaim at a swine operation

  12. Comparative proteomics analysis of human gastric cancer

    Directory of Open Access Journals (Sweden)

    Wei Li, Jian-Fang Li, Ying Qu, Xue-Hua Chen, Jian-Min Qin, Qin-Long Gu, Min Yan, Zheng-Gang Zhu, Bing-Ya Liu

    2008-10-01

    Full Text Available AIM: To isolate and identify differentially expressed proteins between cancer and normal tissues of gastric cancer by two-dimensional electrophoresis (2-DE and matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF-MS.METHODS: Soluble fraction proteins of gastric cancer tissues and paired normal tissues were separated by 2-DE. The differentially expressed proteins were selected and identified by MALDI-TOF-MS and database search.RESULTS: 2-DE profiles with high resolution and reproducibility were obtained. Twenty-three protein spots were excised from sliver staining gel and digested in gel by trypsin, in which fifteen protein spots were identified successfully. Among the identified proteins, there were ten over-expressed and five under-expressed proteins in stomach cancer tissues compared with normal tissues.CONCLUSION: In this study, the well-resolved, reproducible 2-DE patterns of human gastric cancer tissue and paired normal tissue were established and optimized and certain differentially-expressed proteins were identified. The combined use of 2-DE and MS provides an effective approach to screen for potential tumor markers.

  13. CARBON SEQUESTRATION: A METHODS COMPARATIVE ANALYSIS

    International Nuclear Information System (INIS)

    All human activities are related with the energy consumption. Energy requirements will continue to rise, due to the modern life and the developing countries growth. Most of the energy demand emanates from fossil fuels. Fossil fuels combustion has negative environmental impacts, with the CO2 production to be dominating. The fulfillment of the Kyoto protocol criteria requires the minimization of CO2 emissions. Thus the management of the CO2 emissions is an urgent matter. The use of appliances with low energy use and the adoption of an energy policy that prevents the unnecessary energy use, can play lead to the reduction of carbon emissions. A different route is the introduction of ''clean'' energy sources, such as renewable energy sources. Last but not least, the development of carbon sequestration methods can be promising technique with big future potential. The objective of this work is the analysis and comparison of different carbon sequestration and deposit methods. Ocean deposit, land ecosystems deposit, geological formations deposit and radical biological and chemical approaches will be analyzed

  14. Users? guide to the orthopaedic literature: What is a cost-effectiveness analysis?

    Directory of Open Access Journals (Sweden)

    Tanner Stephanie

    2008-01-01

    Full Text Available As the cost of healthcare continue to rise, orthopaedic surgeons are being pressured to practice cost-effective healthcare. Consequently, economic evaluation of treatment options are being reported more commonly in medical and surgical literature. As new orthopaedic procedures and treatments may improve patient outcome and function over traditional treatment options, the effect of the potentially higher costs of new treatments should be formally evaluated. Unfortunately, the resources available for healthcare spending are typically limited. Therefore, cost-effectiveness analyses have become an important and useful tool in informing which procedure or treatment to implement into practice. Cost-effectiveness analysis is a type of economic analysis that compares both the clinical outcomes and the costs of new treatment options to current treatment options or standards of care. For a clinician to be able to apply the results of a cost-effectiveness analysis to their practice, they must be able to critically review the available literature. Conducting an economic analysis is a challenging process, which has resulted in a number of published economic analyses that are of lower quality and may be fraught with bias. It is important that the reader of an economic analysis or cost-effectiveness analysis have the skills required to properly evaluate and critically appraise the methodology used before applying the recommendations to their practice. Using the principles of evidence-based medicine and the questions outlined in the Journal of the American Medical Association?s Users? Guide to the Medical Literature, this article attempts to illustrate how to critically appraise a cost-effectiveness analysis in the orthopaedic surgery literature.

  15. Anti-human immunodeficiency virus-1 antibody titers in injection drug users compared to sexually infected individuals

    Directory of Open Access Journals (Sweden)

    Bongertz Vera

    2003-01-01

    Full Text Available Sera from infected injection drug users (IDU have shown to have antibodies against synthetic human immunodeficiency virus-1 (HIV-1 envelope peptides more frequently. In this study, reactivity of 48 IDU plasma were compared to 60 plasmas obtained from sexually infected individuals (S. The overall reactivity of plasma from IDU compared to S was higher, and the reactivity titers were much higher for IDU plasma than S. IDU plasma also showed a broader antibody response. The higher reactivity titers were observed mainly for the gp41 immunodominant epitope and V3 peptides corresponding to the consensus sequences of HIV-1 subtypes/variants prevalent in Brazil (B, F, C indicating the specificity in the higher immune response of IDU.

  16. Anti-human immunodeficiency virus-1 antibody titers in injection drug users compared to sexually infected individuals

    Scientific Electronic Library Online (English)

    Vera, Bongertz; Elaine Priscilla, Ouverney; Sylvia LM, Teixeira; Carlos, Silva-de-Jesus; Mariana A, Hacker; Mariza G, Morgado; Francisco I, Bastos.

    2003-03-01

    Full Text Available Sera from infected injection drug users (IDU) have shown to have antibodies against synthetic human immunodeficiency virus-1 (HIV-1) envelope peptides more frequently. In this study, reactivity of 48 IDU plasma were compared to 60 plasmas obtained from sexually infected individuals (S). The overall [...] reactivity of plasma from IDU compared to S was higher, and the reactivity titers were much higher for IDU plasma than S. IDU plasma also showed a broader antibody response. The higher reactivity titers were observed mainly for the gp41 immunodominant epitope and V3 peptides corresponding to the consensus sequences of HIV-1 subtypes/variants prevalent in Brazil (B, F, C) indicating the specificity in the higher immune response of IDU.

  17. A comparative 2D modeling of debris-flow propagation and outcomes for end-users

    Science.gov (United States)

    Bettella, F.; Bertoldi, G.; Pozza, E.; McArdell, B. W.; D'Agostino, V.

    2012-04-01

    In Alpine regions gravity-driven natural hazards, in particular debris flows, endanger settlements and human life. Mitigation strategies based on hazard maps are necessary tools for land planning. These maps can be made more precise by using numerical models to forecasting the inundated areas after a careful setting of those 'key parameters' (K-P) which directly affect the flow motion and its interaction with the ground surface. Several physically based 2D models are available for practitioners and governmental agencies, but the selection criteria of model type and of the related K-P remain flexible and partly subjective. This remark has driven us to investigate how different models simulate different types of debris flows (from granular to muddy debris flows, going through intermediate types), in particular when the flow is influenced by the presence of deposition basins. Two commercial 2D physical models (RAMMS and FLO-2D) have been tested for five well-documented debris flows events from five Italian catchments were different geology and flow dynamics are observed: 1) a viscous debris flow occurred in 2009 in a catchment with a metamorphic geology (Gadria torrent, Bolzano Province); 2) the 2009 granular debris flow in an granitic geological setting (Rio Dosson, Trento Province); 3-4) two events occurred in the 'rio Val del Lago' and 'rio Molinara' (Trento Province) in 2010 where porphyritic lithology prevails (intermediate granular debris flow); 5) the Rotolon torrent (Vicenza Province) 2009 debris flow containing sedimentary rocks enclosed in an abundant clay-rich matrix (intermediate viscous case). Event volumes range from 5.000 to 50.000 cubic meters. The Gadria, Rotolon and Val del Lago events are also influenced by artificial retention basins. Case study simulations allowed delineation of some practical end-user suggestions and good practices in order to guide the model choice and the K-P setting, particularly related to different flow dynamics. The presence of mitigation structures (e.g. check dams and retention basins) demands both the implementation of a precise topography and the introduction of devices to better model sediment trapping and functionality of open/closed check dams. The study results represent: i) a first support for practitioners to directly manage debris-flow simulations; ii) an help for local authorities to give the right value to simulations carried out by practitioners and scientific community; iii) a warning that hazard maps should not be based just on model simulation results.

  18. Exploratory factor analysis and reliability analysis with missing data: A simple method for SPSS users

    Directory of Open Access Journals (Sweden)

    Bruce Weaver

    2014-09-01

    Full Text Available Missing data is a frequent problem for researchers conducting exploratory factor analysis (EFA or reliability analysis. The SPSS FACTOR procedure allows users to select listwise deletion, pairwise deletion or mean substitution as a method for dealing with missing data. The shortcomings of these methods are well-known. Graham (2009 argues that a much better way to deal with missing data in this context is to use a matrix of expectation maximization (EM covariances(or correlations as input for the analysis. SPSS users who have the Missing Values Analysis add-on module can obtain vectors ofEM means and standard deviations plus EM correlation and covariance matrices via the MVA procedure. But unfortunately, MVA has no /MATRIX subcommand, and therefore cannot write the EM correlations directly to a matrix dataset of the type needed as input to the FACTOR and RELIABILITY procedures. We describe two macros that (in conjunction with an intervening MVA command carry out the data management steps needed to create two matrix datasets, one containing EM correlations and the other EM covariances. Either of those matrix datasets can then be used asinput to the FACTOR procedure, and the EM correlations can also be used as input to RELIABILITY. We provide an example that illustrates the use of the two macros to generate the matrix datasets and how to use those datasets as input to the FACTOR and RELIABILITY procedures. We hope that this simple method for handling missing data will prove useful to both students andresearchers who are conducting EFA or reliability analysis.

  19. Comparative analysis from hydroelectric generation versus natural gas

    International Nuclear Information System (INIS)

    The aim of the work was to present a comparative analysis between hydroelectric generation and natural gas based on integrated resource planning and sustainable development. The introduced comparative analysis considers the financial aspects; the appropriated technology; and the social, environmental and political factors. The hydroelectric option it showed more advantageous than the thermoelectric. This result was independent of the enterprise scale

  20. Tracking and Analysis Framework (TAF) model documentation and user`s guide

    Energy Technology Data Exchange (ETDEWEB)

    Bloyd, C.; Camp, J.; Conzelmann, G. [and others

    1996-12-01

    With passage of the 1990 Clean Air Act Amendments, the United States embarked on a policy for controlling acid deposition that has been estimated to cost at least $2 billion. Title IV of the Act created a major innovation in environmental regulation by introducing market-based incentives - specifically, by allowing electric utility companies to trade allowances to emit sulfur dioxide (SO{sub 2}). The National Acid Precipitation Assessment Program (NAPAP) has been tasked by Congress to assess what Senator Moynihan has termed this {open_quotes}grand experiment.{close_quotes} Such a comprehensive assessment of the economic and environmental effects of this legislation has been a major challenge. To help NAPAP face this challenge, the U.S. Department of Energy (DOE) has sponsored development of an integrated assessment model, known as the Tracking and Analysis Framework (TAF). This section summarizes TAF`s objectives and its overall design.

  1. A Methodology for Evaluating User Perceptions of the Delivery of ICT Services: a comparative study of six UK local authorities

    Directory of Open Access Journals (Sweden)

    Les Worrall

    2000-11-01

    Full Text Available Evaluating and managing the effective delivery of ICT services is an issue that has been brought into sharper relief recently. This has been particularly prevalent in the UK public sector where the growing emphasis on formalised client-contractor relationships, outsourcing and benchmarking (both between local authorities and between local authorities and private sector organisations has meant that the definition of service standards and agreeing performance criteria has attracted considerable practitioner attention. This research is based on 295 interviews conducted in six UK local authorities. The investigation used both gap analysis and perceptual mapping techniques to develop an understanding of the aspects of ICT service delivery that users' value most in conjunction with an assessment of how well they perceive their ICT department is performing on these criteria. The paper exposes considerable differences in the relative performance of the six local authorities from both the gap analysis and the perceptual mapping elements of the investigation. The methodology is shown to provide an effective way of identifying key performance issues from the user perspective and benchmarking service performance across organisations.

  2. A Cooperative Diversity Analysis of Two User Mobile Communication System with Maximal Ratio Combining

    OpenAIRE

    Sateeshkrishna Dhuli; Mani, V. V.

    2013-01-01

    Cooperative communication is going to play a vital role in the next generation wireless networks. In this paper we derive the expression for symbol error probability (SEP) of a two-user cooperative diversity system, where two users cooperate through the decode-and-forward (DF) relaying with binary phase-shift keying (BPSK) modulation in a flat Rayleigh fading environment. We compare the computational results obtained by the SEP expression with the simulation results using maximal-ratio combi...

  3. Towards for Analyzing Alternatives of Interaction Design Based on Verbal Decision Analysis of User Experience

    Directory of Open Access Journals (Sweden)

    Marília Soares Mendes

    2010-04-01

    Full Text Available In domains (as digital TV, smart home, and tangible interfaces that represent a new paradigm of interactivity, the decision of the most appropriate interaction design solution is a challenge. HCI researchers have promoted in their works the validation of design alternative solutions with users before producing the final solution. User experience with technology is a subject that has also gained ground in these works in order to analyze the appropriate solution(s. Following this concept, a study was accomplished under the objective of finding a better interaction solution for an application of mobile TV. Three executable applications of mobile TV prototypes were built. A Verbal Decision Analysis model was applied on the investigations for the favorite characteristics in each prototype based on the user’s experience and their intentions of use. This model led a performance of a qualitative analysis which objectified the design of a new prototype.

  4. Modelling User-Costs in Life Cycle Cost-Benefit (LCCB) analysis

    OpenAIRE

    Thoft-Christensen, Palle

    2009-01-01

    The importance of including user's costs in Life-Cycle Cost-Benefit analysis of structures is discussed in this paper. This is especially for bridges of great importance. Repair or/and failure of a bridge will usually result in user costs greater than the repair or replacement costs of the bridge. For the society (and the user's) it is therefore of great importance that maintenance or replacement of a bridge is performed in such a way that all costs are minimized - not only the owners cost.

  5. "Do Users Do What They Think They Do?"- A Comparative Study of User Perceived and Actual Information Searching Behaviour in the National Electronic Library of Infection

    Science.gov (United States)

    Roy, Anjana; Kostkova, Patty; Catchpole, Mike; Carson, Ewart

    In the last decade, the Internet has profoundly changed the delivery of healthcare. Medical websites for professionals and patients are playing an increasingly important role in providing the latest evidence-based knowledge for professionals, facilitating virtual patient support groups, and providing an invaluable information source for patients. Information seeking is the key user activity on the Internet. However, the discrepancy between what information is available and what the user is able to find has a profound effect on user satisfaction. The UK National electronic Library of Infection (NeLI, www.neli.org.uk) and its subsidiary projects provide a single-access portal for quality-appraised evidence in infectious diseases. We use this national portal, as test-bed for investigating our research questions. In this paper, we investigate actual and perceived user navigation behaviour that reveals important information about user perceptions and actions, in searching for information. Our results show: (i) all users were able to access information they were seeking; (ii) broadly, there is an agreement between "reported" behaviour (from questionnaires) and "observed" behaviour (from web logs), although some important differences were identified; (iii) both browsing and searching were equally used to answer specific questions and (iv) the preferred route for browsing for data on the NeLI website was to enter via the "Top Ten Topics" menu option. These findings provide important insights into how to improve user experience and satisfaction with health information websites.

  6. User Behavior Analysis from Web Log using Log Analyzer Tool

    Directory of Open Access Journals (Sweden)

    Brijesh Bakariya

    2013-11-01

    Full Text Available Now a day, internet plays a role of huge database in which many websites, information and search engines are available. But due to unstructured and semi-structured data in webpage, it has become a challenging task to extract relevant information. Its main reason is that traditional knowledge based technique are not correct to efficiently utilization the knowledge, because it consist of many discover pattern, contains a lots of noise and uncertainty. In this paper, analyzing of web usage mining has been made with the help if web log data for which web log analyzer tool, “Deep Log Analyzer” to find out abstract information from particular server and also tried to find out the user behavior and also developed an ontology which consist the relation among efficient web apart of web usage mining.

  7. An Analysis and Knowledge Representation System to attain the genuine web user usage behavior

    OpenAIRE

    V.V.R. Maheswara Rao; Dr. V. Valli Kumari

    2013-01-01

    With the explosive growth of WWW, the web mining techniques are densely concentrated to discover the relevant behaviors of the web user from the web log data. In fact the pattern discovery techniques generate many hundreds, often thousands, of patterns, that are unwanted, unexpected, disputable and unbelievable in nature. The success of representing the real knowledge out of such patterns is highly reliant on the pattern analysis stage in investigating the web user usage behavior. To retain m...

  8. Automated video analysis as a tool for analysing road user behaviour

    OpenAIRE

    Laureshyn, Aliaksei; Ardö, Håkan

    2006-01-01

    At Lund University an automated video analysis system is being developed that can be applied for studying the behaviour of road users in complex traffic environments. It is stressed that system must be capable of handling all the categories of road users, i.e. vehicles, pedestrians and cyclists. Common problems like detection and tracking of moving objects, occlusion by foreground objects, ground-plane co-ordinates estimation, smoothing of the scattered data and estimation of speed and accele...

  9. Using Latent Semantic Analysis to Identify Quality in Use (QU) Indicators from User Reviews

    OpenAIRE

    Syn, Wendy Tan Wei; How, Bong Chih; Atoum, Issa

    2015-01-01

    The paper describes a novel approach to categorize users' reviews according to the three Quality in Use (QU) indicators defined in ISO: effectiveness, efficiency and freedom from risk. With the tremendous amount of reviews published each day, there is a need to automatically summarize user reviews to inform us if any of the software able to meet requirement of a company according to the quality requirements. We implemented the method of Latent Semantic Analysis (LSA) and its...

  10. FACTOR ANALYSIS OF USERS PREFERENCE ATTRIBUTES IN USING WEB OPAC IN ACADEMIC LIBRARY: A SURVEY

    OpenAIRE

    Sahu, Mahendra K.; SIBA PRASAD PANDA

    2013-01-01

    The purpose of this paper is to gain an understanding of the acceptance andpreference attributes of Web-Opac services among the users of Gandhi Group ofInstitutions, Odisha, who uses library regularly. Web OPAC is an online public accesscatalogue provides the full description of the library resources through online. In thisstudy Reliability Analysis was conducted for testing the preference of user on Web-Opacfeatures.

  11. Tools for comparative protein structure modeling and analysis

    OpenAIRE

    Eswar, Narayanan; John, Bino; Mirkovic, Nebojsa; Fiser, Andras; Ilyin, Valentin A.; Pieper, Ursula; Stuart, Ashley C.; Marti-renom, Marc A.; Madhusudhan, M. S.; Yerkovich, Bozidar; Sali, Andrej

    2003-01-01

    The following resources for comparative protein structure modeling and analysis are described (http://salilab.org): MODELLER, a program for comparative modeling by satisfaction of spatial restraints; MODWEB, a web server for automated comparative modeling that relies on PSI-BLAST, IMPALA and MODELLER; MODLOOP, a web server for automated loop modeling that relies on MODELLER; MOULDER, a CPU intensive protocol of MODWEB for building comparative models based on distant known structures; MODBASE,...

  12. Visual Analysis of Controversy in User-generated Encyclopedias

    OpenAIRE

    Brandes, Ulrik; Lerner, Jürgen

    2007-01-01

    Wikipedia is a large and rapidly growing Web-based collaborative authoring environment, where anyone on the Internet can create, modify, and delete pages about encyclopedic topics. A remarkable property of some Wikipedia pages is that they are written by up to thousands of authors who may have contradicting opinions. In this paper we show that a visual analysis of the who revises whom - network gives deep insight into controversies. We propose a set of analysis and visualization techniques t...

  13. Vegetation Change Analysis User's Manual

    Energy Technology Data Exchange (ETDEWEB)

    D. J. Hansen; W. K. Ostler

    2002-10-01

    Approximately 70 percent of all U.S. military training lands are located in arid and semi-arid areas. Training activities in such areas frequently adversely affect vegetation, damaging plants and reducing the resilience of vegetation to recover once disturbed. Fugitive dust resulting from a loss of vegetation creates additional problems for human health, increasing accidents due to decreased visibility, and increasing maintenance costs for roads, vehicles, and equipment. Diagnostic techniques are needed to identify thresholds of sustainable military use. A cooperative effort among U.S. Department of Energy, U.S. Department of Defense, and selected university scientists was undertaken to focus on developing new techniques for monitoring and mitigating military impacts in arid lands. This manual focuses on the development of new monitoring techniques that have been implemented at Fort Irwin, California. New mitigation techniques are described in a separate companion manual. This User's Manual is designed to address diagnostic capabilities needed to distinguish between various degrees of sustainable and nonsustainable impacts due to military training and testing and habitat-disturbing activities in desert ecosystems. Techniques described here focus on the use of high-resolution imagery and the application of image-processing techniques developed primarily for medical research. A discussion is provided about the measurement of plant biomass and shrub canopy cover in arid. lands using conventional methods. Both semiquantitative methods and quantitative methods are discussed and reference to current literature is provided. A background about the use of digital imagery to measure vegetation is presented.

  14. arrayCGHbase: an analysis platform for comparative genomic hybridization microarrays

    Directory of Open Access Journals (Sweden)

    Moreau Yves

    2005-05-01

    Full Text Available Abstract Background The availability of the human genome sequence as well as the large number of physically accessible oligonucleotides, cDNA, and BAC clones across the entire genome has triggered and accelerated the use of several platforms for analysis of DNA copy number changes, amongst others microarray comparative genomic hybridization (arrayCGH. One of the challenges inherent to this new technology is the management and analysis of large numbers of data points generated in each individual experiment. Results We have developed arrayCGHbase, a comprehensive analysis platform for arrayCGH experiments consisting of a MIAME (Minimal Information About a Microarray Experiment supportive database using MySQL underlying a data mining web tool, to store, analyze, interpret, compare, and visualize arrayCGH results in a uniform and user-friendly format. Following its flexible design, arrayCGHbase is compatible with all existing and forthcoming arrayCGH platforms. Data can be exported in a multitude of formats, including BED files to map copy number information on the genome using the Ensembl or UCSC genome browser. Conclusion ArrayCGHbase is a web based and platform independent arrayCGH data analysis tool, that allows users to access the analysis suite through the internet or a local intranet after installation on a private server. ArrayCGHbase is available at http://medgen.ugent.be/arrayCGHbase/.

  15. Neutron activation analysis at the Californium User Facility for Neutron Science

    Energy Technology Data Exchange (ETDEWEB)

    Martin, R.C.; Smith, E.H.; Glasgow, D.C. [Oak Ridge National Lab., TN (United States); Jerde, E.A. [Oak Ridge Research Inst., TN (United States); Marsh, D.L. [Univ. of Tennessee, Knoxville, TN (United States); Zhao, L. [Univ. of Texas, Austin, TX (United States)

    1997-12-01

    The Californium User Facility (CUF) for Neutron Science has been established to provide {sup 252}Cf-based neutron irradiation services and research capabilities including neutron activation analysis (NAA). A major advantage of the CUF is its accessibility and controlled experimental conditions compared with those of a reactor environment The CUF maintains the world`s largest inventory of compact {sup 252}Cf neutron sources. Neutron source intensities of {le} 10{sup 11} neutrons/s are available for irradiations within a contamination-free hot cell, capable of providing thermal and fast neutron fluxes exceeding 10{sup 8} cm{sup {minus}2} s{sup {minus}1} at the sample. Total flux of {ge}10{sup 9} cm{sup {minus}2} s{sup {minus}1} is feasible for large-volume irradiation rabbits within the {sup 252}Cf storage pool. Neutron and gamma transport calculations have been performed using the Monte Carlo transport code MCNP to estimate irradiation fluxes available for sample activation within the hot cell and storage pool and to design and optimize a prompt gamma NAA (PGNAA) configuration for large sample volumes. Confirmatory NAA irradiations have been performed within the pool. Gamma spectroscopy capabilities including PGNAA are being established within the CUF for sample analysis.

  16. User's guide for 10 CFR 61 impact analysis codes

    International Nuclear Information System (INIS)

    This document explains how to use the Impact Analysis Codes used in the Draft Environmental Impact Statement (DEIS) (NUREG-0782, Vol. 1-4) supporting 10 CFR 61, Licensing Requirements for Land Disposal of Radioactive Waste. The mathematical development of the impact Analysis Codes and other information necessary to understand the results of using the Codes is contained in the DEIS, and in a supporting document, Data Base for Radioactive Waste Management (NUREG/CR-1759, Vol. 1-3). This document was prepared with the intention of accompanying a computer magnetic tape containing the Impact Analysis Codes. A form is included at the end of this document which can be used to obtain such a tape

  17. Exploiting Formal Concept Analysis in a Customizing Recommendation for New User and Gray Sheep Problems

    Science.gov (United States)

    Li, Xiaohui; Murata, Tomohiro

    Recommender systems are becoming an indispensable application and re-shaping the world in e-commerce scopes. This paper reviews the major problems in the existing recommender systems and presents a tracking recommendation approach based on information of user's behavior and two-level property of items. A new recommendation model based the synergistic use of knowledge from repository, which includes user's behavior, and items property was constructed and utilizes the Formal Concept Analysis (FCA) mapping to guide a personalized recommendation for user. We simulate a prototype recommender system that can make the quality recommendation by tracking user's behavior for implementing the proposed approach and testing its performance. Experiments using two datasets show our strategy was more robust against the drawbacks and preponderate over traditional recommendation approaches in cold-start conditions.

  18. Comparative analysis of myocardial revascularization methods for ischemic heart disease

    Directory of Open Access Journals (Sweden)

    Sinkeev M.S.

    2012-09-01

    Full Text Available The review of literature is devoted to the comparative analysis of clinical researches of efficiency and frequency of complications after application of surgical and medicamentous methods of treatment of coronary heart disease.

  19. Comparative analysis of different methods of radiation diagnosis of choledocholithiasis

    International Nuclear Information System (INIS)

    The comparative analysis of different methods of radiation diagnosis of choledocholithiasis, i.e. transabdominal ultrasonography, helical computed tomography, magnetic resonance and endoscopic retrograde cholangiography to optimize the indications for their use, depending on the clinical situation was performed

  20. TRUFA: A User-Friendly Web Server for de novo RNA-seq Analysis Using Cluster Computing.

    Science.gov (United States)

    Kornobis, Etienne; Cabellos, Luis; Aguilar, Fernando; Frías-López, Cristina; Rozas, Julio; Marco, Jesús; Zardoya, Rafael

    2015-01-01

    Application of next-generation sequencing (NGS) methods for transcriptome analysis (RNA-seq) has become increasingly accessible in recent years and are of great interest to many biological disciplines including, eg, evolutionary biology, ecology, biomedicine, and computational biology. Although virtually any research group can now obtain RNA-seq data, only a few have the bioinformatics knowledge and computation facilities required for transcriptome analysis. Here, we present TRUFA (TRanscriptome User-Friendly Analysis), an open informatics platform offering a web-based interface that generates the outputs commonly used in de novo RNA-seq analysis and comparative transcriptomics. TRUFA provides a comprehensive service that allows performing dynamically raw read cleaning, transcript assembly, annotation, and expression quantification. Due to the computationally intensive nature of such analyses, TRUFA is highly parallelized and benefits from accessing high-performance computing resources. The complete TRUFA pipeline was validated using four previously published transcriptomic data sets. TRUFA's results for the example datasets showed globally similar results when comparing with the original studies, and performed particularly better when analyzing the green tea dataset. The platform permits analyzing RNA-seq data in a fast, robust, and user-friendly manner. Accounts on TRUFA are provided freely upon request at https://trufa.ifca.es. PMID:26056424

  1. TRUFA: A User-Friendly Web Server for de novo RNA-seq Analysis Using Cluster Computing

    Science.gov (United States)

    Kornobis, Etienne; Cabellos, Luis; Aguilar, Fernando; Frías-López, Cristina; Rozas, Julio; Marco, Jesús; Zardoya, Rafael

    2015-01-01

    Application of next-generation sequencing (NGS) methods for transcriptome analysis (RNA-seq) has become increasingly accessible in recent years and are of great interest to many biological disciplines including, eg, evolutionary biology, ecology, biomedicine, and computational biology. Although virtually any research group can now obtain RNA-seq data, only a few have the bioinformatics knowledge and computation facilities required for transcriptome analysis. Here, we present TRUFA (TRanscriptome User-Friendly Analysis), an open informatics platform offering a web-based interface that generates the outputs commonly used in de novo RNA-seq analysis and comparative transcriptomics. TRUFA provides a comprehensive service that allows performing dynamically raw read cleaning, transcript assembly, annotation, and expression quantification. Due to the computationally intensive nature of such analyses, TRUFA is highly parallelized and benefits from accessing high-performance computing resources. The complete TRUFA pipeline was validated using four previously published transcriptomic data sets. TRUFA’s results for the example datasets showed globally similar results when comparing with the original studies, and performed particularly better when analyzing the green tea dataset. The platform permits analyzing RNA-seq data in a fast, robust, and user-friendly manner. Accounts on TRUFA are provided freely upon request at https://trufa.ifca.es. PMID:26056424

  2. NFAP: the nonlinear finite element analysis program. Users manual; Version 1977

    International Nuclear Information System (INIS)

    A brief outline of the analysis capability together with the input instructions are given for a nonlinear finite element analysis program called NFAP, which is an extended version of the NONSAP Program. Extensions include additional element types, material models and several user's features as further described in the report. Similar to NONSAP, the NFAP program can be used for conducting linear or nonlinear analysis of various structures under static or dynamic loadings. Nonlinearities involve both nonlinear materials and large deformations

  3. HORECA. Hoger onderwijs reactor elementary core analysis system. User's manual

    International Nuclear Information System (INIS)

    HORECA is developed at IRI Delft for quick analysis of power distribution, burnup and safety for the HOR. It can be used for the manual search of a better loading of the reactor. HORECA is based on the Penn State Fuel Management Package and uses the MCRAC code included in this package as a calculation engine. (orig./HP)

  4. CULTURE AND SOCIAL MEDIA USAGE: ANALYSIS OF JAPANESE TWITTER USERS

    OpenAIRE

    Adam Acar; Ayaka Deguchi

    2013-01-01

    Twitter, one of the most popular microblogging tools, has been used extensively all around the world. However, up to date, no study has addressed how culture influences the use of this communication platform. In order to close the literature gap and promote cross-cultural understandings, this paper content analyzed 4,000 tweets from 200 college students in Japan and the USA. The results showed that Japanese college students post more self-related messages and ask fewer questions compared to ...

  5. A Conjoint Analysis Framework for Evaluating User Preferences in Machine Translation.

    Science.gov (United States)

    Kirchhoff, Katrin; Capurro, Daniel; Turner, Anne M

    2014-03-01

    Despite much research on machine translation (MT) evaluation, there is surprisingly little work that directly measures users' intuitive or emotional preferences regarding different types of MT errors. However, the elicitation and modeling of user preferences is an important prerequisite for research on user adaptation and customization of MT engines. In this paper we explore the use of conjoint analysis as a formal quantitative framework to assess users' relative preferences for different types of translation errors. We apply our approach to the analysis of MT output from translating public health documents from English into Spanish. Our results indicate that word order errors are clearly the most dispreferred error type, followed by word sense, morphological, and function word errors. The conjoint analysis-based model is able to predict user preferences more accurately than a baseline model that chooses the translation with the fewest errors overall. Additionally we analyze the effect of using a crowd-sourced respondent population versus a sample of domain experts and observe that main preference effects are remarkably stable across the two samples. PMID:24683295

  6. “Beautiful picture of an ugly place” : Exploring photo collections using opinion and sentiment analysis of user comments

    OpenAIRE

    Kisilevich, Slava; Rohrdantz, Christian; Keim, Daniel

    2010-01-01

    User generated content in the form of customer reviews, feedbacks and comments plays an important role in all types of Internet services and activities like news, shopping, forums and blogs. Therefore, the analysis of user opinions is potentially beneficial for the understanding of user attitudes or the improvement of various Internet services. In this paper, we propose a practical unsupervised approach to improve user experience when exploring photo collections by using opinions and senti...

  7. Comparative Analysis of the Main Business Intelligence Solutions

    OpenAIRE

    Rusaneanu, Alexandra

    2013-01-01

    Nowadays, Business Intelligence solutions are the main tools for analyzing and monitoring the company’s performance at any organizational level. This paper presents a comparative analysis of the most powerful Business Intelligence solutions using a set of technical features such as infrastructure of the platform, development facilities, complex analysis tools, interactive dashboards and scorecards, mobile integration and complex implementation of performance management methodologies.

  8. E-learning interventions are comparable to user's manual in a randomized trial of training strategies for the AGREE II

    Directory of Open Access Journals (Sweden)

    Durocher Lisa D

    2011-07-01

    Full Text Available Abstract Background Practice guidelines (PGs are systematically developed statements intended to assist in patient and practitioner decisions. The AGREE II is the revised tool for PG development, reporting, and evaluation, comprised of 23 items, two global rating scores, and a new User's Manual. In this study, we sought to develop, execute, and evaluate the impact of two internet interventions designed to accelerate the capacity of stakeholders to use the AGREE II. Methods Participants were randomized to one of three training conditions. 'Tutorial'--participants proceeded through the online tutorial with a virtual coach and reviewed a PDF copy of the AGREE II. 'Tutorial + Practice Exercise'--in addition to the Tutorial, participants also appraised a 'practice' PG. For the practice PG appraisal, participants received feedback on how their scores compared to expert norms and formative feedback if scores fell outside the predefined range. 'AGREE II User's Manual PDF (control condition'--participants reviewed a PDF copy of the AGREE II only. All participants evaluated a test PG using the AGREE II. Outcomes of interest were learners' performance, satisfaction, self-efficacy, mental effort, time-on-task, and perceptions of AGREE II. Results No differences emerged between training conditions on any of the outcome measures. Conclusions We believe these results can be explained by better than anticipated performance of the AGREE II PDF materials (control condition or the participants' level of health methodology and PG experience rather than the failure of the online training interventions. Some data suggest the online tools may be useful for trainees new to this field; however, this requires further study.

  9. [Regulating the internet: a comparative analysis of Brazil, Chile, Spain, the US, and France.

    Science.gov (United States)

    Segurado, Rosemary; Lima, Carolina Silva Mandú de; Ameni, Cauê S

    2014-08-13

    Global governance is of key concern in the current debate over the workings of the world's computer network, and Brazil has played a notable role in this process, especially after approval of the Marco Civil da Internet (law 12.965, april 23, 2014), which defines Brazil's regulatory framework for the internet. Dubbed the internet bill of rights, this law sets out the principles, guarantees, rights, and duties of internet users and providers in Brazil. Based on the fundamental categories of net neutrality, internet users' right to privacy, and copyright discussions from the perspective of intellectual property, the article offers a comparative analysis of regulations in five countries: Brazil, Chile, Spain, the US, and France. PMID:25119248

  10. QoS Perceived by Users of Ubiquitous UMTS: Compositional Models and Thorough Analysis

    Directory of Open Access Journals (Sweden)

    Andrea Bondavalli

    2009-09-01

    Full Text Available .This paper provides a QoS analysis of a dynamic, ubiquitous UMTS network scenario in the automotive context identi_ed in the ongoing EC HIDENETS project. The scenario comprises different types of mobile users, applications, traffic conditions, and outage events reducing the available network resources. Adopting a compositional modeling approach based on Stochastic Activity Networks (SAN formalism, we analyze the Quality of Service (QoS both from the users' perspective and from the mobile operator's one. The classical QoS analysis is enhanced by taking into account the congestion both caused by the outage events and by the varying traffic conditions. The impact of users' mobility on the selected QoS indicators is further investigated combining the SAN modelling approach with an ad-hoc mobility simulator, which also allows to re_ne the model representing the UMTS network behavior.

  11. Enabling Semantic Analysis of User Browsing Patterns in the Web of Data

    CERN Document Server

    Hoxha, Julia; Agarwal, Sudhir

    2012-01-01

    A useful step towards better interpretation and analysis of the usage patterns is to formalize the semantics of the resources that users are accessing in the Web. We focus on this problem and present an approach for the semantic formalization of usage logs, which lays the basis for eective techniques of querying expressive usage patterns. We also present a query answering approach, which is useful to nd in the logs expressive patterns of usage behavior via formulation of semantic and temporal-based constraints. We have processed over 30 thousand user browsing sessions extracted from usage logs of DBPedia and Semantic Web Dog Food. All these events are formalized semantically using respective domain ontologies and RDF representations of the Web resources being accessed. We show the eectiveness of our approach through experimental results, providing in this way an exploratory analysis of the way users browse theWeb of Data.

  12. User-driven Page Layout Analysis of historical printed Books

    OpenAIRE

    Ramel, Jean-Yves; Demonet, Marie-Luce; Busson, Débastien

    2007-01-01

    In this paper, based on the study of the specificity of historical printed books, we first explain the main error sources in classical methods used for page layout analysis. We show that each method (bottom-up and top-down) provides different types of useful information that should not be ignored, if we want to obtain both a generic method and good segmentation results. Next, we propose to use a hybrid segmentation algorithm that builds two maps: a shape map that focuses on connected componen...

  13. Exploratory analysis of user-generated photos and indicators that influence their appeal

    Directory of Open Access Journals (Sweden)

    Urban Sedlar

    2014-07-01

    Full Text Available In this paper we analyze if simple indicators related to photo quality (brightness, sharpness, color palette and established content detection techniques (face detection can predict the success of photos in obtaining more “likes” from other users of photo-sharing social networks. This provides a unique look into the habits of users of such networks. The analysis was performed on 394.000 images downloaded from the social photo-sharing site Instagram, paired with a de-identified dataset of user liking activity, provided by a seller of a social-media mobile app. Two user groups were analyzed: all users in a two month period (N = 122.260 and a highly selective group (N = 3.982 of users that only like <10% of what they view. No correlation was found with any of the indicators using the whole (non-selective population, likely due to their bias towards earning virtual currency in exchange for liking. However, in selective group, small positive correlation was found between like ratio and image sharpness (r=0.09, p<0.0001 and small negative correlation between like ratio and the number of faces (r=-0.10, p<0.0001.

  14. EMERGY ANALYSIS AND ECONOMIC ANALYSIS A COMPARATIVE STUDY

    Science.gov (United States)

    Our mission at USEPA is to protect human health and safeguard the natural environment. We aim to base our environmental regulations and policies on sound scientific and, where appropriate, economic analyses. Although EPA has conducted analysis of the impact of regulations on ...

  15. CULTURE AND SOCIAL MEDIA USAGE: ANALYSIS OF JAPANESE TWITTER USERS

    Directory of Open Access Journals (Sweden)

    Adam Acar

    2013-06-01

    Full Text Available Twitter, one of the most popular microblogging tools, has been used extensively all around the world. However, up to date, no study has addressed how culture influences the use of this communication platform. In order to close the literature gap and promote cross-cultural understandings, this paper content analyzed 4,000 tweets from 200 college students in Japan and the USA. The results showed that Japanese college students post more self-related messages and ask fewer questions compared to American college students. It was also found that tweets that refer to TV are more common in Japan, whereas sports and news tweets stand out in the USA. The evidence from this study suggests that there is a subtle and complicated relationship between culture and Twitter use.

  16. A multivariate analysis of the factors that influence the modification of sexual desire in oral hormonal contraceptive (OC users

    Directory of Open Access Journals (Sweden)

    Mariano Martin-Loeches

    2011-09-01

    Full Text Available Objective. This work studied the influencing factors of age, level of education, family planning awareness, relationship with partner, the age at which sexual relationships were initiated, parity, the method of contraceptive previously used, the type of contraceptive pill used and the duration of oral hormonal contraception (OC use in relation to the modification of sexual desire in OC users. Materials and Methods. Prospective study of 760 OC users at the Family Planning Center “Marina Alta” in Alicante (Spain. A logistical regression analysis was carried out to study the relative risk of reduction in libido, taking other risk factors into account. Results. In the simple analysis, women who initiated sexual relationships between 18 and 25 years of age had a lower sexual desire in comparison with women who were sexually active before the age of 18 (OR = 2.11; CI: 1.15 - 3.91. Nulliparous women had a reduced sexual desire compared with those women that had given birth (OR = 2.32; CI: 1.41 - 3.82. An OC use of between 6 months and 1 year reduced sexual desire in comparison with a use of less than 6 months (OR = 0.24; CI: 0.09 - 0.64. In the multivariate analysis, age (OR = 1.12; CI: 1.01 - 1.21 and the use of OC within an initial 6 month to a year period (OR = 0.24; CI: 0.09 - 0.64 presented a statistically significant relationship with the modification of sexual desire. The level of education, family planning awareness, relationship with partner, the method of contraception previously used and the type of contraceptive pill prescribed showed no statistical significance with the modification of sexual desire in OC users. Conclusions. Sexual desire in OC users decreases as a woman’s age increases and in an early stage of use in the first six months after beginning OC treatment.

  17. A STANDARD PROCEDURE FOR COST ANALYSIS OF POLLUTION CONTROL OPERATIONS. VOLUME I. USER GUIDE

    Science.gov (United States)

    Volume I is a user guide for a standard procedure for the engineering cost analysis of pollution abatement operations and processes. The procedure applies to projects in various economic sectors: private, regulated, and public. The models are consistent with cost evaluation pract...

  18. imDEV: a graphical user interface to R multivariate analysis tools in Microsoft Excel

    Science.gov (United States)

    Interactive modules for data exploration and visualization (imDEV) is a Microsoft Excel spreadsheet embedded application providing an integrated environment for the analysis of omics data sets with a user-friendly interface. Individual modules were designed to provide toolsets to enable interactive ...

  19. A novel R-package graphic user interface for the analysis of metabonomic profiles

    Directory of Open Access Journals (Sweden)

    Villa Palmira

    2009-10-01

    Full Text Available Abstract Background Analysis of the plethora of metabolites found in the NMR spectra of biological fluids or tissues requires data complexity to be simplified. We present a graphical user interface (GUI for NMR-based metabonomic analysis. The "Metabonomic Package" has been developed for metabonomics research as open-source software and uses the R statistical libraries. Results The package offers the following options: Raw 1-dimensional spectra processing: phase, baseline correction and normalization. Importing processed spectra. Including/excluding spectral ranges, optional binning and bucketing, detection and alignment of peaks. Sorting of metabolites based on their ability to discriminate, metabolite selection, and outlier identification. Multivariate unsupervised analysis: principal components analysis (PCA. Multivariate supervised analysis: partial least squares (PLS, linear discriminant analysis (LDA, k-nearest neighbor classification. Neural networks. Visualization and overlapping of spectra. Plot values of the chemical shift position for different samples. Furthermore, the "Metabonomic" GUI includes a console to enable other kinds of analyses and to take advantage of all R statistical tools. Conclusion We made complex multivariate analysis user-friendly for both experienced and novice users, which could help to expand the use of NMR-based metabonomics.

  20. A comparative analysis of the statistical properties of large mobile phone calling networks

    Science.gov (United States)

    Li, Ming-Xia; Jiang, Zhi-Qiang; Xie, Wen-Jie; Miccichè, Salvatore; Tumminello, Michele; Zhou, Wei-Xing; Mantegna, Rosario N.

    2014-05-01

    Mobile phone calling is one of the most widely used communication methods in modern society. The records of calls among mobile phone users provide us a valuable proxy for the understanding of human communication patterns embedded in social networks. Mobile phone users call each other forming a directed calling network. If only reciprocal calls are considered, we obtain an undirected mutual calling network. The preferential communication behavior between two connected users can be statistically tested and it results in two Bonferroni networks with statistically validated edges. We perform a comparative analysis of the statistical properties of these four networks, which are constructed from the calling records of more than nine million individuals in Shanghai over a period of 110 days. We find that these networks share many common structural properties and also exhibit idiosyncratic features when compared with previously studied large mobile calling networks. The empirical findings provide us an intriguing picture of a representative large social network that might shed new lights on the modelling of large social networks.

  1. Driving an electric vehicle. A sociological analysis on pioneer users

    Energy Technology Data Exchange (ETDEWEB)

    Pierre, M. [EDF R and D, Electricite de France, Research and Development, 1 avenue du General de Gaulle, 92141 Clamart (France); Jemelin, C. [6T research bureau, 11 rue Duhesme, 75018 Paris (France); Louvet, N. [EPFL, Lausanne Federal Polytechnic School, 11 rue Duhesme, 75018 Paris (France)

    2011-11-15

    In most of the western countries, car is the prevalent means of transport for local mobility. At the same time, sensitivity to environmental issues is increasing, correlated to the consciousness that carbon dioxide emissions have to be reduced. In regard to these two trends (individual mobility and public opinions favourable to a reduction of carbon emissions), energy-efficient vehicles will probably develop in the future-car manufacturers actually prepare new offers for the mass market. Comparable cases have occurred during the last decades-probably more modest but full of learning: some local authorities have promoted innovations based on electric vehicles in the 1990s, and some people have chosen this kind of cars for their daily travels. This article deals with these pioneers (This article comes from a communication at the ECEEE Summer Study, June 2009, Panel 6: Energy efficiency in transport and mobility.). Reporting studies carried out in 2006 and 2008, we intend to identify the reasons of this innovative modal choice, to show the difficulties that electric vehicle drivers then encountered and to analyse the patterns of use that governed their mobility and their use of electric vehicles.

  2. Software user's manual for stability and transition analysis with the codes LSH and PSH

    Science.gov (United States)

    Herbert, T.; Stuckert, G. K.; Lin, N.

    1993-09-01

    This manual describes how to use the local linear stability code LSH and the nonlinear Parabolized Stability Equations (PSE) solver PSH for compressible flows. Both codes are adaptable to analysis of different flows over fairly general shapes of bodies. For analysis of a new problem, the user may specify the basic state, the coordinate system, dependent disturbance variables and their boundary conditions to be used for the stability analysis through physics insert files. Once these preliminary steps are completed, different tasks of stability analysis and PSE calculations can be carried out by simple modifications of a few input files. Both codes are highly modular and closely integrated. Output data from LSH can be used directly as initial data for PSH runs. Examples of several insert files and input files of the built-in problems are given to assist the user in setting up a new problem.

  3. Satistical Graphical User Interface Plug-In for Survival Analysis in R Statistical and Graphics Language and Environment

    OpenAIRE

    Daniel C. LEUCU?A; Andrei ACHIMA? CADARIU

    2008-01-01

    Introduction: R is a statistical and graphics language and environment. Although it is extensively used in command line, graphical user interfaces exist to ease the accommodation with it for new users. Rcmdr is an R package providing a basic-statistics graphical user interface to R. Survival analysis interface is not provided by Rcmdr. The AIM of this paper was to create a plug-in for Rcmdr to provide survival analysis user interface for some basic R survival analysis functions.Materials and ...

  4. Comparative analysis of traditional and alternative energy sources

    International Nuclear Information System (INIS)

    The presented thesis with designation of Comparing analysis of traditional and alternative energy resources includes, on basis of theoretical information source, research in firm, internal data, trends in company development and market, description of the problem and its application. Theoretical information source is dedicated to the traditional and alternative energy resources, reserves of it, trends in using and development, the balance of it in the world, EU and in Slovakia as well. Analysis of the thesis is reflecting profile of the company and the thermal pump market evaluation using General Electric method. While the company is implementing, except other products, the thermal pumps on geothermal energy base and surround energy base (air), the mission of the comparing analysis is to compare traditional energy resources with thermal pump from the ecological, utility and economic side of it. The results of the comparing analysis are resumed in to the SWOT analysis. The part of the thesis includes t he questionnaire offer for effectiveness improvement and customer satisfaction analysis, and expected possibilities of alternative energy resources assistance (benefits) from the government and EU funds. (authors)

  5. Comparative analysis of traditional and alternative energy sources

    Directory of Open Access Journals (Sweden)

    Adriana Csikósová

    2008-11-01

    Full Text Available The presented thesis with designation of Comparing analysis of traditional and alternative energy resources includes, on basisof theoretical information source, research in firm, internal data, trends in company development and market, descriptionof the problem and its application. Theoretical information source is dedicated to the traditional and alternative energy resources,reserves of it, trends in using and development, the balance of it in the world, EU and in Slovakia as well. Analysis of the thesisis reflecting profile of the company and the thermal pump market evaluation using General Electric method. While the companyis implementing, except other products, the thermal pumps on geothermal energy base and surround energy base (air, the missionof the comparing analysis is to compare traditional energy resources with thermal pump from the ecological, utility and economic sideof it. The results of the comparing analysis are resumed in to the SWOT analysis. The part of the thesis includes he questionnaire offerfor effectiveness improvement and customer satisfaction analysis, and expected possibilities of alternative energy resources assistance(benefits from the government and EU funds.

  6. Delight2 Daylighting Analysis in Energy Plus: Integration and Preliminary User Results

    Energy Technology Data Exchange (ETDEWEB)

    Carroll, William L.; Hitchcock, Robert J.

    2005-04-26

    DElight is a simulation engine for daylight and electric lighting system analysis in buildings. DElight calculates interior illuminance levels from daylight, and the subsequent contribution required from electric lighting to meet a desired interior illuminance. DElight has been specifically designed to integrate with building thermal simulation tools. This paper updates the DElight capability set, the status of integration into the simulation tool EnergyPlus, and describes a sample analysis of a simple model from the user perspective.

  7. A novel R-package graphic user interface for the analysis of metabonomic profiles

    OpenAIRE

    Villa Palmira; Kyriazis Angelos; Rodríguez Ignacio; Izquierdo-García Jose L; Barreiro Pilar; Desco Manuel; Ruiz-Cabello Jesús

    2009-01-01

    Abstract Background Analysis of the plethora of metabolites found in the NMR spectra of biological fluids or tissues requires data complexity to be simplified. We present a graphical user interface (GUI) for NMR-based metabonomic analysis. The "Metabonomic Package" has been developed for metabonomics research as open-source software and uses the R statistical libraries. Results The package offers the following options: Raw 1-dimensional spectra processing: phase, baseline correction and norma...

  8. Sentiment Analysis Based Approaches for Understanding User Context in Web Content?

    Directory of Open Access Journals (Sweden)

    M. SAKTHIVEL

    2013-07-01

    Full Text Available In our day to day lives, we highly value the opinions of friends in making decisions about issueslike which brand to buy or which movie to watch. With the increasing popularity of blogs, online reviews andsocial networking sites, the current trend is to look up reviews, expert opinions and discussions on the Web,so that one can make an informed decision. Sentiment analysis, also known as opinion mining is thecomputational study of opinions, sentiments and emotions expressed in natural language for the purpose ofdecision making. Sentiment analysis applies natural language processing techniques and computationallinguistics to extract information about sentiments expressed by authors and readers about a particularsubject, thus helping users in making sense of huge volume of unstructured Web data. Applications likereview classification, product review mining and trend prediction benefit from sentiment analysis basedtechniques. This paper presents a study of different approaches in this field, the state of the art techniquesand current research in Sentiment Analysis based approaches for understanding user's context.We show that information about social relationships can be used to improve user-level sentiment analysis.The main motivation behind our approach is that users that are somehow "connected" may be more likely tohold similar opinions; therefore, relationship information can complement what we can extract about auser's viewpoints from their utterances. Employing Twitter as a source for our experimental data, andworking within a semi-supervised framework, we propose models that are induced either from the Twitterfollower/follower network or from the network in Twitter formed by users referring to each other using "@"mentions. Our transductive learning results reveal that incorporating social-network information can indeedlead to statistically significant sentiment classification improvements over the performance of an approachbased on Support Vector Machines having access only to textual features.

  9. Network Meta-analysis: Users' Guide for Surgeons: Part II - Certainty.

    Science.gov (United States)

    Chaudhry, Harman; Foote, Clary J; Guyatt, Gordon; Thabane, Lehana; Furukawa, Toshi A; Petrisor, Brad; Bhandari, Mohit

    2015-07-01

    In the previous article (Network Meta-analysis: Users' Guide for Surgeons-Part I, Credibility), we presented an approach to evaluating the credibility or methodologic rigor of network meta-analyses (NMA), an innovative approach to simultaneously addressing the relative effectiveness of three or more treatment options for a given medical condition or disease state. In the second part of the Users' Guide for Surgeons, we discuss and demonstrate the application of criteria for determining the certainty in effect sizes and directions associated with a given treatment option through an example pertinent to clinical orthopaedics. PMID:25869062

  10. Advanced Techniques in Web Intelligence-2 Web User Browsing Behaviour and Preference Analysis

    CERN Document Server

    Palade, Vasile; Jain, Lakhmi

    2013-01-01

    This research volume focuses on analyzing the web user browsing behaviour and preferences in traditional web-based environments, social  networks and web 2.0 applications,  by using advanced  techniques in data acquisition, data processing, pattern extraction and  cognitive science for modeling the human actions.  The book is directed to  graduate students, researchers/scientists and engineers  interested in updating their knowledge with the recent trends in web user analysis, for developing the next generation of web-based systems and applications.

  11. Three looks at users: a comparison of methods for studying digital library use. User studies, Digital libraries, Digital music libraries, Music, Information use, Information science, Contextual inquiry, Contextual design, User research, Questionnaires, Log file analysis

    OpenAIRE

    Mark Notess

    2004-01-01

    Compares three user research methods of studying real-world digital library usage within the context of the Variations and Variations2 digital music libraries at Indiana University. After a brief description of both digital libraries, each method is described and illustrated with findings from the studies. User satisfaction questionnaires were used in two studies, one of Variations (n=30) and the other of Variations2 (n=12). Second, session activity log files were examined for 175 Variations2...

  12. AITRAC: Augmented Interactive Transient Radiation Analysis by Computer. User's information manual

    International Nuclear Information System (INIS)

    AITRAC is a program designed for on-line, interactive, DC, and transient analysis of electronic circuits. The program solves linear and nonlinear simultaneous equations which characterize the mathematical models used to predict circuit response. The program features 100 external node--200 branch capability; conversional, free-format input language; built-in junction, FET, MOS, and switch models; sparse matrix algorithm with extended-precision H matrix and T vector calculations, for fast and accurate execution; linear transconductances: beta, GM, MU, ZM; accurate and fast radiation effects analysis; special interface for user-defined equations; selective control of multiple outputs; graphical outputs in wide and narrow formats; and on-line parameter modification capability. The user describes the problem by entering the circuit topology and part parameters. The program then automatically generates and solves the circuit equations, providing the user with printed or plotted output. The circuit topology and/or part values may then be changed by the user, and a new analysis, requested. Circuit descriptions may be saved on disk files for storage and later use. The program contains built-in standard models for resistors, voltage and current sources, capacitors, inductors including mutual couplings, switches, junction diodes and transistors, FETS, and MOS devices. Nonstandard models may be constructed from standard models or by using the special equations interface. Time functions may be described by straight-line segments or by sine, damped sine, and exponential functions. 42 figures, 1 table

  13. AITRAC: Augmented Interactive Transient Radiation Analysis by Computer. User's information manual

    Energy Technology Data Exchange (ETDEWEB)

    1977-10-01

    AITRAC is a program designed for on-line, interactive, DC, and transient analysis of electronic circuits. The program solves linear and nonlinear simultaneous equations which characterize the mathematical models used to predict circuit response. The program features 100 external node--200 branch capability; conversional, free-format input language; built-in junction, FET, MOS, and switch models; sparse matrix algorithm with extended-precision H matrix and T vector calculations, for fast and accurate execution; linear transconductances: beta, GM, MU, ZM; accurate and fast radiation effects analysis; special interface for user-defined equations; selective control of multiple outputs; graphical outputs in wide and narrow formats; and on-line parameter modification capability. The user describes the problem by entering the circuit topology and part parameters. The program then automatically generates and solves the circuit equations, providing the user with printed or plotted output. The circuit topology and/or part values may then be changed by the user, and a new analysis, requested. Circuit descriptions may be saved on disk files for storage and later use. The program contains built-in standard models for resistors, voltage and current sources, capacitors, inductors including mutual couplings, switches, junction diodes and transistors, FETS, and MOS devices. Nonstandard models may be constructed from standard models or by using the special equations interface. Time functions may be described by straight-line segments or by sine, damped sine, and exponential functions. 42 figures, 1 table. (RWR)

  14. Analysis of chromosomal aberrations, micronuclei and hematological disorders among workers of wireless communication instruments and cell phone (Mobile) users

    International Nuclear Information System (INIS)

    This study was carried out to investigate the hazardous effect of electromagnetic radiation (EMR) such as chromosomal aberration, disturbed micronucleus formation and hematological disorders that may detected among workers of wireless communication instruments and mobile phone users. Seven individuals ( 3 males and 4 females) of a central workers in the microwave unit of the wireless station and 7 users of Mobil phone (4 males and 3 females ) were volunteered to give blood samples. Chromosomes and micronucleus were prepared for cytogenetic analysis as well as blood film for differential count. The results obtained in the microwave group indicated that, the total summation of all types of aberrations (chromosomes and chromatid aberrations) had a frequency of 6. 14% for the exposed group, whereas, the frequency in the control group amounted to 1.57%. In Mobil phone users, the total summation of all types of aberrations(chromosome and chromatid aberrations) had a frequency of 4.43% for the exposed group and 1.71% for the control group. The incidence of the total number of micronuclei in the exposed microwave group was increased 4.3 folds as compared with those of the control group The incidence of the total number of micronuclei in the exposed mobile phone group was increased 2 fold as compared with those in the control group. On the other hand, normal ranges of total white blood cells counts were determined for mobile phone users but abnormalities in the differential cos but abnormalities in the differential counts of the different types of the white blood cells such as neutropenia, eosinophilia and lymphocytosis were observed in the individuals number 1,2,3,7 in microwave group

  15. Models for comparative analysis of culture: the case of Poland

    OpenAIRE

    Todeva, E.

    1999-01-01

    This paper examines the main theoretical frameworks for analysis comparative cultural attitudes. A critical discussion of the work by Kluckhohn Strodtbeck, Hofstede and Trompenaars leads to a new theoretical approach for study the national cultural attitudes and norms of behaviour. A methodology based on research is designed to compare the 'internalized' norms of behaviour with 'perceived' norms. Two different but complementary techniques are applied to a sample of Polish students to investig...

  16. A Comparative Performance Analysis of Approximate String Matching

    OpenAIRE

    Shivani Jain; Dr. A.L.N. Rao,

    2013-01-01

    This paper presents a comparative study to evaluate experimental results for approximate string matching algorithms on the basis of edit distance. We compare the algorithms in terms of the number of character comparisons and the running time for molecular data, binary alphabets English alphabets etc. The terms like word processors, web search engine, molecular sequence, DNA sequence analysis and natural language processing have lead to the development of many algorithms in the field of patter...

  17. Aggregate Characterization of User Behavior in Twitter and Analysis of the Retweet Graph

    CERN Document Server

    Bild, David R; Dick, Robert P; Mao, Z Morley; Wallach, Dan S

    2014-01-01

    Most previous analysis of Twitter user behavior is focused on individual information cascades and the social followers graph. We instead study aggregate user behavior and the retweet graph with a focus on quantitative descriptions. We find that the lifetime tweet distribution is a type-II discrete Weibull stemming from a power law hazard function, the tweet rate distribution, although asymptotically power law, exhibits a lognormal cutoff over finite sample intervals, and the inter-tweet interval distribution is power law with exponential cutoff. The retweet graph is small-world and scale-free, like the social graph, but is less disassortative and has much stronger clustering. These differences are consistent with it better capturing the real-world social relationships of and trust between users. Beyond just understanding and modeling human communication patterns and social networks, applications for alternative, decentralized microblogging systems-both predicting real-word performance and detecting spam-are d...

  18. Chromosomal damage and apoptosis analysis in exfoliated oral epithelial cells from mouthwash and alcohol users

    Scientific Electronic Library Online (English)

    Rodrigo dos Santos, Rocha; José Roberto Cardoso, Meireles; Eneida de Moraes Marcílio, Cerqueira.

    2014-12-01

    Full Text Available Chromosomal damage and apoptosis were analyzed in users of mouthwash and/or alcoholic beverages, using the micronucleus test on exfoliated oral mucosa cells. Samples from four groups of 20 individuals each were analyzed: three exposed groups (EG1, EG2 and EG3) and a control group (CG). EG1 comprised [...] mouthwash users; EG2 comprised drinkers, and EG3 users of both mouthwashes and alcoholic beverages. Cell material was collected by gently scraping the insides of the cheeks. Then the cells were fixed in a methanol/acetic acid (3:1) solution and stained and counterstained, respectively, with Schiff reactive and fast green. Endpoints were computed on 2,000 cells in a blind test. Statistical analysis showed that chromosomal damage and apoptosis were significantly higher in individuals of groups EG1 and EG3 than in controls (p

  19. Chromosomal damage and apoptosis analysis in exfoliated oral epithelial cells from mouthwash and alcohol users.

    Science.gov (United States)

    Rocha, Rodrigo Dos Santos; Meireles, José Roberto Cardoso; de Moraes Marcílio Cerqueira, Eneida

    2014-10-01

    Chromosomal damage and apoptosis were analyzed in users of mouthwash and/or alcoholic beverages, using the micronucleus test on exfoliated oral mucosa cells. Samples from four groups of 20 individuals each were analyzed: three exposed groups (EG1, EG2 and EG3) and a control group (CG). EG1 comprised mouthwash users; EG2 comprised drinkers, and EG3 users of both mouthwashes and alcoholic beverages. Cell material was collected by gently scraping the insides of the cheeks. Then the cells were fixed in a methanol/acetic acid (3:1) solution and stained and counterstained, respectively, with Schiff reactive and fast green. Endpoints were computed on 2,000 cells in a blind test. Statistical analysis showed that chromosomal damage and apoptosis were significantly higher in individuals of groups EG1 and EG3 than in controls (p data need to be confirmed in larger samples. PMID:25505845

  20. A new comparative approach to macroeconomic modeling and policy analysis

    OpenAIRE

    Wieland, Volker; Cwik, Tobias J.; Mu?ller, Gernot J.; Schmidt, Sebastian; Wolters, Maik H.

    2012-01-01

    In the aftermath of the global financial crisis, the state of macroeconomic modeling and the use of macroeconomic models in policy analysis has come under heavy criticism. Macroeconomists in academia and policy institutions have been blamed for relying too much on a particular class of macroeconomic models. This paper proposes a comparative approach to macroeconomic policy analysis that is open to competing modeling paradigms. Macroeconomic model comparison projects have helped produce some v...

  1. Comparative analysis of traditional and alternative energy sources

    OpenAIRE

    Adriana Csikósová; Lívia Bodonská

    2008-01-01

    The presented thesis with designation of Comparing analysis of traditional and alternative energy resources includes, on basisof theoretical information source, research in firm, internal data, trends in company development and market, descriptionof the problem and its application. Theoretical information source is dedicated to the traditional and alternative energy resources,reserves of it, trends in using and development, the balance of it in the world, EU and in Slovakia as well. Analysis ...

  2. Comparability of Mixed IC50 Data – A Statistical Analysis

    OpenAIRE

    Kalliokoski, Tuomo; Kramer, Christian; Vulpetti, Anna; Gedeck, Peter

    2013-01-01

    The biochemical half maximal inhibitory concentration (IC50) is the most commonly used metric for on-target activity in lead optimization. It is used to guide lead optimization, build large-scale chemogenomics analysis, off-target activity and toxicity models based on public data. However, the use of public biochemical IC50 data is problematic, because they are assay specific and comparable only under certain conditions. For large scale analysis it is not feasible to check each data entry man...

  3. Comparative bioinformatics analysis of the mammalian and bacterial glycomes

    OpenAIRE

    Adibekian, Alexander; Stallforth, Pierre; Hecht, Marie-lyn; Werz, Daniel B.; Gagneux, Pascal; Seeberger, Peter H.

    2011-01-01

    A comparative analysis of bacterial and mammalian glycomes based on the statistical analysis of two major carbohydrate databases, Bacterial Carbohydrate Structure Data Base (BCSDB) and GLYCOSCIENCES.de (GS), is presented. An in-depth comparison of these two glycomes reveals both striking differences and unexpected similarities. Within the prokaryotic kingdom, we focus on the glycomes of seven classes of pathogenic bacteria with respect to (i) their most abundant monosaccharide ...

  4. A Comparative Financial Performance Analysis of Bangladeshi Private Commercial Banks

    OpenAIRE

    Mohammad Omar Faruk; Rokshana Alam

    2014-01-01

    The ratio analysis is considered as a simple and preferable way to understand the financial performance of a financial organization. We have chosen top two operating profit generating banks from each of the three generation of banking in Bangladesh to compare the financial performance from 2005 to 2008 with Prime Bank Limited (PBL). For analyzing we have used the profitability ratios, liquidity ratios and efficiency ratios, risk measures ratio and DuPont analysis. The measurement explores tha...

  5. Formal Model for Data Dependency Analysis between Controls and Actions of a Graphical User Interface

    Directory of Open Access Journals (Sweden)

    SKVORC, D.

    2012-02-01

    Full Text Available End-user development is an emerging computer science discipline that provides programming paradigms, techniques, and tools suitable for users not trained in software engineering. One of the techniques that allow ordinary computer users to develop their own applications without the need to learn a classic programming language is a GUI-level programming based on programming-by-demonstration. To build wizard-based tools that assist users in application development and to verify the correctness of user programs, a computer-supported method for GUI-level data dependency analysis is necessary. Therefore, formal model for GUI representation is needed. In this paper, we present a finite state machine for modeling the data dependencies between GUI controls and GUI actions. Furthermore, we present an algorithm for automatic construction of finite state machine for arbitrary GUI application. We show that proposed state aggregation scheme successfully manages state explosion in state machine construction algorithm, which makes the model applicable for applications with complex GUIs.

  6. Optimised access to user analysis data using the gLite DPM

    International Nuclear Information System (INIS)

    The ScotGrid distributed Tier-2 now provides more that 4MSI2K and 500TB for LHC computing, which is spread across three sites at Durham, Edinburgh and Glasgow. Tier-2 sites have a dual role to play in the computing models of the LHC VOs. Firstly, their CPU resources are used for the generation of Monte Carlo event data. Secondly, the end user analysis data is distributed across the grid to the site's storage system and held on disk ready for processing by physicists' analysis jobs. In this paper we show how we have designed the ScotGrid storage and data management resources in order to optimise access by physicists to LHC data. Within ScotGrid, all sites use the gLite DPM storage manager middleware. Using the EGEE grid to submit real ATLAS analysis code to process VO data stored on the ScotGrid sites, we present an analysis of the performance of the architecture at one site, and procedures that may be undertaken to improve such. The results will be presented from the point of view of the end user (in terms of number of events processed/second) and from the point of view of the site, which wishes to minimise load and the impact that analysis activity has on other users of the system.

  7. Comparative DNA sequence analysis of wheat and rice genomes.

    Science.gov (United States)

    Sorrells, Mark E; La Rota, Mauricio; Bermudez-Kandianis, Catherine E; Greene, Robert A; Kantety, Ramesh; Munkvold, Jesse D; Miftahudin; Mahmoud, Ahmed; Ma, Xuefeng; Gustafson, Perry J; Qi, Lili L; Echalier, Benjamin; Gill, Bikram S; Matthews, David E; Lazo, Gerard R; Chao, Shiaoman; Anderson, Olin D; Edwards, Hugh; Linkiewicz, Anna M; Dubcovsky, Jorge; Akhunov, Eduard D; Dvorak, Jan; Zhang, Deshui; Nguyen, Henry T; Peng, Junhua; Lapitan, Nora L V; Gonzalez-Hernandez, Jose L; Anderson, James A; Hossain, Khwaja; Kalavacharla, Venu; Kianian, Shahryar F; Choi, Dong-Woog; Close, Timothy J; Dilbirligi, Muharrem; Gill, Kulvinder S; Steber, Camille; Walker-Simmons, Mary K; McGuire, Patrick E; Qualset, Calvin O

    2003-08-01

    The use of DNA sequence-based comparative genomics for evolutionary studies and for transferring information from model species to crop species has revolutionized molecular genetics and crop improvement strategies. This study compared 4485 expressed sequence tags (ESTs) that were physically mapped in wheat chromosome bins, to the public rice genome sequence data from 2251 ordered BAC/PAC clones using BLAST. A rice genome view of homologous wheat genome locations based on comparative sequence analysis revealed numerous chromosomal rearrangements that will significantly complicate the use of rice as a model for cross-species transfer of information in nonconserved regions. PMID:12902377

  8. Initial implementation of a comparative data analysis ontology.

    Science.gov (United States)

    Prosdocimi, Francisco; Chisham, Brandon; Pontelli, Enrico; Thompson, Julie D; Stoltzfus, Arlin

    2009-01-01

    Comparative analysis is used throughout biology. When entities under comparison (e.g. proteins, genomes, species) are related by descent, evolutionary theory provides a framework that, in principle, allows N-ary comparisons of entities, while controlling for non-independence due to relatedness. Powerful software tools exist for specialized applications of this approach, yet it remains under-utilized in the absence of a unifying informatics infrastructure. A key step in developing such an infrastructure is the definition of a formal ontology. The analysis of use cases and existing formalisms suggests that a significant component of evolutionary analysis involves a core problem of inferring a character history, relying on key concepts: "Operational Taxonomic Units" (OTUs), representing the entities to be compared; "character-state data" representing the observations compared among OTUs; "phylogenetic tree", representing the historical path of evolution among the entities; and "transitions", the inferred evolutionary changes in states of characters that account for observations. Using the Web Ontology Language (OWL), we have defined these and other fundamental concepts in a Comparative Data Analysis Ontology (CDAO). CDAO has been evaluated for its ability to represent token data sets and to support simple forms of reasoning. With further development, CDAO will provide a basis for tools (for semantic transformation, data retrieval, validation, integration, etc.) that make it easier for software developers and biomedical researchers to apply evolutionary methods of inference to diverse types of data, so as to integrate this powerful framework for reasoning into their research. PMID:19812726

  9. Initial Implementation of a Comparative Data Analysis Ontology

    Directory of Open Access Journals (Sweden)

    Francisco Prosdocimi

    2009-07-01

    Full Text Available Comparative analysis is used throughout biology. When entities under comparison (e.g. proteins, genomes, species are related by descent, evolutionary theory provides a framework that, in principle, allows N-ary comparisons of entities, while controlling for non-independence due to relatedness. Powerful software tools exist for specialized applications of this approach, yet it remains under-utilized in the absence of a unifying informatics infrastructure. A key step in developing such an infrastructure is the definition of a formal ontology. The analysis of use cases and existing formalisms suggests that a significant component of evolutionary analysis involves a core problem of inferring a character history, relying on key concepts: “Operational Taxonomic Units” (OTUs, representing the entities to be compared; “character-state data” representing the observations compared among OTUs; “phylogenetic tree”, representing the historical path of evolution among the entities; and “transitions”, the inferred evolutionary changes in states of characters that account for observations. Using the Web Ontology Language (OWL, we have defined these and other fundamental concepts in a Comparative Data Analysis Ontology (CDAO. CDAO has been evaluated for its ability to represent token data sets and to support simple forms of reasoning. With further development, CDAO will provide a basis for tools (for semantic transformation, data retrieval, validation, integration, etc. that make it easier for software developers and biomedical researchers to apply evolutionary methods of inference to diverse types of data, so as to integrate this powerful framework for reasoning into their research.

  10. A Comparative Analysis of Three Unique Theories of Organizational Learning

    Science.gov (United States)

    Leavitt, Carol C.

    2011-01-01

    The purpose of this paper is to present three classical theories on organizational learning and conduct a comparative analysis that highlights their strengths, similarities, and differences. Two of the theories -- experiential learning theory and adaptive -- generative learning theory -- represent the thinking of the cognitive perspective, while…

  11. Initial Implementation of a comparative Data Analysis Ontology

    Directory of Open Access Journals (Sweden)

    Francisco Prosdocimi

    2009-01-01

    Full Text Available Comparative analysis is used throughout biology. When entities under comparison (e.g. proteins, genomes, species are related by descent, evolutionary theory provides a framework that, in principle, allows N-ary comparisons of entities, while controlling for non-independence due to relatedness. Powerful software tools exist for specialized applications of this approach, yet it remains under-utilized in the absence of a unifying informatics infrastructure. A key step in developing such an infrastructure is the definition of a formal ontology. The analysis of use cases and existing formalisms suggests that a significant component of evolutionary analysis involves a core problem of inferring a character history, relying on key concepts: “Operational Taxonomic Units” (OTUs, representing the entities to be compared; “character-state data” representing the observations compared among OTUs; “phylogenetic tree”, representing the historical path of evolution among the entities; and “transitions”, the inferred evolutionary changes in states of characters that account for observations. Using the Web Ontology Language (OWL, we have defined these and other fundamental concepts in a Comparative Data Analysis Ontology (CDAO. CDAO has been evaluated for its ability to represent token data sets and to support simple forms of reasoning. With further development, CDAO will provide a basis for tools (for semantic transformation, data retrieval, validation, integration, etc. that make it easier for software developers and biomedical researchers to apply evolutionary methods of inference to diverse types of data, so as to integrate this powerful framework for reasoning into their research.

  12. Comparative and Familial Analysis of Handedness in Great Apes

    Science.gov (United States)

    Hopkins, William D.

    2006-01-01

    Historically, population-level handedness has been considered a hallmark of human evolution. Whether nonhuman primates exhibit population-level handedness remains a topic of considerable debate. This paper summarizes published data on handedness in great apes. Comparative analysis indicated that chimpanzees and bonobos show population-level right…

  13. Comparative Analysis and Evaluation of Existing Risk Management Software

    Directory of Open Access Journals (Sweden)

    2007-01-01

    Full Text Available The focus of this article lies on the specific features of the existing software packages for risk management differentiating three categories. Representative for these categories we consider the Crystal Ball, Haufe Risikomanager and MIS - Risk Management solutions, outlining the strenghts and weaknesses in a comparative analysis.

  14. Integrated Reliability and Risk Analysis System (IRRAS) Version 2.0 user's guide

    International Nuclear Information System (INIS)

    The Integrated Reliability and Risk Analysis System (IRRAS) is a state-of-the-art, microcomputer-based probabilistic risk assessment (PRA) model development and analysis tool to address key nuclear plant safety issues. IRRAS is an integrated software tool that gives the user the ability to create and analyze fault trees and accident sequences using a microcomputer. This program provides functions that range from graphical fault tree construction to cut set generation and quantification. Also provided in the system is an integrated full-screen editor for use when interfacing with remote mainframe computer systems. Version 1.0 of the IRRAS program was released in February of 1987. Since that time, many user comments and enhancements have been incorporated into the program providing a much more powerful and user-friendly system. This version has been designated IRRAS 2.0 and is the subject of this user's guide. Version 2.0 of IRRAS provides all of the same capabilities as Version 1.0 and adds a relational data base facility for managing the data, improved functionality, and improved algorithm performance. 9 refs., 292 figs., 4 tabs

  15. Comparative analysis of equalization methods for SC-FDMA

    DEFF Research Database (Denmark)

    Dogadaev, Anton Konstantinovich; Kozlov, Alexander

    2010-01-01

    In this paper we introduce comparative analysis for different types of equalization schemes, based on the minimum mean square error (MMSE) optimization. The following types of equalizers were compared: linear equalization, decision feedback equalization (DFE) and turbo equalization. Performance and complexity of these schemes were tested for Single Carrier Frequency Division Multiple Access (SC-FDMA) system with Single Input Single Output (SISO) antenna configuration. SC-FDMA is a common technique, which is used in the UTRA LTE Uplink, so the results of complexity and performance analysis could be applied to find the appropriate equalization algorithm to be used in the Uplink channel of the LTE – the famous standard in 4G telecommunications. Simulation results in the end in this paper show bit error ratio (BER) and modulation error ratio (MER) for compared schemes.

  16. Microcomputer spacecraft thermal analysis routines (MSTAR) Phase I: The user interface

    Science.gov (United States)

    Teti, Nicholas M.

    1993-01-01

    The Microcomputer Spacecraft Thermal Analysis Routines (MSTAR) software package is being developed for NASA/Goddard Space Flight Center by Swales and Associates, Inc. (S&AI). In December 1992, S&AI was awarded a phase I Small Business Inovative Research contract fronm NASA to develop a microcomputer based thermal analysis program to replace the current SSPTA and TRASYS programs. Phase I consists of a six month effort which will focus on developing geometric model generation and visualization capabilities using a graphical user interface (GUI). The information contained in this paper encompasses the work performed during the Phase I development cycle; with emphasis on the development of the graphical user interface (GUI). This includes both the theory behind and specific examples of how the MSTAR GUI was implemented. Furthermore, this report discusses new applications and enhancements which will improve the capabilities and commercialization of the MSTAR program.

  17. The role of the salience network in processing lexical and nonlexical stimuli in cochlear implant users: an ALE meta-analysis of PET studies.

    Science.gov (United States)

    Song, Jae-Jin; Vanneste, Sven; Lazard, Diane S; Van de Heyning, Paul; Park, Joo Hyun; Oh, Seung Ha; De Ridder, Dirk

    2015-05-01

    Previous positron emission tomography (PET) studies have shown that various cortical areas are activated to process speech signal in cochlear implant (CI) users. Nonetheless, differences in task dimension among studies and low statistical power preclude from understanding sound processing mechanism in CI users. Hence, we performed activation likelihood estimation meta-analysis of PET studies in CI users and normal hearing (NH) controls to compare the two groups. Eight studies (58 CI subjects/92 peak coordinates; 45 NH subjects/40 peak coordinates) were included and analyzed, retrieving areas significantly activated by lexical and nonlexical stimuli. For lexical and nonlexical stimuli, both groups showed activations in the components of the dual-stream model such as bilateral superior temporal gyrus/sulcus, middle temporal gyrus, left posterior inferior frontal gyrus, and left insula. However, CI users displayed additional unique activation patterns by lexical and nonlexical stimuli. That is, for the lexical stimuli, significant activations were observed in areas comprising salience network (SN), also known as the intrinsic alertness network, such as the left dorsal anterior cingulate cortex (dACC), left insula, and right supplementary motor area in the CI user group. Also, for the nonlexical stimuli, CI users activated areas comprising SN such as the right insula and left dACC. Previous episodic observations on lexical stimuli processing using the dual auditory stream in CI users were reconfirmed in this study. However, this study also suggests that dual-stream auditory processing in CI users may need supports from the SN. In other words, CI users need to pay extra attention to cope with degraded auditory signal provided by the implant. PMID:25619989

  18. DisQo : A user needs analysis method for smart home

    OpenAIRE

    Coutaz, Joe?lle; Fontaine, Emeric; Mandran, Nadine; Demeure, Alexandre

    2010-01-01

    How can people identify the services that they might expect from their smart home when they have little to no knowledge about novel technologies? This paper reports on a user needs analysis method designed to answer this question: DisQo. We have recruited 17 families and used a combination of interviews and playful cultural probes. Results show that families are willing to couple smart objects to improve their lives.

  19. imDEV: a graphical user interface to R multivariate analysis tools in Microsoft Excel

    OpenAIRE

    Grapov, Dmitry; NEWMAN, JOHN W.

    2012-01-01

    Summary: Interactive modules for Data Exploration and Visualization (imDEV) is a Microsoft Excel spreadsheet embedded application providing an integrated environment for the analysis of omics data through a user-friendly interface. Individual modules enables interactive and dynamic analyses of large data by interfacing R's multivariate statistics and highly customizable visualizations with the spreadsheet environment, aiding robust inferences and generating information-rich data visualization...

  20. Shape Analysis of 3D Head Scan Data for U.S. Respirator Users

    OpenAIRE

    Stephanie Lynch; Viscusi, Dennis J.; Stacey Benson; Slice, Dennis E.; Ziqing Zhuang

    2010-01-01

    In 2003, the National Institute for Occupational Safety and Health (NIOSH) conducted a head-and-face anthropometric survey of diverse, civilian respirator users. Of the 3,997 subjects measured using traditional anthropometric techniques, surface scans and 26 three-dimensional (3D) landmark locations were collected for 947 subjects. The objective of this study was to report the size and shape variation of the survey participants using the 3D data. Generalized Procrustes Analysis (GPA) was con...

  1. mcaGUI: microbial community analysis R-Graphical User Interface (GUI)

    OpenAIRE

    Copeland, Wade K.; Krishnan, Vandhana; Beck, Daniel; Settles, Matt; Foster, James A.; Cho, Kyu-Chul; Day, Mitch; Hickey, Roxana; Schütte, Ursel M.E.; Xia ZHOU; Williams, Christopher J; Forney, Larry J; Abdo, Zaid

    2012-01-01

    Summary: Microbial communities have an important role in natural ecosystems and have an impact on animal and human health. Intuitive graphic and analytical tools that can facilitate the study of these communities are in short supply. This article introduces Microbial Community Analysis GUI, a graphical user interface (GUI) for the R-programming language (R Development Core Team, 2010). With this application, researchers can input aligned and clustered sequence data to create custom abundance ...

  2. An Irish Cross-Institutional User Needs Analysis of Undergraduate Programming

    Directory of Open Access Journals (Sweden)

    Eileen Mary Costelloe

    2006-07-01

    Full Text Available Research literature and practical experience of subject experts indicate that teaching programming to novices has proven challenging for both learner and lecturer. A number of difficulties arise when teaching novices to program. These ranges from the inadequacy of the undergraduate students’ problem-solving skills, problems with understanding programming constructs, to the complexity of the environments in which the students develop their solutions. This paper outlines a project which aims to address some of the challenges faced by novice programmers by providing them with an innovative learning tool, incorporating a set of Reusable Learning Objects (RLOs, based on sound pedagogical principles and encapsulated in a Constructivist Learning Environment (CLE. The Learning Objects will focus on the common areas of weaknesses that are determined by an Irish cross-institutional User Needs Analysis. The initial research activity was to conduct a User Needs Analysis, which was carried out in the three third level academic partner institutions and which will inform and direct the remainder of the research project. The User Needs Analysis confirmed that first year undergraduate students find programming the most challenging module they study. Programming constructs such as Arrays, Looping and Selection were shown to be the most problematic in semester one, and Methods and Polymorphism posing difficulties in semester two. Interestingly the students’ actual and perceived difficulties with the concepts were not in-line, with the students perceiving their difficulties to be less than they actually were. The students acknowledge that problem-solving abilities impacted on their performance but only 20% of students in one college admitted to thinking about their approach in designing programming solutions. The results of the User Needs Analysis directs the design and development of the RLOs and the learning tool.

  3. An Analysis and Knowledge Representation System to attain the genuine web user usage behavior

    Directory of Open Access Journals (Sweden)

    V.V.R. Maheswara Rao

    2013-05-01

    Full Text Available With the explosive growth of WWW, the web mining techniques are densely concentrated to discover the relevant behaviors of the web user from the web log data. In fact the pattern discovery techniques generate many hundreds, often thousands, of patterns, that are unwanted, unexpected, disputable and unbelievable in nature. The success of representing the real knowledge out of such patterns is highly reliant on the pattern analysis stage in investigating the web user usage behavior. To retain most genuine and interesting patterns it is necessary to filter out unqualified patterns and use moresophisticated visualization techniques to present the knowledge of web user usage effectively. The authors in the present paper propose an Analysis and Knowledge Representation System (AKRS that equally concentrates on both knowledge identification and representation. The key measures are combinedly used for the knowledge identification as a three phase filtering system, to determine the interestingness of patterns in the proposed AKRS. Initially, the objective measures applied on the patterns discovered by pattern discovery techniques to filter out the patterns that do not meet statistical strengths with the frame work of interest factor. Later, subjective measures are applied to identify the patterns that are of most genuine interestingness based on web knowledge. Finally, the heuristic measures evaluate the semantics of patterns based on both user specific objectives and utility of mined patterns. The measures of AKRS efficiently determine the correlation among the most interesting patterns. In addition, to meet the challenges in knowledge representation, like identifying relevant information, finding the depth of information and achieving the visualization competency, the proposed AKRS also designates the recent knowledge visualization techniques like multidimensional and specialized hierarchical. The AKRS amplifies the truthfulness and rate of success in representing final knowledge of web user behavior.

  4. A comparative analysis of the effects of economic policy instruments in promoting environmentally sustainable transport

    DEFF Research Database (Denmark)

    Elvik, Rune; Ramjerdi, Farideh

    2014-01-01

    This paper presents a comparative analysis of the effects of economic policy instruments in promoting environmentally sustainable transport. Promoting environmentally sustainable transport is defined as follows: (1) Reducing the volume of motorised travel; (2) Transferring travel to modes generating less external effects, and (3) Modifying road user behaviour in a way that will reduce external effects of transport. External effects include accidents, congestion, traffic noise and emissions to air. Four economic policy instruments are compared: (1) Prices of motor fuel; (2) Congestion charges; (3) Toll schemes; (4) Reward systems giving incentives to reduce driving or change driver behaviour. The effects of these policy instruments are stated in terms of elasticities. All four economic policy instruments have negative elasticities, which means that they do promote environmentally sustainable transport. Long-term elasticities tend to be larger than short term elasticities. The long-term elasticities of reward systems are unknown. (C) 2014 Elsevier Ltd. All rights reserved.

  5. PuppetDroid: A User-Centric UI Exerciser for Automatic Dynamic Analysis of Similar Android Applications

    OpenAIRE

    Gianazza, Andrea; Maggi, Federico; Fattori, Aristide; Cavallaro, Lorenzo; Zanero, Stefano

    2014-01-01

    Popularity and complexity of malicious mobile applications are rising, making their analysis difficult and labor intensive. Mobile application analysis is indeed inherently different from desktop application analysis: In the latter, the interaction of the user (i.e., victim) is crucial for the malware to correctly expose all its malicious behaviors. We propose a novel approach to analyze (malicious) mobile applications. The goal is to exercise the user interface (UI) of an...

  6. Radiological Safety Analysis Computer (RSAC) Program Version 7.2 Users’ Manual

    Energy Technology Data Exchange (ETDEWEB)

    Dr. Bradley J Schrader

    2010-10-01

    The Radiological Safety Analysis Computer (RSAC) Program Version 7.2 (RSAC-7) is the newest version of the RSAC legacy code. It calculates the consequences of a release of radionuclides to the atmosphere. A user can generate a fission product inventory from either reactor operating history or a nuclear criticality event. RSAC-7 models the effects of high-efficiency particulate air filters or other cleanup systems and calculates the decay and ingrowth during transport through processes, facilities, and the environment. Doses are calculated for inhalation, air immersion, ground surface, ingestion, and cloud gamma pathways. RSAC-7 can be used as a tool to evaluate accident conditions in emergency response scenarios, radiological sabotage events and to evaluate safety basis accident consequences. This users’ manual contains the mathematical models and operating instructions for RSAC-7. Instructions, screens, and examples are provided to guide the user through the functions provided by RSAC-7. This program was designed for users who are familiar with radiological dose assessment methods.

  7. Radiological Safety Analysis Computer (RSAC) Program Version 7.0 Users’ Manual

    Energy Technology Data Exchange (ETDEWEB)

    Dr. Bradley J Schrader

    2009-03-01

    The Radiological Safety Analysis Computer (RSAC) Program Version 7.0 (RSAC-7) is the newest version of the RSAC legacy code. It calculates the consequences of a release of radionuclides to the atmosphere. A user can generate a fission product inventory from either reactor operating history or a nuclear criticality event. RSAC-7 models the effects of high-efficiency particulate air filters or other cleanup systems and calculates the decay and ingrowth during transport through processes, facilities, and the environment. Doses are calculated for inhalation, air immersion, ground surface, ingestion, and cloud gamma pathways. RSAC-7 can be used as a tool to evaluate accident conditions in emergency response scenarios, radiological sabotage events and to evaluate safety basis accident consequences. This users’ manual contains the mathematical models and operating instructions for RSAC-7. Instructions, screens, and examples are provided to guide the user through the functions provided by RSAC-7. This program was designed for users who are familiar with radiological dose assessment methods.

  8. Image retrieval : Theoretical analysis an empirical user studies on accessing information in images

    DEFF Research Database (Denmark)

    Ørnager, Susanne

    1997-01-01

    The paper touches upon indexing and retrieval for effective searches of digitized images. Different conceptions of what subject indexing means are described as a basis for defining an operational subject indexing strategy for images. The methodology is based on the art historian Erwin Panofsky, and his work on renaissance paintings. On the basic of works of art he develops a theory about ways in which one analyses representational images. Panofsky describes three levels of meaning in a work of art which indicate a difference in presupposed knowledge i.e. nothing (or only practical experience), special knowledge about image codes, and special knowledge about history of ideas. The semiologist Roland Barthes has established a semiology for pictorial expressions based on advertising photos. Barthes uses the concepts denotation/connotation where denotations can be explained as the sober expression of signs and connotation as meanings relating to feelings or associations. A joint methodology is suggested between the two researchers and the methodology is implemented in analyzing press photos. Fields of application discussed include the messages in an image and the linking between information running from text, image to object. An empirical study, based on 17 newspaper archives, demonstrates user group requirements including archivists (creators), journalists (immediate users), and newspaper readers (end-users). A word association test is completed and the terms are used to build a user interface. The empirical analysis demonstrates how the results can be applied as the foundation for a semantic model.

  9. Large-System Analysis of Joint User Selection and Vector Precoding for Multiuser MIMO Downlink

    CERN Document Server

    Takeuchi, Keigo; Kawabata, Tsutomu

    2012-01-01

    Joint user selection (US) and vector precoding (US-VP) is proposed for multiuser multiple-input multiple-output (MU-MIMO) downlink. The main difference between joint US-VP and conventional US is that US depends on data symbols for joint US-VP, whereas conventional US is independent of data symbols. The replica method is used to analyze the performance of joint US-VP in the large-system limit, where the numbers of transmit antennas, users, and selected users tend to infinity while their ratios are kept constant. The analysis under the assumptions of replica symmetry (RS) and 1-step replica symmetry breaking (1RSB) implies that optimal data-independent US provides nothing but the same performance as random US in the large-system limit, whereas data-independent US is capacity-achieving as only the number of users tends to infinity. It is shown that joint US-VP can provide a substantial reduction of the energy penalty in the large-system limit. Consequently, joint US-VP outperforms separate US-VP in terms of the ...

  10. Radiological Safety Analysis Computer (RSAC) Program Version 7.0 Users Manual

    International Nuclear Information System (INIS)

    The Radiological Safety Analysis Computer (RSAC) Program Version 7.0 (RSAC-7) is the newest version of the RSAC legacy code. It calculates the consequences of a release of radionuclides to the atmosphere. A user can generate a fission product inventory from either reactor operating history or a nuclear criticality event. RSAC-7 models the effects of high-efficiency particulate air filters or other cleanup systems and calculates the decay and ingrowth during transport through processes, facilities, and the environment. Doses are calculated for inhalation, air immersion, ground surface, ingestion, and cloud gamma pathways. RSAC-7 can be used as a tool to evaluate accident conditions in emergency response scenarios, radiological sabotage events and to evaluate safety basis accident consequences. This users manual contains the mathematical models and operating instructions for RSAC-7. Instructions, screens, and examples are provided to guide the user through the functions provided by RSAC-7. This program was designed for users who are familiar with radiological dose assessment methods

  11. Comparative Study of Data Cluster Analysis for Microarray

    Directory of Open Access Journals (Sweden)

    Lokesh Kumar Sharma, Sourabh Rungta

    2012-06-01

    Full Text Available Microarray has been a popular method for representing biological data. Microarray technology allows biologists to monitor genome-wide patterns of gene expression in a high-throughput fashion. Clustering the biological sequences according to their components may reveal the biological functionality among the sequences. Data cluster analysis is an important task in microarray data. There is no clustering algorithm that can be universally used to solve all problems. Therefore in this paper comparative study of data cluster analysis for microarray is presented. Here the most popular cluster algorithms that can be applied for microarray data are discussed. The uncertainty of data, optimization and density estimation are considered for comparison.

  12. Alcohol: view 2000 - comparative analysis gasoline versus alcohol

    International Nuclear Information System (INIS)

    The comparative analysis between alcohol and gas reveals the pros and the cons of the use of each one of those energy sources, taking as a basis an analysis of the world supply and demand of oil, and of PETROBRAS sceneries, including price expectancies for next decade, and the repercussion of PROALCOOL during its existence in the country. Regarding competitiveness, gas and the energy substitute hydrous alcohol are analyzed jointly, as an energy policy for carburetant fuels, taking into account aspects related with both the direct and the indirect cost of each energy source, as well as the benefits provided by then both. (author)

  13. Comparative analysis of two DOPA dioxygenases from Phytolacca Americana.

    Science.gov (United States)

    Takahashi, Kana; Yoshida, Kazuko; Sakuta, Masaaki

    2015-05-01

    The comparative analysis of two Phytolacca americana DOPA dioxygenases (PaDOD1 and PaDOD2) that may be involved in betalain biosynthesis was carried out. The recombinant protein of PaDOD catalyzed the conversion of DOPA to betalamic acid, whereas DOD activity was not detected in PaDOD2 in vitro. The role of DOD genes is discussed in the evolutionary context using phylogenetic analysis, suggesting that DOD might have been duplicated early in evolution and that accumulation of base substitutions could have led to the different characteristics of DODs within the betalain-producing Caryophyllales. PMID:26058141

  14. MicroScope—an integrated microbial resource for the curation and comparative analysis of genomic and metabolic data

    OpenAIRE

    Vallenet, David; Belda, Eugeni; Calteau, Alexandra; Cruveiller, Stéphane; Engelen, Stefan; Lajus, Aurélie; Le Fèvre, François; Longin, Cyrille; Mornico, Damien; David ROCHE; Rouy, Zoé; Salvignol, Gregory; Scarpelli, Claude; Thil Smith, Adam Alexander; Weiman, Marion

    2012-01-01

    MicroScope is an integrated platform dedicated to both the methodical updating of microbial genome annotation and to comparative analysis. The resource provides data from completed and ongoing genome projects (automatic and expert annotations), together with data sources from post-genomic experiments (i.e. transcriptomics, mutant collections) allowing users to perfect and improve the understanding of gene functions. MicroScope (http://www.genoscope.cns.fr/agc/microscope) combines tools and gr...

  15. Comparative Genomics via Wavelet Analysis for Closely Related Bacteria

    Directory of Open Access Journals (Sweden)

    Jiuzhou Song

    2004-01-01

    Full Text Available Comparative genomics has been a valuable method for extracting and extrapolating genome information among closely related bacteria. The efficiency of the traditional methods is extremely influenced by the software method used. To overcome the problem here, we propose using wavelet analysis to perform comparative genomics. First, global comparison using wavelet analysis gives the difference at a quantitative level. Then local comparison using keto-excess or purine-excess plots shows precise positions of inversions, translocations, and horizontally transferred DNA fragments. We firstly found that the level of energy spectra difference is related to the similarity of bacteria strains; it could be a quantitative index to describe the similarities of genomes. The strategy is described in detail by comparisons of closely related strains: S.typhi CT18, S.typhi Ty2, S.typhimurium LT2, H.pylori 26695, and H.pylori J99.

  16. Transportation Routing Analysis Geographic Information System (WebTRAGIS) User's Manual

    International Nuclear Information System (INIS)

    In the early 1980s, Oak Ridge National Laboratory (ORNL) developed two transportation routing models: HIGHWAY, which predicts truck transportation routes, and INTERLINE, which predicts rail transportation routes. Both of these models have been used by the U.S. Department of Energy (DOE) community for a variety of routing needs over the years. One of the primary uses of the models has been to determine population-density information, which is used as input for risk assessment with the RADTRAN model, which is available on the TRANSNET computer system. During the recent years, advances in the development of geographic information systems (GISs) have resulted in increased demands from the user community for a GIS version of the ORNL routing models. In April 1994, the DOE Transportation Management Division (EM-261) held a Baseline Requirements Assessment Session with transportation routing experts and users of the HIGHWAY and INTERLINE models. As a result of the session, the development of a new GIS routing model, Transportation Routing Analysis GIS (TRAGIS), was initiated. TRAGIS is a user-friendly, GIS-based transportation and analysis computer model. The older HIGHWAY and INTERLINE models are useful to calculate routes, but they cannot display a graphic of the calculated route. Consequently, many users have experienced difficulty determining the proper node for facilities and have been confused by or have misinterpreted the text-based listing from the older routing mode-based listing from the older routing models. Some of the primary reasons for the development of TRAGIS are (a) to improve the ease of selecting locations for routing, (b) to graphically display the calculated route, and (c) to provide for additional geographic analysis of the route

  17. Mammalian Comparative Sequence Analysis of the Agrp Locus

    OpenAIRE

    Kaelin, Christopher B.; Gregory M. Cooper; Sidow, Arend; Barsh, Gregory S

    2007-01-01

    Agouti-related protein encodes a neuropeptide that stimulates food intake. Agrp expression in the brain is restricted to neurons in the arcuate nucleus of the hypothalamus and is elevated by states of negative energy balance. The molecular mechanisms underlying Agrp regulation, however, remain poorly defined. Using a combination of transgenic and comparative sequence analysis, we have previously identified a 760 bp conserved region upstream of Agrp which contains STAT binding elements that pa...

  18. Comparative analysis of endoscopic precut conventional and needle knife sphincterotomy

    OpenAIRE

    Andrzej Jamry

    2013-01-01

    AIM: To compare the efficacy, complications and post-procedural hyperamylasemia in endoscopic pre-cut conventional and needle knife sphincterotomie. METHODS: We performed a retrospective analysis of two pre-cut sphincterotomy (PS) techniques, pre-cut conventional sphincterotomy (PCS), and pre-cut needle knife (PNK). The study included 143 patients; the classic technique was used in 59 patients (41.3%), and the needle knife technique was used in 84 patients (58.7%). We analyzed the efficacy of...

  19. Revealing Mammalian Evolutionary Relationships by Comparative Analysis of Gene Clusters

    OpenAIRE

    Song, Giltae; Riemer, Cathy; Dickins, Benjamin; Kim, Hie Lim; Zhang, Louxin; Zhang, Yu; Hsu, Chih-Hao; Hardison, Ross C.; NISC Comparative Sequencing Program,; Green, Eric D.; Miller, Webb

    2012-01-01

    Many software tools for comparative analysis of genomic sequence data have been released in recent decades. Despite this, it remains challenging to determine evolutionary relationships in gene clusters due to their complex histories involving duplications, deletions, inversions, and conversions. One concept describing these relationships is orthology. Orthologs derive from a common ancestor by speciation, in contrast to paralogs, which derive from duplication. Discriminating orthologs from pa...

  20. Comparative Analysis of the Value Added Tax Evolution

    OpenAIRE

    Mirela Anca Postole

    2013-01-01

    The impact of indirect taxes is analysed in the study of evolution, especially the VAT for the economic activity of the company studied. During the reporting period, namely January 2009 – December 2011 the supporting documents were checked which records on VAT deductible and collected were based on, in compliance with legal norms and principles of financial accounting. Also the data processed were the basis for an analysis to compare the evolution of VAT. VAT shall be paid for the entire ac...

  1. Initial Implementation of a Comparative Data Analysis Ontology

    OpenAIRE

    Francisco Prosdocimi; Brandon Chisham; Enrico Pontelli; Thompson, Julie D.; Arlin Stoltzfus

    2009-01-01

    Comparative analysis is used throughout biology. When entities under comparison (e.g. proteins, genomes, species) are related by descent, evolutionary theory provides a framework that, in principle, allows N-ary comparisons of entities, while controlling for non-independence due to relatedness. Powerful software tools exist for specialized applications of this approach, yet it remains under-utilized in the absence of a unifying informatics infrastructure. A key step in developing such an infr...

  2. Comparative analysis of some brushless motors based on catalog data

    Directory of Open Access Journals (Sweden)

    Anton Kalapish

    2005-10-01

    Full Text Available Brushless motors (polyphased AC induction, synchronous and brushless DC motors have no alternatives in modern electric drives. They possess highly efficient and very wide range of speeds. The objective of this paper is to represent some relation between the basic parameters and magnitudes of electrical machines. This allows to be made a comparative analysis and a choice of motor concerning each particular case based not only on catalogue data or price for sale.

  3. Comparative analysis of plasmids in the genus Listeria.

    OpenAIRE

    Kuenne, Carsten; Voget, Sonja; Pischimarov, Jordan; Oehm, Sebastian; Goesmann, Alexander; Daniel, Rolf; Hain, Torsten; Chakraborty, Trinad

    2010-01-01

    BACKGROUND: We sequenced four plasmids of the genus Listeria, including two novel plasmids from L. monocytogenes serotype 1/2c and 7 strains as well as one from the species L. grayi. A comparative analysis in conjunction with 10 published Listeria plasmids revealed a common evolutionary background. PRINCIPAL FINDINGS: All analysed plasmids share a common replicon-type related to theta-replicating plasmid pAMbeta1. Nonetheless plasmids could be broadly divided into two distinct groups bas...

  4. Comparative analysis of solid waste management in 20 cities

    OpenAIRE

    Wilson, D C; Rodic-Wiersma, L.; Scheinberg, A.; Velis, C. A.; Alabaster, G.

    2012-01-01

    This paper uses the ‘lens’ of integrated and sustainable waste management (ISWM) to analyse the new data set compiled on 20 cities in six continents for the UN-Habitat flagship publication Solid Waste Management in the World’s Cities. The comparative analysis looks first at waste generation rates and waste composition data. A process flow diagram is prepared for each city, as a powerful tool for representing the solid waste system as a whole in a comprehensive but concise way. Benchmark...

  5. Malaysian Real Estate Investment Trusts: A Performance and Comparative Analysis

    OpenAIRE

    Tze San Ong; Boon Heng Teh; Chin Hooi Soh; Yat Liang Yan

    2012-01-01

    This study examines the investment performance of conventional and Islamic Real Estate Investment Trusts (REITs) listed in Malaysia over the 2005–10 time period. Analysis reveals that both conventional and Islamic REITs experienced negative monthly return during 2008 global financial crisis (GFC) period, and positive monthly return post GFC period. Compared to market indices, most REITs are under-performed before GFC. Divergent findings were reported during the GFC and post-GFC, depending o...

  6. The diachronic analysis of pastoralism through comparative variables

    OpenAIRE

    Nixon, Lucia; Price, Simon R. F.

    2001-01-01

    Diachronic analyses of pastoralism over the millennia pose a problem. Studies of one period can use models based on other periods as heuristic devices, to pose problems and questions for investigation. But survey archaeologists and others engaged in diachronic analysis cannot assume a period-specific model as a starting point. Instead, we propose that investigation begin from a set of seven variables, which constitute the elements for the formulation of comparative analyses: environment; loca...

  7. Comparative analysis of some brushless motors based on catalog data

    OpenAIRE

    Anton Kalapish; Dimitar Sotirov; Dimitrina Koeva

    2005-01-01

    Brushless motors (polyphased AC induction, synchronous and brushless DC motors) have no alternatives in modern electric drives. They possess highly efficient and very wide range of speeds. The objective of this paper is to represent some relation between the basic parameters and magnitudes of electrical machines. This allows to be made a comparative analysis and a choice of motor concerning each particular case based not only on catalogue data or price for sale.

  8. Overview and comparative analysis of digital airborne photogrammetric surveying systems

    OpenAIRE

    Tamše, Mojca

    2010-01-01

    In the thesis, three the most important digital airborne photogrammetric systems are described, as well as their comparative analysis and the application of this technology in Slovenia are presented. Digital technology is being constantly developed and use of aerial images is being increased. Traditionally, aerial images were mostly used for digital topographic mapping, nowadays their use is focused on producing digital orthophoto, which is being used in different areas. First, main character...

  9. Comparative analysis of the mitochondrial genomes in gastropods

    International Nuclear Information System (INIS)

    In this work we presented a comparative analysis of the mitochondrial genomes in gastropods. Nucleotide and amino acids composition was calculated and a comparative visual analysis of the start and termination codons was performed. The organization of the genome was compared calculating the number of intergenic sequences, the location of the genes and the number of reorganized genes (breakpoints) in comparison with the sequence that is presumed to be ancestral for the group. In order to calculate variations in the rates of molecular evolution within the group, the relative rate test was performed. In spite of the differences in the size of the genomes, the amino acids number is conserved. The nucleotide and amino acid composition is similar between Vetigastropoda, Ceanogastropoda and Neritimorpha in comparison to Heterobranchia and Patellogastropoda. The mitochondrial genomes of the group are very compact with few intergenic sequences, the only exception is the genome of Patellogastropoda with 26,828 bp. Start codons of the Heterobranchia and Patellogastropoda are very variable and there is also an increase in genome rearrangements for these two groups. Generally, the hypothesis of constant rates of molecular evolution between the groups is rejected, except when the genomes of Caenogastropoda and Vetigastropoda are compared.

  10. Low Cost Desktop Image Analysis Workstation With Enhanced Interactive User Interface

    Science.gov (United States)

    Ratib, Osman M.; Huang, H. K.

    1989-05-01

    A multimodality picture archiving and communication system (PACS) is in routine clinical use in the UCLA Radiology Department. Several types workstations are currently implemented for this PACS. Among them, the Apple Macintosh II personal computer was recently chosen to serve as a desktop workstation for display and analysis of radiological images. This personal computer was selected mainly because of its extremely friendly user-interface, its popularity among the academic and medical community and its low cost. In comparison to other microcomputer-based systems the Macintosh II offers the following advantages: the extreme standardization of its user interface, file system and networking, and the availability of a very large variety of commercial software packages. In the current configuration the Macintosh II operates as a stand-alone workstation where images are imported from a centralized PACS server through an Ethernet network using a standard TCP-IP protocol, and stored locally on magnetic disk. The use of high resolution screens (1024x768 pixels x 8bits) offer sufficient performance for image display and analysis. We focused our project on the design and implementation of a variety of image analysis algorithms ranging from automated structure and edge detection to sophisticated dynamic analysis of sequential images. Specific analysis programs were developed for ultrasound images, digitized angiograms, MRI and CT tomographic images and scintigraphic images.

  11. Performance analysis of an adaptive user interface system based on mobile agents

    OpenAIRE

    Mitrovic, Nikola; Royo, Jos?? Alberto; Mena Nieto, Eduardo

    2009-01-01

    Adapting graphical user interfaces for various user devices is one of the most interesting topics in today's mobile computation. In this paper we present a system based on mobile agents that transparently adapts user interface specifications to the user device' capabilities and monitors user interaction. Specialized agents manage GUI specification according to the specific context and user preferences. We show how the user behavior can be monitored at runtime in a transparent way and how lear...

  12. Comparative analysis of breast cancer detection in mammograms and thermograms.

    Science.gov (United States)

    Milosevic, Marina; Jankovic, Dragan; Peulic, Aleksandar

    2015-02-01

    Abstract In this paper, we present a system based on feature extraction techniques for detecting abnormal patterns in digital mammograms and thermograms. A comparative study of texture-analysis methods is performed for three image groups: mammograms from the Mammographic Image Analysis Society mammographic database; digital mammograms from the local database; and thermography images of the breast. Also, we present a procedure for the automatic separation of the breast region from the mammograms. Computed features based on gray-level co-occurrence matrices are used to evaluate the effectiveness of textural information possessed by mass regions. A total of 20 texture features are extracted from the region of interest. The ability of feature set in differentiating abnormal from normal tissue is investigated using a support vector machine classifier, Naive Bayes classifier and K-Nearest Neighbor classifier. To evaluate the classification performance, five-fold cross-validation method and receiver operating characteristic analysis was performed. PMID:25720034

  13. Comparative analysis of fusion reactors with inertial confinement

    International Nuclear Information System (INIS)

    Comparative analysis of reactor designs with inertial confinement is conducted. Characteristics of targets, drivers, target chambers, blankets and systems for thermal energy conversion to electric are considered. Lasers, high-current generators of electromagnetic pulse, electron and light ion beams, heavy ion accelerators are considered as drivers. Parametric analysis revealed that if energy of target ignition is less than 1 MJ, then laser reactor is more profitable if energy equals 10 MJ then the reactor on the basis of heavy ion accelerator is more profitable. Service life of the laser should cover 108 pulses. Conversion of thermal power to electric is advisable to be realized by means of a standard steam-turbine cycle. The analysis disclosed that a hybride fusion reactor is more efficient than a fusion reactor

  14. Transversal analysis of public policies on user fees exemptions in six West African countries

    Directory of Open Access Journals (Sweden)

    Ridde Valéry

    2012-11-01

    Full Text Available Abstract Background While more and more West African countries are implementing public user fees exemption policies, there is still little knowledge available on this topic. The long time required for scientific production, combined with the needs of decision-makers, led to the creation in 2010 of a project to support implementers in aggregating knowledge on their experiences. This article presents a transversal analysis of user fees exemption policies implemented in Benin, Burkina Faso, Mali, Niger, Togo and Senegal. Methods This was a multiple case study with several embedded levels of analysis. The cases were public user fees exemption policies selected by the participants because of their instructive value. The data used in the countries were taken from documentary analysis, interviews and questionnaires. The transversal analysis was based on a framework for studying five implementation components and five actors’ attitudes usually encountered in these policies. Results The analysis of the implementation components revealed: a majority of State financing; maintenance of centrally organized financing; a multiplicity of reimbursement methods; reimbursement delays and/or stock shortages; almost no implementation guides; a lack of support measures; communication plans that were rarely carried out, funded or renewed; health workers who were given general information but not details; poorly informed populations; almost no evaluation systems; ineffective and poorly funded coordination systems; low levels of community involvement; and incomplete referral-evacuation systems. With regard to actors’ attitudes, the analysis revealed: objectives that were appreciated by everyone; dissatisfaction with the implementation; specific tensions between healthcare providers and patients; overall satisfaction among patients, but still some problems; the perception that while the financial barrier has been removed, other barriers persist; occasionally a reorganization of practices, service rationing due to lack of reimbursement, and some overcharging or shifting of resources. Conclusions This transversal analysis confirms the need to assign a great deal of importance to the implementation of user fees exemption policies once these decisions have been taken. It also highlights some practices that suggest avenues of future research.

  15. GATA: a graphic alignment tool for comparative sequence analysis

    Directory of Open Access Journals (Sweden)

    Nix David A

    2005-01-01

    Full Text Available Abstract Background Several problems exist with current methods used to align DNA sequences for comparative sequence analysis. Most dynamic programming algorithms assume that conserved sequence elements are collinear. This assumption appears valid when comparing orthologous protein coding sequences. Functional constraints on proteins provide strong selective pressure against sequence inversions, and minimize sequence duplications and feature shuffling. For non-coding sequences this collinearity assumption is often invalid. For example, enhancers contain clusters of transcription factor binding sites that change in number, orientation, and spacing during evolution yet the enhancer retains its activity. Dot plot analysis is often used to estimate non-coding sequence relatedness. Yet dot plots do not actually align sequences and thus cannot account well for base insertions or deletions. Moreover, they lack an adequate statistical framework for comparing sequence relatedness and are limited to pairwise comparisons. Lastly, dot plots and dynamic programming text outputs fail to provide an intuitive means for visualizing DNA alignments. Results To address some of these issues, we created a stand alone, platform independent, graphic alignment tool for comparative sequence analysis (GATA http://gata.sourceforge.net/. GATA uses the NCBI-BLASTN program and extensive post-processing to identify all small sub-alignments above a low cut-off score. These are graphed as two shaded boxes, one for each sequence, connected by a line using the coordinate system of their parent sequence. Shading and colour are used to indicate score and orientation. A variety of options exist for querying, modifying and retrieving conserved sequence elements. Extensive gene annotation can be added to both sequences using a standardized General Feature Format (GFF file. Conclusions GATA uses the NCBI-BLASTN program in conjunction with post-processing to exhaustively align two DNA sequences. It provides researchers with a fine-grained alignment and visualization tool aptly suited for non-coding, 0–200 kb, pairwise, sequence analysis. It functions independent of sequence feature ordering or orientation, and readily visualizes both large and small sequence inversions, duplications, and segment shuffling. Since the alignment is visual and does not contain gaps, gene annotation can be added to both sequences to create a thoroughly descriptive picture of DNA conservation that is well suited for comparative sequence analysis.

  16. Comparative Analysis Of Fuzzy Clustering Algorithms In Data Mining

    Directory of Open Access Journals (Sweden)

    Binsy Thomas, Madhu Nashipudimath

    2012-09-01

    Full Text Available Data clustering acts as an intelligent tool, a method that allows the user to handle large volumes of data effectively. The basic function of clustering is to transform data of any origin into a more compact form, one that represents accurately the original data. Clustering algorithms are used to analyze these large collection of data by means of subdividing them into groups of similar data. Fuzzy clustering extends the crisp clustering technique in such a way that instead of an object belonging to just one cluster at a time, the object belongs to one or more clusters at the same time with appropriate membership values assigned to the object in a cluster. This paper addresses the major issues associated with the conventional partitional clustering algorithms, namely difficulty in determining the cluster centers and handling noise or outlier points. Integration of fuzzy logic in data mining subjugates these traditional methods to handle natural data which are often vague. The study provides an analysis of two fuzzy clustering algorithms videlicet fuzzy c- means and adaptive fuzzy clustering algorithm and its illustration on different fields.

  17. Discovering Latent Patterns from the Analysis of User-Curated Movie Lists

    CERN Document Server

    Greene, Derek

    2013-01-01

    User content curation is becoming an important source of preference data, as well as providing information regarding the items being curated. One popular approach involves the creation of lists. On Twitter, these lists might contain accounts relevant to a particular topic, whereas on a community site such as the Internet Movie Database (IMDb), this might take the form of lists of movies sharing common characteristics. While list curation involves substantial combined effort on the part of users, researchers have rarely looked at mining the outputs of this kind of crowdsourcing activity. Here we study a large collection of movie lists from IMDb. We apply network analysis methods to a graph that reflects the degree to which pairs of movies are "co-listed", that is, assigned to the same lists. This allows us to uncover a more nuanced grouping of movies that goes beyond categorisation schemes based on attributes such as genre or director.

  18. A comparative analysis of the Global Land Cover 2000 and MODIS land cover data sets

    Science.gov (United States)

    Giri, C.; Zhu, Z.; Reed, B.

    2005-01-01

    Accurate and up-to-date global land cover data sets are necessary for various global change research studies including climate change, biodiversity conservation, ecosystem assessment, and environmental modeling. In recent years, substantial advancement has been achieved in generating such data products. Yet, we are far from producing geospatially consistent high-quality data at an operational level. We compared the recently available Global Land Cover 2000 (GLC-2000) and MODerate resolution Imaging Spectrometer (MODIS) global land cover data to evaluate the similarities and differences in methodologies and results, and to identify areas of spatial agreement and disagreement. These two global land cover data sets were prepared using different data sources, classification systems, and methodologies, but using the same spatial resolution (i.e., 1 km) satellite data. Our analysis shows a general agreement at the class aggregate level except for savannas/shrublands, and wetlands. The disagreement, however, increases when comparing detailed land cover classes. Similarly, percent agreement between the two data sets was found to be highly variable among biomes. The identified areas of spatial agreement and disagreement will be useful for both data producers and users. Data producers may use the areas of spatial agreement for training area selection and pay special attention to areas of disagreement for further improvement in future land cover characterization and mapping. Users can conveniently use the findings in the areas of agreement, whereas users might need to verify the informaiton in the areas of disagreement with the help of secondary information. Learning from past experience and building on the existing infrastructure (e.g., regional networks), further research is necessary to (1) reduce ambiguity in land cover definitions, (2) increase availability of improved spatial, spectral, radiometric, and geometric resolution satellite data, and (3) develop advanced classification algorithms. ?? 2004 Elsevier Inc. All rights reserved.

  19. GUARDD: user-friendly MATLAB software for rigorous analysis of CPMG RD NMR data

    International Nuclear Information System (INIS)

    Molecular dynamics are essential for life, and nuclear magnetic resonance (NMR) spectroscopy has been used extensively to characterize these phenomena since the 1950s. For the past 15 years, the Carr-Purcell Meiboom-Gill relaxation dispersion (CPMG RD) NMR experiment has afforded advanced NMR labs access to kinetic, thermodynamic, and structural details of protein and RNA dynamics in the crucial ?s-ms time window. However, analysis of RD data is challenging because datasets are often large and require many non-linear fitting parameters, thereby confounding assessment of accuracy. Moreover, novice CPMG experimentalists face an additional barrier because current software options lack an intuitive user interface and extensive documentation. Hence, we present the open-source software package GUARDD (Graphical User-friendly Analysis of Relaxation Dispersion Data), which is designed to organize, automate, and enhance the analytical procedures which operate on CPMG RD data (http://code.google.com/p/guardd/http://code.google.com/p/guardd/). This MATLAB-based program includes a graphical user interface, permits global fitting to multi-field, multi-temperature, multi-coherence data, and implements ?2-mapping procedures, via grid-search and Monte Carlo methods, to enhance and assess fitting accuracy. The presentation features allow users to seamlessly traverse the large amount of results, and the RD Simulator feature can help design future experiments as well as serdesign future experiments as well as serve as a teaching tool for those unfamiliar with RD phenomena. Based on these innovative features, we expect that GUARDD will fill a well-defined gap in service of the RD NMR community.

  20. Alkahest NuclearBLAST : a user-friendly BLAST management and analysis system

    Directory of Open Access Journals (Sweden)

    Burke Mark

    2005-06-01

    Full Text Available Abstract Background - Sequencing of EST and BAC end datasets is no longer limited to large research groups. Drops in per-base pricing have made high throughput sequencing accessible to individual investigators. However, there are few options available which provide a free and user-friendly solution to the BLAST result storage and data mining needs of biologists. Results - Here we describe NuclearBLAST, a batch BLAST analysis, storage and management system designed for the biologist. It is a wrapper for NCBI BLAST which provides a user-friendly web interface which includes a request wizard and the ability to view and mine the results. All BLAST results are stored in a MySQL database which allows for more advanced data-mining through supplied command-line utilities or direct database access. NuclearBLAST can be installed on a single machine or clustered amongst a number of machines to improve analysis throughput. NuclearBLAST provides a platform which eases data-mining of multiple BLAST results. With the supplied scripts, the program can export data into a spreadsheet-friendly format, automatically assign Gene Ontology terms to sequences and provide bi-directional best hits between two datasets. Users with SQL experience can use the database to ask even more complex questions and extract any subset of data they require. Conclusion - This tool provides a user-friendly interface for requesting, viewing and mining of BLAST results which makes the management and data-mining of large sets of BLAST analyses tractable to biologists.

  1. A user`s manual for the program TRES4: Random vibration analysis of vertical-axis wind turbines in turbulent winds

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1994-03-01

    TRES4 is a software package that works with the MSC/NASTRAN finite element analysis code to conduct random vibration analysis of vertical-axis wind turbines. The loads on the turbine are calculated in the time domain to retain the nonlinearities of stalled aerodynamic loadings. The loads are transformed into modal coordinates to reduce the number of degrees of freedom. Power spectra and cross spectra of the loads are calculated in the modal coordinate system. These loads are written in NASTRAN Bulk Data format to be read and applied in a random vibration analysis by NASTRAN. The resulting response is then transformed back to physical coordinates to facilitate user interpretation.

  2. PAPST, a User Friendly and Powerful Java Platform for ChIP-Seq Peak Co-Localization Analysis and Beyond

    Science.gov (United States)

    Bible, Paul W.; Kanno, Yuka; Wei, Lai; Brooks, Stephen R.; O’Shea, John J.; Morasso, Maria I.; Loganantharaj, Rasiah; Sun, Hong-Wei

    2015-01-01

    Comparative co-localization analysis of transcription factors (TFs) and epigenetic marks (EMs) in specific biological contexts is one of the most critical areas of ChIP-Seq data analysis beyond peak calling. Yet there is a significant lack of user-friendly and powerful tools geared towards co-localization analysis based exploratory research. Most tools currently used for co-localization analysis are command line only and require extensive installation procedures and Linux expertise. Online tools partially address the usability issues of command line tools, but slow response times and few customization features make them unsuitable for rapid data-driven interactive exploratory research. We have developed PAPST: Peak Assignment and Profile Search Tool, a user-friendly yet powerful platform with a unique design, which integrates both gene-centric and peak-centric co-localization analysis into a single package. Most of PAPST’s functions can be completed in less than five seconds, allowing quick cycles of data-driven hypothesis generation and testing. With PAPST, a researcher with or without computational expertise can perform sophisticated co-localization pattern analysis of multiple TFs and EMs, either against all known genes or a set of genomic regions obtained from public repositories or prior analysis. PAPST is a versatile, efficient, and customizable tool for genome-wide data-driven exploratory research. Creatively used, PAPST can be quickly applied to any genomic data analysis that involves a comparison of two or more sets of genomic coordinate intervals, making it a powerful tool for a wide range of exploratory genomic research. We first present PAPST’s general purpose features then apply it to several public ChIP-Seq data sets to demonstrate its rapid execution and potential for cutting-edge research with a case study in enhancer analysis. To our knowledge, PAPST is the first software of its kind to provide efficient and sophisticated post peak-calling ChIP-Seq data analysis as an easy-to-use interactive application. PAPST is available at https://github.com/paulbible/papst and is a public domain work. PMID:25970601

  3. PAPST, a User Friendly and Powerful Java Platform for ChIP-Seq Peak Co-Localization Analysis and Beyond.

    Science.gov (United States)

    Bible, Paul W; Kanno, Yuka; Wei, Lai; Brooks, Stephen R; O'Shea, John J; Morasso, Maria I; Loganantharaj, Rasiah; Sun, Hong-Wei

    2015-01-01

    Comparative co-localization analysis of transcription factors (TFs) and epigenetic marks (EMs) in specific biological contexts is one of the most critical areas of ChIP-Seq data analysis beyond peak calling. Yet there is a significant lack of user-friendly and powerful tools geared towards co-localization analysis based exploratory research. Most tools currently used for co-localization analysis are command line only and require extensive installation procedures and Linux expertise. Online tools partially address the usability issues of command line tools, but slow response times and few customization features make them unsuitable for rapid data-driven interactive exploratory research. We have developed PAPST: Peak Assignment and Profile Search Tool, a user-friendly yet powerful platform with a unique design, which integrates both gene-centric and peak-centric co-localization analysis into a single package. Most of PAPST's functions can be completed in less than five seconds, allowing quick cycles of data-driven hypothesis generation and testing. With PAPST, a researcher with or without computational expertise can perform sophisticated co-localization pattern analysis of multiple TFs and EMs, either against all known genes or a set of genomic regions obtained from public repositories or prior analysis. PAPST is a versatile, efficient, and customizable tool for genome-wide data-driven exploratory research. Creatively used, PAPST can be quickly applied to any genomic data analysis that involves a comparison of two or more sets of genomic coordinate intervals, making it a powerful tool for a wide range of exploratory genomic research. We first present PAPST's general purpose features then apply it to several public ChIP-Seq data sets to demonstrate its rapid execution and potential for cutting-edge research with a case study in enhancer analysis. To our knowledge, PAPST is the first software of its kind to provide efficient and sophisticated post peak-calling ChIP-Seq data analysis as an easy-to-use interactive application. PAPST is available at https://github.com/paulbible/papst and is a public domain work. PMID:25970601

  4. What do service users want? A content analysis of what users may write in psychiatric advance directives in India.

    Science.gov (United States)

    Pathare, Soumitra; Shields, Laura; Nardodkar, Renuka; Narasimhan, Lakshmi; Bunders, Joske

    2015-04-01

    Although psychiatric advance directives give service users control over their care, very few studies exist on the content of PADs. This paper aims to contribute to this evidence base by presenting the content of psychiatric advance directives in India. Participants were 75 clients seeking outpatient care at a mental health services organisation in Tamil Nadu, India, who agreed to draft a PAD. Most clients were comfortable with appointing a representative (usually a family member) to make decisions on their behalf during a period of decisional incapacity or relapse, were willing to accept admission to the hospital/clinic and take medication if required, wanted to have a trusted person to discuss their mental health problems. No client used the opportunity to outright refuse treatment. This study highlights an important first step in improving the quality of mental health care by documenting user preferences for care in India. More in-depth research is needed to elicit rich descriptions of experiences of care and user-centred understanding of rights. PMID:25486868

  5. User's manual and analysis methodology of probabilistic fracture mechanics analysis code PASCAL Ver.2 for reactor pressure vessel (Contract research)

    International Nuclear Information System (INIS)

    As a part of the aging structural integrity research for LWR components, the probabilistic fracture mechanics (PFM) analysis code PASCAL (PFM Analysis of Structural Components in Aging LWR) has been developed in JAEA. This code evaluates the conditional probabilities of crack initiation and fracture of a reactor pressure vessel (RPV) under transient conditions such as pressurized thermal shock (PTS). The development of the code has been aimed to improve the accuracy and reliability of analysis by introducing new analysis methodologies and algorithms considering the recent development in the fracture mechanics and computer performance. PASCAL Ver.1 has functions of optimized sampling in the stratified Monte Carlo simulation, elastic-plastic fracture criterion of the R6 method, crack growth analysis models for a semi-elliptical crack, recovery of fracture toughness due to thermal annealing and so on. Since then, under the contract between the Ministry of Economy, Trading and Industry of Japan and JAEA, we have continued to develop and introduce new functions into PASCAL Ver.2 such as the evaluation method for an embedded crack, KI database for a semi-elliptical crack considering stress discontinuity at the base/cladding interface, PTS transient database, and others. A generalized analysis method is proposed on the basis of the development of PASCAL Ver.2 and results of sensitivity analyses. Graphical user interface (GUI) including a generalized method as default values has been also developed for PASCAL Ver.2. This report provides the user's manual and theoretical background of PASCAL Ver.2. (author)

  6. Comparative transcriptome analysis of the metal hyperaccumulator Noccaea caerulescens.

    Science.gov (United States)

    Halimaa, Pauliina; Blande, Daniel; Aarts, Mark G M; Tuomainen, Marjo; Tervahauta, Arja; Kärenlampi, Sirpa

    2014-01-01

    The metal hyperaccumulator Noccaea caerulescens is an established model to study the adaptation of plants to metalliferous soils. Various comparators have been used in these studies. The choice of suitable comparators is important and depends on the hypothesis to be tested and methods to be used. In high-throughput analyses such as microarray, N. caerulescens has been compared to non-tolerant, non-accumulator plants like Arabidopsis thaliana or Thlaspi arvense rather than to the related hypertolerant or hyperaccumulator plants. An underutilized source is N. caerulescens populations with considerable variation in their capacity to accumulate and tolerate metals. Whole transcriptome sequencing (RNA-Seq) is revealing interesting variation in their gene expression profiles. Combining physiological characteristics of N. caerulescens accessions with their RNA-Seq has a great potential to provide detailed insight into the underlying molecular mechanisms, including entirely new gene products. In this review we will critically consider comparative transcriptome analyses carried out to explore metal hyperaccumulation and hypertolerance of N. caerulescens, and demonstrate the potential of RNA-Seq analysis as a tool in evolutionary genomics. PMID:24904610

  7. Peace Negotiations in the Third World: A Comparative Analysis

    Directory of Open Access Journals (Sweden)

    Raúl Benítez Manaut

    1995-01-01

    Full Text Available In this article the negotiations and peace processes in the Third World are analized from a comparative viewpoint in order to focus in on the case of Centroamerica. Reference is made to the special features and common elements of those peace processes in otherregions of the Third World and they are compared to those which have taken place in Centroamerica. It is a retrospective and comparative analysis. For this reason, the author has decided to carry out a brief typology of those conflicts offered by Centroamerica: inNicaragua, El Salvador and Guatemala. Later, the author goes on to analyse the most relevant peace and negotiation processes involving the Third World including one or two from Latin America: the cases of Panama, Afghanistan, Iran-Iraq, Colombia, southern Africa (South Africa, Namibia and Angola and Cambodia. Later, the author goes overthe peace process periods in Centroamerica and the temporary contradictions which are presented by internal conflict, regional conflict and geopolitical conflict. Finally, a comparative methodological exercise is carried out which allows to focus on modes of implementation of the peace processes.

  8. Environmental income and rural livelihoods : a global-comparative analysis

    DEFF Research Database (Denmark)

    Angelsen, Arild; Jagger, Pamela

    2014-01-01

    This paper presents results from a comparative analysis of environmental income from approximately 8000 households in 24 developing countries collected by research partners in CIFOR’s Poverty Environment Network (PEN). Environmental income accounts for 28% of total household income, 77% of which comes from natural forests. Environmental income shares are higher for low-income households, but differences across income quintiles are less pronounced than previously thought. The poor rely more heavily on subsistence products such as wood fuels and wild foods, and on products harvested from natural areas other than forests. In absolute terms environmental income is approximately five times higher in the highest income quintile, compared to the two lowest quintiles.

  9. Stigma, sex work, and substance use: a comparative analysis.

    Science.gov (United States)

    Benoit, Cecilia; McCarthy, Bill; Jansson, Mikael

    2015-03-01

    Stigma is a widely used concept in social science research and an extensive literature claims that stigmatisation contributes to numerous negative health outcomes. However, few studies compare groups that vary in the extent to which they are stigmatised and even fewer studies examine stigma's independent and mediating effects. This article addresses these gaps in a comparative study of perceived stigma and drug use among three low-income feminised service occupations: sex work, food and alcoholic beverage serving, and barbering and hairstyling. An analysis of longitudinal data shows positive associations between sex work, perceived stigma, and socially less acceptable drug use (for example, heroin and cocaine), and that stigma mediates part of the link between sex work and the use of these drugs. Our overall findings suggest that perceived stigma is pronounced among those who work in the sex industry and negatively affects health independently of sex work involvement. PMID:25688450

  10. Sequence and comparative analysis of Leuconostoc dairy bacteriophages

    DEFF Research Database (Denmark)

    Kot, Witold; Hansen, Lars Henrik

    2014-01-01

    Bacteriophages attacking Leuconostoc species may significantly influence the quality of the final product. There is however limited knowledge of this group of phages in the literature. We have determined the complete genome sequences of nine Leuconostoc bacteriophages virulent to either Leuconostoc mesenteroides or Leuconostoc pseudomesenteroides strains. The phages have dsDNA genomes with sizes ranging from 25.7 to 28.4kb. Comparative genomics analysis helped classify the 9 phages into two classes, which correlates with the host species. High percentage of similarity within the classes on both nucleotide and protein levels was observed. Genome comparison also revealed very high conservation of the overall genomic organization between the classes. The genes were organized in functional modules responsible for replication, packaging, head and tail morphogenesis, cell lysis and regulation and modification, respectively. No lysogeny modules were detected. To our knowledge this report provides the first comparative genomic work done on Leuconostoc dairy phages.

  11. Comparative analysis of vitamin status of schoolchildren in recreational period

    Directory of Open Access Journals (Sweden)

    Podrigalo L.V.

    2013-10-01

    Full Text Available It is a comparative analysis of the characteristics of the vitamin status of schoolchildren during the summer recreation of 90th years of the last century and now. The study involved 167 schoolchildren aged 11-14 years. With the help of questionnaires developed by the authors assessed the severity of symptoms of vitamin deficiency, the prevalence of vitamin supplementation, frequency and volume of consumption of fruits and vegetables. It is confirmed that the saturation is the state of the vitamin in children is the best compared with data from 20 years ago, the state of multi-vitamin deficiency is replaced mono-vitamin deficit. The results, data evaluation and the availability of additional fortification of the diet of fruit and vegetables support the need for measures aimed at improving vitamin status. Using the questionnaire method is most appropriate for monitoring the vitamin status of schoolchildren.

  12. Comparative analysis of five protein-protein interaction corpora

    Directory of Open Access Journals (Sweden)

    Ginter Filip

    2008-04-01

    Full Text Available Abstract Background Growing interest in the application of natural language processing methods to biomedical text has led to an increasing number of corpora and methods targeting protein-protein interaction (PPI extraction. However, there is no general consensus regarding PPI annotation and consequently resources are largely incompatible and methods are difficult to evaluate. Results We present the first comparative evaluation of the diverse PPI corpora, performing quantitative evaluation using two separate information extraction methods as well as detailed statistical and qualitative analyses of their properties. For the evaluation, we unify the corpus PPI annotations to a shared level of information, consisting of undirected, untyped binary interactions of non-static types with no identification of the words specifying the interaction, no negations, and no interaction certainty. We find that the F-score performance of a state-of-the-art PPI extraction method varies on average 19 percentage units and in some cases over 30 percentage units between the different evaluated corpora. The differences stemming from the choice of corpus can thus be substantially larger than differences between the performance of PPI extraction methods, which suggests definite limits on the ability to compare methods evaluated on different resources. We analyse a number of potential sources for these differences and identify factors explaining approximately half of the variance. We further suggest ways in which the difficulty of the PPI extraction tasks codified by different corpora can be determined to advance comparability. Our analysis also identifies points of agreement and disagreement in PPI corpus annotation that are rarely explicitly stated by the authors of the corpora. Conclusions Our comparative analysis uncovers key similarities and differences between the diverse PPI corpora, thus taking an important step towards standardization. In the course of this study we have created a major practical contribution in converting the corpora into a shared format. The conversion software is freely available at http://mars.cs.utu.fi/PPICorpora.

  13. Comparing the impacts of renewables: a preliminary analysis

    International Nuclear Information System (INIS)

    This paper describes a framework for comparing the environmental impacts of renewable energy sources such as solar, wind, wave, tidal etc. The approach involves ranking the renewable technologies on the basis of the degree to which the technology interferes with natural energy flows. It is based on analysis of the basic physical factors influencing the degree of interference. Five basic factors were identified: energy flux density, the proportion of energy in flow extracted, the efficiency of the conversion device, the number of conversion processes and the increase in energy flux density. (UK)

  14. A comparative analysis of measles virus RNA by oligonucleotide fingerprinting

    International Nuclear Information System (INIS)

    Isolates from two cases of acute measles, one case of acute measles encephalitis and three patients with subacute sclerosing panencephalitis were compared. This comparison was based upon the electrophoretic analysis of T1 oligonucleotides from single-stranded, full-length RNA isolated from cytoplasmic nucleocapsids. Although all viruses have oligonucleotides in common, each isolate generated a unique pattern of oligonucleotides. However, no group of oligonucleotides was observed which would allow differentiation between viruses isolated from acute infections and those isolated from CNS diseases; indicating that probably all measles viruses differ in their nucleotide sequence, regardless of origin. (Author)

  15. COMPARATIVE ANALYSIS OF VAT EVOLUTION IN THE EUROPEAN ECONOMIC SYSTEM

    OpenAIRE

    Mihaela Andreea STROE

    2011-01-01

    In this paper we study a comparative analysis of VAT in different states of the world. I made some observation on this theme because I believe that VAT is very important in carrying out transactions and the increase or decrease of this tax has a major impact upon national economies and also on the quality of life in developing countries. The papers has to pourpose to make a comparison between the American and European system of taxation with its advantages and disadvantages and, in the end to...

  16. Comparative Analysis of Fare Collection System on Bus Operations

    OpenAIRE

    Hafezi, M. H.; Ismail, A.; Shariff, A. A.

    2012-01-01

    This study presents a comparative analysis of fare collection systems for inter-city bus operation. One of the important issues in the bus scheduling model is stops of buses in the bus stations (called dwell time -where buses have to stop for boarding and alighting passengers in the bus station). This issue has a direct impact on increased travel time. Subsequently, increased travel time for one bus mission can cause delay in the loops of bus scheduling. This article describes a survey of far...

  17. Arabidopsis transcription factors: genome-wide comparative analysis among eukaryotes.

    Science.gov (United States)

    Riechmann, J L; Heard, J; Martin, G; Reuber, L; Jiang, C; Keddie, J; Adam, L; Pineda, O; Ratcliffe, O J; Samaha, R R; Creelman, R; Pilgrim, M; Broun, P; Zhang, J Z; Ghandehari, D; Sherman, B K; Yu, G

    2000-12-15

    The completion of the Arabidopsis thaliana genome sequence allows a comparative analysis of transcriptional regulators across the three eukaryotic kingdoms. Arabidopsis dedicates over 5% of its genome to code for more than 1500 transcription factors, about 45% of which are from families specific to plants. Arabidopsis transcription factors that belong to families common to all eukaryotes do not share significant similarity with those of the other kingdoms beyond the conserved DNA binding domains, many of which have been arranged in combinations specific to each lineage. The genome-wide comparison reveals the evolutionary generation of diversity in the regulation of transcription. PMID:11118137

  18. Comparative Analysis of Methods to Denoise CT Scan Images

    OpenAIRE

    Tarandeep Chhabra, Geetika Dua

    2013-01-01

    Medical images are generally noisy due to the physical mechanisms of the acquisition process. In CT Scan there is a scope to adapt patient image quality and dose. Reduction in radiation dose (i.e the amount of X-rays) affects the quality of image and is responsible for image noise in CT. Most of the denoising algorithms assume additive white Gaussian noise but however most medical images may contain non Gaussian noise like poisson noise in CT. This paper contains the comparative analysis of a...

  19. Comparative Analysis of the Value Added Tax Evolution

    Directory of Open Access Journals (Sweden)

    Mirela Anca Postole

    2013-06-01

    Full Text Available The impact of indirect taxes is analysed in the study of evolution, especially the VAT for the economic activity of the company studied. During the reporting period, namely January 2009 – December 2011 the supporting documents were checked which records on VAT deductible and collected were based on, in compliance with legal norms and principles of financial accounting. Also the data processed were the basis for an analysis to compare the evolution of VAT. VAT shall be paid for the entire activity of the company.

  20. Comparative Analysis of Uncertainties in Urban Surface Runoff Modelling

    DEFF Research Database (Denmark)

    Thorndahl, SØren; Schaarup-Jensen, Kjeld

    2007-01-01

    In the present paper a comparison between three different surface runoff models, in the numerical urban drainage tool MOUSE, is conducted. Analysing parameter uncertainty, it is shown that the models are very sensitive with regards to the choice of hydrological parameters, when combined overflow volumes are compared - especially when the models are uncalibrated. The occurrences of flooding and surcharge are highly dependent on both hydrological and hydrodynamic parameters. Thus, the conclusion of the paper is that if the use of model simulations is to be a reliable tool for drainage system analysis, further research in improved parameter assessment for surface runoff models is needed.

  1. The DNA sequence and comparative analysis of human chromosome 20.

    Science.gov (United States)

    Deloukas, P; Matthews, L H; Ashurst, J; Burton, J; Gilbert, J G; Jones, M; Stavrides, G; Almeida, J P; Babbage, A K; Bagguley, C L; Bailey, J; Barlow, K F; Bates, K N; Beard, L M; Beare, D M; Beasley, O P; Bird, C P; Blakey, S E; Bridgeman, A M; Brown, A J; Buck, D; Burrill, W; Butler, A P; Carder, C; Carter, N P; Chapman, J C; Clamp, M; Clark, G; Clark, L N; Clark, S Y; Clee, C M; Clegg, S; Cobley, V E; Collier, R E; Connor, R; Corby, N R; Coulson, A; Coville, G J; Deadman, R; Dhami, P; Dunn, M; Ellington, A G; Frankland, J A; Fraser, A; French, L; Garner, P; Grafham, D V; Griffiths, C; Griffiths, M N; Gwilliam, R; Hall, R E; Hammond, S; Harley, J L; Heath, P D; Ho, S; Holden, J L; Howden, P J; Huckle, E; Hunt, A R; Hunt, S E; Jekosch, K; Johnson, C M; Johnson, D; Kay, M P; Kimberley, A M; King, A; Knights, A; Laird, G K; Lawlor, S; Lehvaslaiho, M H; Leversha, M; Lloyd, C; Lloyd, D M; Lovell, J D; Marsh, V L; Martin, S L; McConnachie, L J; McLay, K; McMurray, A A; Milne, S; Mistry, D; Moore, M J; Mullikin, J C; Nickerson, T; Oliver, K; Parker, A; Patel, R; Pearce, T A; Peck, A I; Phillimore, B J; Prathalingam, S R; Plumb, R W; Ramsay, H; Rice, C M; Ross, M T; Scott, C E; Sehra, H K; Shownkeen, R; Sims, S; Skuce, C D; Smith, M L; Soderlund, C; Steward, C A; Sulston, J E; Swann, M; Sycamore, N; Taylor, R; Tee, L; Thomas, D W; Thorpe, A; Tracey, A; Tromans, A C; Vaudin, M; Wall, M; Wallis, J M; Whitehead, S L; Whittaker, P; Willey, D L; Williams, L; Williams, S A; Wilming, L; Wray, P W; Hubbard, T; Durbin, R M; Bentley, D R; Beck, S; Rogers, J

    The finished sequence of human chromosome 20 comprises 59,187,298 base pairs (bp) and represents 99.4% of the euchromatic DNA. A single contig of 26 megabases (Mb) spans the entire short arm, and five contigs separated by gaps totalling 320 kb span the long arm of this metacentric chromosome. An additional 234,339 bp of sequence has been determined within the pericentromeric region of the long arm. We annotated 727 genes and 168 pseudogenes in the sequence. About 64% of these genes have a 5' and a 3' untranslated region and a complete open reading frame. Comparative analysis of the sequence of chromosome 20 to whole-genome shotgun-sequence data of two other vertebrates, the mouse Mus musculus and the puffer fish Tetraodon nigroviridis, provides an independent measure of the efficiency of gene annotation, and indicates that this analysis may account for more than 95% of all coding exons and almost all genes. PMID:11780052

  2. Comparative analysis of some soil compaction measurement techniques

    Directory of Open Access Journals (Sweden)

    I. Shmulevich

    1995-09-01

    Full Text Available There is a need to properly define soil compaction as one of the complex soil characteristics relevant to agriculture since it greatly influences plant growth and energy consumption. The level of soil compaction may be described by many, well known, parameters, which also can be comparatively analysed according to its sensitivity and ability to describe soil reaction to the applied load. This paper presents a specific analysis of soil compaction measurement methods based on laboratory testing. The sensitivity of usual compaction parameters such as tire sinkage, cone index and soil bulk density, as well as needle penetration were taken into consideration. The paper also includes the critical analysis of different measurement techniques and its possibility to be a source of valuable agricultural information.

  3. The DNA sequence and comparative analysis of human chromosome 10.

    Science.gov (United States)

    Deloukas, P; Earthrowl, M E; Grafham, D V; Rubenfield, M; French, L; Steward, C A; Sims, S K; Jones, M C; Searle, S; Scott, C; Howe, K; Hunt, S E; Andrews, T D; Gilbert, J G R; Swarbreck, D; Ashurst, J L; Taylor, A; Battles, J; Bird, C P; Ainscough, R; Almeida, J P; Ashwell, R I S; Ambrose, K D; Babbage, A K; Bagguley, C L; Bailey, J; Banerjee, R; Bates, K; Beasley, H; Bray-Allen, S; Brown, A J; Brown, J Y; Burford, D C; Burrill, W; Burton, J; Cahill, P; Camire, D; Carter, N P; Chapman, J C; Clark, S Y; Clarke, G; Clee, C M; Clegg, S; Corby, N; Coulson, A; Dhami, P; Dutta, I; Dunn, M; Faulkner, L; Frankish, A; Frankland, J A; Garner, P; Garnett, J; Gribble, S; Griffiths, C; Grocock, R; Gustafson, E; Hammond, S; Harley, J L; Hart, E; Heath, P D; Ho, T P; Hopkins, B; Horne, J; Howden, P J; Huckle, E; Hynds, C; Johnson, C; Johnson, D; Kana, A; Kay, M; Kimberley, A M; Kershaw, J K; Kokkinaki, M; Laird, G K; Lawlor, S; Lee, H M; Leongamornlert, D A; Laird, G; Lloyd, C; Lloyd, D M; Loveland, J; Lovell, J; McLaren, S; McLay, K E; McMurray, A; Mashreghi-Mohammadi, M; Matthews, L; Milne, S; Nickerson, T; Nguyen, M; Overton-Larty, E; Palmer, S A; Pearce, A V; Peck, A I; Pelan, S; Phillimore, B; Porter, K; Rice, C M; Rogosin, A; Ross, M T; Sarafidou, T; Sehra, H K; Shownkeen, R; Skuce, C D; Smith, M; Standring, L; Sycamore, N; Tester, J; Thorpe, A; Torcasso, W; Tracey, A; Tromans, A; Tsolas, J; Wall, M; Walsh, J; Wang, H; Weinstock, K; West, A P; Willey, D L; Whitehead, S L; Wilming, L; Wray, P W; Young, L; Chen, Y; Lovering, R C; Moschonas, N K; Siebert, R; Fechtel, K; Bentley, D; Durbin, R; Hubbard, T; Doucette-Stamm, L; Beck, S; Smith, D R; Rogers, J

    2004-05-27

    The finished sequence of human chromosome 10 comprises a total of 131,666,441 base pairs. It represents 99.4% of the euchromatic DNA and includes one megabase of heterochromatic sequence within the pericentromeric region of the short and long arm of the chromosome. Sequence annotation revealed 1,357 genes, of which 816 are protein coding, and 430 are pseudogenes. We observed widespread occurrence of overlapping coding genes (either strand) and identified 67 antisense transcripts. Our analysis suggests that both inter- and intrachromosomal segmental duplications have impacted on the gene count on chromosome 10. Multispecies comparative analysis indicated that we can readily annotate the protein-coding genes with current resources. We estimate that over 95% of all coding exons were identified in this study. Assessment of single base changes between the human chromosome 10 and chimpanzee sequence revealed nonsense mutations in only 21 coding genes with respect to the human sequence. PMID:15164054

  4. Initial sequencing and comparative analysis of the mouse genome

    Energy Technology Data Exchange (ETDEWEB)

    Waterston, Robert H.; Lindblad-Toh, Kerstin; Birney, Ewan; Rogers, Jane; Abril, Josep F.; Agarwal, Pankaj; Agarwala, Richa; Ainscough, Rachel; Alexandersson, Marina; An, Peter; Antonarakis, Stylianos E.; Attwood, John; Baertsch, Robert; Bailey, Jonathon; Barlow, Karen; Beck, Stephan; Berry, Eric; Birren, Bruce; Bloom, Toby; Bork, Peer; Botcherby, Marc; Bray, Nicolas; Brent, Michael R.; Brown, Daniel G.; Brown, Stephen D.; Bult, Carol; Burton, John; Butler, Jonathan; Campbell, Robert D.; Carninci, Piero; Cawley, Simon; Chiaromonte, Francesca; Chinwalla, Asif T.; Church, Deanna M.; Clamp, Michele; Clee, Christopher; Collins, Francis S.; Cook, Lisa L.; Copley, Richard R.; Coulson, Alan; Couronne, Olivier; Cuff, James; Curwen, Val; Cutts, Tim; Daly, Mark; David, Robert; Davies, Joy; Delehaunty, Kimberly D.; Deri, Justin; Dermitzakis, Emmanouil T.; Dewey, Colin; Dickens, Nicholas J.; Diekhans, Mark; Dodge, Sheila; Dubchak, Inna; Dunn, Diane M.; Eddy, Sean R.; Elnitski, Laura; Emes, Richard D.; Eswara, Pallavi; Eyras, Eduardo; Felsenfeld, Adam; Fewell, Ginger A.; Flicek, Paul; Foley, Karen; Frankel, Wayne N.; Fulton, Lucinda A.; Fulton, Robert S.; Furey, Terrence S.; Gage, Diane; Gibbs, Richard A.; Glusman, Gustavo; Gnerre, Sante; Goldman, Nick; Goodstadt, Leo; Grafham, Darren; Graves, Tina A.; Green, Eric D.; Gregory, Simon; Guigo, Roderic; Guyer, Mark; Hardison, Ross C.; Haussler, David; Hayashizaki, Yoshihide; Hillier, LaDeana W.; Hinrichs, Angela; Hlavina, Wratko; Holzer, Timothy; Hsu, Fan; Hua, Axin; Hubbard, Tim; Hunt, Adrienne; Jackson, Ian; Jaffe, David B.; Johnson, L. Steven; Jones, Matthew; Jones, Thomas A.; Joy, Ann; Kamal, Michael; Karlsson, Elinor K.; Karolchik, Donna; Kasprzyk, Arkadiusz; Kawai, Jun; Keibler, Evan; Kells, Cristyn; Kent, W. James; Kirby, Andrew; Kolbe, Diana L.; Korf, Ian; Kucherlapati, Raju S.; Kulbokas III, Edward J.; Kulp, David; Landers, Tom; Leger, J.P.; Leonard, Steven; Letunic, Ivica; Levine, Rosie; et al.

    2002-12-15

    The sequence of the mouse genome is a key informational tool for understanding the contents of the human genome and a key experimental tool for biomedical research. Here, we report the results of an international collaboration to produce a high-quality draft sequence of the mouse genome. We also present an initial comparative analysis of the mouse and human genomes, describing some of the insights that can be gleaned from the two sequences. We discuss topics including the analysis of the evolutionary forces shaping the size, structure and sequence of the genomes; the conservation of large-scale synteny across most of the genomes; the much lower extent of sequence orthology covering less than half of the genomes; the proportions of the genomes under selection; the number of protein-coding genes; the expansion of gene families related to reproduction and immunity; the evolution of proteins; and the identification of intraspecies polymorphism.

  5. A cluster-randomised controlled trial to compare the effectiveness of different knowledge-transfer interventions for rural working equid users in Ethiopia

    OpenAIRE

    Stringer, A.P.; Bell, C E; Christley, R.M.; Gebreab, F.; Tefera, G.; Reed, K.; Trawford, A.; Pinchbeck, G.L.

    2011-01-01

    There have been few studies evaluating the efficacy of knowledge-transfer methods for livestock owners in developing countries, and to the authors’ knowledge no published work is available that evaluates the effect of knowledge-transfer interventions on the education of working equid users. A cluster-randomised controlled trial (c-RCT) was used to evaluate and compare the effectiveness of three knowledge-transfer interventions on knowledge-change about equid health amongst rural Ethiopian w...

  6. Methods to Recruit Hard-to-Reach Groups: Comparing Two Chain Referral Sampling Methods of Recruiting Injecting Drug Users Across Nine Studies in Russia and Estonia

    OpenAIRE

    Platt, Lucy; Wall, Martin; Rhodes, Tim; Judd, Ali; Hickman, Matthew; Johnston, Lisa G.; Renton, Adrian; Bobrova, Natalia; Sarang, Anya

    2006-01-01

    Evidence suggests rapid diffusion of injecting drug use and associated outbreaks of HIV among injecting drug users (IDUs) in the Russian Federation and Eastern Europe. There remains a need for research among non-treatment and community-recruited samples of IDUs to better estimate the dynamics of HIV transmission and to improve treatment and health services access. We compare two sampling methodologies “respondent-driven sampling” (RDS) and chain referral sampling using “indigenous field...

  7. PCB/transformer techno-economic analysis model: User manual, volume 2

    Science.gov (United States)

    Plum, Martin M.; Geimer, Ray M.

    1989-02-01

    This model is designed to evaluate the economic viability of replacing or retrofilling a PCB transformer with numerous non-PCB transformer options. Replacement options include conventional, amorphous, amorphous-liquid filled, or amorphous-liquid filled-high performance transformers. The retrofill option is the process that removes and disposes this with non-PCB dielectric. Depending on data available, the skills of the user, and the intent of the analysis, there are three model options available to the user. For practical purposes, Level 1 requires the least amount of input data from the user while Level 3 requires the greatest quantity of data. This manual is designed for people who have a minimum experience with Lotus 123 and are familiar with electric transformers. This manual covers system requirements, how to install the model on your system, how to get started, how to move around in the model, how to make changes in the model data, how to print information, how to save your work, and how to exit from the model.

  8. User Decisions in a (Partly) Digital World : Comparing Digital Piracy to Legal Alternatives for Film and Music

    DEFF Research Database (Denmark)

    Veitch, Rob; Constantiou, Ioanna

    2012-01-01

    Technologies enabling digital piracy have expanded the variety of options available to users when deciding how to access a product. As a result, access-mode decisions for film and music are broader than for other goods where the piracy option is not as prevalent. This paper presents a model of access-mode decisions for film and music which integrates elements of previous digital piracy models and expands upon them to reflect the decision’s complexity. We depict the access-mode decision as being influenced by the user’s product desire, price perceptions, perceived risks, internal regulators of behaviour, resources and legal availability. We test the model for film and music using causal data of access-mode decisions collected from students at two Danish universities. Our findings indicate that the economic considerations of price perception and legal availability are the most consistent factors in influencing the access-mode decision across different legal options. The paper concludes with an outline for future research.

  9. Factors Affecting Collective Action for Forest Fire Management: A Comparative Study of Community Forest User Groups in Central Siwalik, Nepal

    Science.gov (United States)

    Sapkota, Lok Mani; Shrestha, Rajendra Prasad; Jourdain, Damien; Shivakoti, Ganesh P.

    2015-01-01

    The attributes of social ecological systems affect the management of commons. Strengthening and enhancing social capital and the enforcement of rules and sanctions aid in the collective action of communities in forest fire management. Using a set of variables drawn from previous studies on the management of commons, we conducted a study across 20 community forest user groups in Central Siwalik, Nepal, by dividing the groups into two categories based on the type and level of their forest fire management response. Our study shows that the collective action in forest fire management is consistent with the collective actions in other community development activities. However, the effectiveness of collective action is primarily dependent on the complex interaction of various variables. We found that strong social capital, strong enforcement of rules and sanctions, and users' participation in crafting the rules were the major variables that strengthen collective action in forest fire management. Conversely, users' dependency on a daily wage and a lack of transparency were the variables that weaken collective action. In fire-prone forests such as the Siwalik, our results indicate that strengthening social capital and forming and enforcing forest fire management rules are important variables that encourage people to engage in collective action in fire management.

  10. Comparative study of analysis methods in biospeckle phenomenon

    Science.gov (United States)

    da Silva, Emerson Rodrigo; Muramatsu, Mikiya

    2008-04-01

    In this work we present a review of main statistical properties of speckle patterns and accomplish a comparative study of the more used methods for analysis and extraction of information from optical grainy. The first and second order space-time statistics are dicussed in an overview perspective. The biospeckle phenomenon has detailed attention, specially in its application on monitoring of activity in tissues. The main techniques used to obtain information from speckle patterns are presented, with special prominence to autocorrelation function, co-occurrence matrices, Fujii's method, Briers' contrast and spatial and temporal contrast analisys (LASCA and LASTCA). An incipient method for analysis, based on the study of sucessive correlations contrast, is introduced. Numerical simulations, using diferent probability density functions for velocities of scatterers, were made with two objectives: to test the analysis methods and to give subsidies for interpretation of in vivo results. Vegetable and animal tissues are investigated, achieving the monitoring of senescence process and vascularization maps on leaves, the accompaniment of fungi contamined fruits, the mapping of activity in flowers and the analisys of healing in rats subjected to abdominal surgery. Experiments using the biospeckle phenomenon in microscopy are carried out. At last, it is evaluated the potentiality of biospeckle as diagnosis tool in chronic vein ulcer cared with low intensity laser therapy and the better analysis methods for each kind of tissue are pointed.

  11. Single and Multiple Hand Gesture Recognition Systems: A Comparative Analysis

    OpenAIRE

    Siddharth Rautaray; Manjusha Pandey

    2014-01-01

    With the evolution of higher computing speed, efficient communication technologies, and advanced display techniques the legacy HCI techniques become obsolete and are no more helpful in accurate and fast flow of information in present day computing devices. Hence the need of user friendly human machine interfaces for real time interfaces for human computer interaction have to be designed and developed to make the man machine interaction more intuitive and user friendly. The vision based hand ...

  12. Comparative Analysis of Different Cryptosystems for Hierarchical Mobile IPv6-based Wireless Mesh Network

    Directory of Open Access Journals (Sweden)

    Ramanarayana Kandikattu

    2010-05-01

    Full Text Available Wireless Mesh Network (WMN is advocated as the major supporting technology for the next generation wireless Internet satisfying the needs of anywhere-anytime broadband Internet access. In order to support secure ubiquitous communications for mobile users, WMN must have an efficient key setup procedure to secure control packets as well as data packets. In this paper we apply four different cryptosystems, namely: (1 RSA; (2 Elliptic Curve Digital Signature Algorithm (ECDSA (3 Identity-Based Cryptography (IBC; and, (4 Elliptic Curve Cryptography-Based Public Key Cryptosystem (ECCSCPKC, to secure Hierarchical Mobile IPv6 (HMIPv6 based WMN. We present detailed cost analysis and numerical results to compare these systems for their suitability to secure HMIPv6 based WMN.

  13. Development of the Graphical User Interface for the Fuel Assembly Bow Analysis

    International Nuclear Information System (INIS)

    KEPCO NF, Westinghouse and ENUSA jointly developed a new fuel assembly growth and bow computer code(SAVAN2D), a new fuel assembly analysis and performance model, and a new GUI(Graphical User Interface) for the prediction of incore deformation behaviour of the fuel assemblies. The SAVAN2D code can analyze fuel assembly growth and bow using fuel assembly design data and core conditions. In this paper, the development results and application areas of the SAVAN2D pre-processing and post-processing program are presented

  14. User Centered Design as a Framework for Applying Conversation Analysis in Hearing Aid Consultations

    DEFF Research Database (Denmark)

    Egbert, Maria; Matthews, Ben

    2011-01-01

    Recurrent issues in applying CA results to change in institutional practices concern the degree to which the CA researcher is involved and what aspects of the change process CA researchers is involved in. This paper presents a methodology from innovation studies called User Centered Design (Buur and Bagger, 1999) and, more recently, Participatory Innovation (Buur and Matthews, 2008) which is uniquely compatible with conversation analysis. Designers following this approach study how a ‘user’ of goods or services interacts with products and other interaction partners in order to derive ideas for innovation. Although this methodological convergence of disciplines is rooted in different traditions, it augurs well for successful cooperation. This paper reports on such a collaboration carried out within a federally funded research center for innovation. We present principles of the interdisciplinary collaboration, as well as successes and pitfalls. In particular we focus on the role of conversation analysistsboth from the perspective of the designers and the conversation analysts. To illustrate this, we have selected a project on hearing aid fitting. To understand the perspective of the users (the person with hearing loss and the hearing aid fitter is imperative because the compliance rate for hearing aid use is staggeringly low. One of the barriers of hearing aid use lies in problematic encounters between the person with hearing loss and audiologists. Buur, J and Matthews, B. (2008) ‘Participatory Innovation’ International Journal of Innovation Management. vol. 12, no. 3, pp. 255-273. Buur, J. and Bagger, K. (1999). ‘Replacing usability testing with user dialogue’ Communications of the ACM 42(5), pp. 63-66.

  15. Analysis of RNAseq datasets from a comparative infectious disease zebrafish model using GeneTiles bioinformatics.

    Science.gov (United States)

    Veneman, Wouter J; de Sonneville, Jan; van der Kolk, Kees-Jan; Ordas, Anita; Al-Ars, Zaid; Meijer, Annemarie H; Spaink, Herman P

    2015-03-01

    We present a RNA deep sequencing (RNAseq) analysis of a comparison of the transcriptome responses to infection of zebrafish larvae with Staphylococcus epidermidis and Mycobacterium marinum bacteria. We show how our developed GeneTiles software can improve RNAseq analysis approaches by more confidently identifying a large set of markers upon infection with these bacteria. For analysis of RNAseq data currently, software programs such as Bowtie2 and Samtools are indispensable. However, these programs that are designed for a LINUX environment require some dedicated programming skills and have no options for visualisation of the resulting mapped sequence reads. Especially with large data sets, this makes the analysis time consuming and difficult for non-expert users. We have applied the GeneTiles software to the analysis of previously published and newly obtained RNAseq datasets of our zebrafish infection model, and we have shown the applicability of this approach also to published RNAseq datasets of other organisms by comparing our data with a published mammalian infection study. In addition, we have implemented the DEXSeq module in the GeneTiles software to identify genes, such as glucagon A, that are differentially spliced under infection conditions. In the analysis of our RNAseq data, this has led to the possibility to improve the size of data sets that could be efficiently compared without using problem-dedicated programs, leading to a quick identification of marker sets. Therefore, this approach will also be highly useful for transcriptome analyses of other organisms for which well-characterised genomes are available. PMID:25503064

  16. Comparative Genomic Analysis of Primary Versus Metastatic Colorectal Carcinomas

    Science.gov (United States)

    Vakiani, Efsevia; Janakiraman, Manickam; Shen, Ronglai; Sinha, Rileen; Zeng, Zhaoshi; Shia, Jinru; Cercek, Andrea; Kemeny, Nancy; D'Angelica, Michael; Viale, Agnes; Heguy, Adriana; Paty, Philip; Chan, Timothy A.; Saltz, Leonard B.; Weiser, Martin; Solit, David B.

    2012-01-01

    Purpose To compare the mutational and copy number profiles of primary and metastatic colorectal carcinomas (CRCs) using both unpaired and paired samples derived from primary and metastatic disease sites. Patients and Methods We performed a multiplatform genomic analysis of 736 fresh frozen CRC tumors from 613 patients. The cohort included 84 patients in whom tumor tissue from both primary and metastatic sites was available and 31 patients with pairs of metastases. Tumors were analyzed for mutations in the KRAS, NRAS, BRAF, PIK3CA, and TP53 genes, with discordant results between paired samples further investigated by analyzing formalin-fixed, paraffin-embedded tissue and/or by 454 sequencing. Copy number aberrations in primary tumors and matched metastases were analyzed by comparative genomic hybridization (CGH). Results TP53 mutations were more frequent in metastatic versus primary tumors (53.1% v 30.3%, respectively; P 90% concordance for all five genes). Clonality analysis of array CGH data suggested that multiple CRC primary tumors or treatment-associated effects were likely etiologies for mutational and/or copy number profile differences between primary tumors and metastases. Conclusion For determining RAS, BRAF, and PIK3CA mutational status, genotyping of the primary CRC is sufficient for most patients. Biopsy of a metastatic site should be considered in patients with a history of multiple primary carcinomas and in the case of TP53 for patients who have undergone interval treatment with radiation or cytotoxic chemotherapies. PMID:22665543

  17. Thermal buckling comparative analysis using Different FE (Finite Element) tools

    Energy Technology Data Exchange (ETDEWEB)

    Banasiak, Waldemar; Labouriau, Pedro [INTECSEA do Brasil, Rio de Janeiro, RJ (Brazil); Burnett, Christopher [INTECSEA UK, Surrey (United Kingdom); Falepin, Hendrik [Fugro Engineers SA/NV, Brussels (Belgium)

    2009-12-19

    High operational temperature and pressure in offshore pipelines may lead to unexpected lateral movements, sometimes call lateral buckling, which can have serious consequences for the integrity of the pipeline. The phenomenon of lateral buckling in offshore pipelines needs to be analysed in the design phase using FEM. The analysis should take into account many parameters, including operational temperature and pressure, fluid characteristic, seabed profile, soil parameters, coatings of the pipe, free spans etc. The buckling initiation force is sensitive to small changes of any initial geometric out-of-straightness, thus the modeling of the as-laid state of the pipeline is an important part of the design process. Recently some dedicated finite elements programs have been created making modeling of the offshore environment more convenient that has been the case with the use of general purpose finite element software. The present paper aims to compare thermal buckling analysis of sub sea pipeline performed using different finite elements tools, i.e. general purpose programs (ANSYS, ABAQUS) and dedicated software (SAGE Profile 3D) for a single pipeline resting on an the seabed. The analyses considered the pipeline resting on a flat seabed with a small levels of out-of straightness initiating the lateral buckling. The results show the quite good agreement of results of buckling in elastic range and in the conclusions next comparative analyses with sensitivity cases are recommended. (author)

  18. 2004/2008 labour market information comparative analysis report

    International Nuclear Information System (INIS)

    The electricity sector has entered into a phase of both challenges and opportunities. Challenges include workforce retirement, labour shortages, and increased competition from other employers to attract and retain the skilled people required to deliver on the increasing demand for electricity in Canada. The electricity sector in Canada is also moving into a new phase, whereby much of the existing infrastructure is either due for significant upgrades, or complete replacement. The increasing demand for electricity means that increased investment and capital expenditure will need to be put toward building new infrastructure altogether. The opportunities for the electricity industry will lie in its ability to effectively and efficiently react to these challenges. The purpose of this report was to provide employers and stakeholders in the sector with relevant and current trend data to help them make appropriate policy and human resource decisions. The report presented a comparative analysis of a 2004 Canadian Electricity Association employer survey with a 2008 Electricity Sector Council employer survey. The comparative analysis highlighted trends and changes that emerged between the 2004 and 2008 studies. Specific topics that were addressed included overall employment trends; employment diversity in the sector; age of non-support staff; recruitment; and retirements and pension eligibility. Recommendations were also offered. It was concluded that the electricity sector coulconcluded that the electricity sector could benefit greatly from implementing on-going recruitment campaigns. refs., tabs., figs

  19. Comparative risk evaluation for siting a waste analysis laboratory

    International Nuclear Information System (INIS)

    With the US Department of Energy (DOE) focus on environmental restoration and waste management at DOE nuclear waste sites, a significant increase is forecast in the quantity of transuranic (TRU) mixed waste samples that require detailed environmental analysis. Under the direction of the laboratory management branch (EM-532), Westinghouse Hanford Company (Westinghouse Hanford) was commissioned to lead the preparation of an engineering study for site selection of the proposed $300-million DOE Waste Analysis Laboratory (WAL). This assessment provides the DOE with important comparative risk information and data in the site selection process. The six DOE sites under consideration are Hanford Site (HANF), Idaho National Engineering Laboratory (INEL), Los Alamos National Laboratory (LANL), Savannah River Site (SRS), Rocky Flats Plant (RFP), and Oak Ridge National Laboratory (ORNL). The study evaluates relative risk contributions from transportation in terms of severe transportation accidents per year and from external events in terms of respective frequency of exceedance. Detailed calculations and relevant information are documented in report WHC-SD-WM-ES-190. This comparative risk assessment provides an objective engineering study that HANF is the most desirable location for the proposed WAL. The favorable safety data for the HANF can be translated to less risk for identical construction or less cost for achieving the same level of safetysafety

  20. Comparative Analysis of Congestion Control Algorithms Using ns-2

    Directory of Open Access Journals (Sweden)

    Sanjeev Patel

    2011-09-01

    Full Text Available In order to curtail the escalating packet loss rates caused by an exponential increase in network traffic, active queue management techniques such as Random Early Detection (RED have come into picture. Flow Random Early Drop (FRED keeps state based on instantaneous queue occupancy of a given flow. FRED protects fragile flows by deterministically accepting flows from low bandwidth connections and fixes several shortcomings of RED by computing queue length during both arrival and departure of the packet. Stochastic Fair Queuing (SFQ ensures fair access to network resources and prevents a busty flow from consuming more than its fair share. In case of (Random Exponential Marking REM, the key idea is to decouple congestion measure from performance measure (loss, queue length or delay. Stabilized RED (SRED is another approach of detecting nonresponsive flows. In this paper, we have shown a comparative analysis of throughput, delay and queue length for the various congestion control algorithms RED, SFQ and REM. We also included the comparative analysis of loss rate having different bandwidth for these algorithms.

  1. Relationship-level analysis of drug users' anticipated changes in risk behavior following HIV vaccination.

    Science.gov (United States)

    Young, April M; Halgin, Daniel S; Havens, Jennifer R

    2015-08-01

    Formative research into the behavioral factors surrounding HIV vaccine uptake is becoming increasingly important as progress is made in HIV vaccine development. Given that the first vaccines on the market are likely to be partially effective, risk compensation (i.e., increased risk behavior following vaccination) may present a concern. This study characterized the relationships in which HIV vaccine-related risk compensation is most likely to occur using dyadic data collected from people who use drugs, a high-risk group markedly underrepresented in extant literature. Data were collected from 433 drug users enrolled in a longitudinal study in the USA. Respondents were asked to provide the first name and last initial of individuals with whom they had injected drugs and/or had sex during the past six months. For each partner, respondents reported their likelihood of increasing risk behavior if they and/or their partner received an HIV vaccine. Using generalized linear mixed models, relationship-level correlates to risk compensation were examined. In bivariate analysis, risk compensation was more likely to occur between partners who have known each other for a shorter time (odds ratio [OR] = 0.95, 95% confidence interval [CI]: 0.90-0.99, p = 0.028) and between those who inject drugs and have sex together (OR = 2.52, CI: 1.05-6.04, p = 0.039). In relationships involving risk compensation, 37% involved partners who had known each other for a year or less compared to only 13% of relationships not involving risk compensation. Adjusting for other variables, duration (OR: 0.95, CI: 0.90-1.00, p = 0.033) was associated with risk compensation intent. These analyses suggest that risk compensation may be more likely to occur in less established relationships and between partners engaging in more than one type of risk behavior. These data provide further support for the need to expand measures of risk compensation in HIV vaccine preparedness studies to assess not only if people will change their behavior, but also with whom. PMID:25730519

  2. Massive comparative genomic analysis reveals convergent evolution of specialized bacteria

    Directory of Open Access Journals (Sweden)

    Raoult Didier

    2009-04-01

    Full Text Available Abstract Background Genome size and gene content in bacteria are associated with their lifestyles. Obligate intracellular bacteria (i.e., mutualists and parasites have small genomes that derived from larger free-living bacterial ancestors; however, the different steps of bacterial specialization from free-living to intracellular lifestyle have not been studied comprehensively. The growing number of available sequenced genomes makes it possible to perform a statistical comparative analysis of 317 genomes from bacteria with different lifestyles. Results Compared to free-living bacteria, host-dependent bacteria exhibit fewer rRNA genes, more split rRNA operons and fewer transcriptional regulators, linked to slower growth rates. We found a function-dependent and non-random loss of the same 100 orthologous genes in all obligate intracellular bacteria. Thus, we showed that obligate intracellular bacteria from different phyla are converging according to their lifestyle. Their specialization is an irreversible phenomenon characterized by translation modification and massive gene loss, including the loss of transcriptional regulators. Although both mutualists and parasites converge by genome reduction, these obligate intracellular bacteria have lost distinct sets of genes in the context of their specific host associations: mutualists have significantly more genes that enable nutrient provisioning whereas parasites have genes that encode Types II, IV, and VI secretion pathways. Conclusion Our findings suggest that gene loss, rather than acquisition of virulence factors, has been a driving force in the adaptation of parasites to eukaryotic cells. This comparative genomic analysis helps to explore the strategies by which obligate intracellular genomes specialize to particular host-associations and contributes to advance our knowledge about the mechanisms of bacterial evolution. Reviewers This article was reviewed by Eugene V. Koonin, Nicolas Galtier, and Jeremy Selengut.

  3. Comparative analysis of solid waste management in 20 cities.

    Science.gov (United States)

    Wilson, David C; Rodic, Ljiljana; Scheinberg, Anne; Velis, Costas A; Alabaster, Graham

    2012-03-01

    This paper uses the 'lens' of integrated and sustainable waste management (ISWM) to analyse the new data set compiled on 20 cities in six continents for the UN-Habitat flagship publication Solid Waste Management in the World's Cities. The comparative analysis looks first at waste generation rates and waste composition data. A process flow diagram is prepared for each city, as a powerful tool for representing the solid waste system as a whole in a comprehensive but concise way. Benchmark indicators are presented and compared for the three key physical components/drivers: public health and collection; environment and disposal; and resource recovery--and for three governance strategies required to deliver a well-functioning ISWM system: inclusivity; financial sustainability; and sound institutions and pro-active policies. Key insights include the variety and diversity of successful models - there is no 'one size fits all'; the necessity of good, reliable data; the importance of focusing on governance as well as technology; and the need to build on the existing strengths of the city. An example of the latter is the critical role of the informal sector in the cities in many developing countries: it not only delivers recycling rates that are comparable with modern Western systems, but also saves the city authorities millions of dollars in avoided waste collection and disposal costs. This provides the opportunity for win-win solutions, so long as the related wider challenges can be addressed. PMID:22407700

  4. Rice fortification: a comparative analysis in mandated settings.

    Science.gov (United States)

    Forsman, Carmen; Milani, Peiman; Schondebare, Jill A; Matthias, Dipika; Guyondet, Christophe

    2014-09-01

    Legal mandates can play an important role in the success of rice fortification programs that involve the private sector. However, merely enacting mandatory legislation does not guarantee success; it requires a coordinated, multidimensional cross-sector effort that addresses stewardship, develops an appropriate rice fortification technology, enables sustainable production and distribution channels through a range of private-sector players, ensures quality, generates consumer demand, and monitors progress. Furthermore, economic sustainability must be built into the supply chain and distribution network to enable the program to outlast government administrations and/or time-limited funding. Hence, mandates can serve as valuable long-term enablers of cross-sector mobilization and collaboration and as catalysts of civil society engagement in and ownership of fortification programs. This paper compares the rice fortification experiences of Costa Rica and the Philippines--two countries with mandates, yet distinctly different industry landscapes. Costa Rica has achieved national success through strong government stewardship and active market development--key elements of success regardless of industry structure. With a comparatively more diffuse rice industry structure, the Philippines has also had success in limited geographies where key stakeholders have played an active role in market development. A comparative analysis provides lessons that may be relevant to other rice fortification programs. PMID:24913356

  5. Mycobacterial species as case-study of comparative genome analysis

    DEFF Research Database (Denmark)

    Zakham, F.; Belayachi, L.

    2011-01-01

    The genus Mycobacterium represents more than 120 species including important pathogens of human and cause major public health problems and illnesses. Further, with more than 100 genome sequences from this genus, comparative genome analysis can provide new insights for better understanding the evolutionary events of these species and improving drugs, vaccines, and diagnostics tools for controlling Mycobacterial diseases. In this present study we aim to outline a comparative genome analysis of fourteen Mycobacterial genomes: M. avium subsp. paratuberculosis K—10, M. bovis AF2122/97, M. bovis BCG str. Pasteur 1173P2, M. leprae Br4923, M. marinum M, M. sp. KMS, M. sp. MCS, M. tuberculosis CDC1551, M. tuberculosis F11, M. tuberculosis H37Ra, M. tuberculosis H37Rv, M. tuberculosis KZN 1435 , M. ulcerans Agy99,and M. vanbaalenii PYR—1, For this purpose a comparison has been done based on their length of genomes, GC content, number of genes in different data bases (Genbank, Refseq, and Prodigal). The BLAST matrix of these genomes has been figured to give a lot of information about the similarity between species in a simple scheme. As a result of multiple genome analysis, the pan and core genome have been defined for twelve Mycobacterial species. We have also introduced the genome atlas of the reference strain M. tuberculosis H37Rv which can give a good overview of this genome. And for examining the phylogenetic relationships among these bacteria, a phylogenic tree has been constructed from 16S rRNA gene for tuberculosis and non tuberculosis Mycobacteria to understand the evolutionary events of these species.

  6. Mycobacterial species as case-study of comparative genome analysis.

    Science.gov (United States)

    Zakham, F; Belayachi, L; Ussery, D; Akrim, M; Benjouad, A; El Aouad, R; Ennaji, M M

    2011-01-01

    The genus Mycobacterium represents more than 120 species including important pathogens of human and cause major public health problems and illnesses. Further, with more than 100 genome sequences from this genus, comparative genome analysis can provide new insights for better understanding the evolutionary events of these species and improving drugs, vaccines, and diagnostics tools for controlling Mycobacterial diseases. In this present study we aim to outline a comparative genome analysis of fourteen Mycobacterial genomes: M. avium subsp. paratuberculosis K—10, M. bovis AF2122/97, M. bovis BCG str. Pasteur 1173P2, M. leprae Br4923, M. marinum M, M. sp. KMS, M. sp. MCS, M. tuberculosis CDC1551, M. tuberculosis F11, M. tuberculosis H37Ra, M. tuberculosis H37Rv, M. tuberculosis KZN 1435 , M. ulcerans Agy99,and M. vanbaalenii PYR—1, For this purpose a comparison has been done based on their length of genomes, GC content, number of genes in different data bases (Genbank, Refseq, and Prodigal). The BLAST matrix of these genomes has been figured to give a lot of information about the similarity between species in a simple scheme. As a result of multiple genome analysis, the pan and core genome have been defined for twelve Mycobacterial species. We have also introduced the genome atlas of the reference strain M. tuberculosis H37Rv which can give a good overview of this genome. And for examining the phylogenetic relationships among these bacteria, a phylogenic tree has been constructed from 16S rRNA gene for tuberculosis and non tuberculosis Mycobacteria to understand the evolutionary events of these species. PMID:21396338

  7. Extreme storm surges: a comparative study of frequency analysis approaches

    Science.gov (United States)

    Hamdi, Y.; Bardet, L.; Duluc, C.-M.; Rebour, V.

    2014-08-01

    In France, nuclear facilities were designed around very low probabilities of failure. Nevertheless, some extreme climatic events have given rise to exceptional observed surges (outliers) much larger than other observations, and have clearly illustrated the potential to underestimate the extreme water levels calculated with the current statistical methods. The objective of the present work is to conduct a comparative study of three approaches to extreme value analysis, including the annual maxima (AM), the peaks-over-threshold (POT) and the r-largest order statistics (r-LOS). These methods are illustrated in a real analysis case study. All data sets were screened for outliers. Non-parametric tests for randomness, homogeneity and stationarity of time series were used. The shape and scale parameter stability plots, the mean excess residual life plot and the stability of the standard errors of return levels were used to select optimal thresholds and r values for the POT and r-LOS method, respectively. The comparison of methods was based on (i) the uncertainty degrees, (ii) the adequacy criteria and tests, and (iii) the visual inspection. It was found that the r-LOS and POT methods have reduced the uncertainty on the distribution parameters and return level estimates and have systematically shown values of the 100 and 500-year return levels smaller than those estimated with the AM method. Results have also shown that none of the compared methods has allowed a good fit at the right tail of the distribution in the presence of outliers. As a perspective, the use of historical information was proposed in order to increase the representativeness of outliers in data sets. Findings are of practical relevance, not only to nuclear energy operators in France, for applications in storm surge hazard analysis and flood management, but also for the optimal planning and design of facilities to withstand extreme environmental conditions, with an appropriate level of risk.

  8. In silico comparative genomic analysis of GABAA receptor transcriptional regulation

    Directory of Open Access Journals (Sweden)

    Joyce Christopher J

    2007-06-01

    Full Text Available Abstract Background Subtypes of the GABAA receptor subunit exhibit diverse temporal and spatial expression patterns. In silico comparative analysis was used to predict transcriptional regulatory features in individual mammalian GABAA receptor subunit genes, and to identify potential transcriptional regulatory components involved in the coordinate regulation of the GABAA receptor gene clusters. Results Previously unreported putative promoters were identified for the ?2, ?1, ?3, ?, ? and ? subunit genes. Putative core elements and proximal transcriptional factors were identified within these predicted promoters, and within the experimentally determined promoters of other subunit genes. Conserved intergenic regions of sequence in the mammalian GABAA receptor gene cluster comprising the ?1, ?2, ?2 and ?6 subunits were identified as potential long range transcriptional regulatory components involved in the coordinate regulation of these genes. A region of predicted DNase I hypersensitive sites within the cluster may contain transcriptional regulatory features coordinating gene expression. A novel model is proposed for the coordinate control of the gene cluster and parallel expression of the ?1 and ?2 subunits, based upon the selective action of putative Scaffold/Matrix Attachment Regions (S/MARs. Conclusion The putative regulatory features identified by genomic analysis of GABAA receptor genes were substantiated by cross-species comparative analysis and now require experimental verification. The proposed model for the coordinate regulation of genes in the cluster accounts for the head-to-head orientation and parallel expression of the ?1 and ?2 subunit genes, and for the disruption of transcription caused by insertion of a neomycin gene in the close vicinity of the ?6 gene, which is proximal to a putative critical S/MAR.

  9. ClimatePipes: User-Friendly Data Access, Manipulation, Analysis & Visualization of Community Climate Models

    Science.gov (United States)

    Chaudhary, A.; DeMarle, D.; Burnett, B.; Harris, C.; Silva, W.; Osmari, D.; Geveci, B.; Silva, C.; Doutriaux, C.; Williams, D. N.

    2013-12-01

    The impact of climate change will resonate through a broad range of fields including public health, infrastructure, water resources, and many others. Long-term coordinated planning, funding, and action are required for climate change adaptation and mitigation. Unfortunately, widespread use of climate data (simulated and observed) in non-climate science communities is impeded by factors such as large data size, lack of adequate metadata, poor documentation, and lack of sufficient computational and visualization resources. We present ClimatePipes to address many of these challenges by creating an open source platform that provides state-of-the-art, user-friendly data access, analysis, and visualization for climate and other relevant geospatial datasets, making the climate data available to non-researchers, decision-makers, and other stakeholders. The overarching goals of ClimatePipes are: - Enable users to explore real-world questions related to climate change. - Provide tools for data access, analysis, and visualization. - Facilitate collaboration by enabling users to share datasets, workflows, and visualization. ClimatePipes uses a web-based application platform for its widespread support on mainstream operating systems, ease-of-use, and inherent collaboration support. The front-end of ClimatePipes uses HTML5 (WebGL, Canvas2D, CSS3) to deliver state-of-the-art visualization and to provide a best-in-class user experience. The back-end of the ClimatePipes is built around Python using the Visualization Toolkit (VTK, http://vtk.org), Climate Data Analysis Tools (CDAT, http://uv-cdat.llnl.gov), and other climate and geospatial data processing tools such as GDAL and PROJ4. ClimatePipes web-interface to query and access data from remote sources (such as ESGF). Shown in the figure is climate data layer from ESGF on top of map data layer from OpenStreetMap. The ClimatePipes workflow editor provides flexibility and fine grained control, and uses the VisTrails (http://www.vistrails.org) workflow engine in the backend.

  10. Appropriateness of antibiotic treatment in intravenous drug users, a retrospective analysis

    Directory of Open Access Journals (Sweden)

    Fluckiger Ursula

    2008-04-01

    Full Text Available Abstract Background Infectious disease is often the reason for intravenous drug users being seen in a clinical setting. The objective of this study was to evaluate the appropriateness of treatment and outcomes for this patient population in a hospital setting. Methods Retrospective study of all intravenous drug users hospitalized for treatment of infectious diseases and seen by infectious diseases specialists 1/2001–12/2006 at a university hospital. Treatment was administered according to guidelines when possible or to alternative treatment program in case of patients for whom adherence to standard protocols was not possible. Outcomes were defined with respect to appropriateness of treatment, hospital readmission, relapse and mortality rates. For statistical analysis adjustment for multiple hospitalizations of individual patients was made by using a generalized estimating equation. Results The total number of hospitalizations for infectious diseases was 344 among 216 intravenous drug users. Skin and soft tissue infections (n = 129, 37.5% of hospitalizations, pneumonia (n = 75, 21.8% and endocarditis (n = 54, 15.7% were most prevalent. Multiple infections were present in 25%. Treatment was according to standard guidelines for 78.5%, according to an alternative recommended program for 11.3%, and not according to guidelines or by the infectious diseases specialist advice for 10.2% of hospitalizations. Psychiatric disorders had a significant negative impact on compliance (compliance problems in 19.8% of hospitalizations in multiple logistic regression analysis (OR = 2.4, CI 1.1–5.1, p = 0.03. The overall readmission rate and relapse rate within 30 days was 13.7% and 3.8%, respectively. Both non-compliant patient behavior (OR = 3.7, CI 1.3–10.8, p = 0.02 and non-adherence to treatment guidelines (OR = 3.3, CI 1.1–9.7, p = 0.03 were associated with a significant increase in the relapse rate in univariate analysis. In 590 person-years of follow-up, 24.6% of the patients died: 6.4% died during hospitalization (1.2% infection-related and 13.6% of patients died after discharge. Conclusion Appropriate antibiotic therapy according to standard guidelines in hospitalized intravenous drug users is generally practicable and successful. In a minority alternative treatments may be indicated, although associated with a higher risk of relapse.

  11. Multiple comparative studies of Green Supply Chain Management : Pressures analysis

    DEFF Research Database (Denmark)

    Xu, Lihui; Mathiyazhagan, K.

    2013-01-01

    Environmental sustainability is of great concern among world organizations and enterprises due to recent trends in global warming. Many developed nations have put in place stricter environmental regulations. Industries in such nations have established full-fledged systems to adopt environment friendly operation strategies to lower their overall carbon footprint. Currently, there is increased awareness among customers even in developing countries about eco friendly manufacturing solutions. Multi-national firms have identified economies of developed nations as a potential market for their products. Such organizations in developing countries like India and China are under pressure to adopt green concepts in supply chain operations to compete in the market and satisfy their customers' increasing needs. This paper offers a comparative study of pressures that impact the adoption of Green Supply Chain Management (GSCM). Thirty two pressures are identified from extensive literature reviews and they are classified into five distinct groups based on their similarities. A detailed questionnaire is prepared and circulated among industries in various sectors. Industries were requested through this survey to rate the impact of each pressure. Two independent hypotheses were formulated from literature to test the nature of impact and the differences affecting Indian industries. Statistical data analysis through one-way single factor Analysis of Variance (ANOVA), followed by pair-wise comparison of means using Tukey's test was used. The analysis was performed for different sectors and different scales of production categories. The results and their implications are also discussed. © 2013 Elsevier B.V. All rights reserved.

  12. White matter degeneration in schizophrenia: a comparative diffusion tensor analysis

    Science.gov (United States)

    Ingalhalikar, Madhura A.; Andreasen, Nancy C.; Kim, Jinsuh; Alexander, Andrew L.; Magnotta, Vincent A.

    2010-03-01

    Schizophrenia is a serious and disabling mental disorder. Diffusion tensor imaging (DTI) studies performed on schizophrenia have demonstrated white matter degeneration either due to loss of myelination or deterioration of fiber tracts although the areas where the changes occur are variable across studies. Most of the population based studies analyze the changes in schizophrenia using scalar indices computed from the diffusion tensor such as fractional anisotropy (FA) and relative anisotropy (RA). The scalar measures may not capture the complete information from the diffusion tensor. In this paper we have applied the RADTI method on a group of 9 controls and 9 patients with schizophrenia. The RADTI method converts the tensors to log-Euclidean space where a linear regression model is applied and hypothesis testing is performed between the control and patient groups. Results show that there is a significant difference in the anisotropy between patients and controls especially in the parts of forceps minor, superior corona radiata, anterior limb of internal capsule and genu of corpus callosum. To check if the tensor analysis gives a better idea of the changes in anisotropy, we compared the results with voxelwise FA analysis as well as voxelwise geodesic anisotropy (GA) analysis.

  13. Security analysis and enhancements of an effective biometric-based remote user authentication scheme using smart cards.

    Science.gov (United States)

    An, Younghwa

    2012-01-01

    Recently, many biometrics-based user authentication schemes using smart cards have been proposed to improve the security weaknesses in user authentication system. In 2011, Das proposed an efficient biometric-based remote user authentication scheme using smart cards that can provide strong authentication and mutual authentication. In this paper, we analyze the security of Das's authentication scheme, and we have shown that Das's authentication scheme is still insecure against the various attacks. Also, we proposed the enhanced scheme to remove these security problems of Das's authentication scheme, even if the secret information stored in the smart card is revealed to an attacker. As a result of security analysis, we can see that the enhanced scheme is secure against the user impersonation attack, the server masquerading attack, the password guessing attack, and the insider attack and provides mutual authentication between the user and the server. PMID:22899887

  14. Three looks at users: a comparison of methods for studying digital library use. User studies, Digital libraries, Digital music libraries, Music, Information use, Information science, Contextual inquiry, Contextual design, User research, Questionnaires, Log file analysis

    Directory of Open Access Journals (Sweden)

    Mark Notess

    2004-01-01

    Full Text Available Compares three user research methods of studying real-world digital library usage within the context of the Variations and Variations2 digital music libraries at Indiana University. After a brief description of both digital libraries, each method is described and illustrated with findings from the studies. User satisfaction questionnaires were used in two studies, one of Variations (n=30 and the other of Variations2 (n=12. Second, session activity log files were examined for 175 Variations2 sessions using both quantitative and qualitative methods. The third method, contextual inquiry, is illustrated with results from field observations of four voice students' information usage patterns. The three methods are compared in terms of expertise required; time required to set up, conduct, and analyse resulting data; and the benefits derived. Further benefits are achieved with a mixed-methods approach, combining the strengths of the methods to answer questions lingering as a result of other methods.

  15. LISA package user guide. Part III: SPOP (Statistical POst Processor). Uncertainty and sensitivity analysis for model output. Program description and user guide

    International Nuclear Information System (INIS)

    This manual is subdivided into three parts. In the third part, the SPOP (Statistical POst Processor) code is described as a tool to perform Uncertainty and Sensitivity Analyses on the output of a User implemented model. It has been developed at the joint Research Centre of Ispra as part of the LISA package. SPOP performs Sensitivity Analysis (SA) and Uncertainty Analysis (UA) on a sample output from a Monte Carlo simulation. The sample is generated by the User and contains values of the output variable (in the form of a time series) and values of the input variables for a set of different simulations (runs), which are realised by varying the model input parameters. The User may generate the Monte Carlo sample with the PREP pre-processor, another module of the LISA package. The SPOP code is completely written in FORTRAN 77 using structured programming. Among the tasks performed by the code are the computation of Tchebycheff and Kolmogorov confidence bounds on the output variable (UA), and the use of effective non-parametric statistics to rank the influence of model input parameters (SA). The statistics employed are described in the present manual. 19 refs., 16 figs., 2 tabs. Note: This PART III is a revised version of the previous EUR report N.12700EN (1990)

  16. ANALYSIS OF INTERNET TRAFFIC IN EDUCATIONAL NETWORK BASED ON USERS’ PREFERENCES

    Directory of Open Access Journals (Sweden)

    Mustafa M.H. Ibrahim

    2014-01-01

    Full Text Available The demand for Internet services and network resources in Educational networks are increasing rapidly. Specifically, the revolution of web 2.0 “also referred to as the Read-Write Web” has changed the way of information exchange and distribution. Although web 2.0 has gained attraction in all sectors of the education industry, but it results in high-traffic loads on networks which often leads to the Internet users’ dissatisfaction. Therefore, analyzing Internet traffic becomes an urgent need to provide high-quality service, monitoring bandwidth usage. In this study, we focus on analyzing the Internet traffic in Universiti Utara Malaysia (UUM main campus. We performed measurement analysis form the application level characteristics based on users’ preferences. A total of three methodological steps are carried out to meet the objective of this study namely data collection, data analysis and data presentation. The finding shows that social networks are the most web applications visited in UUM. These findings lead to facilitate the enhancement of Educational network performance and Internet bandwidth strategies.

  17. Comparative Genomics and Transcriptomic Analysis of Mycobacterium Kansasii

    KAUST Repository

    Alzahid, Yara

    2014-04-01

    The group of Mycobacteria is one of the most intensively studied bacterial taxa, as they cause the two historical and worldwide known diseases: leprosy and tuberculosis. Mycobacteria not identified as tuberculosis or leprosy complex, have been referred to by ‘environmental mycobacteria’ or ‘Nontuberculous mycobacteria (NTM). Mycobacterium kansasii (M. kansasii) is one of the most frequent NTM pathogens, as it causes pulmonary disease in immuno-competent patients and pulmonary, and disseminated disease in patients with various immuno-deficiencies. There have been five documented subtypes of this bacterium, by different molecular typing methods, showing that type I causes tuberculosis-like disease in healthy individuals, and type II in immune-compromised individuals. The remaining types are said to be environmental, thereby, not causing any diseases. The aim of this project was to conduct a comparative genomic study of M. kansasii types I-V and investigating the gene expression level of those types. From various comparative genomics analysis, provided genomics evidence on why M. kansasii type I is considered pathogenic, by focusing on three key elements that are involved in virulence of Mycobacteria: ESX secretion system, Phospholipase c (plcb) and Mammalian cell entry (Mce) operons. The results showed the lack of the espA operon in types II-V, which renders the ESX- 1 operon dysfunctional, as espA is one of the key factors that control this secretion system. However, gene expression analysis showed this operon to be deleted in types II, III and IV. Furthermore, plcB was found to be truncated in types III and IV. Analysis of Mce operons (1-4) show that mce-1 operon is duplicated, mce-2 is absent and mce-3 and mce-4 is present in one copy in M. kansasii types I-V. Gene expression profiles of type I-IV, showed that the secreted proteins of ESX-1 were slightly upregulated in types II-IV when compared to type I and the secreted forms of ESX-5 were highly down regulated in the same types. Differentially expressed genes in types II-IV were also evaluated and validated by qPCR for selected genes. This study gave a general view of the genome of this bacterium and its types, highlighted some different aspects of its subtypes and supplemented by gene expression data.

  18. Implementation and evaluation of an interactive user interface for a clinical image analysis workstation

    Science.gov (United States)

    Ratib, Osman M.; Huang, H. K.

    1990-05-01

    Recent developments in digital imaging and Picture Archiving and Communication Systems (PACS) allow physicians and radiologists to assess radiographic images directly in digital form through imaging workstations. The development of medical workstations was primarily oriented toward the development of a convenient tool for rapid display of images. In this project our goal was to design and evaluate a personal desktop workstation that will provide a large number of clinically useful image analysis tools. The hardware used is a standard Macintosh II interfaced to our existing PACS network through an Ethernet interface using standard TCP/IP communication protocols. Special emphasis was placed on the design of the user interface to allow clinicians with minimal or no computer manipulation skills to use complex analysis tools.

  19. Sentiment Analysis of User Comments for One-Class Collaborative Filtering over TED Talks

    OpenAIRE

    Pappas, Nikolaos; Popescu-belis, Andrei

    2013-01-01

    User-generated texts such as reviews, comments or discussions are valuable indicators of users’ preferences. Unlike previous works which focus on labeled data from user-contributed reviews, we focus here on user comments which are not accompanied by pre-defined rating labels. We investigate their role in a one-class collaborative filtering task such as bookmarking, where only the user action is given as ground-truth. We propose a sentiment-aware nearest neighbor model (SANN) for multimedia ...

  20. Phylogeny and comparative genome analysis of a Basidiomycete fungi

    Energy Technology Data Exchange (ETDEWEB)

    Riley, Robert W.; Salamov, Asaf; Grigoriev, Igor; Hibbett, David

    2011-03-14

    Fungi of the phylum Basidiomycota, make up some 37percent of the described fungi, and are important from the perspectives of forestry, agriculture, medicine, and bioenergy. This diverse phylum includes the mushrooms, wood rots, plant pathogenic rusts and smuts, and some human pathogens. To better understand these important fungi, we have undertaken a comparative genomic analysis of the Basidiomycetes with available sequenced genomes. We report a phylogeny that sheds light on previously unclear evolutionary relationships among the Basidiomycetes. We also define a `core proteome? based on protein families conserved in all Basidiomycetes. We identify key expansions and contractions in protein families that may be responsible for the degradation of plant biomass such as cellulose, hemicellulose, and lignin. Finally, we speculate as to the genomic changes that drove such expansions and contractions.

  1. Urban street networks: a comparative analysis of ten European cities

    CERN Document Server

    Strano, Emanuele; Cardillo, Alessio; Costa, Luciano Da Fontoura; Porta, Sergio; Latora, Vito

    2012-01-01

    We compare the structural properties of the street networks of ten different European cities using their primal representation. We investigate the properties of the geometry of the networks and a set of centrality measures highlighting differences and similarities among cases. In particular, we found that cities share structural similarities due to their quasi planarity but that there are also several distinctive geometrical proprieties. A Principal Component Analysis is also performed on the distributions of centralities and their respective moments, which is used to find distinctive characteristics by which we can classify cities into families. We believe that, beyond the improvement of the empirical knowledge on streets network proprieties, our findings can open new perspectives in the scientific relation between city planning and complex networks, stimulating the debate on the effectiveness of the set of knowledge that statistical physics can contribute for city planning and urban morphology studies.

  2. Comparative Evaluation and Analysis of IAX and RSW

    CERN Document Server

    Kolhar, Manjur S; Abouabdalla, Omar; Wan, Tat Chee; Manasrah, Ahmad M

    2010-01-01

    Voice over IP (VoIP) is a technology to transport media over IP networks such as the Internet. VoIP has the capability of connecting people over packet switched networks instead of traditional circuit switched networks. Recently, the InterAsterisk Exchange Protocol (IAX) has emerged as a new VoIP which is gaining popularity among VoIP products. IAX is known for its simplicity, NAT friendliness, efficiency, and robustness. More recently, the Real time Switching (RSW) control criterion has emerged as a multimedia conferencing protocol. In this paper, we made a comparative evaluation and analysis of IAX and RSW using Mean Opinion Score rating (MOS) and found that they both perform well under different network packet delays in ms.

  3. Malaysian Real Estate Investment Trusts: A Performance and Comparative Analysis

    Directory of Open Access Journals (Sweden)

    Tze San Ong

    2012-04-01

    Full Text Available This study examines the investment performance of conventional and Islamic Real Estate Investment Trusts (REITs listed in Malaysia over the 2005–10 time period. Analysis reveals that both conventional and Islamic REITs experienced negative monthly return during 2008 global financial crisis (GFC period, and positive monthly return post GFC period. Compared to market indices, most REITs are under-performed before GFC. Divergent findings were reported during the GFC and post-GFC, depending on the measurement tools used. Based on Treynor and Sharpe measurements, most REITs under-performed the market portfolio in during and post GFC period. However, according to Jensen measurement, the REITs out-performed market indices during and post GFC period. Despite these seemingly divergent findings, this study can assist investors, regulatory body, fund managers and academics to make a better informed investment decision on Malaysia REITs. This study has provided interesting and important information and insights into the performance of Malaysia REITs.

  4. Comparative Analysis of Visco-elastic Models with Variable Parameters

    Directory of Open Access Journals (Sweden)

    Silviu Nastac

    2010-01-01

    Full Text Available The paper presents a theoretical comparative study for computational behaviour analysis of vibration isolation elements based on viscous and elastic models with variable parameters. The changing of elastic and viscous parameters can be produced by natural timed evolution demo-tion or by heating developed into the elements during their working cycle. It was supposed both linear and non-linear numerical viscous and elastic models, and their combinations. The results show the impor-tance of numerical model tuning with the real behaviour, as such the characteristics linearity, and the essential parameters for damping and rigidity. Multiple comparisons between linear and non-linear simulation cases dignify the basis of numerical model optimization regarding mathematical complexity vs. results reliability.

  5. COMPARATIVE ANALYSIS OF VAT EVOLUTION IN THE EUROPEAN ECONOMIC SYSTEM

    Directory of Open Access Journals (Sweden)

    Mihaela Andreea STROE

    2011-08-01

    Full Text Available In this paper we study a comparative analysis of VAT in different states of the world. I made some observation on this theme because I believe that VAT is very important in carrying out transactions and the increase or decrease of this tax has a major impact upon national economies and also on the quality of life in developing countries. The papers has to pourpose to make a comparison between the American and European system of taxation with its advantages and disadvantages and, in the end to render an economic model and its statistics components. VAT is a value added tax which appeared about 50 years, initially with two purposes: one to replace certain indirect taxes, and another to reduce the budget deficit according to the faith of that time. The first country that has adopted this model was France, calling it today as value-added tax.

  6. COMPARATIVE ANALYSIS OF VAT EVOLUTION IN THE EUROPEAN ECONOMIC SYSTEM

    Directory of Open Access Journals (Sweden)

    MIHAELA ANDREEA STROE

    2011-04-01

    Full Text Available In this paper we study a comparative analysis of VAT in different states of the world. I made some observation on this theme because I believe that VAT is very important in carrying out transactions and the increase or decrease of this tax has a major impact upon national economies and also on the quality of life in developing countries. The papers has to pourpose to make a comparison between the American and European system of taxation with its advantages and disadvantages and, in the end to render an economic model and its statistics components. VAT is a value added tax which appeared about 50 years, initially with two purposes: one to replace certain indirect taxes, and another to reduce the budget deficit according to the faith of that time. The first country that has adopted this model was France, calling it today as value-added tax.

  7. Comparative Analysis of Fare Collection System on Bus Operations

    Directory of Open Access Journals (Sweden)

    M.H. Hafezi

    2012-01-01

    Full Text Available This study presents a comparative analysis of fare collection systems for inter-city bus operation. One of the important issues in the bus scheduling model is stops of buses in the bus stations (called dwell time -where buses have to stop for boarding and alighting passengers in the bus station. This issue has a direct impact on increased travel time. Subsequently, increased travel time for one bus mission can cause delay in the loops of bus scheduling. This article describes a survey of fare collection systems for bus operations, covering two fare collection systems: paying cash and using the touch-n-go card. We studied this issue in a real case inter-city bus operation. It has been highlighted that a fare collection system using the touch-n-go card has higher efficiency than other the cash method in relation to reducing dwell time of buses in the bus station.

  8. Comparative Analysis on Turing Machine and Quantum Turing Machine

    Directory of Open Access Journals (Sweden)

    Tanistha Nayak

    2012-06-01

    Full Text Available Now-a-days every computing device is based on the Turing Machine. It is known that classical physics is sufficient to explain macroscopic phenomena, but is not to explain microscopic phenomena like the interference of electrons. In these days, speed-up and down-sizing of computing devices have been carried out using quantum physical effects; however, principles of computation on these devices are also based on classical physics. This paper tries to analyze mathematically a possibility that the Universal Quantum Turing Machine (UQTM is able to compute faster than any other classical models of computation. Basically we focused on comparative study on computation power of Universal Turing Machine (UTM and UQTM. Namely, in the equal, we tried to show that the UQTM can solve any NP-complete problem in polynomial time. The result analysis showed that UQTM is faster for any computation.

  9. Comparative Traffic Performance Analysis of Urban Transportation Network Structures

    CERN Document Server

    Amini, Behnam; Mojarradi, Morteza; Derrible, Sybil

    2015-01-01

    The network structure of an urban transportation system has a significant impact on its traffic performance. This study uses network indicators along with several traffic performance measures including speed, trip length, travel time, and traffic volume, to compare a selection of seven transportation networks with a variety of structures and under different travel demand conditions. The selected network structures are: modified linear, branch, grid, 3-directional grid, 1-ring web, 2-ring web, and radial. For the analysis, a base origin-destination matrix is chosen, to which different growth factors are applied in order to simulate various travel demand conditions. Results show that overall the 2-ring web network offers the most efficient traffic performance, followed by the grid and the 1-ring networks. A policy application of this study is that the branch, 3-directional grid, and radial networks are mostly suited for small cities with uncongested traffic conditions. In contrast, the 2-ring web, grid, and 1-r...

  10. Vermiculites of the Northeast Brazilian region: comparative analysis

    International Nuclear Information System (INIS)

    Vermiculites are clay minerals similar to montmorillonites differing for crystalline structures. The product exfoliated is odorless, hydrophobic, not irritate the skin and not the lungs. These properties provide the modified thermally vermiculite a product of broad application in the sectors of construction, agriculture and industry. The aim this work is the comparative analysis between two vermiculites micron fractions from different localities of the northeastern Brazilian region, UBM/PB and EUCATEX/PI. Samples exfoliated to 950 deg C were leached for removal of organic matter by oxidation with hydrogen peroxide. The spectroscopy in the infrared, FTIR and X-ray diffraction, XRD, were used to characterize the samples. Data from X-ray diffraction showed that the structural characteristics of the mineral samples were not significantly altered with the process of leaching and the IR spectra proved the efficiency of procedure for removal of organic impurities (author)

  11. Comparative analysis of existing models for power-grid synchronization

    CERN Document Server

    Nishikawa, Takashi

    2015-01-01

    The dynamics of power-grid networks is becoming an increasingly active area of research within the physics and network science communities. The results from such studies are typically insightful and illustrative, but are often based on simplifying assumptions that can be either difficult to assess or not fully justified for realistic applications. Here we perform a comprehensive comparative analysis of three leading models recently used to study synchronization dynamics in power-grid networks -- a fundamental problem of practical significance given that frequency synchronization of all power generators in the same interconnection is a necessary condition for a power grid to operate. We show that each of these models can be derived from first principles within a common framework based on the classical model of a generator, thereby clarifying all assumptions involved. This framework allows us to view power grids as complex networks of coupled second-order phase oscillators with both forcing and damping terms. U...

  12. Comparative analysis of cytogenetic manifestations of human genome instability

    International Nuclear Information System (INIS)

    The comparative analysis of cytogenetic manifestations of human genome instability was carried out. The studied parameters are the micronuclei rate (MNR), the level of single and double chromosome fragment and the level of premature chromatid division (PCD). PCD and chromosome fragments were chosen as anomalies that possibly result in MN formation. We analysed the MNR in buccal epithelium (BE) and peripheral blood lymphocytes (PBL), the level of single and double chromosome fragment as well as level PCD - in PBL only. Average MNR in BE was higher than in PBL. The studied parameters are independent ones and have to be considered altogether for more comprehensive evaluation of the level and peculiarities of manifestation of human genome instability

  13. Recurrence quantification analysis to compare the machinability of steels

    Directory of Open Access Journals (Sweden)

    Ravish

    2011-01-01

    Full Text Available Machinability, though is a simple term, is difficult to generalize. But nevertheless, it can be realized as the ease or difficulty with which a material can be machined. Assessing the machinability of various materials before they are used in commercial manufacturing is very demanding, as the machinability affects the material removal rate, surface finish of the workpiece, cutting power consumption and tool wear rate. The present work aims at establishing Recurrence Quantification Analysis, a relatively new technique in the study of chaotic systems, as a potential tool to establish and compare the machinability of steels. The technique has its roots in quantifying the Recurrence Plots obtained by the phase space reconstruction of time domain signals. Variation in Determinism, one of the variables of the technique, is used as a mean to establish the comparison of machinability.

  14. Comparative Analysis of Hydrogen Production Methods with Nuclear Reactors

    International Nuclear Information System (INIS)

    Hydrogen is highly effective and ecologically clean fuel. It can be produced by a variety of methods. Presently the most common are through electrolysis of water and through the steam reforming of natural gas. It is evident that the leading method for the future production of hydrogen is nuclear energy. Several types of reactors are being considered for hydrogen production, and several methods exist to produce hydrogen, including thermochemical cycles and high-temperature electrolysis. In the article the comparative analysis of various hydrogen production methods is submitted. It is considered the possibility of hydrogen production with the nuclear reactors and is proposed implementation of research program in this field at the IPPE sodium-potassium eutectic cooling high temperature experimental facility (VTS rig). (authors)

  15. Comparative analysis of CT and DSA in traumatic splenic salvage

    International Nuclear Information System (INIS)

    Objective: To explore the better diagnostic method for acute splenic artery injury through comparative analysis of CT and DSA. Methods: Fifty-seven patients with acute splenic injury were examined by CT and DSA, treated with splenic arterial embolization and then undertook follow up. Results: CT examination possessed higher sensitivity and accuracy than DSA in demonstrating splenic parenchymal laceration, intrasplenic hematoma, subcapsular hematoma, rupture of splenic capsule and combined injury of intra-abdominal organs, especially in localizing splenic laceration. And there was a high significant difference statistically between the two kinds of examination (?2=10.71, P2=12.57, P<0.005). Conclusions: CT and DSA are complementary in the diagnosis of splenic injury. After CT confirmation of splenic injury and the patient vital signs being stable, DSA should be referred to as soon as possible for further detail information as well as for possible interventional embolization and reduction of surgical complications. (authors)

  16. Comparative Analysis of Congestion Control Algorithms Using ns-2

    CERN Document Server

    Patel, Sanjeev; Garg, Arjun; Mehrotra, Prateek; Chhabra, Manish

    2012-01-01

    In order to curtail the escalating packet loss rates caused by an exponential increase in network traffic, active queue management techniques such as Random Early Detection (RED) have come into picture. Flow Random Early Drop (FRED) keeps state based on instantaneous queue occupancy of a given flow. FRED protects fragile flows by deterministically accepting flows from low bandwidth connections and fixes several shortcomings of RED by computing queue length during both arrival and departure of the packet. Stochastic Fair Queuing (SFQ) ensures fair access to network resources and prevents a busty flow from consuming more than its fair share. In case of (Random Exponential Marking) REM, the key idea is to decouple congestion measure from performance measure (loss, queue length or delay). Stabilized RED (SRED) is another approach of detecting nonresponsive flows. In this paper, we have shown a comparative analysis of throughput, delay and queue length for the various congestion control algorithms RED, SFQ and REM...

  17. Imaging hydrated microbial extracellular polymers: Comparative analysis by electron microscopy

    Energy Technology Data Exchange (ETDEWEB)

    Dohnalkova, A.C.; Marshall, M. J.; Arey, B. W.; Williams, K. H.; Buck, E. C.; Fredrickson, J. K.

    2011-01-01

    Microbe-mineral and -metal interactions represent a major intersection between the biosphere and geosphere but require high-resolution imaging and analytical tools for investigating microscale associations. Electron microscopy has been used extensively for geomicrobial investigations and although used bona fide, the traditional methods of sample preparation do not preserve the native morphology of microbiological components, especially extracellular polymers. Herein, we present a direct comparative analysis of microbial interactions using conventional electron microscopy approaches of imaging at room temperature and a suite of cryogenic electron microscopy methods providing imaging in the close-to-natural hydrated state. In situ, we observed an irreversible transformation of the hydrated bacterial extracellular polymers during the traditional dehydration-based sample preparation that resulted in their collapse into filamentous structures. Dehydration-induced polymer collapse can lead to inaccurate spatial relationships and hence could subsequently affect conclusions regarding nature of interactions between microbial extracellular polymers and their environment.

  18. Comparative analysis of the biochemistry undergraduate courses in Brazil

    Directory of Open Access Journals (Sweden)

    P. A. Granjeiro

    2014-08-01

    Full Text Available INTRODUCTION: The economic and social development of Brazil during the recent decades has contributed to the installation of several new undergraduate and graduate study programs, as is the case of the undergraduate biochemistry programs at UFV, UFSJ and UEM. The new biochemical professionals are being prepared to work mainly in Industries, research Institutes, government agencies and Universities in all fields that involve Biochemistry and Molecular Biology. The aim of this study was to conduct a comparative analysis of the courses in Biochemistry in Brazil. MATERIAL AND METHODS: Comparative analysis of the course units of the UFV, UFSJ and UEM programs, centered on the curricula contents and organization and on the profiles of the students in terms of parameters such as the number of admissions and the graduation completion rates. RESULTS AND DISCUSSION: The UFV and UEM programs present a very similar distribution of workload over the biological, exact sciences, humanities, biochemical specialties and technological applications. The UFSJ program presents higher workloads in the areas of biological sciences and technological applications. No significant differences in the distribution of the workloads of mandatory and optional disciplines, complementary activities and supervised activities were detected. Over the past five years there was a decrease in the number of students that abandoned the programs, despite the increased retention time in the three courses. Most graduated students at both UFV and UFSJ continue their academic career toward the Master or Doctor degrees. CONCLUSION: Little difference between the study programs analyzed. This is somewhat surprising if one considers the fact that individual conception of each program was based on different local conditions and needs, which indeed justify small differences. The similarity of the programs, on the other hand, reflects the universality of the biochemical sciences and their broad potential of practical applications.

  19. A comparative analysis of metal transportomes from metabolically versatile Pseudomonas

    Directory of Open Access Journals (Sweden)

    Rodrigue Agnes

    2008-09-01

    Full Text Available Abstract Background The availability of complete genome sequences of versatile Pseudomonas occupying remarkably diverse ecological niches enabled to gain insights into their adaptative assets. The objective of this study was to analyze the complete genetic repertoires of metal transporters (metal transportomes from four representative Pseudomonas species and to identify metal transporters with "Genomic Island" associated features. Methods A comparative metal transporter inventory was built for the following four Pseudomonas species: P.putida (Ppu KT2440, P.aeruginosa (Pae PA01, P.fluorescens (Pfl Pf-5 and P.syringae (Psypv.tomato DC3000 using TIGR-CMR and Transport DB. Genomic analysis of essential and toxic metal ion transporters was accomplished from the above inventory. Metal transporters with "Genomic Island" associated features were identified using Islandpath analysis. Results Dataset cataloguing has been executed for 262 metal transporters from the four spp. Additional metal ion transporters belonging to NiCoT, Ca P-type ATPase, Cu P-type ATPases, ZIP and MgtC families were identified. In Psy DC3000, 48% of metal transporters showed strong GI features while it was 45% in Ppu KT2440. In Pfl Pf-5 and Pae PA01 only 26% of their metal transporters exhibited GI features. Conclusion Our comparative inventory of 262 metal transporters from four versatile Pseudomonas spp is the complete suite of metal transportomes analysed till date in a prokaryotic genus. This study identified differences in the basic composition of metal transportomes from Pseudomonas occupying diverse ecological niches and also elucidated their novel features. Based on this inventory we analysed the role of horizontal gene transfer in expansion and variability of metal transporter families.

  20. Extreme storm surges: a comparative study of frequency analysis approaches

    Directory of Open Access Journals (Sweden)

    Y. Hamdi

    2013-11-01

    Full Text Available In France, nuclear facilities were designed to very low probabilities of failure. Nevertheless, exceptional climatic events have given rise to surges much larger than observations (outliers and had clearly illustrated the potential to underestimate the extreme water levels calculated with the current statistical methods. The objective of the present work is to conduct a comparative study of three approaches including the Annual Maxima (AM, the Peaks-Over Threshold (POT and the r-Largest Order Statistics (r-LOS. These methods are illustrated in a real analysis case study. All the data sets were screened for outliers. Non-parametric tests for randomness, homogeneity and stationarity of time series were used. The shape and scale parameters stability plots, the mean excess residual life plot and the stability of the standard errors of return levels were used to select optimal thresholds and r values for the POT and r-LOS method, respectively. The comparison of methods was based on: (i the uncertainty degrees, (ii the adequacy criteria and tests and (iii the visual inspection. It was found that the r-LOS and POT methods have reduced the uncertainty on the distributions parameters and return level estimates and have systematically shown values of the 100 and 500 yr return levels smaller than those estimated with the AM method. Results have also shown that none of the compared methods has allowed a good fitting at the right tail of the distribution in the presence of outliers. As a perspective, the use of historical information was proposed in order to increase the representativity of outliers in data sets. Findings are of practical relevance not only to nuclear energy operators in France, for applications in storm surge hazard analysis and flood management, but also for the optimal planning and design of facilities to withstand extreme environmental conditions, with an appropriate level of risk.

  1. Comparative expression pathway analysis of human and canine mammary tumors

    Directory of Open Access Journals (Sweden)

    Marconato Laura

    2009-03-01

    Full Text Available Abstract Background Spontaneous tumors in dog have been demonstrated to share many features with their human counterparts, including relevant molecular targets, histological appearance, genetics, biological behavior and response to conventional treatments. Mammary tumors in dog therefore provide an attractive alternative to more classical mouse models, such as transgenics or xenografts, where the tumour is artificially induced. To assess the extent to which dog tumors represent clinically significant human phenotypes, we performed the first genome-wide comparative analysis of transcriptional changes occurring in mammary tumors of the two species, with particular focus on the molecular pathways involved. Results We analyzed human and dog gene expression data derived from both tumor and normal mammary samples. By analyzing the expression levels of about ten thousand dog/human orthologous genes we observed a significant overlap of genes deregulated in the mammary tumor samples, as compared to their normal counterparts. Pathway analysis of gene expression data revealed a great degree of similarity in the perturbation of many cancer-related pathways, including the 'PI3K/AKT', 'KRAS', 'PTEN', 'WNT-beta catenin' and 'MAPK cascade'. Moreover, we show that the transcriptional relationships between different gene signatures observed in human breast cancer are largely maintained in the canine model, suggesting a close interspecies similarity in the network of cancer signalling circuitries. Conclusion Our data confirm and further strengthen the value of the canine mammary cancer model and open up new perspectives for the evaluation of novel cancer therapeutics and the development of prognostic and diagnostic biomarkers to be used in clinical studies.

  2. Differentiation between Intramedullary spinal ependymoma and astrocytoma: Comparative MRI analysis

    International Nuclear Information System (INIS)

    Aim: To investigate magnetic resonance imaging (MRI) findings that could be used to differentiate intramedullary spinal ependymoma from astrocytoma, and to determine predictors for this differentiation. Materials and methods: MRI images of 43 consecutive patients with pathologically proven intramedullary spinal ependymoma (n = 24) and astrocytoma (n = 19) were comparatively evaluated with regard to size, location, margin, signal intensity, contrast enhancement, presence of syringohydromyelia, tumoural cyst, non-tumoural cyst, and haemorrhage. MRI findings and demographic data were compared between the two tumour groups using univariate and multivariate logistic regression analyses. Results: In patients with ependymoma, older age and a larger solid component were more often observed than in astrocytoma. Central location, presence of enhancement, diffuse enhancement, syringohydromyelia, haemorrhage, and cap sign were more frequently observed in ependymoma. However, multivariate analysis revealed that syringohydromyelia was the only variable able to independently differentiate ependymoma from astrocytoma, with an odds ratio of 62.9 (95% CI: 4.38–903.22; p = 0.002). Conclusion: Among the various findings, the presence of syringohydromyelia is the main factor distinguishing ependymoma from astrocytoma

  3. Comparative analysis of mapping burned areas from landsat TM images

    International Nuclear Information System (INIS)

    Remote sensing is a major source of mapping the burned area caused by forest fire. The focus in this application is to map a single class of interest, i.e. burned area. In this study, three different data combinations were classified using different classifiers and quantitatively compared. The adopted classifiers are Support Vector Data Description (SVDD), a one-class classifier, Binary classifier Support Vector Machines (SVMs) and traditional Maximum Likelihood classifier (ML). At first, the Principal Component Analysis (PCA) was applied to extract the best possible features form the original multispectral image (OMI) and calculated spectral indices (SI). Then the resulting subset of features was applied to the classifiers. The comparative study has undertaken to find firstly, the best possible set of features (data combination) and secondly, an effective classifier to map the burned areas. The best possible set of features was attained by data combination- II (i.e., OMI information). Furthermore, the results of the SVM showed the high classification accuracies than ML. Experimental results demonstrate that even though the SVDD for mapping the burned areas doesn't showed the higher classification accuracy than SVM, but it shows the suitability for the cases with few or poorly represented labelled samples available. The parameters should be further optimized through the use of intelligent training for improving the accuracy of SVDD. SVDD.

  4. Community Service and User Support for the Gridpoint Statistical Interpolation (GSI) Data Assimilation and Analysis System

    Science.gov (United States)

    Shao, H.; Hu, M.; Stark, D.; Newman, K.; Zhou, C.; Derber, J.; Lueken, M.

    2013-12-01

    The Gridpoint Statistical Interpolation (GSI) system is a unified variational data assimilation and analysis system for both global and regional applications. It is currently used as a data assimilation system by various operational centers, including the National Oceanic and Atmospheric Administration (NOAA) (e.g., Global Forecasting System (GFS), North American Mesoscale (NAM) system, the Hurricane WRF (HWRF), and the RAPid Refresh (RAP) system), the National Aeronautics and Space Administration (NASA) (Goddard Earth Observing System (GEOS) Model), and the Air Force Weather Agency (AFWA). This analysis system is also used to generate certain analysis products, such as output from NOAA's GFS reanalysis and the Real-Time Mesoscale Analysis (RTMA) (e.g., 2m temperature, 10m winds gust, surface pressure and surface visibility). GSI can also be used to generate analyses for climate studies (e.g., ozone and sea surface temperature (SST) analyses) or assimilate non-'traditional' fields (e.g., aerosol data assimilation) for air quality studies (e.g., dust storms). Lately, an effort was initiated to use GSI for data assimilation throughout the entire atmosphere. One example of such an effort is the development of a data assimilation system for the Whole Atmosphere Model (WAM) at NCEP. Over the past few years, GSI has been transitioned to a community resource through a joint effort led by the Developmental Testbed Center (DTC), in collaboration with the National Centers for Environmental Prediction (NCEP) Environmental Modeling Center (EMC) and other GSI partners. The DTC is a distributed facility with a goal of serving as a bridge between the research and operational communities by transitioning the operational capability to a community resource and committing the contributions from the research community to the operational repository. The DTC has hosted four Community GSI tutorials and released five versions of the community GSI system with a corresponding User's Guide. The DTC has built and continues to maintain a community GSI User's Page to provide GSI code, documentation, and on-line tutorials for the research community. The DTC staff has been providing support to GSI users through the GSI help desk since the release of version 1 of the community code in 2009. This paper will briefly describe the GSI system and emphasize the GSI community services and support available from the DTC and other developers.

  5. Comparative Analysis of Public-Key Encryption Schemes

    Directory of Open Access Journals (Sweden)

    Falaki, S. O.

    2012-09-01

    Full Text Available The introduction of public-key cryptography by Diffie and Hellman in 1976 was an important watershed in the history of cryptography. The work sparked off interest in the cryptographic research community and soon several public-key schemes were proposed and implemented. The Rivest, Shamir and Adleman (RSA, being the first realisation of this abstract model, is the most widely used public-key scheme today. However, increased processing power and availability of cheaper processing technology occasioned by the exponential growth in digital technology has generated some security concerns, necessitating the review of security parameters for enhanced security. Enhanced processing power requirement does not favour the present class of ubiquitous mobile devices that are characterised by low power consumption, limited memory and bandwidth as they may not be able to run this cryptographic algorithm due to computational burden associated with long key lengths. And since future increase in key lengths looks likely given the current technological developments, Elliptic Curve Cryptography (ECC has been proposed as an alternative cryptosystem because it satisfies both security requirements and efficiency with shorter key lengths. This research work focuses on the comparative analysis of RSA Encryption algorithm, ElGamal Elliptic Curve Encryption algorithm and Menezes-Vanstone Elliptic Curve Encryption algorithm. These elliptic curve analogues of ElGamal Encryption scheme were implemented in Java, using classes from the Flexiprovider library of ECC. The RSA algorithm used in the comparison is the Flexiprovider implementation. Performance evaluation on the three algorithms based on the time lapse for their Key generation, encryption and decryption algorithms, and encrypted data size was carried out and compared. The results show that our elliptic curve-based implementations are more superior to the RSA algorithm on all comparative parameters.

  6. Comparative Analysis and Performance Evaluation of MANET Routing Protocols?

    Directory of Open Access Journals (Sweden)

    Mr. Amit D. Chavan?

    2014-12-01

    Full Text Available (MANETs is a kind of distribution network system such as the internet World Wide Web social network and in MANET node are mobile and have the freedom to join or leave the network system . A simulation have carried out to evaluate the efficient of event MANET routing protocols (DSR, AODV, DSDV, TORA, FSR, CBRP and CGSR [9]. In this paper we have sure comparative performance analysis of different routing protocol. The number of node increasing random way point mobility and scenario where the node for detail simulation result and analysis a stable routing protocol can be close for specification. MANET network is a collection of mobile nodes that are dynamically and arbitrarily located in such a manner that the interconnections between nodes are capable of changing on a continual basis. In order to facilitate communication within the network, a routing protocol is used to discover routes between nodes. The primary goal of such an ad hoc network routing protocol is correct and efficient route establishment between a pair of nodes so that messages may be delivered in a timely manner

  7. Comparative visual analysis of Lagrangian transport in CFD ensembles.

    Science.gov (United States)

    Hummel, Mathias; Obermaier, Harald; Garth, Christoph; Joy, Kenneth I

    2013-12-01

    Sets of simulation runs based on parameter and model variation, so-called ensembles, are increasingly used to model physical behaviors whose parameter space is too large or complex to be explored automatically. Visualization plays a key role in conveying important properties in ensembles, such as the degree to which members of the ensemble agree or disagree in their behavior. For ensembles of time-varying vector fields, there are numerous challenges for providing an expressive comparative visualization, among which is the requirement to relate the effect of individual flow divergence to joint transport characteristics of the ensemble. Yet, techniques developed for scalar ensembles are of little use in this context, as the notion of transport induced by a vector field cannot be modeled using such tools. We develop a Lagrangian framework for the comparison of flow fields in an ensemble. Our techniques evaluate individual and joint transport variance and introduce a classification space that facilitates incorporation of these properties into a common ensemble visualization. Variances of Lagrangian neighborhoods are computed using pathline integration and Principal Components Analysis. This allows for an inclusion of uncertainty measurements into the visualization and analysis approach. Our results demonstrate the usefulness and expressiveness of the presented method on several practical examples. PMID:24051841

  8. Comparative Analysis of Probabilistic Models for Activity Recognition with an Instrumented Walker

    CERN Document Server

    Omar, Farheen; Truszkowski, Jakub; Poupart, Pascal; Tung, James; Caine, Allen

    2012-01-01

    Rollating walkers are popular mobility aids used by older adults to improve balance control. There is a need to automatically recognize the activities performed by walker users to better understand activity patterns, mobility issues and the context in which falls are more likely to happen. We design and compare several techniques to recognize walker related activities. A comprehensive evaluation with control subjects and walker users from a retirement community is presented.

  9. Comparative evaluation of gene-set analysis methods

    Directory of Open Access Journals (Sweden)

    Potter John D

    2007-11-01

    Full Text Available Abstract Background Multiple data-analytic methods have been proposed for evaluating gene-expression levels in specific biological pathways, assessing differential expression associated with a binary phenotype. Following Goeman and Bühlmann's recent review, we compared statistical performance of three methods, namely Global Test, ANCOVA Global Test, and SAM-GS, that test "self-contained null hypotheses" Via. subject sampling. The three methods were compared based on a simulation experiment and analyses of three real-world microarray datasets. Results In the simulation experiment, we found that the use of the asymptotic distribution in the two Global Tests leads to a statistical test with an incorrect size. Specifically, p-values calculated by the scaled ?2 distribution of Global Test and the asymptotic distribution of ANCOVA Global Test are too liberal, while the asymptotic distribution with a quadratic form of the Global Test results in p-values that are too conservative. The two Global Tests with permutation-based inference, however, gave a correct size. While the three methods showed similar power using permutation inference after a proper standardization of gene expression data, SAM-GS showed slightly higher power than the Global Tests. In the analysis of a real-world microarray dataset, the two Global Tests gave markedly different results, compared to SAM-GS, in identifying pathways whose gene expressions are associated with p53 mutation in cancer cell lines. A proper standardization of gene expression variances is necessary for the two Global Tests in order to produce biologically sensible results. After the standardization, the three methods gave very similar biologically-sensible results, with slightly higher statistical significance given by SAM-GS. The three methods gave similar patterns of results in the analysis of the other two microarray datasets. Conclusion An appropriate standardization makes the performance of all three methods similar, given the use of permutation-based inference. SAM-GS tends to have slightly higher power in the lower ?-level region (i.e. gene sets that are of the greatest interest. Global Test and ANCOVA Global Test have the important advantage of being able to analyze continuous and survival phenotypes and to adjust for covariates. A free Microsoft Excel Add-In to perform SAM-GS is available from http://www.ualberta.ca/~yyasui/homepage.html.

  10. Comparative analysis of borehole interaction in the HAW test field

    International Nuclear Information System (INIS)

    A comparative finite element study has been performed on 2 models of the HAW test field, which is an underground experimental facility used for the investigation of the effects of storage of nuclear waste in salt formations. The HAW test field consists of 2 parallel galleries in an anticline type of salt formation. From each gallery 4 vertical boreholes have been drilled at equal distance in which electrical heaters or heat producing nuclear sources will be installed. In a former 3D analysis this HAW field has been modelled as if an infinite number of boreholes existed in each gallery which reduced the size of the model due to symmetry conditions. In order to investigate the influence of this assumption a comparison has been made between two models. One uses the actual 8 boreholes and the other is based on two infinite rows of holes. The comparison has been executed for a horizontal plane through the centre of the heat sources in which a state of plane strain has been assumed. From the analysis it appears that maximum compressive stress on the borehole liner can be determined with good accuracy with the model based on an infinite number of holes. For development of the compressive stress after longer periods of time however the comparison shows that the boundary conditions have a larger influence on the evolution of the stress on the liner. A full 3D analysis of the field will be required to account for geometry effects in a correct way. (author). 5 refs.; 21 figs.; 1 tab

  11. Comparative genomic analysis and phylogenetic position of Theileria equi

    Directory of Open Access Journals (Sweden)

    Kappmeyer Lowell S

    2012-11-01

    Full Text Available Abstract Background Transmission of arthropod-borne apicomplexan parasites that cause disease and result in death or persistent infection represents a major challenge to global human and animal health. First described in 1901 as Piroplasma equi, this re-emergent apicomplexan parasite was renamed Babesia equi and subsequently Theileria equi, reflecting an uncertain taxonomy. Understanding mechanisms by which apicomplexan parasites evade immune or chemotherapeutic elimination is required for development of effective vaccines or chemotherapeutics. The continued risk of transmission of T. equi from clinically silent, persistently infected equids impedes the goal of returning the U. S. to non-endemic status. Therefore comparative genomic analysis of T. equi was undertaken to: 1 identify genes contributing to immune evasion and persistence in equid hosts, 2 identify genes involved in PBMC infection biology and 3 define the phylogenetic position of T. equi relative to sequenced apicomplexan parasites. Results The known immunodominant proteins, EMA1, 2 and 3 were discovered to belong to a ten member gene family with a mean amino acid identity, in pairwise comparisons, of 39%. Importantly, the amino acid diversity of EMAs is distributed throughout the length of the proteins. Eight of the EMA genes were simultaneously transcribed. As the agents that cause bovine theileriosis infect and transform host cell PBMCs, we confirmed that T. equi infects equine PBMCs, however, there is no evidence of host cell transformation. Indeed, a number of genes identified as potential manipulators of the host cell phenotype are absent from the T. equi genome. Comparative genomic analysis of T. equi revealed the phylogenetic positioning relative to seven apicomplexan parasites using deduced amino acid sequences from 150 genes placed it as a sister taxon to Theileria spp. Conclusions The EMA family does not fit the paradigm for classical antigenic variation, and we propose a novel model describing the role of the EMA family in persistence. T. equi has lost the putative genes for host cell transformation, or the genes were acquired by T. parva and T. annulata after divergence from T. equi. Our analysis identified 50 genes that will be useful for definitive phylogenetic classification of T. equi and closely related organisms.

  12. Recent national trends in Salvia divinorum use and substance-use disorders among recent and former Salvia divinorum users compared with nonusers

    Directory of Open Access Journals (Sweden)

    Blazer DG

    2011-04-01

    Full Text Available Li-Tzy Wu1, George E Woody2, Chongming Yang3, Jih-Heng Li4, Dan G Blazer11Department of Psychiatry and Behavioral Sciences, Duke University Medical Center, Durham, NC, USA; 2Department of Psychiatry, University of Pennsylvania and Treatment Research Institute, Philadelphia, PA, USA; 3Social Science Research Institute, Duke University, Durham, NC, USA; 4College of Pharmacy, Kaohsiung Medical University, Kaohsiung, TaiwanContext: Media and scientific reports have indicated an increase in recreational use of Salvia divinorum. Epidemiological data are lacking on the trends, prevalence, and correlates of S. divinorum use in large representative samples, as well as the extent of substance use and mental health problems among S. divinorum users.Objective: To examine the national trend in prevalence of S. divinorum use and to identify sociodemographic, behavioral, mental health, and substance-use profiles of recent (past-year and former users of S. divinorum.Design: Analyses of public-use data files from the 2006–2008 United States National Surveys on Drug Use and Health (N = 166,453.Setting: Noninstitutionalized individuals aged 12 years or older were interviewed in their places of residence.Main measures: Substance use, S. divinorum, self-reported substance use disorders, criminality, depression, and mental health treatment were assessed by standardized survey questions administered by the audio computer-assisted self-interviewing method.Results: Among survey respondents, lifetime prevalence of S. divinorum use had increased from 0.7% in 2006 to 1.3% in 2008 (an 83% increase. S. divinorum use was associated with ages 18–25 years, male gender, white or multiple race, residence of large metropolitan areas, arrests for criminal activities, and depression. S. divinorum use was particularly common among recent drug users, including users of lysergic acid diethylamide (53.7%, ecstasy (30.1%, heroin (24.2%, phencyclidine (22.4%, and cocaine (17.5%. Adjusted multinomial logistic analyses indicated polydrug use as the strongest determinant for recent and former S. divinorum use. An estimated 43.0% of past-year S. divinorum users and 28.9% of former S. divinorum users had an illicit or nonmedical drug-use disorder compared with 2.5% of nonusers. Adjusted logistic regression analyses showed that recent and former S. divinorum users had greater odds of having past-year depression and a substance-use disorder (alcohol or drugs than past-year alcohol or drug users who did not use S. divinorum.Conclusion: S. divinorum use is prevalent among recent or active drug users who have used other hallucinogens or stimulants. The high prevalence of substance use disorders among recent S. divinorum users emphasizes the need to study health risks of drug interactions.Keywords: alcohol-use disorders, drug-use disorders, ecstasy, lysergic acid diethylamide, major depression, multiple race, nicotine dependence, phencyclidine, prescription drug abuse

  13. Subchannel Notching and Channel Bonding: Comparative Analysis of Opportunistic Spectrum OFDMA Designs

    CERN Document Server

    Park, Jihoon; Grønsund, Pål; ?abri?, Danijela

    2010-01-01

    We present an analytical model that enables a comparison of multiple design options of Opportunistic Spectrum Orthogonal Frequency Division Multiple Access (OS-OFDMA). The model considers continuous and non-continuous subchannel allocation algorithms, as well as different ways to bond separate non-continuous frequency bands. Different user priorities and channel dwell times, for the Secondary Users and the Primary Users of the radio spectrum, are studied. Further, the model allows the inclusion of different types of Secondary User traffic. Finally, the model enables the study of multiple two-stage spectrum sensing algorithms. Analysis is based on a discrete time Markov chain model which allows for the computation of network characteristics such as the average throughput. From the analysis we conclude that OS-OFDMA with subchannel notching and channel bonding could provide, under certain network configurations, almost seven times higher throughput than the design without those options enabled.

  14. Satistical Graphical User Interface Plug-In for Survival Analysis in R Statistical and Graphics Language and Environment

    Directory of Open Access Journals (Sweden)

    Daniel C. LEUCU?A

    2008-12-01

    Full Text Available Introduction: R is a statistical and graphics language and environment. Although it is extensively used in command line, graphical user interfaces exist to ease the accommodation with it for new users. Rcmdr is an R package providing a basic-statistics graphical user interface to R. Survival analysis interface is not provided by Rcmdr. The AIM of this paper was to create a plug-in for Rcmdr to provide survival analysis user interface for some basic R survival analysis functions.Materials and Methods: The Rcmdr plug-in code was written in Tinn-R. The plug-in package was tested and built with Rtools. The plug-in was installed and tested in R with Rcmdr package on a Windows XP workstation with the "aml" and "kidney" data sets from survival R package.Results: The Rcmdr survival analysis plug-in was successfully built and it provides the functionality it was designed to offer: interface for Kaplan Meier and log log survival graph, interface for the log-rank test, interface to create a Cox proportional hazard regression model, interface commands to test and assess graphically the proportional hazard assumption, and influence observations. Conclusion: Rcmdr and R though their flexible and well planed structure, offer an easy way to expand their functionality that was used here to make the statistical environment more user friendly in respect with survival analysis.

  15. Qualitative analysis of end user computing strategy and experiences in promoting nursing informatics in Taiwan.

    Science.gov (United States)

    Hou, I-Ching; Chang, Polun; Wang, Tsen-Yung

    2006-01-01

    The purpose of this study was to analyse end user computing strategy and experiences in promoting nursing informatics in Taiwan. In February 2004, an 8-day NI technology training campaign was held in Taipei for 60 clinical nurses. Excel VBA was used as the tool to teach the clinical nurses, who had never written any programs, but were very interested in informatics. Three projects were determined after detailed discussion and evaluation of clinical needs and technical feasibility between the nurses and the technical support team, which was composed of one experienced informatics professor and one clinical NI assistant. A qualitative analysis was used to interview the three pairs of programming clinical nurses and their direct supervisors with a structured but open questionnaire. Representative concepts were categorized from the data until all were categorized. The concepts were organized under three categories: the purposes, the benefits and the challenges of system development. According to this study, end user computing strategy with Excel VBA was successful so far. PMID:17102334

  16. Advanced analysis system and user interface for gyrokinetic simulations of microturbulence

    Science.gov (United States)

    Lestz, Jeff; Shahidain, Sadik; Ethier, Stephane; Wang, Weixing

    2012-10-01

    Fully-global, 5D gyrokinetic simulations of turbulent transport in tokamak devices generate a large amount of time-dependant data that contain a wealth of information about waves, particles, and their self-consistent interactions. To explore these data in spectral space, in both wave numbers and frequencies, the information needs to be written out and analyzed in a post-process stage. This work describes the development of a MATLAB-based system for the extensive analysis of gyrokinetic simulation data, with particular application to the Gyrokinetic Tokamak Simulation code (GTS), which is being used for studying experimental discharges from NSTX, DIIID, and C-MOD. Parallel FORTRAN and C routines are used in some cases to read in the large amount of data and carry out the first stage of post-processing. Advanced MATLAB functions are then used for calculating statistical quantities, correlations, etc. A graphical user interface enhances the user experience and provides advanced plotting capabilities. Examples of microturbulence data analyses are given and discussed.

  17. RKWard: A Comprehensive Graphical User Interface and Integrated Development Environment for Statistical Analysis with R

    Directory of Open Access Journals (Sweden)

    Stephan Rödiger

    2012-06-01

    Full Text Available R is a free open-source implementation of the S statistical computing language and programming environment. The current status of R is a command line driven interface with no advanced cross-platform graphical user interface (GUI, but it includes tools for building such. Over the past years, proprietary and non-proprietary GUI solutions have emerged, based on internal or external tool kits, with different scopes and technological concepts. For example, Rgui.exe and Rgui.app have become the de facto GUI on the Microsoft Windows and Mac OS X platforms, respectively, for most users. In this paper we discuss RKWard which aims to be both a comprehensive GUI and an integrated development environment for R. RKWard is based on the KDE software libraries. Statistical procedures and plots are implemented using an extendable plugin architecture based on ECMAScript (JavaScript, R, and XML. RKWard provides an excellent tool to manage different types of data objects; even allowing for seamless editing of certain types. The objective of RKWard is to provide a portable and extensible R interface for both basic and advanced statistical and graphical analysis, while not compromising on flexibility and modularity of the R programming environment itself.

  18. A graphical user interface for real-time analysis of XPCS using HPC

    International Nuclear Information System (INIS)

    With the development of third generation synchrotron radiation sources, X-ray photon correlation spectroscopy has emerged as a powerful technique for characterizing equilibrium and non-equilibrium dynamics in complex materials at nanometer length scales over a wide range of time-scales (0.001-1000 s). Moreover, the development of powerful new direct detection CCD cameras has allowed investigation of faster dynamical processes. A consequence of these technical improvements is the need to reduce a very large amount of area detector data within a short time. This problem can be solved by utilizing a large number of processors (32-64) in the cluster architecture to improve the efficiency of the calculations by 1-2 orders of magnitude (Tieman et al., this issue). However, to make such a data analysis system operational, powerful and user-friendly control software needs to be developed. As a part of the effort to maintain a high data acquisition and reduction rate, we have developed a Matlab-based software that acts as an interface between the user and the high performance computing (HPC) cluster.

  19. Bitmap indices for fast end-user physics analysis in ROOT

    International Nuclear Information System (INIS)

    Most physics analysis jobs involve multiple selection steps on the input data. These selection steps are called cuts or queries. A common strategy to implement these queries is to read all input data from files and then process the queries in memory. In many applications the number of variables used to define these queries is a relative small portion of the overall data set therefore reading all variables into memory takes unnecessarily long time. In this paper we describe an integration effort that can significantly reduce this unnecessary reading by using an efficient compressed bitmap index technology. The primary advantage of this index is that it can process arbitrary combinations of queries very efficiently, while most other indexing technologies suffer from the 'curse of dimensionality' as the number of queries increases. By integrating this index technology with the ROOT analysis framework, the end-users can benefit from the added efficiency without having to modify their analysis programs. Our performance results show that for multi-dimensional queries, bitmap indices outperform the traditional analysis method up to a factor of 10

  20. Design and Analysis of Multi-User SDMA Systems with Noisy Limited CSIT Feedback

    CERN Document Server

    Wu, Tianyu

    2010-01-01

    In this paper, we consider spatial-division multiple-access (SDMA) systems with one base station with multiple antennae and a number of single antenna mobiles under noisy limited CSIT feedback. We propose a robust noisy limited feedback design for SDMA systems. The solution consists of a real-time robust SDMA precoding, user selection and rate adaptation as well as an offline feedback index assignment algorithm. The index assignment problem is cast into a Traveling Sales Man problem (TSP). Based on the specific structure of the feedback constellation and the precoder, we derive a low complex but asymptotically optimal solution. Simulation results show that the proposed framework has significant goodput gain compared to the traditional naive designs under noisy limited feedback channel. Furthermore, we show that despite the noisy feedback channel, the average SDMA system goodput grows with the number of feedback bits in the interference limited regime while in noise limited regime increases linearly with the n...

  1. User guide for data analysis of estimation algorithm of loose parts

    International Nuclear Information System (INIS)

    Generally, it is known that loose parts in the reactor coolant systems (RCS) bring serious damage into the system components and impede the normal function of the system. So, it is necessary to rapidly respond when the impact event has occurred. But the existing system is known to only alarm information for the operator. The report presented the user guide of the estimation algorithm needed to diagnosis and proposed how to use the impact test and actual impact of Database. The Database will be used to compare the test data with the actual data when the impact event has occurred. Appendix I include that the estimation algorithm applied to the impact test data and actual impact data is proposed. Appendix II is represented to the report about the actual impact data sent to the operator, until now. Appendix III shows the flowchart of LPMS's Monitoring and diagnosis at each plant

  2. Comparing Sustainable Forest Management Certifications Standards: A Meta-analysis

    Directory of Open Access Journals (Sweden)

    Joelyn Sarrah. Kozar

    2011-03-01

    Full Text Available To solve problems caused by conventional forest management, forest certification has emerged as a driver of sustainable forest management. Several sustainable forest management certification systems exist, including the Forest Stewardship Council and those endorsed by the Programme for the Endorsement of Forest Certification, such as the Canadian Standards Association - Sustainable Forestry Management Standard CAN/CSA - Z809 and Sustainable Forestry Initiative. For consumers to use certified products to meet their own sustainability goals, they must have an understanding of the effectiveness of different certification systems. To understand the relative performance of three systems, we determined: (1 the criteria used to compare the Forest Stewardship Council, Canadian Standards Association - Sustainable Forestry Management, and Sustainable Forestry Initiative, (2 if consensus exists regarding their ability to achieve sustainability goals, and (3 what research gaps must be filled to improve our understanding of how forest certification systems affect sustainable forest management. We conducted a qualitative meta-analysis of 26 grey literature references (books, industry and nongovernmental organization publications and 9 primary literature references (articles in peer-reviewed academic journals that compared at least two of the aforementioned certification systems. The Forest Stewardship Council was the highest performer for ecological health and social sustainable forest management criteria. The Canadian Standards Association - Sustainable Forestry Management and Sustainable Forestry Initiative performed best under sustainable forest management criteria of forest productivity and economic longevity of a firm. Sixty-two percent of analyses were comparisons of the wording of certification system principles or criteria; 34% were surveys of foresters or consumers. An important caveat to these results is that only one comparison was based on empirically collected field data. We recommend that future studies collect ecological and socioeconomic data from forests so purchasers can select certified forest products based on empirical evidence.

  3. A comparative analysis of biclustering algorithms for gene expression data.

    Science.gov (United States)

    Eren, Kemal; Deveci, Mehmet; Küçüktunç, Onur; Çatalyürek, Ümit V

    2013-05-01

    The need to analyze high-dimension biological data is driving the development of new data mining methods. Biclustering algorithms have been successfully applied to gene expression data to discover local patterns, in which a subset of genes exhibit similar expression levels over a subset of conditions. However, it is not clear which algorithms are best suited for this task. Many algorithms have been published in the past decade, most of which have been compared only to a small number of algorithms. Surveys and comparisons exist in the literature, but because of the large number and variety of biclustering algorithms, they are quickly outdated. In this article we partially address this problem of evaluating the strengths and weaknesses of existing biclustering methods. We used the BiBench package to compare 12 algorithms, many of which were recently published or have not been extensively studied. The algorithms were tested on a suite of synthetic data sets to measure their performance on data with varying conditions, such as different bicluster models, varying noise, varying numbers of biclusters and overlapping biclusters. The algorithms were also tested on eight large gene expression data sets obtained from the Gene Expression Omnibus. Gene Ontology enrichment analysis was performed on the resulting biclusters, and the best enrichment terms are reported. Our analyses show that the biclustering method and its parameters should be selected based on the desired model, whether that model allows overlapping biclusters, and its robustness to noise. In addition, we observe that the biclustering algorithms capable of finding more than one model are more successful at capturing biologically relevant clusters. PMID:22772837

  4. Placing the public library – a comparative analysis of political perceptions

    DEFF Research Database (Denmark)

    Evjen, Sunniva

    2012-01-01

    This thesis explores politicians’ perceptions of the public library and public library development. While many call for a redefined vision for public libraries, eye-catching libraries are built in major cities around the world. What library visions are expressed through such projects? I attempt to discern how local politicians view the role of the public library, and how do they want to develop it in their local context, using concepts from institutional theory in the analysis. The research questions include issues concerning norms politicians connect with public libraries compared with those expressed by the professional field, library legitimization, as well as plans and visions for the future library. I have addressed these questions through a comparative case study done in three cities; Oslo, Aarhus, and Birmingham, and taken a qualitative approach, using interviews with local politicians and document analysis of local and national policy documents. One important premise for this study has been to find cases where there are on-going developments; in the shape of main library construction. The findings show that politicians have extensive knowledge about the norms and values found on the professional library field. They share much of the same views regarding library roles and missions, as well as core values such as equal access to knowledge and culture – expressed for instance through a free service. When the informants legitimize public libraries in general, they primarily connect them with citizens’ democratic rights and the country’s democratic practice. Legitimizing the local projects is done using a slightly different argumentation: These are connected to city development and a desire to make visible the city’s knowledge and culture profile. The perceptions expressed through this study show that the local politicians through their work with library issues – and the projects in particular – have acquired knowledge about and understanding of the public library institution. There are shared perceptions in the three cases, however the biggest difference is found in the extent to which the library service is subjected to political planning – both locally and nationally. I find it likely that a stronger degree of formal vii institutionalisation will render the public library service more resilient when faced with external pressure, in the form of technological or economic challenges.

  5. Prediction of Financial Distress for Tunisian Firms: A Comparative Study Between Financial Analysis and Neuronal Analysis

    Directory of Open Access Journals (Sweden)

    Manel Hamdi

    2012-07-01

    Full Text Available This paper presents a prognosis of financial distress of Tunisian firms. For the purpose, we empirically compared the financial analysis to artificial neural network analysis. Five multilayer perceptron are applied to improve banking-decision. Based on the results of correct classification rate, artificial neural network proved an intact predictive ability. As well, the findings of generalization test confirmed the conclusion of the classical financial analysis of a company not included in our base sample. The artificial neural network can effectively automate the granting credit decision then performed better than traditional financial analysis.

  6. Comparative analysis of discrete exosome fractions obtained by differential centrifugation

    Directory of Open Access Journals (Sweden)

    Dennis K. Jeppesen

    2014-11-01

    Full Text Available Background: Cells release a mixture of extracellular vesicles, amongst these exosomes, that differ in size, density and composition. The standard isolation method for exosomes is centrifugation of fluid samples, typically at 100,000×g or above. Knowledge of the effect of discrete ultracentrifugation speeds on the purification from different cell types, however, is limited. Methods: We examined the effect of applying differential centrifugation g-forces ranging from 33,000×g to 200,000×g on exosome yield and purity, using 2 unrelated human cell lines, embryonic kidney HEK293 cells and bladder carcinoma FL3 cells. The fractions were evaluated by nanoparticle tracking analysis (NTA, total protein quantification and immunoblotting for CD81, TSG101, syntenin, VDAC1 and calreticulin. Results: NTA revealed the lowest background particle count in Dulbecco's Modified Eagle's Medium media devoid of phenol red and cleared by 200,000×g overnight centrifugation. The centrifugation tube fill level impacted the sedimentation efficacy. Comparative analysis by NTA, protein quantification, and detection of exosomal and contamination markers identified differences in vesicle size, concentration and composition of the obtained fractions. In addition, HEK293 and FL3 vesicles displayed marked differences in sedimentation characteristics. Exosomes were pelleted already at 33,000×g, a g-force which also removed most contaminating microsomes. Optimal vesicle-to-protein yield was obtained at 67,000×g for HEK293 cells but 100,000×g for FL3 cells. Relative expression of exosomal markers (TSG101, CD81, syntenin suggested presence of exosome subpopulations with variable sedimentation characteristics. Conclusions: Specific g-force/k factor usage during differential centrifugation greatly influences the purity and yield of exosomes. The vesicle sedimentation profile differed between the 2 cell lines.

  7. Comparative and retrospective molecular analysis of Parapoxvirus (PPV) isolates.

    Science.gov (United States)

    Friederichs, Schirin; Krebs, Stefan; Blum, Helmut; Wolf, Eckhard; Lang, Heike; von Buttlar, Heiner; Büttner, Mathias

    2014-03-01

    Species members of the genus Parapoxvirus (PPV) within the family Poxviridae cause contagious pustular dermatitis in small ruminants (Orf virus, ORFV) and mostly mild localized inflammation in cattle (bovine papular stomatitis virus, BPSV and pseudocowpox virus, PCPV). All PPVs are known to be zoonotic, leading to circumscribed skin lesions in humans, historically known as milker's nodules. Human PPV isolates are often ill defined concerning their allocation to an animal origin. Here we present a comparative molecular analysis of a unique collection of 21 historic and recent human and animal PPV cell culture isolates (and two PPV DNA samples). Cell culture PPV propagation was restricted to primary ruminant fibroblasts and was strictly kept at low passages to avoid genomic changes by in vitro influences. For molecular arrangement of the isolate DNAs and their attribution to established PPV species DNA fragments of the PPVs were generated by two different discriminating PCR protocols, targeting the major part of the open reading frame (ORF) 011 (B2L gene) and the complete ORF 032. Multiple sequence alignments and phylogenetic analysis of both genes resulted in affiliation to the known PPV species. The sequences from the ORF 032 allowed discrimination of the isolate DNAs at a higher resolution. Human PPV isolates could be clearly assigned to the PPV species belonging to the reported or assumed animal host of transmission. For the first time, a whole PPV genome sequence comparison of a human biopsy derived virus (B029) and its ovine counterpart (B015) originating from a defined Orf outbreak in Germany is provided, revealing their well conserved relationship. Thus human PPVs can be molecularly retraced to the PPV species indicating the animal of transmission. After transmission to the human host, molecular conservation of the animal's virus peculiarities indicative for a PPV species became evident. PMID:24373950

  8. Comparative analysis of different methods for graphene nanoribbon synthesis

    Directory of Open Access Journals (Sweden)

    Toši? Dragana D.

    2013-01-01

    Full Text Available Graphene nanoribbons (GNRs are thin strips of graphene that have captured the interest of scientists due to their unique structure and promising applications in electronics. This paper presents the results of a comparative analysis of morphological properties of graphene nanoribbons synthesized by different methods. Various methods have been reported for graphene nanoribons synthesis. Lithography methods usually include electron-beam (e-beam lithography, atomic force microscopy (AFM lithography, and scanning tunnelling microscopy (STM lithography. Sonochemical and chemical methods exist as well, namely chemical vapour deposition (CVD and anisotropic etching. Graphene nanoribbons can also be fabricated from unzipping carbon nanotubes (CNTs. We propose a new highly efficient method for graphene nanoribbons production by gamma irradiation of graphene dispersed in cyclopentanone (CPO. Surface morphology of graphene nanoribbons was visualized with atomic force and transmission electron microscopy. It was determined that dimensions of graphene nanoribbons are inversely proportional to applied gamma irradiation dose. It was established that the narrowest nanoribbons were 10-20 nm wide and 1 nm high with regular and smooth edges. In comparison to other synthesis methods, dimensions of graphene nanoribbons synthesized by gamma irradiation are slightly larger, but the yield of nanoribbons is much higher. Fourier transform infrared spectroscopy was used for structural analysis of graphene nanoribbons. Results of photoluminescence spectroscopy revealed for the first time that synthesized nanoribbons showed photoluminescence in the blue region of visible light in contrast to graphene nanoribbons synthesized by other methods. Based on disclosed facts, we believe that our synthesis method has good prospects for potential future mass production of graphene nanoribbons with uniform size, as well as for future investigations of carbon nanomaterials for applications in optoelectronics and biological labeling.

  9. Arms control verification costs: the need for a comparative analysis

    International Nuclear Information System (INIS)

    The end of the Cold War era has presented practitioners and analysts of international non-proliferation, arms control and disarmament (NACD) the opportunity to focus more intently on the range and scope of NACD treaties and their verification. Aside from obvious favorable and well-publicized developments in the field of nuclear non-proliferation, progress also has been made in a wide variety of arenas, ranging from chemical and biological weapons, fissile material, conventional forces, ballistic missiles, to anti-personnel landmines. Indeed, breaking from the constraints imposed by the Cold War United States-Soviet adversarial zero-sum relationship that impeded the progress of arms control, particularly on a multilateral level, the post Cold War period has witnessed significant developments in NACD commitments, initiatives, and implementation. The goals of this project - in its final iteration - will be fourfold. First, it will lead to the creation of a costing analysis model adjustable for uses in several current and future arms control verification tasks. Second, the project will identify data accumulated in the cost categories outlined in Table 1 in each of the five cases. By comparing costs to overall effectiveness, the application of the model will demonstrate desirability in each of the cases (see Chart 1). Third, the project will identify and scrutinize 'political costs' as well as real expenditures and investment in the verification regimes (see Chart 2). And,e verification regimes (see Chart 2). And, finally, the project will offer some analysis on the relationship between national and multilateral forms of arms control verification, as well as the applicability of multilateralism as an effective tool in the verification of international non-proliferation, arms control, and disarmament agreements. (author)

  10. A Comparative Study of Kernel and Robust Canonical Correlation Analysis

    Directory of Open Access Journals (Sweden)

    Ashad M. Alam

    2010-02-01

    Full Text Available A number of measures of canonical correlation coefficient are now used in multimedia related fields like object recognition, image segmentation facial expression recognition and pattern recognition in the different literature. Some robust forms of classical canonical correlation coefficient are introduced recently to address the robustness issue of the canonical coefficient in the presence of outliers and departure from normality. Also a few number of kernels are used in canonical analysis to capture nonlinear relationship in data space, which is linear in some higher dimensional feature space. But not much work has been done to investigate their relative performances through i simulation from the view point of sensitivity, breakdown analysis as well as ii using real data sets. In this paper an attempt has been made to compare performances of kernel canonical correlation coefficients (Gaussian function, Laplacian function and Polynomial function with that of robust and classical canonical correlation coefficient measures using simulation with five sample sizes (50, 500, 1000, 1500 and 2000, influence function, breakdown point along with several real data and a multi-modal data sets, focusing on the specific case of segmented images with associated text. We investigate the bias, mean square error(MISE, qualitative robustness index (RI, sensitivity curve of each estimator under a variety of situations and also employ box plots and scatter plots of canonical variates to judge their performances. We have observed that the class of kernel estimators perform better than the class of classical and robust estimators in general and the kernel estimator with Laplacian function has shown the best performance for large sample size and break down is high in case of nonlinear data.

  11. Comparative analysis of discrete exosome fractions obtained by differential centrifugation

    DEFF Research Database (Denmark)

    Jeppesen, Dennis KjØlhede; Hvam, Michael L

    2014-01-01

    BACKGROUND: Cells release a mixture of extracellular vesicles, amongst these exosomes, that differ in size, density and composition. The standard isolation method for exosomes is centrifugation of fluid samples, typically at 100,000×g or above. Knowledge of the effect of discrete ultracentrifugation speeds on the purification from different cell types, however, is limited. METHODS: We examined the effect of applying differential centrifugation g-forces ranging from 33,000×g to 200,000×g on exosome yield and purity, using 2 unrelated human cell lines, embryonic kidney HEK293 cells and bladder carcinoma FL3 cells. The fractions were evaluated by nanoparticle tracking analysis (NTA), total protein quantification and immunoblotting for CD81, TSG101, syntenin, VDAC1 and calreticulin. RESULTS: NTA revealed the lowest background particle count in Dulbecco's Modified Eagle's Medium media devoid of phenol red and cleared by 200,000×g overnight centrifugation. The centrifugation tube fill level impacted the sedimentation efficacy. Comparative analysis by NTA, protein quantification, and detection of exosomal and contamination markers identified differences in vesicle size, concentration and composition of the obtained fractions. In addition, HEK293 and FL3 vesicles displayed marked differences in sedimentation characteristics. Exosomes were pelleted already at 33,000×g, a g-force which also removed most contaminating microsomes. Optimal vesicle-to-protein yield was obtained at 67,000×g for HEK293 cells but 100,000×g for FL3 cells. Relative expression of exosomal markers (TSG101, CD81, syntenin) suggested presence of exosome subpopulations with variable sedimentation characteristics. CONCLUSIONS: Specific g-force/k factor usage during differential centrifugation greatly influences the purity and yield of exosomes. The vesicle sedimentation profile differed between the 2 cell lines.

  12. Network Meta-analysis: Users' Guide for Surgeons: Part I - Credibility.

    Science.gov (United States)

    Foote, Clary J; Chaudhry, Harman; Bhandari, Mohit; Thabane, Lehana; Furukawa, Toshi A; Petrisor, Brad; Guyatt, Gordon

    2015-07-01

    Conventional meta-analyses quantify the relative effectiveness of two interventions based on direct (that is, head-to-head) evidence typically derived from randomized controlled trials (RCTs). For many medical conditions, however, multiple treatment options exist and not all have been compared directly. This issue limits the utility of traditional synthetic techniques such as meta-analyses, since these approaches can only pool and compare evidence across interventions that have been compared directly by source studies. Network meta-analyses (NMA) use direct and indirect comparisons to quantify the relative effectiveness of three or more treatment options. Interpreting the methodologic quality and results of NMAs may be challenging, as they use complex methods that may be unfamiliar to surgeons; yet for these surgeons to use these studies in their practices, they need to be able to determine whether they can trust the results of NMAs. The first judgment of trust requires an assessment of the credibility of the NMA methodology; the second judgment of trust requires a determination of certainty in effect sizes and directions. In this Users' Guide for Surgeons, Part I, we show the application of evaluation criteria for determining the credibility of a NMA through an example pertinent to clinical orthopaedics. In the subsequent article (Part II), we help readers evaluate the level of certainty NMAs can provide in terms of treatment effect sizes and directions. PMID:25869061

  13. Comparative analysis of graphite oxidation behaviour based on microstructure

    Energy Technology Data Exchange (ETDEWEB)

    Badenhorst, Heinrich, E-mail: heinrich.badenhorst@up.ac.za; Focke, Walter

    2013-11-15

    Two unidentified powdered graphite samples, from a natural and a synthetic origin respectively, were examined. These materials are intended for use in nuclear applications, but have an unknown treatment history since they are considered proprietary. In order to establish a baseline for comparison, the samples were compared to two commercial flake natural graphite samples with varying impurity levels. The samples were characterized by conventional techniques such as powder X-ray diffraction, Raman spectroscopy and X-ray fluorescence. The results indicated that all four samples were very similar, with low impurity levels and good crystallinity, yet they exhibit remarkably different oxidation behaviours. The oxidized microstructures of the materials were examined using high-resolution scanning electron microscopy at low acceleration voltages. The relative influence of each factor affecting the oxidation was established, enabling a structured comparison of the different oxidative behaviours. Based on this analysis, it was possible to account for the measured differences in oxidative reactivity. The material with the lowest reactivity was a flake natural graphite which was characterized as having highly visible crystalline perfection, large particles with a high aspect ratio and no traces of catalytic activity. The second sample, which had an identical inherent microstructure, was found to have an increased reactivity due to the presence of small catalytic impurities. This material also exhibited a more gradual reduction in the oxidation rate at higher conversion, caused by the accumulation of particles which impede the oxidation. The sample with the highest reactivity was found to be a milled, natural graphite material, despite its evident crystallinity. The increased reactivity was attributable to a smaller particle size, the presence of catalytic impurities and extensive damage to the particle structure caused by jet milling. Despite displaying the lowest levels of crystalline perfection, the synthetic graphite had an intermediate reactivity, comparable to that of the highly crystalline but contaminated sample. The absence of catalytic impurities and the needle coke-derived particle structure were found to account for this behaviour. This work illustrates that the single most important factor when comparing unknown graphite materials from different origins is an assessment of the oxidized microstructure. This approach has the added benefit of identifying further potential processing steps and limitations for material customization.

  14. Comparative analysis of graphite oxidation behaviour based on microstructure

    Science.gov (United States)

    Badenhorst, Heinrich; Focke, Walter

    2013-11-01

    Two unidentified powdered graphite samples, from a natural and a synthetic origin respectively, were examined. These materials are intended for use in nuclear applications, but have an unknown treatment history since they are considered proprietary. In order to establish a baseline for comparison, the samples were compared to two commercial flake natural graphite samples with varying impurity levels. The samples were characterized by conventional techniques such as powder X-ray diffraction, Raman spectroscopy and X-ray fluorescence. The results indicated that all four samples were very similar, with low impurity levels and good crystallinity, yet they exhibit remarkably different oxidation behaviours. The oxidized microstructures of the materials were examined using high-resolution scanning electron microscopy at low acceleration voltages. The relative influence of each factor affecting the oxidation was established, enabling a structured comparison of the different oxidative behaviours. Based on this analysis, it was possible to account for the measured differences in oxidative reactivity. The material with the lowest reactivity was a flake natural graphite which was characterized as having highly visible crystalline perfection, large particles with a high aspect ratio and no traces of catalytic activity. The second sample, which had an identical inherent microstructure, was found to have an increased reactivity due to the presence of small catalytic impurities. This material also exhibited a more gradual reduction in the oxidation rate at higher conversion, caused by the accumulation of particles which impede the oxidation. The sample with the highest reactivity was found to be a milled, natural graphite material, despite its evident crystallinity. The increased reactivity was attributable to a smaller particle size, the presence of catalytic impurities and extensive damage to the particle structure caused by jet milling. Despite displaying the lowest levels of crystalline perfection, the synthetic graphite had an intermediate reactivity, comparable to that of the highly crystalline but contaminated sample. The absence of catalytic impurities and the needle coke-derived particle structure were found to account for this behaviour. This work illustrates that the single most important factor when comparing unknown graphite materials from different origins is an assessment of the oxidized microstructure. This approach has the added benefit of identifying further potential processing steps and limitations for material customization.

  15. Comparative analysis of graphite oxidation behaviour based on microstructure

    International Nuclear Information System (INIS)

    Two unidentified powdered graphite samples, from a natural and a synthetic origin respectively, were examined. These materials are intended for use in nuclear applications, but have an unknown treatment history since they are considered proprietary. In order to establish a baseline for comparison, the samples were compared to two commercial flake natural graphite samples with varying impurity levels. The samples were characterized by conventional techniques such as powder X-ray diffraction, Raman spectroscopy and X-ray fluorescence. The results indicated that all four samples were very similar, with low impurity levels and good crystallinity, yet they exhibit remarkably different oxidation behaviours. The oxidized microstructures of the materials were examined using high-resolution scanning electron microscopy at low acceleration voltages. The relative influence of each factor affecting the oxidation was established, enabling a structured comparison of the different oxidative behaviours. Based on this analysis, it was possible to account for the measured differences in oxidative reactivity. The material with the lowest reactivity was a flake natural graphite which was characterized as having highly visible crystalline perfection, large particles with a high aspect ratio and no traces of catalytic activity. The second sample, which had an identical inherent microstructure, was found to have an increased reactivity due to the presence of small catalytic impurities. This material also exhibited a more gradual reduction in the oxidation rate at higher conversion, caused by the accumulation of particles which impede the oxidation. The sample with the highest reactivity was found to be a milled, natural graphite material, despite its evident crystallinity. The increased reactivity was attributable to a smaller particle size, the presence of catalytic impurities and extensive damage to the particle structure caused by jet milling. Despite displaying the lowest levels of crystalline perfection, the synthetic graphite had an intermediate reactivity, comparable to that of the highly crystalline but contaminated sample. The absence of catalytic impurities and the needle coke-derived particle structure were found to account for this behaviour. This work illustrates that the single most important factor when comparing unknown graphite materials from different origins is an assessment of the oxidized microstructure. This approach has the added benefit of identifying further potential processing steps and limitations for material customization

  16. Comparative Analysis Of Fuzzy Clustering Algorithms In Data Mining

    OpenAIRE

    Binsy Thomas, Madhu Nashipudimath

    2012-01-01

    Data clustering acts as an intelligent tool, a method that allows the user to handle large volumes of data effectively. The basic function of clustering is to transform data of any origin into a more compact form, one that represents accurately the original data. Clustering algorithms are used to analyze these large collection of data by means of subdividing them into groups of similar data. Fuzzy clustering extends the crisp clustering technique in such a way that instead of an object belong...

  17. Comparative Analysis of Various Authentication Techniques in Cloud Computing

    OpenAIRE

    SHABNAM SHARMA; USHA MITTAL

    2013-01-01

    Over the recent years, there is a great advancement in the field of Computer Science. Cloud Computing is the result of advancement in the existing technologies. It shares the characteristics with Autonomic Computing, Client-Server Model, Grid Computing, Mainframe Computer, Utility Computing, Peer-to-Peer and Cloud Gaming. Cloud Computing is beneficial not only for users but also for large and small organizations. Security issues are the major concern in Cloud Computing. In this paper, our foc...

  18. Comparative Analysis of Fragment based and Exemplar based Inpainting Techniques

    OpenAIRE

    J.N.KAZI; Patil, Y. M.

    2013-01-01

    Inpainting is an art of modifying the digital image in such a way that the modifications/alterations are undetectable to an observer who is unknown of the original image. Applications of this technique include restoration of damaged photographs & films, removal of superimposed text, removal/replacement of unwanted objects. After the user selects a region to be inpainted the algorithm automatically fills in these holes by data sampled from remainder of the image. In past the problem of inpaint...

  19. Exploiting all phone media? A multidimensional network analysis of phone users' sociality

    CERN Document Server

    Zignani, Matteo; Gaitto, Sabrina; Rossi, Gian Paolo

    2014-01-01

    The growing awareness that human communications and social interactions are assuming a stratified structure, due to the availability of multiple techno-communication channels, including online social networks, mobile phone calls, short messages (SMS) and e-mails, has recently led to the study of multidimensional networks, as a step further the classical Social Network Analysis. A few papers have been dedicated to develop the theoretical framework to deal with such multiplex networks and to analyze some example of multidimensional social networks. In this context we perform the first study of the multiplex mobile social network, gathered from the records of both call and text message activities of millions of users of a large mobile phone operator over a period of 12 weeks. While social networks constructed from mobile phone datasets have drawn great attention in recent years, so far studies have dealt with text message and call data, separately, providing a very partial view of people sociality expressed on p...

  20. The Quantitative Analysis of User Behavior Online - Data, Models and Algorithms

    Science.gov (United States)

    Raghavan, Prabhakar

    By blending principles from mechanism design, algorithms, machine learning and massive distributed computing, the search industry has become good at optimizing monetization on sound scientific principles. This represents a successful and growing partnership between computer science and microeconomics. When it comes to understanding how online users respond to the content and experiences presented to them, we have more of a lacuna in the collaboration between computer science and certain social sciences. We will use a concrete technical example from image search results presentation, developing in the process some algorithmic and machine learning problems of interest in their own right. We then use this example to motivate the kinds of studies that need to grow between computer science and the social sciences; a critical element of this is the need to blend large-scale data analysis with smaller-scale eye-tracking and "individualized" lab studies.

  1. Light water reactor fuel analysis code FEMAXI-IV(Ver.2). Detailed structure and user's manual

    International Nuclear Information System (INIS)

    A light water reactor fuel behavior analysis code FEMAXI-IV(Ver.2) was developed as an improved version of FEMAXI-IV. Development of FEMAXI-IV has been already finished in 1992, though a detailed structure and input manual of the code have not been open to users yet. Here, the basic theories and structure, the models and numerical solutions applied to FEMAXI-IV(Ver.2), and the material properties adopted in the code are described in detail. In FEMAXI-IV(Ver.2), programming bugs in previous FEMAXI-IV were eliminated, renewal of the pellet thermal conductivity was performed, and a model of thermal-stress restraint on FP gas release was incorporated. For facilitation of effective and wide-ranging application of the code, methods of input/output of the code are also described in detail, and sample output is included. (author)

  2. Thermal Insulation System Analysis Tool (TISTool) User's Manual. Version 1.0.0

    Science.gov (United States)

    Johnson, Wesley; Fesmire, James; Leucht, Kurt; Demko, Jonathan

    2010-01-01

    The Thermal Insulation System Analysis Tool (TISTool) was developed starting in 2004 by Jonathan Demko and James Fesmire. The first edition was written in Excel and Visual BasIc as macros. It included the basic shapes such as a flat plate, cylinder, dished head, and sphere. The data was from several KSC tests that were already in the public literature realm as well as data from NIST and other highly respectable sources. More recently, the tool has been updated with more test data from the Cryogenics Test Laboratory and the tank shape was added. Additionally, the tool was converted to FORTRAN 95 to allow for easier distribution of the material and tool. This document reviews the user instructions for the operation of this system.

  3. GPP user's guide: A general-purpose postprocessor for wind turbine data analysis

    Science.gov (United States)

    Buhl, M. L., Jr.

    1995-01-01

    GPP is a General-Purpose Postprocessor for wind turbine data analysis. The author, a member of the Wind Technology Division (WTD) of the National Renewable Energy Laboratory (NREL), developed GPP to postprocess test data and simulation predictions. GPP reads data into large arrays and allows the user to run many types of analyses on the data stored in memory. It runs on inexpensive computers common in the wind industry. One can even use it on a laptop in the field. The author wrote the program in such a way as to make it easy to add new types of analyses and to port it to many types of computers. Although GPP is very powerful and feature-rich, it is still very easy to learn and to use. Exhaustive error trapping prevents one from losing valuable work due to input errors. GPP will, hopefully, make a significant impact on engineering productivity in the wind industry.

  4. ComPath: comparative enzyme analysis and annotation in pathway/subsystem contexts

    Directory of Open Access Journals (Sweden)

    Kim Sun

    2008-03-01

    Full Text Available Abstract Background Once a new genome is sequenced, one of the important questions is to determine the presence and absence of biological pathways. Analysis of biological pathways in a genome is a complicated task since a number of biological entities are involved in pathways and biological pathways in different organisms are not identical. Computational pathway identification and analysis thus involves a number of computational tools and databases and typically done in comparison with pathways in other organisms. This computational requirement is much beyond the capability of biologists, so information systems for reconstructing, annotating, and analyzing biological pathways are much needed. We introduce a new comparative pathway analysis workbench, ComPath, which integrates various resources and computational tools using an interactive spreadsheet-style web interface for reliable pathway analyses. Results ComPath allows users to compare biological pathways in multiple genomes using a spreadsheet style web interface where various sequence-based analysis can be performed either to compare enzymes (e.g. sequence clustering and pathways (e.g. pathway hole identification, to search a genome for de novo prediction of enzymes, or to annotate a genome in comparison with reference genomes of choice. To fill in pathway holes or make de novo enzyme predictions, multiple computational methods such as FASTA, Whole-HMM, CSR-HMM (a method of our own introduced in this paper, and PDB-domain search are integrated in ComPath. Our experiments show that FASTA and CSR-HMM search methods generally outperform Whole-HMM and PDB-domain search methods in terms of sensitivity, but FASTA search performs poorly in terms of specificity, detecting more false positive as E-value cutoff increases. Overall, CSR-HMM search method performs best in terms of both sensitivity and specificity. Gene neighborhood and pathway neighborhood (global network visualization tools can be used to get context information that is complementary to conventional KEGG map representation. Conclusion ComPath is an interactive workbench for pathway reconstruction, annotation, and analysis where experts can perform various sequence, domain, context analysis, using an intuitive and interactive spreadsheet-style interface.

  5. SHEAT for PC. A computer code for probabilistic seismic hazard analysis for personal computer, user's manual

    International Nuclear Information System (INIS)

    The SHEAT code developed at Japan Atomic Energy Research Institute is for probabilistic seismic hazard analysis which is one of the tasks needed for seismic Probabilistic Safety Assessment (PSA) of a nuclear power plant. At first, SHEAT was developed as the large sized computer version. In addition, a personal computer version was provided to improve operation efficiency and generality of this code in 2001. It is possible to perform the earthquake hazard analysis, display and the print functions with the Graphical User Interface. With the SHEAT for PC code, seismic hazard which is defined as an annual exceedance frequency of occurrence of earthquake ground motions at various levels of intensity at a given site is calculated by the following two steps as is done with the large sized computer. One is the modeling of earthquake generation around a site. Future earthquake generation (locations, magnitudes and frequencies of postulated earthquake) is modeled based on the historical earthquake records, active fault data and expert judgment. Another is the calculation of probabilistic seismic hazard at the site. An earthquake ground motion is calculated for each postulated earthquake using an attenuation model taking into account its standard deviation. Then the seismic hazard at the site is calculated by summing the frequencies of ground motions by all the earthquakes. This document is the user's manual of the SHEAT for PC code. It includes: (1) Outline of the code, which include overall concept, logical process, code structure, data file used and special characteristics of code, (2) Functions of subprogram and analytical models in them, (3) Guidance of input and output data, (4) Sample run result, and (5) Operational manual. (author)

  6. Comparative Analysis of Various Condenser in Vapour Compression Refrigeration System

    Directory of Open Access Journals (Sweden)

    Patil Deepak P.

    2014-09-01

    Full Text Available The present work is to analyze performance of refrigeration system on three condensers viz. micro-channel, round tube and coil tube using R134a and R290 refrigerants. These three condensers are kept in parallel with other components of refrigerating unit while construction.The performance of refrigeration system is checked for each condenser at various cooling loads in the range from 175 W to 288 W.The performance of the condenser is measured for whole refrigeration unit in terms of coefficient of performance, efficiency of the system, heat rejection ratio, heat rejected from condenserand heat transfer coefficient. The experimental data of heat transfer coefficient is validated with existing correlation.The result shows that for both refrigerants R134a and R290, coefficient of performance increases with increase in heating load. From the analysis of three condensers, coefficient of performance of refrigeration system using microchannel condenser is more compared to round tube and coil tube condenser. The coefficient of performance of the system with the microchannel condenser is found 15.3% higher than that with the round tube condenser and 8% higher than that with the coil tube condenser. AlsoR134a gives better cooling effect than the R290 for all operating condition.

  7. [Comparative analysis of seven marine biological source of mineral drugs].

    Science.gov (United States)

    Si, Wei; A, Ru-na; Li, Shang-rong; Zhang, Jing-Xian; Wu, Wan-ying; Cui, Ya-jun

    2014-09-01

    The marine biological source of mineral drugs recorded in Chinese Pharmacopoeia (2010 version) mainly including pearl, nacre, clam shell, common oyster shell, ark shell, cuttle bone, and sea-ear shell are widely used in clinical. Calcium carbonate and a small amount of protein are the main components in this type of drugs. In this paper, a systematical and comparable study were carried out by determination of calcium carbonate by EDTA titration method, the crystal of calcium carbonate by X-Ray powder diffraction and the total amino acids (TAAs) of the hydrolyzed samples by ultraviolet spectrophotometry method. As a result, the crystal structure is calcite for common oyster shell, mixture of calcite and aragonite for nacre and sea-ear shell, aragonite for the other drugs. The content of calcium carbonate ranged from 86% to 96%. Cuttle bone has the highest amount of TAAs among the seven drugs which reached 1.7% while clam shell has the lowest content of 0.16% on average. In conclusion, an effective method was developed for the quality control of marine mineral drugs by comprehensive analysis of calcium carbonate and TAAs in the seven marine mineral drugs. PMID:25522620

  8. Comparative tectonic analysis of Black sea and western Mediterranean

    International Nuclear Information System (INIS)

    Full text : This paper outlines preliminary results of comparative tectonic analysis for two regions of the Alpides belt - Black Sea and western Mediterranean. It is speculated, that these two regions have despite of a number of essential differences, many profound analogies in their geological structure and development so the found similarities can elaborate new prospecting trends for oil and gas exploration in the circum-Black sea basin using geological knowledge on the Western Mediterranean. Such numerous analogies cannot be as an accidental coincidence. They are deeply rooted in common kinematical style reproducing the pretty much same ensemble of tectonic terrains and fault patterns of different scale in remote segments of the Western Tethys as a whole and its Ponto-Caspian segment in particular. Understanding of cohesive evolving of the circum Black area structural constituents and driving forces of its tectonic development represents the longstanding and intriguing geological problem. Over the past two decades, much has been published concerning structural development of the Black Sea region as a consequence of the advent of moderngeodynamical models employing plate tectonic reconstructions and new phases of exploration activity. On the other hand an existence of mutually exclusive scenarios for structural development of the area, timing of its onset and reconciliation of different data raise many questions puzzling the assessment of hydrocarbon charge of its s assessment of hydrocarbon charge of its subbasins and prospecting efforts to determine top-priority trends for petroleum exploration.

  9. Comparative Analysis of Methods to Denoise CT Scan Images

    Directory of Open Access Journals (Sweden)

    TARANDEEP CHHABRA, GEETIKA DUA, TRIPTI MALHOTRA

    2013-07-01

    Full Text Available Medical images are generally noisy due to the physical mechanisms of the acquisition process. In CT Scan there is a scope to adapt patient image quality and dose. Reduction in radiation dose (i.e the amount of X-rays affects the quality of image and is responsible for image noise in CT. Most of the denoising algorithms assume additive white Gaussian noise but however most medical images may contain non Gaussian noise like poisson noise in CT. This paper contains the comparative analysis of a number of denoising algorithms namely wiener filtering, wavelet decomposition, anisotropic diffusion, anisotropic diffusion in wavelet domain, wave atom decomposition, median filtering and NL-means filtering. Then, some quantitative performance metrics like PSNR, SNR, MSE, S/MSE and MAD are computed. This comparison helps in the assessment of image quality and fidelity. We conclude that the anisotropic diffusion in wavelet domain is the most efficient method in removing poisson noise from CT Scan images.

  10. Comparative Genome Analysis of Lolium-Festuca Complex Species

    DEFF Research Database (Denmark)

    Czaban, Adrian; Byrne, Stephen

    2015-01-01

    The Lolium-Festuca complex incorporates species from the Lolium genera and the broad leaf Fescues. Plants belonging to this complex exhibit significant phenotypic plasticity for agriculturally important traits, such as annuality/perenniality, establishment potential, growth speed, nutritional value, winter hardiness, drought tolerance and resistance to grazing. In this study we have sequenced and assembled the low copy fraction of the genomes of Lolium westerwoldicum, Lolium multiflorum, Festuca pratensis and Lolium temulentum. We have also generated de-novo transcriptome assemblies for each species, and these have aided in the annotation of the genomic sequence. Using this data we were able to generate annotated assemblies of the gene rich regions of the four species to complement the already sequenced Lolium perenne genome. Using these gene models we have identified orthologous genes between the species. Our dataset enabled us to perform comparative gene family analysis for CBF (C-Repeat Binding Factor) proteins, which are key regulators of cold acclimation and freezing tolerance in plants.

  11. Comparative analysis of ADS gene promoter in seven Artemisia species.

    Science.gov (United States)

    Ranjbar, Mojtaba; Naghavi, Mohammad Reza; Alizadeh, Hoshang

    2014-12-01

    Artemisinin is the most effective antimalarial drug that is derived from Artemisia annua. Amorpha-4,11-diene synthase (ADS) controls the first committed step in artemisinin biosynthesis. The ADS gene expression is regulated by transcription factors which bind to the cis-acting elements on the ADS promoter and are probably responsible for the ADS gene expression difference in the Artemisia species. To identify the elements that are significantly involved in ADS gene expression, the ADS gene promoter of the seven Artemisia species was isolated and comparative analysis was performed on the ADS promoter sequences of these species. Results revealed that some of the cis-elements were unique or in terms of number were more in the high artemisinin producer species, A. annua, than the other species. We have reported that the light-responsive elements, W-box, CAAT-box, 5'-UTR py-rich stretch, TATA-box sequence and tandem repeat sequences have been identified as important factors in the increased expression of ADS gene. PMID:25572235

  12. Comparative exergy analysis of trigeneration systems for a dairy industry

    Energy Technology Data Exchange (ETDEWEB)

    Burbano, Juan Carlos [Technological University of Pereira (Colombia). Mechanical Engineering Faculty]. E-mail: jburbano@utp.edu.co; Pellegrini, Luiz Felipe; Oliveira Junior, Silvio de [University of Sao Paulo, SP (Brazil). Polytechnic School]. E-mails: luiz.pellegrini@poli.usp.br; silvio.oliveira@poli.usp.br

    2008-07-01

    The increasing costs of primary energy sources, like gas and oil, invite consumers to seek a more efficient use of them. High fuel prices and more restrictive ecological implications are giving impulse to technologies that better explore those energy sources. In spite of the efforts that are being made to develop alternative renewable energy sources (biomass, for instance), fossil fuels will be the predominant energy resource. Most of the industrialized countries, especially the United States and in Europe, have been developing energy policies to make better use of fossil fuels. These policies include programs for the efficient use of the energy, development of energy alternative sources, among others. The combined generation of heat and power is one of the technologies often used in industrial processes. A trigeneration process is an alternative to increase the efficiency in the power and thermal generation. The term trigeneration can be defined as the production of power, heat and additional cooling (generally, chilled water for air conditioned purposes). The trigeneration systems produce these three energy forms from a primary energy source such as natural gas or oil. This paper presents a comparative exergy analysis study of different configurations of this type of system for a dairy industry, which involve the use of steam turbines with compression and absorption chillers as well as a combined cycle with absorption chiller. (author)

  13. Comparative whole genome analysis of six diagnostic brucellaphages.

    Science.gov (United States)

    Farlow, Jason; Filippov, Andrey A; Sergueev, Kirill V; Hang, Jun; Kotorashvili, Adam; Nikolich, Mikeljon P

    2014-05-15

    Whole genome sequencing of six diagnostic brucellaphages, Tbilisi (Tb), Firenze (Fz), Weybridge (Wb), S708, Berkeley (Bk) and R/C, was followed with genomic comparisons including recently described genomes of the Tb phage from Mexico (TbM) and Pr phage to elucidate genomic diversity and candidate host range determinants. Comparative whole genome analysis revealed high sequence homogeneity among these brucellaphage genomes and resolved three genetic groups consistent with defined host range phenotypes. Group I was composed of Tb and Fz phages that are predominantly lytic for Brucella abortus and Brucella neotomae; Group II included Bk, R/C, and Pr phages that are lytic mainly for B. abortus, Brucella melitensis and Brucella suis; Group III was composed of Wb and S708 phages that are lytic for B. suis, B. abortus and B. neotomae. We found that the putative phage collar protein is a variable locus with features that may be contributing to the host specificities exhibited by different brucellaphage groups. The presence of several candidate host range determinants is illustrated herein for future dissection of the differential host specificity observed among these phages. PMID:24530704

  14. Comparative Analysis of Serial Decision Tree Classification Algorithms

    Directory of Open Access Journals (Sweden)

    Matthew Nwokejizie Anyanwu

    2009-09-01

    Full Text Available Classification of data objects based on a predefined knowledge of the objects is a data mining and knowledge management technique used in grouping similar data objects together. It can be defined as supervised learning algorithms as it assigns class labels to data objects based on the relationship between the data items with a pre-defined class label. Classification algorithms have a wide range of applications like churn prediction, fraud detection, artificial intelligence, and credit card rating etc. Also there are many classification algorithms available in literature but decision trees is the most commonly used because of its ease of implementation and easier to understand compared to other classification algorithms. Decision Tree classification algorithm can be implemented in a serial or parallel fashion based on the volume of data, memory space available on the computer resource and scalability of the algorithm. In this paper we will review the serial implementations of the decision tree algorithms, identify those that are commonly used. We will also use experimental analysis based on sample data records (Statlog data sets to evaluate the performance of the commonly used serial decision tree algorithms

  15. Comparative analysis of PSO algorithms for PID controller tuning

    Science.gov (United States)

    Štimac, Goranka; Braut, Sanjin; Žiguli?, Roberto

    2014-09-01

    The active magnetic bearing(AMB) suspends the rotating shaft and maintains it in levitated position by applying controlled electromagnetic forces on the rotor in radial and axial directions. Although the development of various control methods is rapid, PID control strategy is still the most widely used control strategy in many applications, including AMBs. In order to tune PID controller, a particle swarm optimization(PSO) method is applied. Therefore, a comparative analysis of particle swarm optimization(PSO) algorithms is carried out, where two PSO algorithms, namely (1) PSO with linearly decreasing inertia weight(LDW-PSO), and (2) PSO algorithm with constriction factor approach(CFA-PSO), are independently tested for different PID structures. The computer simulations are carried out with the aim of minimizing the objective function defined as the integral of time multiplied by the absolute value of error(ITAE). In order to validate the performance of the analyzed PSO algorithms, one-axis and two-axis radial rotor/active magnetic bearing systems are examined. The results show that PSO algorithms are effective and easily implemented methods, providing stable convergence and good computational efficiency of different PID structures for the rotor/AMB systems. Moreover, the PSO algorithms prove to be easily used for controller tuning in case of both SISO and MIMO system, which consider the system delay and the interference among the horizontal and vertical rotor axes.

  16. Comparative chiasma analysis using a computerised optical digitiser.

    Science.gov (United States)

    Shaw, D D; Knowles, G R

    1976-12-16

    A new computerised technique has been devised for measuring the distribution of chiasmata along diplotene bivalents. The method involves the introduction into the field of view of the microscope, of a fine light spot which can be accurately manipulated along the chromosomes of each bivalent. The data recorded include (a) the positions of the chiasmata along the bivalent in terms of their relative distances from the centromere and (b) the individual bivalent and cellular chiasma frequencies. -- The method has been applied to the analysis of chiasma distribution patterns in the two known species of the genus Caledia, C. species nova 1 and C. captiva and in two chromosomal races of the latter. Statistical tests indicate that within bivalents at least 40% of the comparative distribution patterns of chiasmata between races and species are significantly different. Similar comparisons between populations within races reveal only 18% significant differences. -- The observed distribution patterns of chiasmata in this genus suggest that chiasma formation is sequential from centromere to telomere. -- The variation in the frequency and distribution of chiasmata between races and species suggests that the interference distances between successive chiasmata are, at least partially, independent of chiasma frequency and position. -- The interracial and interspecific differences in chromosome structure are correlated with changes in chiasma pattern. PMID:1009812

  17. Comparative Analysis of German and Anglo-Saxon Business Culture

    Directory of Open Access Journals (Sweden)

    Hamburg Andrea

    2013-05-01

    Full Text Available Two premises built the starting point for following study: that cultural background, cultural conditioning have a considerable influence upon business area at one hand and at the other hand that nations having common origins are likely to present similar cultural conditioning. The first hypothesis found proving in the works of theoreticians and practitioners like E.T. Hall, Geert Hofstede, Richard Gesteland and others dealing with the problem of people’s “mental programming” called culture and with cultural differences around the world. For the second premise we wanted to analyze three cultures having common Germanic roots namely the German, British (focusing on the English component of it and American cultures through the prism of their concept of time, relation to business, working and communicational style, structure of management, attitude towards hierarchy and interpersonal distance including physical contact. As the results of our comparative analysis showed above mentioned business cultures had very much in common regarding attitude to time, business and interpersonal distance but in the other segments they presented considerable differences as well. Taking all aspects into consideration the similarities deriving from their common Germanic origin offer the three cultures in question some advantages in business relations but the essential differences they present should be minded, too to avoid failure in deal making.

  18. Comparative analysis of solution methods of the punctual kinetic equations

    International Nuclear Information System (INIS)

    The following one written it presents a comparative analysis among different analytical solutions for the punctual kinetics equation, which present two variables of interest: a) the temporary behavior of the neutronic population, and b) The temporary behavior of the different groups of precursors of delayed neutrons. The first solution is based on a method that solves the transfer function of the differential equation for the neutronic population, in which intends to obtain the different poles that give the stability of this transfer function. In this section it is demonstrated that the temporary variation of the reactivity of the system can be managed as it is required, since the integration time for this method doesn't affect the result. However, the second solution is based on an iterative method like that of Runge-Kutta or the Euler method where the algorithm was only used to solve first order differential equations giving this way solution to each differential equation that conforms the equations of punctual kinetics. In this section it is demonstrated that only it can obtain a correct temporary behavior of the neutronic population when it is integrated on an interval of very short time, forcing to the temporary variation of the reactivity to change very quick way without one has some control about the time. In both methods the same change is used so much in the reactivity of the system like in the integration times, giving validity to the results graph the one the temporary behavior of the neutronic population vs. time. (Author)

  19. Comparative analysis of steam generator hideout return test

    Energy Technology Data Exchange (ETDEWEB)

    Song, H. R.; Kim, H. D.; Jeong, H. S. [KEPRI, Taejon (Korea, Republic of)

    1999-10-01

    To reduce Steam Generator (SG) tube degradation by IGA/SCC, many utilities have been used Molar Ratio Control (MRC) method. Maintaining the neutral crevice pH in the SG is the useful method for decreasing the rates of IGA/SCC. HOR test which is performed by the analysis of chemical impurities during the overhaul period can makes it possible to assess of crevice environment. HORT results from Model F type and Model System 80 type SG were evaluated and compared. Analysed chemical species were Na, K, Ca, Mg, Cl, SO{sub 4}, NO{sub 3}, PO{sub 4}, and silica. Generally, Na, K, Cl, silica which are soluble phases in the crevice are known to be returned during Hot Zero Power(HZP) period. But, they were returned during power reduction period for Model System 80. It was shown that the eggcrate tube support structure of System 80 SG makes the soluble species could be returned easily. The crevice environments were assessed by the use of HORT results which were based of the MRC operation.

  20. Comparative analysis of steam generator hideout return test

    International Nuclear Information System (INIS)

    To reduce Steam Generator (SG) tube degradation by IGA/SCC, many utilities have been used Molar Ratio Control (MRC) method. Maintaining the neutral crevice pH in the SG is the useful method for decreasing the rates of IGA/SCC. HOR test which is performed by the analysis of chemical impurities during the overhaul period can makes it possible to assess of crevice environment. HORT results from Model F type and Model System 80 type SG were evaluated and compared. Analysed chemical species were Na, K, Ca, Mg, Cl, SO4, NO3, PO4, and silica. Generally, Na, K, Cl, silica which are soluble phases in the crevice are known to be returned during Hot Zero Power(HZP) period. But, they were returned during power reduction period for Model System 80. It was shown that the eggcrate tube support structure of System 80 SG makes the soluble species could be returned easily. The crevice environments were assessed by the use of HORT results which were based of the MRC operation

  1. Comparative Analysis of Principal Components Can be Misleading.

    Science.gov (United States)

    Uyeda, Josef C; Caetano, Daniel S; Pennell, Matthew W

    2015-07-01

    Most existing methods for modeling trait evolution are univariate, although researchers are often interested in investigating evolutionary patterns and processes across multiple traits. Principal components analysis (PCA) is commonly used to reduce the dimensionality of multivariate data so that univariate trait models can be fit to individual principal components. The problem with using standard PCA on phylogenetically structured data has been previously pointed out yet it continues to be widely used in the literature. Here we demonstrate precisely how using standard PCA can mislead inferences: The first few principal components of traits evolved under constant-rate multivariate Brownian motion will appear to have evolved via an "early burst" process. A phylogenetic PCA (pPCA) has been proprosed to alleviate these issues. However, when the true model of trait evolution deviates from the model assumed in the calculation of the pPCA axes, we find that the use of pPCA suffers from similar artifacts as standard PCA. We show that data sets with high effective dimensionality are particularly likely to lead to erroneous inferences. Ultimately, all of the problems we report stem from the same underlying issue-by considering only the first few principal components as univariate traits, we are effectively examining a biased sample of a multivariate pattern. These results highlight the need for truly multivariate phylogenetic comparative methods. As these methods are still being developed, we discuss potential alternative strategies for using and interpreting models fit to univariate axes of multivariate data. PMID:25841167

  2. Does Offline Political Segregation Affect the Filter Bubble? An Empirical Analysis of Information Diversity for Dutch and Turkish Twitter Users

    CERN Document Server

    Bozdag, Engin; Houben, Geert-Jan; Warnier, Martijn

    2014-01-01

    From a liberal perspective, pluralism and viewpoint diversity are seen as a necessary condition for a well-functioning democracy. Recently, there have been claims that viewpoint diversity is diminishing in online social networks, putting users in a "bubble", where they receive political information which they agree with. The contributions from our investigations are fivefold: (1) we introduce different dimensions of the highly complex value viewpoint diversity using political theory; (2) we provide an overview of the metrics used in the literature of viewpoint diversity analysis; (3) we operationalize new metrics using the theory and provide a framework to analyze viewpoint diversity in Twitter for different political cultures; (4) we share our results for a case study on minorities we performed for Turkish and Dutch Twitter users; (5) we show that minority users cannot reach a large percentage of Turkish Twitter users. With the last of these contributions, using theory from communication scholars and philoso...

  3. Privacy - an Issue for eLearning? A Trend Analysis Reflecting the Attitude of European eLearning Users

    CERN Document Server

    Borcea-Pfitzmann, Katrin

    2007-01-01

    Availing services provided via the Internet became a widely accepted means in organising one's life. Beside others, eLearning goes with this trend as well. But, while employing Internet service makes life more convenient, at the same time, it raises risks with respect to the protection of the users' privacy. This paper analyses the attitudes of eLearning users towards their privacy by, initially, pointing out terminology and legal issues connected with privacy. Further, the concept and implementation as well as a result analysis of a conducted study is presented, which explores the problem area from different perspectives. The paper will show that eLearning users indeed care for the protection of their personal information when using eLearning services. However, their attitudes and behaviour slightly differ. In conclusion, we provide first approaches of assisting possibilities for users how to resolve the difference of requirements and their actual activities with respect to privacy protection.

  4. BPO crude oil analysis data base user`s guide: Methods, publications, computer access correlations, uses, availability

    Energy Technology Data Exchange (ETDEWEB)

    Sellers, C.; Fox, B.; Paulz, J.

    1996-03-01

    The Department of Energy (DOE) has one of the largest and most complete collections of information on crude oil composition that is available to the public. The computer program that manages this database of crude oil analyses has recently been rewritten to allow easier access to this information. This report describes how the new system can be accessed and how the information contained in the Crude Oil Analysis Data Bank can be obtained.

  5. C language program analysis system (CLAS) part 1: graphical user interface (GUI)

    International Nuclear Information System (INIS)

    CLAS (C Language Program Analysis System) is a reverse engineering tool intended for use in the verification and validation (V and V) phase of software programs developed in the ANSI C language. From the source code, CLAS generates data pertaining to two conceptual models of software programs viz., Entity-Relationship (E-R) model and Control Flow Graphs (CFG) model. Browsing tools within CLAS, make use of this data, to provide different graphical views of the project. Static analysis tools have been developed earlier for analysing assembly language programs. CLAS is a continuation of this work to provide automated support in analysis of ANSI C language programs. CLAS provides an integrated Graphical User Interface (GUI) based environment under which programs can be analysed into the above mentioned models and the analysed data can be viewed using the browsing tools. The GUI of CLAS is implemented using an OPEN LOOK compliant tool kit XVIEW on Sun SPARC IPC workstation running Sun OS 4.1.1 rev. B. This report describes the GUI of CLAS. CLAS is also expected to be useful in other contexts which may involve understanding architecture/structure of already developed C language programs. Such requirements can arise while carrying out activities like code modification, parallelising etc. (author). 5 refs., 13 figs., 1 appendix

  6. Translators and Interpreters Certification in Australia, Canada, the Us? and Ukraine: Comparative Analysis

    Directory of Open Access Journals (Sweden)

    Skyba Kateryna

    2014-09-01

    Full Text Available The article presents an overview of the certification process by which potential translators and interpreters demonstrate minimum standards of performance to warrant official or professional recognition of their ability to translate or interpret and to practice professionally in Australia, Canada, the USA and Ukraine. The aim of the study is to research and to compare the certification procedures of translators and interpreters in Australia, Canada, the USA and Ukraine; to outline possible avenues of creating a certification system network in Ukraine. It has been revealed that there is great variation in minimum requirements for practice, availability of training facilities and formal bodies that certify practitioners and that monitor and advance specialists’ practices in the countries. Certification can be awarded by governmental or non-governmental organizations or associations of professionals in the field of translation/interpretation. Testing has been acknowledged as the usual avenue for candidates to gain certification. There are less popular grounds to get certification such as: completed training, presentation of previous relevant experience, and/or recommendations from practicing professionals or service-user. The comparative analysis has revealed such elements of the certification procedures and national conventions in the researched countries that may form a basis for Ukrainian translators/interpreters certifying system and make it a part of a cross-national one.

  7. An Economic Analysis of User-Privacy Options in Ad-Supported Services

    OpenAIRE

    Feigenbaum, Joan; Mitzenmacher, Michael; Zervas, Georgios

    2012-01-01

    We analyze the value to e-commerce website operators of offering privacy options to users, e.g., of allowing users to opt out of ad targeting. In particular, we assume that site operators have some control over the cost that a privacy option imposes on users and ask when it is to their advantage to make such costs low. We consider both the case of a single site and the case of multiple sites that compete both for users who value privacy highly and for users who value it less...

  8. HOW CAN ELECTRONIC COMMERCE IN DEVELOPING COUNTRIES ATTRACT USERS FROM DEVELOPED COUNTRIES? A COMPARATIVE STUDY OF THAILAND AND JAPAN

    OpenAIRE

    Tetsuro Kobayashi; Hitoshi Okada; Nagul Cooharojananone; Vanessa Bracamonte; Takahisa Suzuki

    2013-01-01

    A comparative study of Thailand and Japan investigated how electronic commerce (EC) in developing countries can be used to attract customers from developed countries. Thai and Japanese participants were shown language-appropriate versions of a hotel booking website in Thailand. Perceptions of and trust in the website were assessed, as was the willingness to book a room in the hotel using the website. The Thai participants tended to evaluate the quality of the website more highly and to trust ...

  9. Comparative Analysis of Various Authentication Techniques in Cloud Computing

    Directory of Open Access Journals (Sweden)

    SHABNAM SHARMA

    2013-04-01

    Full Text Available Over the recent years, there is a great advancement in the field of Computer Science. Cloud Computing is the result of advancement in the existing technologies. It shares the characteristics with Autonomic Computing, Client-Server Model, Grid Computing, Mainframe Computer, Utility Computing, Peer-to-Peer and Cloud Gaming. Cloud Computing is beneficial not only for users but also for large and small organizations. Security issues are the major concern in Cloud Computing. In this paper, our focus is on the authentication techniques used for verifying the client identity to the Cloud Broker.

  10. Integrating Actionable User-defined Faceted Rules into the Hybrid Science Data System for Advanced Rapid Imaging & Analysis

    Science.gov (United States)

    Manipon, G. J. M.; Hua, H.; Owen, S. E.; Sacco, G. F.; Agram, P. S.; Moore, A. W.; Yun, S. H.; Fielding, E. J.; Lundgren, P.; Rosen, P. A.; Webb, F.; Liu, Z.; Smith, A. T.; Wilson, B. D.; Simons, M.; Poland, M. P.; Cervelli, P. F.

    2014-12-01

    The Hybrid Science Data System (HySDS) scalably powers the ingestion, metadata extraction, cataloging, high-volume data processing, and publication of the geodetic data products for the Advanced Rapid Imaging & Analysis for Monitoring Hazard (ARIA-MH) project at JPL. HySDS uses a heterogeneous set of worker nodes from private & public clouds as well as virtual & bare-metal machines to perform every aspect of the traditional science data system. For our science data users, the forefront of HySDS is the facet search interface, FacetView, which allows them to browse, filter, and access the published products. Users are able to explore the collection of product metadata information and apply multiple filters to constrain the result set down to their particular interests. It allows them to download these faceted products for further analysis and generation of derived products. However, we have also employed a novel approach to faceting where it is also used to apply constraints for custom monitoring of products, system resources, and triggers for automated data processing. The power of the facet search interface is well documented across various domains and its usefulness is rooted in the current state of existence of metadata. However, user needs usually extend beyond what is currently present in the data system. A user interested in synthetic aperture radar (SAR) data over Kilauea will download them from FacetView but would also want email notification of future incoming scenes. The user may even want that data pushed to a remote workstation for automated processing. Better still, these future products could trigger HySDS to run the user's analysis on its array of worker nodes, on behalf of the user, and ingest the resulting derived products. We will present our findings in integrating an ancillary, user-defined, system-driven processing system for HySDS that allows users to define faceted rules based on facet constraints and triggers actions when new SAR data products arrive that match the constraints. We will discuss use cases where users have defined rules for the automated generation of InSAR derived products: interferograms for California and Kilauea, time-series analyses, and damage proxy maps. These findings are relevant for science data system development of the proposed NASA-ISRO SAR mission.

  11. Comparative analysis of 60Co intensity-modulated radiation therapy

    International Nuclear Information System (INIS)

    In this study, we perform a scientific comparative analysis of using 60Co beams in intensity-modulated radiation therapy (IMRT). In particular, we evaluate the treatment plan quality obtained with (i) 6 MV, 18 MV and 60Co IMRT; (ii) different numbers of static multileaf collimator (MLC) delivered 60Co beams and (iii) a helical tomotherapy 60Co beam geometry. We employ a convex fluence map optimization (FMO) model, which allows for the comparison of plan quality between different beam energies and configurations for a given case. A total of 25 clinical patient cases that each contain volumetric CT studies, primary and secondary delineated targets, and contoured structures were studied: 5 head-and-neck (H and N), 5 prostate, 5 central nervous system (CNS), 5 breast and 5 lung cases. The DICOM plan data were anonymized and exported to the University of Florida optimized radiation therapy (UFORT) treatment planning system. The FMO problem was solved for each case for 5-71 equidistant beams as well as a helical geometry for H and N, prostate, CNS and lung cases, and for 3-7 equidistant beams in the upper hemisphere for breast cases, all with 6 MV, 18 MV and 60Co dose models. In all cases, 95% of the target volumes received at least the prescribed dose with clinical sparing criteria for critical organs being met for all structures that were not wholly or partially contained within the target volume. Improvements in critin the target volume. Improvements in critical organ sparing were found with an increasing number of equidistant 60Co beams, yet were marginal above 9 beams for H and N, prostate, CNS and lung. Breast cases produced similar plans for 3-7 beams. A helical 60Co beam geometry achieved similar plan quality as static plans with 11 equidistant 60Co beams. Furthermore, 18 MV plans were initially found not to provide the same target coverage as 6 MV and 60Co plans; however, adjusting the trade-offs in the optimization model allowed equivalent target coverage for 18 MV. For plans with comparable target coverage, critical structure sparing was best achieved with 6 MV beams followed closely by 60Co beams, with 18 MV beams requiring significantly increased dose to critical structures. In this paper, we report in detail on a representative set of results from these experiments. The results of the investigation demonstrate the potential for IMRT radiotherapy employing commercially available 60Co sources and a double-focused MLC. Increasing the number of equidistant beams beyond 9 was not observed to significantly improve target coverage or critical organ sparing and static plans were found to produce comparable plans to those obtained using a helical tomotherapy treatment delivery when optimized using the same well-tuned convex FMO model. While previous studies have shown that 18 MV plans are equivalent to 6 MV for prostate IMRT, we found that the 18 MV beams actually required more fluence to provide similar quality target coverage

  12. Comparative Analysis of Cross-Platform MAD Frameworks

    Directory of Open Access Journals (Sweden)

    Kunal Verma

    2014-09-01

    Full Text Available Mobile Application Development is getting to be additionally difficult with differing stages and their product improvement packs. Lately, mobile computing has been having truly a revolution. Anyways one of the difficulties that has been conceived due to this revolution is technology and device fragmentation leaving developers stupefied. Platform developers, device manufacturers accompany such a variety of gimmicks and functionalities that it has been hard to give developers a less demanding method for creating applications and running the application on every cell phone with expense and time compelling measures. To lessen the expense of development and connectivity with maximum users across a plethora of platforms, developers are relocating to cross-platform application development tools. In this paper, we give a few choice criteria past the portability concerns toward picking suitable cross-platform solution for application development. Nonetheless, we discovered that cross-platform solutions might be suggested by and large, yet they are still constrained if high prerequisites apply with respect to execution, convenience or native user experience.

  13. Model Based User's Access Requirement Analysis of E-Governance Systems

    Science.gov (United States)

    Saha, Shilpi; Jeon, Seung-Hwan; Robles, Rosslin John; Kim, Tai-Hoon; Bandyopadhyay, Samir Kumar

    The strategic and contemporary importance of e-governance has been recognized across the world. In India too, various ministries of Govt. of India and State Governments have taken e-governance initiatives to provide e-services to citizens and the business they serve. To achieve the mission objectives, and make such e-governance initiatives successful it would be necessary to improve the trust and confidence of the stakeholders. It is assumed that the delivery of government services will share the same public network information that is being used in the community at large. In particular, the Internet will be the principal means by which public access to government and government services will be achieved. To provide the security measures main aim is to identify user's access requirement for the stakeholders and then according to the models of Nath's approach. Based on this analysis, the Govt. can also make standards of security based on the e-governance models. Thus there will be less human errors and bias. This analysis leads to the security architecture of the specific G2C application.

  14. A COMPARATIVE STATISTICAL ANALYSIS OF RICE CULTIVARS DATA

    Directory of Open Access Journals (Sweden)

    Mugemangango Cyprien

    2012-12-01

    Full Text Available In this paper, rice cultivars data have been analysed by three different statisticaltechniques viz. Split-plot analysis in RBD, two-factor factorial analysis in RBD and analysis oftwo-way classified data with several observations per cell. The powers of the tests under differentmethods of analysis have been calculated. The method of two-way classified data with severalobservations per cell is found better followed by two-factor factorial technique in RBD and splitplot analysis for analyzing the given data.

  15. Crack users, sexual behavior and risk of HIV infection Usuários de crack, comportamento sexual e risco de infecção pelo HIV

    OpenAIRE

    Renata Cruz Soares de Azevedo; Neury José Botega; Liliana Andolpho Magalhães Guimarães

    2007-01-01

    OBJECTIVE: To compare a sample of injecting cocaine users and crack users, assessing sexual behavior, risk for infection by HIV and its seroprevalence. METHOD: 109 injecting cocaine users and 132 crack users were assessed, using the World Health Organization questionnaire from the expanded "Cross-Site Study of Behaviors and HIV Seroprevalence among Injecting Drug Users" and HIV serology. Data were assessed by Multiple Correspondences Analysis. RESULTS: Crack users showed less time of drug con...

  16. Comparative transcriptome analysis of the metal hyperaccumulator Noccaea caerulescens

    OpenAIRE

    Halimaa, Pauliina; Blande, Daniel; Aarts, Mark G. M.; Tuomainen, Marjo; Tervahauta, Arja; Ka?renlampi, Sirpa

    2014-01-01

    The metal hyperaccumulator Noccaea caerulescens is an established model to study the adaptation of plants to metalliferous soils. Various comparators have been used in these studies. The choice of suitable comparators is important and depends on the hypothesis to be tested and methods to be used. In high-throughput analyses such as microarray, N. caerulescens has been compared to non-tolerant, non-accumulator plants like Arabidopsis thaliana or Thlaspi arvense rather than to the related hyper...

  17. Comparative analysis of endoscopic precut conventional and needle knife sphincterotomy

    Directory of Open Access Journals (Sweden)

    Andrzej Jamry

    2013-01-01

    Full Text Available AIM: To compare the efficacy, complications and post-procedural hyperamylasemia in endoscopic pre-cut conventional and needle knife sphincterotomie. METHODS: We performed a retrospective analysis of two pre-cut sphincterotomy (PS techniques, pre-cut conventional sphincterotomy (PCS, and pre-cut needle knife (PNK. The study included 143 patients; the classic technique was used in 59 patients (41.3%, and the needle knife technique was used in 84 patients (58.7%. We analyzed the efficacy of bile duct access, the need for a two-step procedure, the rates of complications and hyperamylasemia 4 h after the procedure, “endoscopic bleeding” and the need for bleeding control. Furthermore, to assess whether the anatomy of the Vater’s papilla, indications for the procedure or the need for additional procedures could inform the choice of the PS method, we evaluated the additive hyperamylasemia risk 4 h after the procedure with respect to the above mentioned variables. RESULTS: The bile duct access efficacy with PNK and PCS was 100% and 96.6%, respectively, and the difference between the two groups was not significant (P = 0.06. However, the needle knife technique required two-step access significantly more often, in 48.8% vs 8.5% of cases (P 80 U/L, 41/84 vs 23/59 (P = 0.32; hyperamylasemia 4 h after the procedure > 240 U/L, 19/84 vs 11/59 (P = 0.71; pancreatic pain, 13/84 vs 7/59 (P = 0.71; endoscopic bleeding, 10/84 vs 8/59 (P = 0.97; and the need for bleeding control, 10/84 vs 7/59 (P = 0.79. In the next part of the study, we analyzed the influence of the method chosen on the risk of hyperamylasemia with respect to an indication for endoscopic retrograde cholangiopancreatography, papillary anatomy and concomitant procedures performed. We determined that the hyperamylasemia risk was increased by more than threefold [odds ratio (OR = 3.38; P = 0.027] after PCS in patients with a flat Vater’s papilla and more than fivefold (OR = 5.3; P = 0.049 after the PNK procedure in patients who required endoscopic homeostasis. CONCLUSION: PCS and PNK do not differ in terms of efficacy or complication rates, but PNK is more often associated with the necessity for a two-step procedure.

  18. User-based and Cognitive Approaches to Knowledge Organization : A Theoretical Analysis of the Research Literature.

    DEFF Research Database (Denmark)

    HjØrland, Birger

    2013-01-01

    In the 1970s and 1980s, forms of user-based and cognitive approaches to knowledge organization came to the forefront as part of the overall development in library and information science and in the broader society. The specific nature of userbased approaches is their basis in the empirical studies of users or the principle that users need to be involved in the construction of knowledge organization systems. It might seem obvious that user-friendly systems should be designed on user studies or user involvement, but extremely successful systems such as Apple’s iPhone, Dialog’s search system and Google’s PageRank are not based on the empirical studies of users. In knowledge organization, the Book House System is one example of a system based on user studies. In cognitive science the important WordNet database is claimed to be based on psychological research. This article considers such examples. The role of the user is often confused with the role of subjectivity. Knowledge organization systems cannot be objective and must therefore, by implication, be based on some kind of subjectivity. This subjectivity should, however, be derived from collective views in discourse communities rather than be derived from studies of individuals or from the study of abstract minds.

  19. Comparative Analysis between PI and Wavelet Transform for the Fault Detection in Induction Motor.

    Directory of Open Access Journals (Sweden)

    Siddharth Ukey

    2013-07-01

    Full Text Available Squirrel cage Induction motor is widely used in industries because roughest construction, highly reliable, low cost, high efficiency, user friendly and maintenance is minimum as compare to other motor. Induction motor monitoring has a challenging task for researcher engineers' and industries. In this paper we will discusses the fundamental fault in induction motor. The PI & wavelet transform is considered the most popular fault detection method now a day because it can easy detect the common fault in induction machine such as turn to turn s/c, broken rotor bar, bearing deterioration & open circuit faults etc. According to IEEE-IAS most severe fault is bearing fault (44%, then second is stator winding fault (26%, and last is rotor broken bar fault (8% and other fault is (22%. Another survey according to Allianz most severe fault is stator winding fault (66%, then rotor fault (13% and bearing fault is (13% and other is (13%. (4. There are many methods for detection the fault basically conventional method and other is signal processing technique. Automatic fault detection is widely used in industries because save the maintenance time and money. The overall problems are subdivided into two distinct key modules: (a operation and control, (b fault diagnosis. In this paper we proposed a method of comparative analysis between PI & Wavelet Controller for fault detection in Induction motor and find out which one is the best. In this paper we consider only two faults: (a Broken rotor bar fault, (b short stator winding fault. Basically bearing is the outer portion of the motor so bearing fault detection is easy as compare to short stator winding fault or broken rotor bar fault.

  20. HOW CAN ELECTRONIC COMMERCE IN DEVELOPING COUNTRIES ATTRACT USERS FROM DEVELOPED COUNTRIES? A COMPARATIVE STUDY OF THAILAND AND JAPAN

    Directory of Open Access Journals (Sweden)

    Tetsuro Kobayashi

    2013-12-01

    Full Text Available A comparative study of Thailand and Japan investigated how electronic commerce (EC in developing countries can be used to attract customers from developed countries. Thai and Japanese participants were shown language-appropriate versions of a hotel booking website in Thailand. Perceptions of and trust in the website were assessed, as was the willingness to book a room in the hotel using the website. The Thai participants tended to evaluate the quality of the website more highly and to trust it more than did the Japanese participants. Furthermore, the Thai participants tended to think that the hotel was more responsible for their hotel reservations than was the EC service, and that the content of the website was developed by the hotel rather than by the EC service. Thai participants were more likely to express willingness to reserve a room if they thought that the hotel had developed the website content, whereas the Japanese participants’ willingness to book a room were greater when they thought that the EC service had developed the content. Based on these results, customization strategies for EC in developing countries to attract customers from developed countries are discussed.

  1. Evaluation of a gene information summarization system by users during the analysis process of microarray datasets

    Directory of Open Access Journals (Sweden)

    Cohen Aaron

    2009-02-01

    Full Text Available Abstract Background Summarization of gene information in the literature has the potential to help genomics researchers translate basic research into clinical benefits. Gene expression microarrays have been used to study biomarkers for disease and discover novel types of therapeutics and the task of finding information in journal articles on sets of genes is common for translational researchers working with microarray data. However, manually searching and scanning the literature references returned from PubMed is a time-consuming task for scientists. We built and evaluated an automatic summarizer of information on genes studied in microarray experiments. The Gene Information Clustering and Summarization System (GICSS is a system that integrates two related steps of the microarray data analysis process: functional gene clustering and gene information gathering. The system evaluation was conducted during the process of genomic researchers analyzing their own experimental microarray datasets. Results The clusters generated by GICSS were validated by scientists during their microarray analysis process. In addition, presenting sentences in the abstract provided significantly more important information to the users than just showing the title in the default PubMed format. Conclusion The evaluation results suggest that GICSS can be useful for researchers in genomic area. In addition, the hybrid evaluation method, partway between intrinsic and extrinsic system evaluation, may enable researchers to gauge the true usefulness of the tool for the scientists in their natural analysis workflow and also elicit suggestions for future enhancements. Availability GICSS can be accessed online at: http://ir.ohsu.edu/jianji/index.html

  2. Analysis of Optimization Techniques to Improve User Response Time of Web Applications and Their Implementation for MOODLE

    OpenAIRE

    Manchanda, Priyanka

    2013-01-01

    Analysis of seven optimization techniques grouped under three categories (hardware, back-end, and front-end) is done to study the reduction in average user response time for Modular Object Oriented Dynamic Learning Environment (Moodle), a Learning Management System which is scripted in PHP5, runs on Apache web server and utilizes MySQL database software. Before the implementation of these techniques, performance analysis of Moodle is performed for varying number of concurren...

  3. SINGULAB - A Graphical user Interface for the Singularity Analysis of Parallel Robots based on Grassmann-Cayley Algebra

    CERN Document Server

    Ben-Horin, Patricia; Caro, Stéphane; Chablat, Damien; Wenger, Philippe

    2008-01-01

    This paper presents SinguLab, a graphical user interface for the singularity analysis of parallel robots. The algorithm is based on Grassmann-Cayley algebra. The proposed tool is interactive and introduces the designer to the singularity analysis performed by this method, showing all the stages along the procedure and eventually showing the solution algebraically and graphically, allowing as well the singularity verification of different robot poses.

  4. Analysis of pilocytic astrocytoma by comparative genomic hybridization

    OpenAIRE

    Sanoudou, D.; Tingby, O; Ferguson-Smith, M.A.; Collins, V P; Coleman, N.

    2000-01-01

    Very little is known about genetic abnormalities involved in the development of pilocytic astrocytoma, the most frequently occurring brain tumour of childhood. We have analysed 48 pilocytic astrocytoma specimens using comparative genomic hybridization. Only five of 41 tumours from children showed abnormalities detectable by comparative genomic hybridization, and in each case this represented gain of a single chromosome. Interestingly, two of seven tumours from adults showed abnormalities, whi...

  5. Comparative DNA Sequence Analysis of Wheat and Rice Genomes

    OpenAIRE

    Sorrells, Mark E; La Rota, Mauricio; Bermudez-Kandianis, Catherine E.; Greene, Robert A.; Kantety, Ramesh; Munkvold, Jesse D.; Miftahudin; Mahmoud, Ahmed; Ma, Xuefeng; Gustafson, Perry J.; Qi, Lili L.; Echalier, Benjamin; Gill, Bikram S; Matthews, David E; Lazo, Gerard R.

    2003-01-01

    The use of DNA sequence-based comparative genomics for evolutionary studies and for transferring information from model species to crop species has revolutionized molecular genetics and crop improvement strategies. This study compared 4485 expressed sequence tags (ESTs) that were physically mapped in wheat chromosome bins, to the public rice genome sequence data from 2251 ordered BAC/PAC clones using BLAST. A rice genome view of homologous wheat genome locations based ...

  6. EIA BASED COMPARATIVE URBAN TRAFFIC NOISE ANALYSIS BETWEEN OPERATIONAL AND UNDER CONSTRUCTION PHASE PUBLIC TRANSPORT CORRIDOR

    Directory of Open Access Journals (Sweden)

    Rajeev Kumar Mishra

    2014-09-01

    Full Text Available Delhi has a population of 16.75 million and is increasing at a rapid rate. This increase in population has enhanced the need for public transport. In Delhi, this need for public transport is served mainly by buses, auto rickshaws, a rapid transit system, taxis and suburban railways. Delhi has one of the highest road densities in India. Buses are the most popular means of transport catering to about 60% of the total demand. In order to meet the transport demand in Delhi, the State and the Union government started the construction of a mass rapid transit system, including the Delhi Metro. By the application of various data and public response, the paper accentuates the qualitative discussion on impacts of mass rapid transit system (MRTS corridor on land use and social aspects of lives of residents and road users. It also proposes certain mitigating measures for that meticulous condition. The analysis and survey outcome explain about the exceeded level of noise level as compared to CPCB standards. The share of public transport in total noise pollution is smaller than private but still exceeds the standards. Such problem demands the design of noise barrier along the corridor to curb the noise pollution.

  7. An Empirical Study on User-oriented Association Analysis of Library Classification Schemes

    Directory of Open Access Journals (Sweden)

    Hsiao-Tieh Pu

    2002-12-01

    Full Text Available Library classification schemes are mostly organized based on disciplines with a hierarchical structure. From the user point of view, some highly related yet non-hierarchical classes may not be easy to perceive in these schemes. This paper is to discover hidden associations between classes by analyzing users’ usage of library collections. The proposed approach employs collaborative filtering techniques to discover associated classes based on the circulation patterns of similar users. Many associated classes scattered across different subject hierarchies could be discovered from the circulation patterns of similar users. The obtained association norms between classes were found to be useful in understanding users' subject preferences for a given class. Classification schemes can, therefore, be made more adaptable to changes of users and the uses of different library collections. There are implications for applications in information organization and retrieval as well. For example, catalogers could refer to the ranked associated classes when they perform multi-classification, and users could also browse the associated classes for related subjects in an enhanced OPAC system. In future research, more empirical studies will be needed to validate the findings, and methods for obtaining user-oriented associations can still be improved.[Article content in Chinese

  8. User-Centered Design for Interactive Maps: A Case Study in Crime Analysis

    Directory of Open Access Journals (Sweden)

    Robert E. Roth

    2015-02-01

    Full Text Available In this paper, we address the topic of user-centered design (UCD for cartography, GIScience, and visual analytics. Interactive maps are ubiquitous in modern society, yet they often fail to “work” as they could or should. UCD describes the process of ensuring interface success—map-based or otherwise—by gathering input and feedback from target users throughout the design and development of the interface. We contribute to the expanding literature on UCD for interactive maps in two ways. First, we synthesize core concepts on UCD from cartography and related fields, as well as offer new ideas, in order to organize existing frameworks and recommendations regarding the UCD of interactive maps. Second, we report on a case study UCD process for GeoVISTA CrimeViz, an interactive and web-based mapping application supporting visual analytics of criminal activity in space and time. The GeoVISTA CrimeViz concept and interface were improved iteratively by working through a series of user?utility?usability loops in which target users provided input and feedback on needs and designs (user, prompting revisions to the conceptualization and functional requirements of the interface (utility, and ultimately leading to new mockups and prototypes of the interface (usability for additional evaluation by target users (user… and so on. Together, the background review and case study offer guidance for applying UCD to interactive mapping projects, and demonstrate the benefit of including target users throughout design and development.

  9. Music Streaming in Denmark : An analysis of listening patterns and the consequences of a ’per user’ settlement model based on streaming data from WiMP

    DEFF Research Database (Denmark)

    Pedersen, Rasmus Rex

    2014-01-01

    This report analyses how a ’per user’ settlement model differs from the ‘pro rata’ model currently used. The analysis is based on data for all streams by WiMP users in Denmark during August 2013. The analysis has been conducted in collaboration with Christian Schlelein from Koda on the basis of data delivered by WiMP.

  10. Knowledge Level Assessment in e-Learning Systems Using Machine Learning and User Activity Analysis

    Directory of Open Access Journals (Sweden)

    Nazeeh Ghatasheh

    2015-04-01

    Full Text Available Electronic Learning has been one of the foremost trends in education so far. Such importance draws the attention to an important shift in the educational paradigm. Due to the complexity of the evolving paradigm, the prospective dynamics of learning require an evolution of knowledge delivery and evaluation. This research work tries to put in hand a futuristic design of an autonomous and intelligent e-Learning system. In which machine learning and user activity analysis play the role of an automatic evaluator for the knowledge level. It is important to assess the knowledge level in order to adapt content presentation and to have more realistic evaluation of online learners. Several classification algorithms are applied to predict the knowledge level of the learners and the corresponding results are reported. Furthermore, this research proposes a modern design of a dynamic learning environment that goes along the most recent trends in e-Learning. The experimental results illustrate an overall performance superiority of a support vector machine model in evaluating the knowledge levels; having 98.6%of correctly classified instances with 0.0069 mean absolute error.

  11. Comparative analysis of bacteria in uranium mining wastes

    International Nuclear Information System (INIS)

    Compositional analysis of predominant bacterial groups in three different kinds of uranium wastes gives indications for different biogeological processes running at the studied sites which seems to be influenced by the anthropological activities involved in the production of uranium. (orig.)

  12. User centric approach to itemset utility mining in Market Basket Analysis

    Directory of Open Access Journals (Sweden)

    Jyothi Pillai

    2011-01-01

    Full Text Available Business intelligence is information about a company's past performance that is used to help predict the company's future performance. It can reveal emerging trends from which the company might profit [31]. Data mining allows users to sift through the enormous amount of information available in data warehouses; it is from this sifting process that business intelligence gems may be found [31]. Within the area of data mining, the problem of deriving associations from data has received a great deal of attention. This problem is referred as “market-basket problem”. Association Rule Mining (ARM, a well-studied technique in the data mining field, identifies frequent itemsets from databases and generates association rules by assuming that all items have the same significance andfrequency of occurrence in a record. However, items are actually different in many aspects in a number of real applications such as retail marketing, nutritional pattern mining, etc [26]. Rare items are less frequent items [32]. For many real world applications, however, utility of rare itemsets based on cost, profit or revenue is of importance. For extracting rare itemsets, the equal frequency based approaches like Apriori approach suffer from “rare item problem dilemma”. Utility mining aims at identifying rare itemsets with high utility. The main objective of Utility Mining is to identify the itemsets with highest utilities, by considering profit, quantity, costor other user preferences [40]. Also valuable patterns cannot be discovered by traditional non-temporal data mining approaches that treat all the data as one large segment, with no attention paid to utilizing the time information of transactions. Now, as increasingly complex real-world problems are addressed, temporal rare itemset utility problem, are taking center stage. In many real-life applications, high-utility itemsets consist of rare items. Rare itemsets provide useful information in different decision-making domains such as business transactions, medical, security, fraudulenttransactions, and retail communities. For example, in a supermarket, customers purchase microwave ovens or frying pans rarely as compared to bread, washing powder, soap. But the former transactions yield more profit for the supermarket. A retail business may be interested in identifying its most valuable customers i.e. who contribute a major fraction of overall company profit [40]. In this paper, these problems of analyzing market-basket data are considered and important contributions are presented. It is assumed that the utilities of itemsets may differ and determine the high utility itemsets based onboth internal (transaction and external utilities.

  13. User's Manual for HPTAM: a Two-Dimensional Heat Pipe Transient Analysis Model, Including the Startup from a Frozen State

    Science.gov (United States)

    Tournier, Jean-Michel; El-Genk, Mohamed S.

    1995-01-01

    This report describes the user's manual for 'HPTAM,' a two-dimensional Heat Pipe Transient Analysis Model. HPTAM is described in detail in the UNM-ISNPS-3-1995 report which accompanies the present manual. The model offers a menu that lists a number of working fluids and wall and wick materials from which the user can choose. HPTAM is capable of simulating the startup of heat pipes from either a fully-thawed or frozen condition of the working fluid in the wick structure. The manual includes instructions for installing and running HPTAM on either a UNIX, MS-DOS or VMS operating system. Samples for input and output files are also provided to help the user with the code.

  14. Phylogenetic analysis of “Volvocacae” for comparative genetic studies

    OpenAIRE

    Coleman, Annette W.

    1999-01-01

    Sequence analysis based on multiple isolates representing essentially all genera and species of the classic family Volvocaeae has clarified their phylogenetic relationships. Cloned internal transcribed spacer sequences (ITS-1 and ITS-2, flanking the 5.8S gene of the nuclear ribosomal gene cistrons) were aligned, guided by ITS transcript secondary structural features, and subjected to parsimony and neighbor joining distance analysis. Results confirm the notion of a single common ancestor, and ...

  15. Measuring service quality and a comparative analysis in airline industry

    OpenAIRE

    Mohammad Mehdi Izadi; Naser Azad; Seyed Mohsen SeyedAliAkbar; Kuimars Bahreini

    2013-01-01

    Quality of services in airline industry plays an important role in market penetration and customer retention. In this paper, we present a factor analysis to find important factors in Iranian Airline industry. The study designs a questionnaire consist of 35 questions and distribute it among 200 customers who regularly use services from 16 different airlines and they are investigated based on the implementation of factor analysis. The results of our survey determines seven important factors inc...

  16. TEXCAD: Textile Composite Analysis for Design. Version 1.0: User's manual

    Science.gov (United States)

    Naik, Rajiv A.

    1994-01-01

    The Textile Composite Analysis for Design (TEXCAD) code provides the materials/design engineer with a user-friendly desktop computer (IBM PC compatible or Apple Macintosh) tool for the analysis of a wide variety of fabric reinforced woven and braided composites. It can be used to calculate overall thermal and mechanical properties along with engineering estimates of damage progression and strength. TEXCAD also calculates laminate properties for stacked, oriented fabric constructions. It discretely models the yarn centerline paths within the textile repeating unit cell (RUC) by assuming sinusoidal undulations at yarn cross-over points and uses a yarn discretization scheme (which subdivides each yarn not smaller, piecewise straight yarn slices) together with a 3-D stress averaging procedure to compute overall stiffness properties. In the calculations for strength, it uses a curved beam-on-elastic foundation model for yarn undulating regions together with an incremental approach in which stiffness properties for the failed yarn slices are reduced based on the predicted yarn slice failure mode. Nonlinear shear effects and nonlinear geometric effects can be simulated. Input to TEXCAD consists of: (1) materials parameters like impregnated yarn and resin properties such moduli, Poisson's ratios, coefficients of thermal expansion, nonlinear parameters, axial failure strains and in-plane failure stresses; and (2) fabric parameters like yarn sizes, braid angle, yarn packing density, filament diameter and overall fiber volume fraction. Output consists of overall thermoelastic constants, yarn slice strains/stresses, yarn slice failure history, in-plane stress-strain response and ultimate failure strength. Strength can be computed under the combined action of thermal and mechanical loading (tension, compression and shear).

  17. The Formation of English Phrasal Comparatives? Study of Lechner’s Small Clause Analysis

    OpenAIRE

    Xiaowen Zhang

    2013-01-01

    Lechner (2001) proposes two hypotheses, that is, CR-Hypothesis (Conjunction Reduction (hereafter CR) operations can target comparatives.) and PC-Hypothesis (Phrasal Comparatives (hereafter PCs) derive from clausal comparatives by CR.) In this paper, I discuss the formation of PCs and its relevance to Lechner’s two Hypotheses. As to the formation of PCs, based on his two hypotheses, he puts forward small clause analysis. Compared with Direct Analysis and Comparative Ellipsis approach, al...

  18. GENOVA: a generalized perturbation theory program for various applications to CANDU core physics analysis (II) - a user's manual

    International Nuclear Information System (INIS)

    A user's guide for GENOVA, a GENeralized perturbation theory (GPT)-based Optimization and uncertainty analysis program for Canada deuterium uranium (CANDU) physics VAriables, was prepared. The program was developed under the framework of CANDU physics design and analysis code RFSP. The generalized perturbation method was implemented in GENOVA to estimate the zone controller unit (ZCU) level upon refueling operation and calculate various sensitivity coefficients for fuel management study and uncertainty analyses, respectively. This documentation contains descriptions and directions of four major modules of GENOVA such as ADJOINT, GADJINT, PERTURB, and PERTXS so that it can be used as a practical guide for GENOVA users. This documentation includes sample inputs for the ZCU level estimation and sensitivity coefficient calculation, which are the main application of GENOVA. The GENOVA can be used as a supplementary tool of the current CANDU physics design code for advanced CANDU core analysis and fuel development

  19. Digital video analysis of health professionals' interactions with an electronic whiteboard : A longitudinal, naturalistic study of changes to user interactions

    DEFF Research Database (Denmark)

    Rasmussen, Rasmus; Kushniruk, Andre

    2013-01-01

    As hospital departments continue to introduce electronic whiteboards in real clinical settings a range of human factor issues have emerged and it has become clear that there is a need for improved methods for designing and testing these systems. In this study, we employed a longitudinal and naturalistic method in the usability evaluation of an electronic whiteboard system. The goal of the evaluation was to explore the extent to which usability issues experienced by users change as they gain more experience with the system. In addition, the paper explores the use of a new approach to collection and analysis of continuous digital video recordings of naturalistic "live" user interactions. The method developed and employed in the study included recording the users' interactions with system during actual use using screen-capturing software and analyzing these recordings for usability issues. In this paper we describe and discuss both the method and the results of the evaluation. We found that the electronic whiteboard system contains system-related usability issues that did not change over time as the clinicians collectively gained more experience with the system. Furthermore, we also found user-related issues that seemed to change as the users gained more experience and we discuss the underlying reasons for these changes. We also found that the method used in the study has certain advantages over traditional usability evaluation methods, including the ability to collect analyze live user data over time. However, challenges and drawbacks to using the method (including the time taken for analysis and logistical issues in doing live recordings) should be considered before utilizing a similar approach. In conclusion we summarize our findings and call for an increased focus on longitudinal and naturalistic evaluations of health information systems and encourage others to apply and refine the method utilized in this study.

  20. Analysis of New Dynamic Comparator for ADC Circuit

    Directory of Open Access Journals (Sweden)

    Fazal Noorbasha

    2014-05-01

    Full Text Available Comparator is the main basic device mostly used in analog to digital converters (ADC. For the better transmission of signals, requirement of fastest analog to digital converters are required. So the new dynamic comparator is replaced in the place of existing comparator of analog to digital converter for the better conversion. This is designed and the performance is evaluated using CADENCE GPDK 180nm technology in LINUX environment. The clock frequency of new ADC circuit is increased from 200MHZ to 250MHZ and voltage is reduced from 1.2v to 0.2v. The power dissipation is decreased from 1.63619mw to 321.032µw for 1- bit ADC and 1.2765mw to 82.346µw for 2-bit ADC.