WorldWideScience

Sample records for users comparative analysis

  1. Institutional Repositories: Investigating User Groups and Comparative Evaluation Using Link Analysis

    Wells, Paul

    2009-01-01

    The aim of this investigation was to look at user groups of institutional repositories. Past research on repository users has focused on authors and depositors at the expense of other users, and little is known about what types of user groups are associated with institutional repositories. This investigation used the research techniques of link analysis and content analysis to investigate links to institutional repository websites and determine what types of user groups are using repositories...

  2. Bias in Observational Studies of Prevalent Users: Lessons for Comparative Effectiveness Research From a Meta-Analysis of Statins

    Danaei, Goodarz; Tavakkoli, Mohammad; Hernán, Miguel A.

    2012-01-01

    Randomized clinical trials (RCTs) are usually the preferred strategy with which to generate evidence of comparative effectiveness, but conducting an RCT is not always feasible. Though observational studies and RCTs often provide comparable estimates, the questioning of observational analyses has recently intensified because of randomized-observational discrepancies regarding the effect of postmenopausal hormone replacement therapy on coronary heart disease. Reanalyses of observational data that excluded prevalent users of hormone replacement therapy led to attenuated discrepancies, which begs the question of whether exclusion of prevalent users should be generally recommended. In the current study, the authors evaluated the effect of excluding prevalent users of statins in a meta-analysis of observational studies of persons with cardiovascular disease. The pooled, multivariate-adjusted mortality hazard ratio for statin use was 0.77 (95% confidence interval (CI): 0.65, 0.91) in 4 studies that compared incident users with nonusers, 0.70 (95% CI: 0.64, 0.78) in 13 studies that compared a combination of prevalent and incident users with nonusers, and 0.54 (95% CI: 0.45, 0.66) in 13 studies that compared prevalent users with nonusers. The corresponding hazard ratio from 18 RCTs was 0.84 (95% CI: 0.77, 0.91). It appears that the greater the proportion of prevalent statin users in observational studies, the larger the discrepancy between observational and randomized estimates. PMID:22223710

  3. A comparative analysis of user preference-based and existing knowledge management systems attributes in the aerospace industry

    Varghese, Nishad G.

    Knowledge management (KM) exists in various forms throughout organizations. Process documentation, training courses, and experience sharing are examples of KM activities performed daily. The goal of KM systems (KMS) is to provide a tool set which serves to standardize the creation, sharing, and acquisition of business critical information. Existing literature provides numerous examples of targeted evaluations of KMS, focusing on specific system attributes. This research serves to bridge the targeted evaluations with an industry-specific, holistic approach. The user preferences of aerospace employees in engineering and engineering-related fields were compared to profiles of existing aerospace KMS based on three attribute categories: technical features, system administration, and user experience. The results indicated there is a statistically significant difference between aerospace user preferences and existing profiles in the user experience attribute category, but no statistically significant difference in the technical features and system administration attribute categories. Additional analysis indicated in-house developed systems exhibit higher technical features and user experience ratings than commercial-off-the-self (COTS) systems.

  4. Non-Academic Service Quality: Comparative Analysis of Students and Faculty as Users

    Sharif, Khurram; Kassim, Norizan Mohd

    2012-01-01

    The research focus was a non-academic service quality assessment within higher education. In particular, non-academic service quality perceptions of faculty and students were evaluated using a service profit chain. This enabled a comparison which helped understanding of non-academic service quality orientation from a key users' perspective. Data…

  5. Comparing Methods for Involving Users in Ideation

    Nicolajsen, Hanne Westh; Scupola, Ada; Sørensen, Flemming

    2015-01-01

    In this paper we discuss how users may be involved in the ideation phase of innovation. The study compares the use of a blog and three future workshops (students, employees and a mix of the two) in a library. Our study shows that the blog is efficient in giving the users voice whereas the mixed...... workshop method (involving users and employees) is especially good at qualifying and further developing ideas. The findings suggest that methods for involving users in ideation should be carefully selected and combined to achieve optimum benefits and avoid potential disadvantages....

  6. Tree, funny, to_read, google: What are Tags Supposed to Achieve? A Comparative Analysis of User Keywords for Different Digital Resource Types

    Heckner, Markus; Neubauer, Tanja; Wolff, Christian

    2008-01-01

    Social tagging systems have become increasingly popular over the past years. Users' tagging practices have been little studied and understood so far. However, understanding tagging behaviour can contribute towards a thorough understanding of the tagging phenomenon from multiple perspectives. In the present paper, results of a comparative analysis of tag characteristics on the tagging platforms connotea.org (scientific articles), del.icio.us (bookmarks), flickr.com (photos), and youtube.com (v...

  7. Bias in Observational Studies of Prevalent Users: Lessons for Comparative Effectiveness Research From a Meta-Analysis of Statins

    Danaei, Goodarz; Tavakkoli, Mohammad; Hernán, Miguel A.

    2012-01-01

    Randomized clinical trials (RCTs) are usually the preferred strategy with which to generate evidence of comparative effectiveness, but conducting an RCT is not always feasible. Though observational studies and RCTs often provide comparable estimates, the questioning of observational analyses has recently intensified because of randomized-observational discrepancies regarding the effect of postmenopausal hormone replacement therapy on coronary heart disease. Reanalyses of observational data th...

  8. Comparative cytomorphometric analysis of oral mucosal cells in normal, tobacco users, oral leukoplakia and oral squamous cell carcinoma

    Mahadoon Nivia

    2015-01-01

    Conclusion: The cytomorphometric changes observed in samples from oral SCC and oral leukoplakia were consistent with the current diagnostic features. Hence, the semi-automated cytomorphometric analysis of oral mucosal cells can be used as an objective adjunct diagnostic tool in the diagnosis of these lesions.

  9. Trajectory analysis and optimization system (TAOS) user`s manual

    Salguero, D.E.

    1995-12-01

    The Trajectory Analysis and Optimization System (TAOS) is software that simulates point--mass trajectories for multiple vehicles. It expands upon the capabilities of the Trajectory Simulation and Analysis program (TAP) developed previously at Sandia National Laboratories. TAOS is designed to be a comprehensive analysis tool capable of analyzing nearly any type of three degree-of-freedom, point-mass trajectory. Trajectories are broken into segments, and within each segment, guidance rules provided by the user control how the trajectory is computed. Parametric optimization provides a powerful method for satisfying mission-planning constraints. Althrough TAOS is not interactive, its input and output files have been designed for ease of use. When compared to TAP, the capability to analyze trajectories for more than one vehicle is the primary enhancement, although numerous other small improvements have been made. This report documents the methods used in TAOS as well as the input and output file formats.

  10. Benefit beliefs about protein supplements: A comparative study of users and non-users.

    Hartmann, Christina; Siegrist, Michael

    2016-08-01

    The consumption of protein supplements among leisure time exercisers is growing. The present study aims to identify factors that motivate protein supplement consumption by comparing users' and non-users' underlying benefit beliefs about protein supplement. The study is based on an online survey of 813 Swiss adults (376 users of protein supplements and 437 non-users). Participants answered questions related to their benefit beliefs regarding protein supplement, their protein supplements consumption frequency, their activity level (GPAQ), and their reasons for taking protein supplement. In women, the most commonly cited reasons were to increase muscles (57.3%) and to regulate their weight (48.6%); and in men to increase muscles (83.7%) and to promote regeneration (53.7%). Furthermore, a principal component analysis revealed four benefit belief factors: (a) restore nutrients/avoid weakness; (b) fitness promotion; (c) health/well-being; (d) muscle modulation/competitive performance. The analysis showed that both users and non-users predominantly perceive protein supplements consumption as a strategy to modulate muscle mass, while beliefs in a health and well-being promoting effect was more prevalent among users (M = 3.2, SD = 1.3) than non-users (M = 2.7, SD = 1.3) (p < 0.001). Moreover, health and wellbeing-related beliefs were associated with an increased likelihood of a higher protein supplements intake frequency (OR = 1.5, 95% CI: 1.1-1.9), while physical activity level was not associated with protein supplements intake frequency. In addition, a negative correlation between physical activity level and beliefs in a fitness-promoting effect of protein supplements (r = -0.14, p < 0.001) was observed, indicating that for a subgroup, protein supplements might license lower activity levels. Despite a lack of scientific evidence, consumers of varying activity levels consume protein supplements and believe in its' various positive features. Users should be

  11. MAUS: MICE Analysis User Software

    CERN. Geneva

    2012-01-01

    The Muon Ionization Cooling Experiment (MICE) has developed the MICE Analysis User Software (MAUS) to simulate and analyse experimental data. It serves as the primary codebase for the experiment, providing for online data quality checks and offline batch simulation and reconstruction. The code is structured in a Map-Reduce framework to allow parallelization whether on a personal machine or in the control room. Various software engineering practices from industry are also used to ensure correct and maintainable physics code, which include unit, functional and integration tests, continuous integration and load testing, code reviews, and distributed version control systems. Lastly, there are various small design decisions like using JSON as the data structure, using SWIG to allow developers to write components in either Python or C++, or using the SCons python-based build system that may be of interest to other experiments.

  12. Comparative Analysis of Allocative Efficiency in Input use by Credit and Non Credit User Small Scale Poulty Farmers in Delta State, Nigeria

    P. C. Ike; I. Udeh

    2011-01-01

    This study examined the relative allocative efficiencies in input use by credit user and non credit user small scale poultry farmers in Delta State, Nigeria. Relative elasticities of production and returns to scale of the defined poultry farmers were examined. Primary data were collected from a random sample of 108 small scale poultry farmers consisting of 54 credit users and 54 non credit users. A stochastic frontier production function model was used to analyse the data. Results of the find...

  13. Comparative Analysis of Allocative Efficiency in Input use by Credit and Non Credit User Small Scale Poulty Farmers in Delta State, Nigeria

    P.C. Ike

    2011-11-01

    Full Text Available This study examined the relative allocative efficiencies in input use by credit user and non credit user small scale poultry farmers in Delta State, Nigeria. Relative elasticities of production and returns to scale of the defined poultry farmers were examined. Primary data were collected from a random sample of 108 small scale poultry farmers consisting of 54 credit users and 54 non credit users. A stochastic frontier production function model was used to analyse the data. Results of the findings indicate that none of the poultry farmer groups allocated any production input optimally. All the variables entered in the model were significant for credit and non credit users except drugs and veterinary services which was not significant for non credit user poultry farmers. On the whole, the credit user poultry farmers over utilized (Kij1 feed input as well as drugs and veterinary services. The non credit user farmers over utilized (Kij1 feed input. It is therefore the recommendation of this study that economic policies and programmes such as the Delta state microcredit programme should be strengthened so as to improve access to credit and enhance efficiency in the use of resources by small scale poultry farmers.

  14. Relative injury severity among vulnerable non-motorised road users: comparative analysis of injury arising from bicycle-motor vehicle and bicycle-pedestrian collisions.

    Chong, Shanley; Poulos, Roslyn; Olivier, Jake; Watson, Wendy L; Grzebieta, Raphael

    2010-01-01

    With the expansion of bicycle usage and limited funding and/or space for segregated pedestrian and bicycle paths, there is a need for traffic, road design and local government engineers to decide if it is more appropriate for space to be shared between either cyclists and pedestrians, or between cars and cyclists, and what restrictions need to be applied in such circumstances. To provide knowledge to aid engineers and policy makers in making these decisions, this study explored death and morbidity data for the state of New South Wales, Australia to examine rates and severity of injury arising from collisions between pedestrians and cyclists, and between cyclists and motor vehicles (MVs). An analysis of the severity of hospitalised injuries was conducted using International Classification of Diseases, Version 10, Australian Modification (ICD-10-AM) diagnosis-based Injury Severity Score (ICISS) and the Disability Adjusted Life Year (DALY) was used to measure burden of injury arising from collisions resulting in death or hospitalisation. The greatest burden of injury in NSW, for the studied collision mechanisms, is for cyclists who are injured in collisions with motor vehicles. Collisions between cyclists and pedestrians also result in significant injuries. For all collision mechanisms, the odds of serious injury on admission are greater for the elderly than for those in other age groups. The significant burden of injury arising from collisions of cyclists and MVs needs to be addressed. However in the absence of appropriate controls, increasing the opportunity for conflict between cyclists and pedestrians (through an increase in shared spaces for these users) may shift the burden of injury from cyclists to pedestrians, in particular, older pedestrians. PMID:19887170

  15. User analysis of LHCb data with Ganga

    GANGA (http://cern.ch/ganga) is a job-management tool that offers a simple, efficient and consistent user analysis tool in a variety of heterogeneous environments: from local clusters to global Grid systems. Experiment specific plug-ins allow GANGA to be customised for each experiment. For LHCb users GANGA is the officially supported and advertised tool for job submission to the Grid. The LHCb specific plug-ins allow support for end-to-end analysis helping the user to perform his complete analysis with the help of GANGA. This starts with the support for data selection, where a user can select data sets from the LHCb Bookkeeping system. Next comes the set up for large analysis jobs: with tailored plug-ins for the LHCb core software, jobs can be managed by the splitting of these analysis jobs with the subsequent merging of the resulting files. Furthermore, GANGA offers support for Toy Monte-Carlos to help the user tune their analysis. In addition to describing the GANGA architecture, typical usage patterns within LHCb and experience with the updated LHCb DIRAC workload management system are presented.

  16. ForAVis : explorative user forum analysis

    Wanner, Franz; Ramm, Thomas; Keim, Daniel

    2011-01-01

    User generated textual content on the internet has become increasingly valueable during the past few years. Forums, blogs, twitter and other social media websites are accessible for a huge amount of people all over the world. Hence, methods and tools are needed to handle this vast bulk of textual data. In this paper we present an explorative forum analysis system helping various stakeholders to cope with the task analyzing user generated content in online forums. The used mobile communication...

  17. Language workbench user interfaces for data analysis

    Victoria M. Benson

    2015-02-01

    Full Text Available Biological data analysis is frequently performed with command line software. While this practice provides considerable flexibility for computationally savy individuals, such as investigators trained in bioinformatics, this also creates a barrier to the widespread use of data analysis software by investigators trained as biologists and/or clinicians. Workflow systems such as Galaxy and Taverna have been developed to try and provide generic user interfaces that can wrap command line analysis software. These solutions are useful for problems that can be solved with workflows, and that do not require specialized user interfaces. However, some types of analyses can benefit from custom user interfaces. For instance, developing biomarker models from high-throughput data is a type of analysis that can be expressed more succinctly with specialized user interfaces. Here, we show how Language Workbench (LW technology can be used to model the biomarker development and validation process. We developed a language that models the concepts of Dataset, Endpoint, Feature Selection Method and Classifier. These high-level language concepts map directly to abstractions that analysts who develop biomarker models are familiar with. We found that user interfaces developed in the Meta-Programming System (MPS LW provide convenient means to configure a biomarker development project, to train models and view the validation statistics. We discuss several advantages of developing user interfaces for data analysis with a LW, including increased interface consistency, portability and extension by language composition. The language developed during this experiment is distributed as an MPS plugin (available at http://campagnelab.org/software/bdval-for-mps/.

  18. Analysis of the 2011 CERN Document Server User Satisfaction Survey

    Le Meur, J-Y

    2012-01-01

    This document analyses the results of the CDS User Satisfaction Survey that ran during Autumn 2011. It shows the feedback received relative to the Search engine, the Submission procedures and the Collaborative tools. It then describes the feedback received relative to the content of the CERN Document Server and general user impressions. The feedback is compared with some key statistics that were automatically extracted from the CDS user activity logs in 2011. A selection of the most useful free text comments that were made by the respondents of the survey is also listed. In the last part, an action list derived from the combined analysis of the survey, the statistics and the comments is being drafted. 150 answers have been received in total (some, with sections not completed). 2/3rd of these answers came from users working at CERN.

  19. CONPAS 1.0 (CONtainment Performance Analysis System). User's manual

    CONPAS (CONtainment Performance Analysis System) is a verified computer code package to integrate the numerical, graphical, and results-operation aspects of Level 2 probabilistic safety assessments (PSA) for nuclear power plants automatically under a PC window environment. Compared with the existing DOS-based computer codes for Level 2 PSA, the most important merit of the window-based computer code is that user can easily describe and quantify the accident progression models, and manipulate the resultant outputs in a variety of ways. As a main logic for accident progression analysis, CONPAS employs a concept of the small containment phenomenological event tree (CPET) helpful to trace out visually individual accident progressions and of the large supporting event tree (LSET) for its detailed quantification. For the integrated analysis of Level 2 PSA, the code utilizes four distinct, but closely related modules; (1) ET Editor for construction of several event tree models describing the accident progressions, (2) Computer for quantification of the constructed event trees and graphical display of the resultant outputs, (3) Text Editor for preparation of input decks for quanification and utilization of calculational results, and (4) Mechanistic Code Plotter for utilization of results obtained from severe accident analysis codes. Compared with other existing computer codes for Level 2 PSA, the CONPAS code provides several advanced features: computational aspects including systematic uncertainty analysis, importance analysis, sensitivity analysis and data interpretation, reporting aspects including tabling and graphic as well as user-friend interface. 10 refs. (Author) .new

  20. User controlled analysis of gamma ray spectra

    The program 'ANGES' was designed as a general purpose high-resolution γ ray spectrometry program. It offers all main features as commercial software packages except control of acquisition process. The program is able to perform automatic analysis of spectra but it is announced as 'user controlled' because it supplies all intermediate results and gives the opportunity these results to be analyzed and corrected by the user. ANGES offers: multi document Windows interface; detailed visualization of spectra; nuclide library based on another contribution to CRP; energy and FWHM calibrations calculated by means of orthonormal polynomial fitting; peak processing engine based on a non-linear LSQ method for fitting peaks; peak location engine, based on first derivative method is provided to ease the preparation of a spectrum for processing; two methods for efficiency calibration (an efficiency calibration curve and reference table); peak identification and activity calculation procedure; a number of corrections (true coincidence summing, background correction, pile up rejection and so on); an option for processing series of similar spectra. The fitting procedure can be applied to the whole spectrum or to a single Region-of-Interest (ROI). The assumed peak shape is pure Gaussian. All peaks in single ROI are assumed to have the same FWHM. The maximum number of peaks in a single ROI is restricted to 25, the maximum ROI length is 512 channels, and the baseline is described with a polynomial of a degree up to 4. As a result of the identification procedure a report file is issued containing spectrum processing results, list of identified and not identified peaks, list of identified nuclides and background nuclides. (author)

  1. Trajectory Based Behavior Analysis for User Verification

    Pao, Hsing-Kuo; Lin, Hong-Yi; Chen, Kuan-Ta; Fadlil, Junaidillah

    Many of our activities on computer need a verification step for authorized access. The goal of verification is to tell apart the true account owner from intruders. We propose a general approach for user verification based on user trajectory inputs. The approach is labor-free for users and is likely to avoid the possible copy or simulation from other non-authorized users or even automatic programs like bots. Our study focuses on finding the hidden patterns embedded in the trajectories produced by account users. We employ a Markov chain model with Gaussian distribution in its transitions to describe the behavior in the trajectory. To distinguish between two trajectories, we propose a novel dissimilarity measure combined with a manifold learnt tuning for catching the pairwise relationship. Based on the pairwise relationship, we plug-in any effective classification or clustering methods for the detection of unauthorized access. The method can also be applied for the task of recognition, predicting the trajectory type without pre-defined identity. Given a trajectory input, the results show that the proposed method can accurately verify the user identity, or suggest whom owns the trajectory if the input identity is not provided.

  2. Comparative Analysis of Packet Scheduling Schemes for HSDPA Cellular Networks

    T. Janevski; K. Jakimoski

    2009-01-01

    In this paper we present comparison analysis for packet scheduling algorithms for HSDPA (High Speed Downlink Packet Networks). In particular, we analyze the round robin, max C/I and FCDS packet scheduling algorithms in HSDPA by comparing the average throughput, delay and fairness of the users, changing the number of the users in pedestrian and vehicular environment. The results have showed that the number of the users in a given coverage area is very important when choosing which packet sched...

  3. User Behavior and IM Topology Analysis

    Qiang Yan

    2008-07-01

    Full Text Available The use of Instant Messaging, or IM, has become widely adopted in private and corporate communication. They can provide instant, multi-directed and multi-types of communications which make the message spreading in IM different from those in WWW, Blog and email systems. Groups have great impacts on message spreading in IM. The research demonstrates the power law distribution of groups in MSN with parameter γ ranging from 0.76 to 1.22. Based on an online survey, IM user behavior is analyzed from the aspects of message sending/receiving and contacts maintaining. According to the results, degree distribution of users has a peak value and doesn't present power law character. This may indicate that social networks should be a prospective direction for the research on IM topology.

  4. Characteristics of Bitcoin Users: An Analysis of Google Search Data

    Wilson, Matthew; Yelowitz, Aaron

    2014-01-01

    The anonymity of Bitcoin prevents analysis of its users. We collect Google Trends data to examine determinants of interest in Bitcoin. Based on anecdotal evidence regarding Bitcoin users, we construct proxies for four possible clientele: computer programming enthusiasts, speculative investors, Libertarians, and criminals. Computer programming and illegal activity search terms are positively correlated with Bitcoin interest, while Libertarian and investment terms are not.

  5. CMS Configuration Editor: GUI based application for user analysis job

    We present the user interface and the software architecture of the Configuration Editor for the CMS experiment. The analysis workflow is organized in a modular way integrated within the CMS framework that organizes in a flexible way user analysis code. The Python scripting language is adopted to define the job configuration that drives the analysis workflow. It could be a challenging task for users, especially for newcomers, to develop analysis jobs managing the configuration of many required modules. For this reason a graphical tool has been conceived in order to edit and inspect configuration files. A set of common analysis tools defined in the CMS Physics Analysis Toolkit (PAT) can be steered and configured using the Config Editor. A user-defined analysis workflow can be produced starting from a standard configuration file, applying and configuring PAT tools according to the specific user requirements. CMS users can adopt this tool, the Config Editor, to create their analysis visualizing in real time which are the effects of their actions. They can visualize the structure of their configuration, look at the modules included in the workflow, inspect the dependences existing among the modules and check the data flow. They can visualize at which values parameters are set and change them according to what is required by their analysis task. The integration of common tools in the GUI needed to adopt an object-oriented structure in the Python definition of the PAT tools and the definition of a layer of abstraction from which all PAT tools inherit.

  6. CMS Configuration Editor: GUI based application for user analysis job

    de Cosa, A.

    2011-12-01

    We present the user interface and the software architecture of the Configuration Editor for the CMS experiment. The analysis workflow is organized in a modular way integrated within the CMS framework that organizes in a flexible way user analysis code. The Python scripting language is adopted to define the job configuration that drives the analysis workflow. It could be a challenging task for users, especially for newcomers, to develop analysis jobs managing the configuration of many required modules. For this reason a graphical tool has been conceived in order to edit and inspect configuration files. A set of common analysis tools defined in the CMS Physics Analysis Toolkit (PAT) can be steered and configured using the Config Editor. A user-defined analysis workflow can be produced starting from a standard configuration file, applying and configuring PAT tools according to the specific user requirements. CMS users can adopt this tool, the Config Editor, to create their analysis visualizing in real time which are the effects of their actions. They can visualize the structure of their configuration, look at the modules included in the workflow, inspect the dependences existing among the modules and check the data flow. They can visualize at which values parameters are set and change them according to what is required by their analysis task. The integration of common tools in the GUI needed to adopt an object-oriented structure in the Python definition of the PAT tools and the definition of a layer of abstraction from which all PAT tools inherit.

  7. ORMONTE, Uncertainty Analysis for User-Developed System Models

    1 - Description of program or function: ORMONTE is a generic multivariable uncertainty analysis driver which can be linked to any FORTRAN model supplied by the user. The user tells ORMONTE which variables in his model are uncertain and describes the associated probability distributions. The user also tells ORMONTE which outputs from his model are of interest and for which uncertainty profiles are desired. Given the uncertainties in the inputs, ORMONTE samples the user-defined input distributions and 'drives' or runs the users model enough times such that a probability histogram or profile is constructed for the user-defined outputs of interest. ORMONTE can also perform sequential one-variable-at-a-time sensitivity studies and elasticity analysis. The user-supplied model is not restricted to a shielding model. Any FORTRAN model where uncertain outputs can be represented as functions of uncertain, independent inputs can be used. The ORMONTE package includes a set of Probability Data Analysis (PDA) routines for converting raw probability data into probability distribution format suitable for input to ORMONTE. 2 - Method of solution: ORMONTE uses the Monte Carlo technique to sample user-defined input probability distributions. 3 - Restrictions on the complexity of the problem: None noted

  8. Preliminary analysis of GASIS user needs

    Vidas, E.H.; Hugman, R.H.

    1993-12-31

    The GASIS (Gas Information System) project is a three year effort to develop a personal computer-based (CD-ROM) natural gas database and information system for the United States. GASIS will have two components: a ``Source Directory`` documenting natural gas supply-related databases and information centers and a ``Reservoir Data System`` of information for individual gas reservoirs. The Source Directory will document the location, characteristics, and accessibility of natural gas supply information sources, such as bibliographic databases, engineering and/or geological data compilations, and natural gas information centers. The Data System will be the largest portion of GASIS and will contain geological and engineering data at the reservoir level. The GASIS project will involve the compilation of existing public domain data, excerpts from Dwight`s databases, and the collection of new reservoir data. Data assembly and collection will be prioritized by the User Needs study. A ``User Needs`` assessment for the planned GASIS data system has been underway since September of this year. It is designed to cover all major segments of the gas industry, including major and independent producers, state and federal agencies, pipelines, research organizations, banks, and service companies. The objectives of the evaluation are: To design GASIS to meet the needs of industry and the research community; to determine potential applications for GASIS in order to better design the database; to prioritize data categories and specific data collection activities; to evaluate industry software and data exchange requirements.

  9. Reinforcing user data analysis with Ganga in the LHC era: scalability, monitoring and user-support

    Ganga is a grid job submission and management system widely used in the ATLAS and LHCb experiments and several other communities in the context of the EGEE project. The particle physics communities have entered the LHC operation era which brings new challenges for user data analysis: a strong growth in the number of users and jobs is already noticeable. Current work in the Ganga project is focusing on dealing with these challenges. In recent Ganga releases the support for the pilot job based grid systems Panda and Dirac of the ATLAS and LHCb experiment respectively have been strengthened. A more scalable job repository architecture, which allows efficient storage of many thousands of jobs in XML or several database formats, was recently introduced. A better integration with monitoring systems, including the Dashboard and job execution monitor systems is underway. These will provide comprehensive and easy job monitoring. A simple to use error reporting tool integrated at the Ganga command-line will help to improve user support and debugging user problems. Ganga is a mature, stable and widely-used tool with long-term support from the HEP community. We report on how it is being constantly improved following the user needs for faster and easier distributed data analysis on the grid.

  10. The Radiological Safety Analysis Computer Program (RSAC-5) user's manual

    The Radiological Safety Analysis Computer Program (RSAC-5) calculates the consequences of the release of radionuclides to the atmosphere. Using a personal computer, a user can generate a fission product inventory from either reactor operating history or nuclear criticalities. RSAC-5 models the effects of high-efficiency particulate air filters or other cleanup systems and calculates decay and ingrowth during transport through processes, facilities, and the environment. Doses are calculated through the inhalation, immersion, ground surface, and ingestion pathways. RSAC+, a menu-driven companion program to RSAC-5, assists users in creating and running RSAC-5 input files. This user's manual contains the mathematical models and operating instructions for RSAC-5 and RSAC+. Instructions, screens, and examples are provided to guide the user through the functions provided by RSAC-5 and RSAC+. These programs are designed for users who are familiar with radiological dose assessment methods

  11. KEY analysis system user's guide. Version 2.0

    The KEY analysis system is a software program designed to process digital wave form data from the United States National Seismograph Network. The KEY system performs many data processing and scientific analysis functions. Detailed operating procedures for the KEY analysis system are provided in this User's Guide

  12. A User Requirements Analysis Approach Based on Business Processes

    ZHENG Yue-bin; HAN Wen-xiu

    2001-01-01

    Requirements analysis is the most important phase of information system development.Existing requirements analysis techniques concern little or no about features of different business processes.This paper presents a user requirements analysis approach which focuses business processes on the early stage of requirements analysis. It also gives an example of the using of this approach in the analysis of an enterprise information system.

  13. Uncertainty analysis in integrated assessment: the users' perspective

    Gabbert, S.G.M.; Ittersum, van M.K.; Kroeze, C.; Stalpers, S.I.P.; Ewert, F.; Alkan Olsson, J.

    2010-01-01

    Integrated Assessment (IA) models aim at providing information- and decision-support to complex problems. This paper argues that uncertainty analysis in IA models should be user-driven in order to strengthen science–policy interaction. We suggest an approach to uncertainty analysis that starts with

  14. Advanced space system analysis software. Technical, user, and programmer guide

    Farrell, C. E.; Zimbelman, H. F.

    1981-01-01

    The LASS computer program provides a tool for interactive preliminary and conceptual design of LSS. Eight program modules were developed, including four automated model geometry generators, an associated mass properties module, an appendage synthesizer module, an rf analysis module, and an orbital transfer analysis module. The existing rigid body controls analysis module was modified to permit analysis of effects of solar pressure on orbital performance. A description of each module, user instructions, and programmer information are included.

  15. Reliability of an Automated High-Resolution Manometry Analysis Program across Expert Users, Novice Users, and Speech-Language Pathologists

    Jones, Corinne A.; Hoffman, Matthew R.; Geng, Zhixian; Abdelhalim, Suzan M.; Jiang, Jack J.; McCulloch, Timothy M.

    2014-01-01

    Purpose: The purpose of this study was to investigate inter- and intrarater reliability among expert users, novice users, and speech-language pathologists with a semiautomated high-resolution manometry analysis program. We hypothesized that all users would have high intrarater reliability and high interrater reliability. Method: Three expert…

  16. Dairy Analytics and Nutrient Analysis (DANA) Prototype System User Manual

    Sam Alessi; Dennis Keiser

    2012-10-01

    This document is a user manual for the Dairy Analytics and Nutrient Analysis (DANA) model. DANA provides an analysis of dairy anaerobic digestion technology and allows users to calculate biogas production, co-product valuation, capital costs, expenses, revenue and financial metrics, for user customizable scenarios, dairy and digester types. The model provides results for three anaerobic digester types; Covered Lagoons, Modified Plug Flow, and Complete Mix, and three main energy production technologies; electricity generation, renewable natural gas generation, and compressed natural gas generation. Additional options include different dairy types, bedding types, backend treatment type as well as numerous production, and economic parameters. DANA’s goal is to extend the National Market Value of Anaerobic Digester Products analysis (informa economics, 2012; Innovation Center, 2011) to include a greater and more flexible set of regional digester scenarios and to provide a modular framework for creation of a tool to support farmer and investor needs. Users can set up scenarios from combinations of existing parameters or add new parameters, run the model and view a variety of reports, charts and tables that are automatically produced and delivered over the web interface. DANA is based in the INL’s analysis architecture entitled Generalized Environment for Modeling Systems (GEMS) , which offers extensive collaboration, analysis, and integration opportunities and greatly speeds the ability construct highly scalable web delivered user-oriented decision tools. DANA’s approach uses server-based data processing and web-based user interfaces, rather a client-based spreadsheet approach. This offers a number of benefits over the client-based approach. Server processing and storage can scale up to handle a very large number of scenarios, so that analysis of county, even field level, across the whole U.S., can be performed. Server based databases allow dairy and digester

  17. User-level sentiment analysis incorporating social networks

    Tan, Chenhao; Tang, Jie; Jiang, Long; Zhou, Ming; Li, Ping

    2011-01-01

    We show that information about social relationships can be used to improve user-level sentiment analysis. The main motivation behind our approach is that users that are somehow "connected" may be more likely to hold similar opinions; therefore, relationship information can complement what we can extract about a user's viewpoints from their utterances. Employing Twitter as a source for our experimental data, and working within a semi-supervised framework, we propose models that are induced either from the Twitter follower/followee network or from the network in Twitter formed by users referring to each other using "@" mentions. Our transductive learning results reveal that incorporating social-network information can indeed lead to statistically significant sentiment-classification improvements over the performance of an approach based on Support Vector Machines having access only to textual features.

  18. User`s manual of a support system for human reliability analysis

    Yokobayashi, Masao [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment; Tamura, Kazuo

    1995-10-01

    Many kinds of human reliability analysis (HRA) methods have been developed. However, users are required to be skillful so as to use them, and also required complicated works such as drawing event tree (ET) and calculation of uncertainty bounds. Moreover, each method is not so complete that only one method of them is not enough to evaluate human reliability. Therefore, a personal computer (PC) based support system for HRA has been developed to execute HRA practically and efficiently. The system consists of two methods, namely, simple method and detailed one. The former uses ASEP that is a simplified THERP-technique, and combined method of OAT and HRA-ET/DeBDA is used for the latter. Users can select a suitable method for their purpose. Human error probability (HEP) data were collected and a database of them was built to use for the support system. This paper describes outline of the HRA methods, support functions and user`s guide of the system. (author).

  19. Meta-analysis of molecular imaging of serotonin transporters in ecstasy/polydrug users.

    Roberts, Carl Alexander; Jones, Andrew; Montgomery, Catharine

    2016-04-01

    We conducted a meta-analysis on the available data from studies investigating SERTs in ecstasy users and polydrug using controls. From 7 studies we compared data from 157 ecstasy users and 148 controls across 14 brain regions. The main effect suggested ecstasy/MDMA related SERT reductions (SMD=0.52, 95% CIs [0.40, 0.65]; Z=8.36, pMDMA use. PMID:26855234

  20. Meta-analysis of executive functioning in ecstasy/polydrug users

    Roberts, C A; Jones, A.; Montgomery, C.

    2016-01-01

    Ecstasy/3,4-methylenedioxymethamphetamine (MDMA) use is proposed to cause damage to serotonergic (5-HT) axons in humans. Therefore, users should show deficits in cognitive processes that rely on serotonin-rich, prefrontal areas of the brain. However, there is inconsistency in findings to support this hypothesis. The aim of the current study was to examine deficits in executive functioning in ecstasy users compared with controls using meta-analysis. We identified k = 39 studies, contributing 8...

  1. Residence time distribution software analysis. User's manual

    Radiotracer applications cover a wide range of industrial activities in chemical and metallurgical processes, water treatment, mineral processing, environmental protection and civil engineering. Experiment design, data acquisition, treatment and interpretation are the basic elements of tracer methodology. The application of radiotracers to determine impulse response as RTD as well as the technical conditions for conducting experiments in industry and in the environment create a need for data processing using special software. Important progress has been made during recent years in the preparation of software programs for data treatment and interpretation. The software package developed for industrial process analysis and diagnosis by the stimulus-response methods contains all the methods for data processing for radiotracer experiments

  2. GRAFLAB 2.3 for UNIX - A MATLAB database, plotting, and analysis tool: User`s guide

    Dunn, W.N.

    1998-03-01

    This report is a user`s manual for GRAFLAB, which is a new database, analysis, and plotting package that has been written entirely in the MATLAB programming language. GRAFLAB is currently used for data reduction, analysis, and archival. GRAFLAB was written to replace GRAFAID, which is a FORTRAN database, analysis, and plotting package that runs on VAX/VMS.

  3. Nuclear power ecology: comparative analysis

    Ecological effects of different energy sources are compared. Main actions for further nuclear power development - safety increase and waste management, are noted. Reasons of restrained public position to nuclear power and role of social and political factors in it are analyzed. An attempt is undertaken to separate real difficulties of nuclear power from imaginary ones that appear in some mass media. International actions of environment protection are noted. Risk factors at different energy source using are compared. The results of analysis indicate that ecological influence and risk for nuclear power are of minimum

  4. Experiences of Kratom Users: A Qualitative Analysis.

    Swogger, Marc T; Hart, Elaine; Erowid, Fire; Erowid, Earth; Trabold, Nicole; Yee, Kaila; Parkhurst, Kimberly A; Priddy, Brittany M; Walsh, Zach

    2015-01-01

    Kratom (Mitragyna speciosa) is a psychoactive plant that has been used since at least 1836 in folk medicine in Southeast Asian countries. More recently, kratom has become widely available in the West and is used for both recreational and medicinal purposes. There has, however, been little scientific research into the short- and long-term effects of kratom in humans, and much of the information available is anecdotal. To supplement the increasing scientific understanding of kratom's pharmacology and research into its effects in animals, we report the results of a qualitative analysis of first-hand descriptions of human kratom use that were submitted to, and published by, a psychoactive substance information website (Erowid.org). Themes that emerged from these experience reports indicate that kratom may be useful for analgesia, mood elevation, anxiety reduction, and may aid opioid withdrawal management. Negative response themes also emerged, indicating potential problems and unfavorable "side" effects, especially stomach upset and vomiting. Based on our analyses, we present preliminary hypotheses for future examination in controlled, quantitative studies of kratom. PMID:26595229

  5. BANK RATING. A COMPARATIVE ANALYSIS

    Batrancea Ioan

    2015-07-01

    Full Text Available Banks in Romania offers its customers a wide range of products but which involves both risk taking. Therefore researchers seek to build rating models to help managers of banks to risk of non-recovery of loans and interest. In the following we highlight rating Raiffeisen Bank, BCR-ERSTE Bank and Transilvania Bank, based on the models CAAMPL and Stickney making a comparative analysis of the two rating models.

  6. Comparing user experiences in using Twiki & Mediawiki to facilitate collaborative learning

    Liang, M.; Siu, F; Zhou, A.; Chu, S.

    2009-01-01

    This research seeks to determine the perceived effectiveness of using TWiki and MediaWiki in collaborative work and knowledge management; and to compare the use of TWiki and MediaWiki in terms of user experiences in the master’s level of study at the University of Hong Kong. Through a multiple case study approach, the study adopted a mixed methods research design which used both quantitative and qualitative methods to analyze findings from specific user groups in two study programmes. In the ...

  7. Multiple-user data acquisition and analysis system

    Manzella, V.; Chrien, R.E.; Gill, R.L.; Liou, H.I.; Stelts, M.L.

    1981-01-01

    The nuclear physics program at the Brookhaven National Laboratory High Flux Beam Reactor (HFBR) employs a pair of PDP-11 computers for the dual functions of data acquisition and analysis. The data acquisition is accomplished through CAMAC and features a microprogrammed branch driver to accommodate various experimental inputs. The acquisition computer performs the functions of multi-channel analyzers, multiscaling and time-sequenced multichannel analyzers and gamma-ray coincidence analyzers. The data analysis computer is available for rapid processing of data tapes written by the acquisition computer. The ability to accommodate many users is facilitated by separating the data acquisition and analysis functions, and allowing each user to tailor the analysis to the specific requirements of his own experiment. The system is to be upgraded soon by the introduction of a dual port disk to allow a data base to be available to each computer.

  8. Multiple-user data acquisition and analysis system

    The nuclear physics program at the Brookhaven National Laboratory High Flux Beam Reactor (HFBR) employs a pair of PDP-11 computers for the dual functions of data acquisition and analysis. The data acquisition is accomplished through CAMAC and features a microprogrammed branch driver to accommodate various experimental inputs. The acquisition computer performs the functions of multi-channel analyzers, multiscaling and time-sequenced multichannel analyzers and gamma-ray coincidence analyzers. The data analysis computer is available for rapid processing of data tapes written by the acquisition computer. The ability to accommodate many users is facilitated by separating the data acquisition and analysis functions, and allowing each user to tailor the analysis to the specific requirements of his own experiment. The system is to be upgraded soon by the introduction of a dual port disk to allow a data base to be available to each computer

  9. User's operating procedures. Volume 2: Scout project financial analysis program

    Harris, C. G.; Haris, D. K.

    1985-01-01

    A review is presented of the user's operating procedures for the Scout Project Automatic Data system, called SPADS. SPADS is the result of the past seven years of software development on a Prime mini-computer located at the Scout Project Office, NASA Langley Research Center, Hampton, Virginia. SPADS was developed as a single entry, multiple cross-reference data management and information retrieval system for the automation of Project office tasks, including engineering, financial, managerial, and clerical support. This volume, two (2) of three (3), provides the instructions to operate the Scout Project Financial Analysis program in data retrieval and file maintenance via the user friendly menu drivers.

  10. Methamphetamine Users Have Increased Dental Disease: A Propensity Score Analysis.

    Shetty, V; Harrell, L; Clague, J; Murphy, D A; Dye, B A; Belin, T R

    2016-07-01

    Methamphetamine (MA) users are assumed to have a high burden of tooth decay. Less clear is how the distribution and severity of dental caries in MA users differ from the general population. Using a covariate-balancing propensity score strategy, we investigated the differential effects of MA use on dental caries by comparing the patterns of decayed, missing, and filled teeth in a community sample of 571 MA users with a subset of 2,755 demographically similar control individuals selected from a National Health and Nutrition Examination Survey (NHANES) cohort. Recruited over a 2-y period with a stratified sampling protocol, the MA users underwent comprehensive dental examinations by 3 trained and calibrated dentists using NHANES protocols. Propensity scores were estimated with logistic regression based on background characteristics, and a subset of closely matched subjects was stratified into quintiles for comparisons. MA users were twice as likely to have untreated caries (odds ratio [OR] = 2.08; 95% confidence interval [95% CI]: 1.55 to 2.78) and 4 times more likely to have caries experience (OR = 4.06; 95% CI: 2.24 to 7.34) than the control group of NHANES participants. Additionally, MA users were twice as likely to have 2 more decayed, missing, or filled teeth (OR = 2.08; 95% CI: 1.29 to 2.79) than the NHANES participants. The differential involvement of the teeth surfaces in MA users was quite distinctive, with carious surface involvement being highest for the maxillary central incisors, followed by maxillary posterior premolars and molars. Users injecting MA had significantly higher rates of tooth decay compared with noninjectors (P = 0.04). Although MA users experienced decayed and missing dental surfaces more frequently than NHANES participants, NHANES participants had more restored surfaces, especially on molars. The high rates and distinctive patterns of dental caries observed could be used 1) to alert dentists to covert MA use in their patients and 2) as

  11. Comparative Perspective of Human Behavior Patterns to Uncover Ownership Bias among Mobile Phone Users

    Ayumi Arai

    2016-06-01

    Full Text Available With the rapid spread of mobile devices, call detail records (CDRs from mobile phones provide more opportunities to incorporate dynamic aspects of human mobility in addressing societal issues. However, it has been increasingly observed that CDR data are not always representative of the population under study because it only includes device users alone. To understand the discrepancy between the population captured by CDRs and the general population, we profile principal populations of CDRs by analyzing routines based on time spent at key locations and compare these data with those of the general population. We employ a topic model to estimate typical routines of mobile phone users using CDRs as topics. The routines are extracted from field survey data and compared between those of the general population and mobile phone users. We found that there are two main population groups of mobile phone users in Dhaka: males engaged in an income-generating activity at a specific location other than home and females performing household tasks and spending most of their time at home. We determine that CDRs tend to omit students, who form a significant component of the Dhaka population.

  12. User's manual of a support system for human reliability analysis

    Many kinds of human reliability analysis (HRA) methods have been developed. However, users are required to be skillful so as to use them, and also required complicated works such as drawing event tree (ET) and calculation of uncertainty bounds. Moreover, each method is not so complete that only one method of them is not enough to evaluate human reliability. Therefore, a personal computer (PC) based support system for HRA has been developed to execute HRA practically and efficiently. The system consists of two methods, namely, simple method and detailed one. The former uses ASEP that is a simplified THERP-technique, and combined method of OAT and HRA-ET/DeBDA is used for the latter. Users can select a suitable method for their purpose. Human error probability (HEP) data were collected and a database of them was built to use for the support system. This paper describes outline of the HRA methods, support functions and user's guide of the system. (author)

  13. Development of output user interface software to support analysis

    Data processing software packages such as VSOP and MCNPX are softwares that has been scientifically proven and complete. The result of VSOP and MCNPX are huge and complex text files. In the analyze process, user need additional processing like Microsoft Excel to show informative result. This research develop an user interface software for output of VSOP and MCNPX. VSOP program output is used to support neutronic analysis and MCNPX program output is used to support burn-up analysis. Software development using iterative development methods which allow for revision and addition of features according to user needs. Processing time with this software 500 times faster than with conventional methods using Microsoft Excel. PYTHON is used as a programming language, because Python is available for all major operating systems: Windows, Linux/Unix, OS/2, Mac, Amiga, among others. Values that support neutronic analysis are k-eff, burn-up and mass Pu239 and Pu241. Burn-up analysis used the mass inventory values of actinide (Thorium, Plutonium, Neptunium and Uranium). Values are visualized in graphical shape to support analysis

  14. Development of output user interface software to support analysis

    Wahanani, Nursinta Adi; Natsir, Khairina; Hartini, Entin

    2014-09-01

    Data processing software packages such as VSOP and MCNPX are softwares that has been scientifically proven and complete. The result of VSOP and MCNPX are huge and complex text files. In the analyze process, user need additional processing like Microsoft Excel to show informative result. This research develop an user interface software for output of VSOP and MCNPX. VSOP program output is used to support neutronic analysis and MCNPX program output is used to support burn-up analysis. Software development using iterative development methods which allow for revision and addition of features according to user needs. Processing time with this software 500 times faster than with conventional methods using Microsoft Excel. PYTHON is used as a programming language, because Python is available for all major operating systems: Windows, Linux/Unix, OS/2, Mac, Amiga, among others. Values that support neutronic analysis are k-eff, burn-up and mass Pu239 and Pu241. Burn-up analysis used the mass inventory values of actinide (Thorium, Plutonium, Neptunium and Uranium). Values are visualized in graphical shape to support analysis.

  15. Ontology-Aided vs. Keyword-Based Web Searches: A Comparative User Study

    Kamel, Magdi; Lee, Ann; Powers, Ed

    2007-01-01

    Ontologies are formal explicit description of concepts in a domain of discourse, properties of these concepts, and restrictions on these properties that are specified by semantics that follow the “rules” of the domain of knowledge. As such, ontologies would be extremely useful as knowledge bases for an application attempting to add context to a particular Web search term. This paper describes such an application and reports the results of a user study designed to compare the effec...

  16. User`s Guide for the NREL Teetering Rotor Analysis Program (STRAP)

    Wright, A D

    1992-08-01

    The following report gives the reader an overview of instructions on the proper use of the National Renewable Energy Laboratory (formerly the Solar Energy Research Institute, or SERI) teetering Rotor Analysis Program (STRAP version 2.20). STRAP is a derivative of the Force and Loads Analysis program (FLAP). It is intended as a tool for prediction of rotor and blade loads and response for only two-bladed teetering hub wind turbines. The effects of delta-3, undersling, hub mass, and wind turbulence are accounted for. The objectives of the report are to give an overview of the code and also show the methods of data input and correct code execution steps in order to model an example two-bladed teetering hub turbine. A large portion of the discussion (Sections 6.0, 7.0, and 8.0) is devoted to the subject of inputting and running the code for wind turbulence effects. The ability to include turbulent wind effects is perhaps the biggest change in the code since the release of FLAP version 2.01 in 1988. This report is intended to be a user`s guide. It does not contain a theoretical discussion on equations of motion, assumptions, underlying theory, etc. It is intended to be used in conjunction with Wright, Buhl, and Thresher (1988).

  17. Comparative Analysis of Classifier Fusers

    Marcin Zmyslony

    2012-06-01

    Full Text Available There are many methods of decision making by an ensemble of classifiers. The most popular are methods that have their origin in voting method, where the decision of the common classifier is a combination of individual classifiers’ outputs. This work presents comparative analysis of some classifier fusion methods based on weighted voting of classifiers’ responses and combination of classifiers’ discriminant functions. We discus different methods of producing combined classifiers based on weights. We show that it is notpossible to obtain classifier better than an abstract model of committee known as an Oracle if it is based only on weighted voting but models based on discriminant function or classifier using feature values and class numbers could outperform the Oracle as well. Delivered conclusions are confirmed by the results of computer experiments carried out on benchmark and computer generated data.

  18. Comparative Analysis of Classifier Fusers

    Marcin Zmyslony

    2012-05-01

    Full Text Available There are many methods of decision making by an ensemble of classifiers. The most popular are methods that have their origin in voting method, where the decision of the common classifier is a combination of individual classifiers’ outputs. This work presents comparative analysis of some classifier fusion methods based on weighted voting of classifiers’ responses and combination of classifiers’ discriminant functions. We discus different methods of producing combined classifiers based on weights. We show that it is not possible to obtain classifier better than an abstract model of committee known as an Oracle if it is based only on weighted voting but models based on discriminant function or classifier using feature values and class numbers could outperform the Oracle as well. Delivered conclusions are confirmed by the results of computer experiments carried out on benchmark and computer generated data.

  19. Comparative analysis of collaboration networks

    In this paper we carry out a comparative analysis of the word network as the collaboration network based on the novel by M. Bulgakov 'Master and Margarita', the synonym network of the Russian language as well as the Russian movie actor network. We have constructed one-mode projections of these networks, defined degree distributions for them and have calculated main characteristics. In the paper a generation algorithm of collaboration networks has been offered which allows one to generate networks statistically equivalent to the studied ones. It lets us reveal a structural correlation between word network, synonym network and movie actor network. We show that the degree distributions of all analyzable networks are described by the distribution of q-type.

  20. A cytomorphometric analysis of oral mucosal changes in tobacco users

    Khot, Komal; Deshmane, Swati; Bagri-Manjarekar, Kriti; Warke, Darshana; Kotak, Keyuri

    2015-01-01

    Aim: Tobacco use is the major cause of oral cancer, which is the sixth most common form of malignancy globally. Even in the absence of clinical manifestations, early changes in the oral mucosa can be detected microscopically by exfoliative cytology. The present study aimed to study and compare the cellular changes in the oral mucosa of tobacco users using cytomorphometry. Materials and Methods: A total of 80 subjects were included: 20 without any tobacco use habits, 20 tobacco chewers, 20 smo...

  1. Comparative Analysis of Packet Scheduling Schemes for HSDPA Cellular Networks

    T. Janevski

    2009-06-01

    Full Text Available In this paper we present comparison analysis for packet scheduling algorithms for HSDPA (High Speed Downlink Packet Networks. In particular, we analyze the round robin, max C/I and FCDS packet scheduling algorithms in HSDPA by comparing the average throughput, delay and fairness of the users, changing the number of the users in pedestrian and vehicular environment. The results have showed that the number of the users in a given coverage area is very important when choosing which packet scheduling algorithm for HSDPA networks. These results will be very useful for choosing the adequate scheduling algorithm in HSDPA network with aim to satisfy the desired quality of service for the mobile users.

  2. Reactor safety analysis computer program features that enhance user productivity

    This paper describes several design features of the MARY computer program that increase user productivity. The MARY program was used to analyze behavior of the Savannah River Site (SRS) K Reactor during postulated nuclear and thermal-hydraulic transients, such as overpower and underflow events, before K Reactor was placed in cold standby in 1993. These analyses provide the bases for portions of the accident chapter of the K-Reactor Safety Analysis Report

  3. User`s Guide for the NREL Force and Loads Analysis Program. Version 2.2

    Wright, A D

    1992-08-01

    The following report gives the reader an overview of and instructions on the proper use of the National Renewable Energy Laboratory Force and Loads Analysis Program (FLAP, version 2.2). It is intended as a tool for prediction of rotor and blade loads and response for two- or three-bladed rigid hub wind turbines. The effects of turbulence are accounted for. The objectives of the report are to give an overview of the code and also show the methods of data input and correct code execution steps in order to model an example two-bladed rigid hub turbine. A large portion of the discussion (Sections 6.0, 7.0, and 8.0) is devoted to the subject of inputting and running the code for wind turbulence effects. The ability to include turbulent wind effects is perhaps the biggest change in the code since the release of FLAP version 2.01 in 1988. This report is intended to be a user`s guide. It does not contain a theoretical discussion on equations of motion, assumptions, underlying theory, etc. It is intended to be used in conjunction with Wright, Buhl, and Thresher (1988).

  4. New indicators of illegal drug use to compare drug user populations for policy evaluation

    Francesco Fabi

    2013-11-01

    Full Text Available Background: New trends in drug consumption show a trend towards higher poly-use. Epidemiological indicators presently used are mostly based on the prevalence of users of the “main” substances and the ranking of harm caused by drug use is based on a single substance analysis.Methods: In this paper new indicators are proposed; the approach consider the segmentation of the population with respect to the frequency of use in the last 30 days and the harm score of the various substances used by a poly-user. Scoring is based on single substance score table reported in recent papers and principal component analysis is applied to reduce dimensionality. Any user ischaracterized by the two new scores: frequency of use score and poly-use score.Results: The method is applied to the drug user populations interviewed in Communities and Low Threshold Services within the Problem Drug Use 2012 survey in four different European countries. The comparison of the poly-use score cumulative distributions gives insight about behavioural trends of drug use and also evaluate the efficacy of the intervention services. Furthermore, the application of this method to School Population Survey 2011 data allows a definition of the expected behaviour of the poly-drug score for the General Population Survey to be representative.Conclusions: In general, the method is simply and intuitive, and could be applied to surveys containing questions about drug use. A possible limitations could be that the median is chosen for calculating the frequency of use score in questionnaires containing the frequency of drug use in classes.

  5. Crianças usuárias de lente de contato nos serviços público e privado: análise comparativa Pediatric contact lens users in public and private services: comparative analysis

    Daniela Araújo Toscano

    2009-04-01

    Full Text Available OBJETIVOS: Analisar as indicações, tipo, complicações do uso de lentes de contato e acuidade visual em crianças de serviços de Oftalmologia público e privado. MÉTODOS: Os dados dos prontuários de 59 crianças usuárias de lentes de contato em serviço privado (Hospital de Olhos de Pernambuco - Grupo 1, e 43 no serviço público (Fundação Altino Ventura - Grupo 2, foram analisados. A coleta de dados incluiu características sociodemográficas, idade da primeira consulta, indicação do uso da lente, tipo de lente, complicações e acuidade visual. RESULTADOS: As mais comuns indicações do uso de lente de contato no grupo 1 foram: ametropia (55,9%, anisometropia (18,6% e esotropia (16,9%. Neste grupo o leucoma e phthisis não estavam presentes. No grupo 2, as indicações mais comuns foram: anisometropia (23,2%, ametropia e leucoma (18,6% cada, e phthisis (16,3%. A esotropia não apareceu no grupo 2. O tipo de lente de contato mais prescrita foi a gelatinosa de uso permanente (não descartável no grupo 1 (45,8% e no grupo 2 (32,6%. A complicação mais encontrada no grupo 1 foi desconforto (33,3% e no grupo 2 perda da lente (60%. CONCLUSÕES: A indicação de ametropia predominou nos pacientes privados e as anisometropias nos públicos. O tipo de lente de contato mais prescrita nos dois grupos foi a gelatinosa de uso permanente. A complicação mais frequente no grupo 1 foi desconforto e no grupo 2 perda da lente. A acuidade visual na maioria dos pacientes manteve-se.PURPOSE: To analyze the indications, type and complications of contact lens use and visual acuity in children, in ophthalmological, public and private, services. METHODS: The information from the medical records of 59 contact lens users at a private service (Hospital de Olhos de Pernambuco - Recife - PE- Brazil - group 1, and 43 at public service (Fundação Altino Ventura - Recife - PE - Brazil - group 2, was analyzed. The collected data included: demographic information

  6. GPP user`s guide - a general-purpose postprocessor for wind turbine data analysis

    Buhl, Jr, M L

    1995-01-01

    GPP (pronounced {open_quotes}jeep{close_quotes}) is a General-Purpose Postprocessor for wind turbine data analysis. The author, a member of the Wind Technology Division (WTD) of the National Renewable Energy Laboratory (NREL), developed GPP to postprocess test data and simulation predictions. GPP reads data into large arrays and allows the user to run many types of analyses on the data stored in memory. It runs on inexpensive computers common in the wind industry. One can even use it on a laptop in the field. The author wrote the program in such a way as to make it easy to add new types of analyses and to port it to many types of computers. Although GPP is very powerful and feature-rich, it is still very easy to learn and to use. Exhaustive error trapping prevents one from losing valuable work due to input errors. GPP will, hopefully, make a significant impact on engineering productivity in the wind industry.

  7. User needs and requirements analysis for big data healthcare applications.

    Zillner, Sonja; Lasierra, Nelia; Faix, Werner; Neururer, Sabrina

    2014-01-01

    The realization of big data applications that allow improving the quality and efficiency of healthcare care delivery is challenging. In order to take advantage of the promising opportunities of big data technologies, a clear understanding of user needs and requirements of the various stakeholders of healthcare, such as patients, clinicians and physicians, healthcare provider, payors, pharmaceutical industry, medical product suppliers and government, is needed. Our study is based on internet, literature and market study research as well as on semi-structured interviews with major stakeholder groups of healthcare delivery settings. The analysis shows that big data technologies could be used to align the opposing user needs of improved quality with improved efficiency of care. However, this requires the integrated view of various heterogeneous data sources, legal frameworks for data sharing and incentives that foster collaboration. PMID:25160268

  8. Architecture of collaborating frameworks: simulation, visualisation, user interface and analysis

    In modern high energy and astrophysics experiments the variety of user requirements and the complexity of the problem domain often involve the collaboration of several software frameworks, and different components are responsible for providing the functionalities related to each domain. For instance, a common use case consists in studying the physics effects and the detector performance, resulting from primary events, in a given detector configuration, to evaluate the physics reach of the experiment or optimise the detector design. Such a study typically involves various components: Simulation, Visualisation, Analysis and (interactive) User Interface. The authors focus on the design aspects of the collaboration of these frameworks and on the technologies that help to simplify the complex process of software design

  9. Cultural Differences in User Privacy Behavior on Social Networking Sites : An Empirical Study comparing German and Swedish Facebook Users

    Falk, Sebastian; Riel, Nils

    2013-01-01

    Social Networking Sites (SNSs), such as Facebook, are becoming increasingly popular. Their worldwide accessibility is attracting billions of SNS users from all over the globe, which results in a variety of different cultures meeting on the respective platforms. Apart from their growing popularity, privacy issues represent a downside of SNSs attracting strong media and research attention. Considering SNS users’ cultural diversity, recent studies show that a culture influences the privacy conce...

  10. Meta-analysis of executive functioning in ecstasy/polydrug users.

    Roberts, C A; Jones, A; Montgomery, C

    2016-06-01

    Ecstasy/3,4-methylenedioxymethamphetamine (MDMA) use is proposed to cause damage to serotonergic (5-HT) axons in humans. Therefore, users should show deficits in cognitive processes that rely on serotonin-rich, prefrontal areas of the brain. However, there is inconsistency in findings to support this hypothesis. The aim of the current study was to examine deficits in executive functioning in ecstasy users compared with controls using meta-analysis. We identified k = 39 studies, contributing 89 effect sizes, investigating executive functioning in ecstasy users and polydrug-using controls. We compared function-specific task performance in 1221 current ecstasy users and 1242 drug-using controls, from tasks tapping the executive functions - updating, switching, inhibition and access to long-term memory. The significant main effect demonstrated overall executive dysfunction in ecstasy users [standardized mean difference (SMD) = -0.18, 95% confidence interval (CI) -0.26 to -0.11, Z = 5.05, p < 0.001, I 2 = 82%], with a significant subgroup effect (χ 2 = 22.06, degrees of freedom = 3, p < 0.001, I 2 = 86.4%) demonstrating differential effects across executive functions. Ecstasy users showed significant performance deficits in access (SMD = -0.33, 95% CI -0.46 to -0.19, Z = 4.72, p < 0.001, I 2 = 74%), switching (SMD = -0.19, 95% CI -0.36 to -0.02, Z = 2.16, p < 0.05, I 2 = 85%) and updating (SMD = -0.26, 95% CI -0.37 to -0.15, Z = 4.49, p < 0.001, I 2 = 82%). No differences were observed in inhibitory control. We conclude that this is the most comprehensive analysis of executive function in ecstasy users to date and provides a behavioural correlate of potential serotonergic neurotoxicity. PMID:26966023

  11. Development of a User Interface for a Regression Analysis Software Tool

    Ulbrich, Norbert Manfred; Volden, Thomas R.

    2010-01-01

    An easy-to -use user interface was implemented in a highly automated regression analysis tool. The user interface was developed from the start to run on computers that use the Windows, Macintosh, Linux, or UNIX operating system. Many user interface features were specifically designed such that a novice or inexperienced user can apply the regression analysis tool with confidence. Therefore, the user interface s design minimizes interactive input from the user. In addition, reasonable default combinations are assigned to those analysis settings that influence the outcome of the regression analysis. These default combinations will lead to a successful regression analysis result for most experimental data sets. The user interface comes in two versions. The text user interface version is used for the ongoing development of the regression analysis tool. The official release of the regression analysis tool, on the other hand, has a graphical user interface that is more efficient to use. This graphical user interface displays all input file names, output file names, and analysis settings for a specific software application mode on a single screen which makes it easier to generate reliable analysis results and to perform input parameter studies. An object-oriented approach was used for the development of the graphical user interface. This choice keeps future software maintenance costs to a reasonable limit. Examples of both the text user interface and graphical user interface are discussed in order to illustrate the user interface s overall design approach.

  12. Development of a task analysis tool to facilitate user interface design

    Scholtz, Jean C.

    1992-01-01

    A good user interface is one that facilitates the user in carrying out his task. Such interfaces are difficult and costly to produce. The most important aspect in producing a good interface is the ability to communicate to the software designers what the user's task is. The Task Analysis Tool is a system for cooperative task analysis and specification of the user interface requirements. This tool is intended to serve as a guide to development of initial prototypes for user feedback.

  13. CMS dashboard for monitoring of the user analysis activities

    Karavakis, Edward; Andreeva, Julia; Maier, Gerhild; Khan, Akram

    2012-12-01

    The CMS Virtual Organisation (VO) uses various fully distributed job submission methods and execution backends. The CMS jobs are processed on several middleware platforms such as the gLite, the ARC and the OSG. Up to 200,000 CMS jobs are submitted daily to the Worldwide LHC Computing Grid (WLCG) infrastructure and this number is steadily growing. These mentioned factors increase the complexity of the monitoring of the user analysis activities within the CMS VO. Reliable monitoring is an aspect of particular importance; it is a vital factor for the overall improvement of the quality of the CMS VO infrastructure.

  14. BWR plant dynamic analysis code BWRDYN user's manual

    Computer code BWRDYN has been developed for thermal-hydraulic analysis of a BWR plant. It can analyze the various types of transient caused by not only small but also large disturbances such as operating mode changes and/or system malfunctions. The verification of main analytical models of the BWRDYN code has been performed with measured data of actual BWR plant. Furthermore, the installation of BOP (Balance of Plant) model has made it possible to analyze the effect of BOP on reactor system. This report describes on analytical models and instructions for user of the BWRDYN code. (author)

  15. Architecture of collaborating frameworks simulation, visualisation, user interface and analysis

    Pfeier, A; Ferrero-Merlino, B; Giannitrapani, R; Longo, F; Nieminen, P; Pia, M G; Santin, G

    2001-01-01

    The Anaphe project is an ongoing effort to provide an Object Oriented software environment for data analysis in HENP experiments. A range of commercial and public domain libraries is used to cover basic functionalities; on top of these libraries a set of HENP-specific C++ class libraries for histogram management, fitting, plotting and ntuple-like data analysis has been developed. In order to comply with the user requirements for a command-line driven tool, we have chosen to use a scripting language (Python) as the front-end for a data analysis tool. The loose coupling provided by the consequent use of (AIDA compliant) Abstract Interfaces for each component in combination with the use of shared libraries for their implementation provides an easy integration of existing libraries into modern scripting languages thus allowing for rapid application development. This integration is simplified even further using a specialised toolkit (SWIG) to create "shadow classes" for the Python language, which map the definitio...

  16. Sociological analysis and comparative education

    Woock, Roger R.

    1981-12-01

    It is argued that comparative education is essentially a derivative field of study, in that it borrows theories and methods from academic disciplines. After a brief humanistic phase, in which history and philosophy were central for comparative education, sociology became an important source. In the mid-50's and 60's, sociology in the United States was characterised by Structural Functionalism as a theory, and Social Survey as a dominant methodology. Both were incorporated into the development of comparative education. Increasingly in the 70's, and certainly today, the new developments in sociology are characterised by an attack on Positivism, which is seen as the philosophical position underlying both functionalism and survey methods. New or re-discovered theories with their attendant methodologies included Marxism, Phenomenological Sociology, Critical Theory, and Historical Social Science. The current relationship between comparative education and social science is one of uncertainty, but since social science is seen to be returning to its European roots, the hope is held out for the development of an integrated social theory and method which will provide a much stronger basis for developments in comparative education.

  17. GCtool for fuel cell systems design and analysis : user documentation.

    Ahluwalia, R.K.; Geyer, H.K.

    1999-01-15

    GCtool is a comprehensive system design and analysis tool for fuel cell and other power systems. A user can analyze any configuration of component modules and flows under steady-state or dynamic conditions. Component models can be arbitrarily complex in modeling sophistication and new models can be added easily by the user. GCtool also treats arbitrary system constraints over part or all of the system, including the specification of nonlinear objective functions to be minimized subject to nonlinear, equality or inequality constraints. This document describes the essential features of the interpreted language and the window-based GCtool environment. The system components incorporated into GCtool include a gas flow mixer, splitier, heater, compressor, gas turbine, heat exchanger, pump, pipe, diffuser, nozzle, steam drum, feed water heater, combustor, chemical reactor, condenser, fuel cells (proton exchange membrane, solid oxide, phosphoric acid, and molten carbonate), shaft, generator, motor, and methanol steam reformer. Several examples of system analysis at various levels of complexity are presented. Also given are instructions for generating two- and three-dimensional plots of data and the details of interfacing new models to GCtool.

  18. Performance analysis of an opportunistic multi-user cognitive network with multiple primary users

    Khan, Fahd Ahmed

    2014-03-01

    Consider a multi-user underlay cognitive network where multiple cognitive users concurrently share the spectrum with a primary network with multiple users. The channel between the secondary network is assumed to have independent but not identical Nakagami-m fading. The interference channel between the secondary users (SUs) and the primary users is assumed to have Rayleigh fading. A power allocation based on the instantaneous channel state information is derived when a peak interference power constraint is imposed on the secondary network in addition to the limited peak transmit power of each SU. The uplink scenario is considered where a single SU is selected for transmission. This opportunistic selection depends on the transmission channel power gain and the interference channel power gain as well as the power allocation policy adopted at the users. Exact closed form expressions for the moment-generating function, outage performance, symbol error rate performance, and the ergodic capacity are derived. Numerical results corroborate the derived analytical results. The performance is also studied in the asymptotic regimes, and the generalized diversity gain of this scheduling scheme is derived. It is shown that when the interference channel is deeply faded and the peak transmit power constraint is relaxed, the scheduling scheme achieves full diversity and that increasing the number of primary users does not impact the diversity order. © 2014 John Wiley & Sons, Ltd.

  19. Analysis of the Navigation Behavior of the Users' using Grey Relational Pattern Analysis with Markov Chains

    BINDU MADHURI .Ch,; DR. ANAND CHANDULAL.J

    2010-01-01

    Generally user page visits are sequential in nature. The large number of Web pages on many Web sites has raised navigational problems. Markov chains have been used to model user sequential navigational behavior on the World Wide Web (WWW).The enormous growth in the number of documents in the WWW increases the need for improved link navigation and path analysis models. Link prediction and path analysis are important problems with a wide range of applications ranging from personalization to web...

  20. A RANDOMIZED COMPARATIVE STUDY OF MENSTRUAL PATTERNS IN MEGESTROL ACETATE IUD AND TCu220c IUD USERS

    YANJun-Hong; PANJia-Xiang; ZHANGYa-Qin; KANGJin-Fang; LINGXiu-Ying

    1989-01-01

    This is a randomized comparative study of menstrual patterns between 30 cases of megestrol acetate IUD users and 30 cases using TCu220c IUD before and at the Ist, 3rd, 6th and 12th cycles after insertion.

  1. Comparative performance analysis of nonorthogonal joint diagonalization algorithms

    Ammar, Mesloub; Abed-Meraim, Karim; Belouchrani, Adel

    2013-01-01

    Recently, many non orthogonal joint diagonalization (NOJD) algorithms have been developed and applied in several applications including blind source separation (BSS) problems. The aim of this paper is to provide an overview of major complex NOJD (CNOJD) algorithm and to study and compare their performance analysis reveals many interesting features that help the non expert user to select the CNOJD method depending on the application conditions.

  2. Blog Content and User Engagement - An Insight Using Statistical Analysis.

    Apoorva Vikrant Kulkarni

    2013-06-01

    Full Text Available Since the past few years organizations have increasingly realized the value of social media in positioning, propagating and marketing the product/service and organization itself. Today every organization be it small or big has realized the essence of creating a space in the World Wide Web. Social Media through its multifaceted platforms has enabled the organizations to propagate their brands. There are a number of social media networks which are helpful in spreading the message to customers. Many organizations are having full time web analytics teams that are regularly trying to ensure that prospectivecustomers are visiting their organization through various forms of social media. Web analytics is foreseen as a tool for Business Intelligence by organizations and there are a large number of analytics tools available for monitoring the visibility of a particular brand on the web. For example, Google has its ownanalytic tool that is very widely used. There are number of free as well as paid analytical tools available on the internet. The objective of this paper is to study what content in a blog present in the social media creates a greater impact on user engagement. The study statistically analyzes the relation between content of the blog and user engagement. The statistical analysis was carried out on a blog of a reputed management institute in Pune to arrive at conclusions.

  3. User's guide for the REBUS-3 fuel cycle analysis capability

    REBUS-3 is a system of programs designed for the fuel-cycle analysis of fast reactors. This new capability is an extension and refinement of the REBUS-3 code system and complies with the standard code practices and interface dataset specifications of the Committee on Computer Code Coordination (CCCC). The new code is hence divorced from the earlier ARC System. In addition, the coding has been designed to enhance code exportability. Major new capabilities not available in the REBUS-2 code system include a search on burn cycle time to achieve a specified value for the multiplication constant at the end of the burn step; a general non-repetitive fuel-management capability including temporary out-of-core fuel storage, loading of fresh fuel, and subsequent retrieval and reloading of fuel; significantly expanded user input checking; expanded output edits; provision of prestored burnup chains to simplify user input; option of fixed-or free-field BCD input formats; and, choice of finite difference, nodal or spatial flux-synthesis neutronics in one-, two-, or three-dimensions

  4. Usability and use reference in the social network facebook: a netnographic analysis of technological users

    Naiara Silva Ferreira

    2015-10-01

    Full Text Available This article presents a study about the preference of use in virtual social networks, using Facebook as object of study, to identify the motivational factors for the usability of this technology platform. The social network Facebook has been chosen to present a technological scenario of high sociability and virtual interaction. The methodology used was the netnography, being made through the collection of discussions in North American sites of news and forums online, where there is a large critical user participation on the internet, about the gains and frustrations in this context. The content analysis was performed comparing the categories of users found in the literature about values that motivate consumer technology, describing the hedonic, social, utilitarian values and perceptions of risk in consumption when related to lack of privacy. The results show two main groups of users of this technology and 7 subgroups. Therefore, the contribution of the study is that the formation of these groups may reflect technological usability of user groups around the world. The study also brings to the discussion issues related to the behaviors of the users of virtual networks which can be useful for businesses and their relationships with consumers and also the development of new knowledge from such criticism and demands that digital consumers expose about the technologies.

  5. International Reactor Physics Handbook Database and Analysis Tool (IDAT) - IDAT user manual

    The IRPhEP Database and Analysis Tool (IDAT) was first released in 2013 and is included on the DVD. This database and corresponding user interface allows easy access to handbook information. Selected information from each configuration was entered into IDAT, such as the measurements performed, benchmark values, calculated values and materials specifications of the benchmark. In many cases this is supplemented with calculated data such as neutron balance data, spectra data, k-eff nuclear data sensitivities, and spatial reaction rate plots. IDAT accomplishes two main objectives: 1. Allow users to search the handbook for experimental configurations that satisfy their input criteria. 2. Allow users to trend results and identify suitable benchmarks experiments for their application. IDAT provides the user with access to several categories of calculated data, including: - 1-group neutron balance data for each configuration with individual isotope contributions in the reactor system. - Flux and other reaction rates spectra in a 299-group energy scheme. Plotting capabilities were implemented into IDAT allowing the user to compare the spectra of selected configurations in the original fine energy structure or on any user-defined broader energy structure. - Sensitivity coefficients (percent changes of k-effective due to elementary change of basic nuclear data) for the major nuclides and nuclear processes in a 238-group energy structure. IDAT is actively being developed. Those approved to access the online version of the handbook will also have access to an online version of IDAT. As May 2013 marks the first release, IDAT may contain data entry errors and omissions. The handbook remains the primary source of reactor physics benchmark data. A copy of IDAT user's manual is attached to this document. A copy of the IRPhE Handbook can be obtained on request at http://www.oecd-nea.org/science/wprs/irphe/irphe-handbook/form.html

  6. SOR Users` Guide : How to Navigate Through the SOR Analysis.

    United States. Bonneville Power Administration.

    1996-08-01

    The Columbia River System Operation Review (SOR) gave river managers, users, and the general public a chance to examine system operations in detail, to study how each river use affects others, and to consider the consequences of changing the way the system works. The task was enormous, and it was a multiyear undertaking. In its wake, the SOR left a multitude of documents and six years of analysis that can and should be used broadly for other reference and research purposes. This catalog will introduce you to numerous SOR products to be found throughout the 20 appendices and the Final Environmental Impact Statement (EIS) Main Report. They include maps, models, data bases, current descriptions of Federal hydro projects and river resources, publications, and slide shows.

  7. RENT CONTROL: A COMPARATIVE ANALYSIS

    Sue-Mari Maass

    2012-11-01

    Full Text Available Recent case law shows that vulnerable, previously disadvantaged private sector tenants are currently facing eviction orders – and consequential homelessness – on the basis that their leases have expired. In terms of the case law it is evident that once their leases have expired, these households do not have access to alternative accommodation. In terms of the Constitution, this group of marginalised tenants have a constitutional right of access to adequate housing and a right to occupy land with legally secure tenure. The purpose of this article is to critically analyse a number of legislative interventions, and specifically rent control, that were imposed in various jurisdictions in order to provide strengthened tenure protection for tenants. The rationale for this analysis is to determine whether the current South African landlord-tenant regime is able to provide adequate tenure protection for vulnerable tenants and therefore in the process of transforming in line with the Constitution. The legal construction of rent control was adopted in pre-1994 South Africa, England and New York City to provide substantive tenure protection for tenants during housing shortages. These statutory interventions in the different private rental markets were justified on the basis that there was a general need to protect tenants against exploitation by landlords. However, the justification for the persistent imposition of rent control in New York City is different since it protects a minority group of financially weak tenants against homelessness. The English landlord-tenant regime highlights the importance of a well-structured social sector that can provide secure, long-term housing options for low-income households who are struggling to access the private rental sector. Additionally, the English rental housing framework shows that if the social sector is functioning as a "safety net" for low-income households, the private sector would be able to uphold

  8. Comparing the performance of expert user heuristics and an integer linear program in aircraft carrier deck operations.

    Ryan, Jason C; Banerjee, Ashis Gopal; Cummings, Mary L; Roy, Nicholas

    2014-06-01

    Planning operations across a number of domains can be considered as resource allocation problems with timing constraints. An unexplored instance of such a problem domain is the aircraft carrier flight deck, where, in current operations, replanning is done without the aid of any computerized decision support. Rather, veteran operators employ a set of experience-based heuristics to quickly generate new operating schedules. These expert user heuristics are neither codified nor evaluated by the United States Navy; they have grown solely from the convergent experiences of supervisory staff. As unmanned aerial vehicles (UAVs) are introduced in the aircraft carrier domain, these heuristics may require alterations due to differing capabilities. The inclusion of UAVs also allows for new opportunities for on-line planning and control, providing an alternative to the current heuristic-based replanning methodology. To investigate these issues formally, we have developed a decision support system for flight deck operations that utilizes a conventional integer linear program-based planning algorithm. In this system, a human operator sets both the goals and constraints for the algorithm, which then returns a proposed schedule for operator approval. As a part of validating this system, the performance of this collaborative human-automation planner was compared with that of the expert user heuristics over a set of test scenarios. The resulting analysis shows that human heuristics often outperform the plans produced by an optimization algorithm, but are also often more conservative. PMID:23934675

  9. User's manual for seismic analysis code 'SONATINA-2V'

    The seismic analysis code, SONATINA-2V, has been developed to analyze the behavior of the HTTR core graphite components under seismic excitation. The SONATINA-2V code is a two-dimensional computer program capable of analyzing the vertical arrangement of the HTTR graphite components, such as fuel blocks, replaceable reflector blocks, permanent reflector blocks, as well as their restraint structures. In the analytical model, each block is treated as rigid body and is restrained by dowel pins which restrict relative horizontal movement but allow vertical and rocking motions between upper and lower blocks. Moreover, the SONATINA-2V code is capable of analyzing the core vibration behavior under both simultaneous excitations of vertical and horizontal directions. The SONATINA-2V code is composed of the main program, pri-processor for making the input data to SONATINA-2V and post-processor for data processing and making the graphics from analytical results. Though the SONATINA-2V code was developed in order to work in the MSP computer system of Japan Atomic Energy Research Institute (JAERI), the computer system was abolished with the technical progress of computer. Therefore, improvement of this analysis code was carried out in order to operate the code under the UNIX machine, SR8000 computer system, of the JAERI. The users manual for seismic analysis code, SONATINA-2V, including pri- and post-processor is given in the present report. (author)

  10. Development of Point Kernel Shielding Analysis Computer Program Implementing Recent Nuclear Data and Graphic User Interfaces

    Kang, Sang Ho; Lee, Seung Gi; Chung, Chan Young [Korea Power Engineering Co, Ltd., Seoul (Korea, Republic of); Lee, Choon Sik; Lee, Jai Ki [Hanyang Univ., Seoul (Korea, Republic of)

    2001-09-15

    In order to comply with revised national regulationson radiological protection and to implement recent nuclear data and dose conversion factors, KOPEC developed a new point kernel gamma and beta ray shielding analysis computer program. This new code, named VisualShield, adopted mass attenuation coefficient and buildup factors from recent ANSI/ANS standards and flux-to-dose conversion factors from the International Commission on Radiological Protection (ICRP) Publication 74 for estimation of effective/equivalent dose recommended in ICRP 60. VisualShield utilizes graphical user interfaces and 3-D visualization of the geometric configuration for preparing input data sets and analyzing results, which leads users to error free processing with visual effects. Code validation and data analysis were performed by comparing the results of various calculations to the data outputs of previous programs such as MCNP 4B, ISOSHLD-II, QAD-CGGP, etc.

  11. Development of Point Kernel Shielding Analysis Computer Program Implementing Recent Nuclear Data and Graphic User Interfaces

    In order to comply with revised national regulationson radiological protection and to implement recent nuclear data and dose conversion factors, KOPEC developed a new point kernel gamma and beta ray shielding analysis computer program. This new code, named VisualShield, adopted mass attenuation coefficient and buildup factors from recent ANSI/ANS standards and flux-to-dose conversion factors from the International Commission on Radiological Protection (ICRP) Publication 74 for estimation of effective/equivalent dose recommended in ICRP 60. VisualShield utilizes graphical user interfaces and 3-D visualization of the geometric configuration for preparing input data sets and analyzing results, which leads users to error free processing with visual effects. Code validation and data analysis were performed by comparing the results of various calculations to the data outputs of previous programs such as MCNP 4B, ISOSHLD-II, QAD-CGGP, etc

  12. Mobile Phone Usage for M-Learning: Comparing Heavy and Light Mobile Phone Users

    Suki, Norbayah Mohd; Suki, Norazah Mohd

    2007-01-01

    Purpose: Mobile technologies offer the opportunity to embed learning in a natural environment. The objective of the study is to examine how the usage of mobile phones for m-learning differs between heavy and light mobile phone users. Heavy mobile phone users are hypothesized to have access to/subscribe to one type of mobile content than light…

  13. Analysis of user interfaces and interactions with computers

    PERČIČ, JAN

    2016-01-01

    The diploma thesis is a study of evolution of user interfaces and human interaction with computers. The thesis offers an overview of examples from history and mentions people that were important for the user interface development. At the same time it examines the current interaction principles and their potential evolution in the future. The goal was to define a potential ideal user interface, but because we are using different types of computers and in different situations, conclusion was re...

  14. Performance and security analysis of Gait-based user authentication

    2008-01-01

    Verifying the identity of a user, usually referred to as user authentication, before granting access to the services or objects is a very important step in many applications. People pass through some sorts of authentication process in their daily life. For example, to prove having access to the computer the user is required to know a password. Similarly, to be able to activate a mobile phone the owner has to know its PIN code, etc. Some user authentication techniques are based on human physio...

  15. Creation and documentation analysis of Mobile shopper guide system: administrative and user portal

    Charževskytė, Egidija

    2008-01-01

    Mobile shopper guide system consists of two parts mainly: mobile map, administrative and user portal. Mobile map consists of Vilnius city map and the points, which shows the location of the shops. Administrative and user portal supports mobile map. In the paper the analysis of documentation templates was done.During this work administrative and user portal was created and analyzed.

  16. Users, User-Friendliness and Projected Uses of Isichazamazwi SesiNdebele: An Analysis

    Langa Khumalo

    2011-01-01

    Abstract: This article discusses the first-ever monolingual general Ndebele dictionary, Isichazamazwi SesiNdebele (henceforth the ISN) within the context of the history of lexicography and the compilation of dic-tionaries in Ndebele. It further assesses the scope of the dictionary with regard to its structure. It also dis-cusses decisions taken by the editors during the writing of the ISN in an attempt to compile a user-friendly dictionary primarily aimed at secondary schools and the general ...

  17. Sentiment Detection of Web Users Using Probabilistic Latent Semantic Analysis

    Weijian Ren

    2014-10-01

    Full Text Available With the wide application of Internet in almost all fields, it has become the most important way for information publication, providing a large number of channels for spreading public opinion. Public opinions, as the response of Internet users to the information such as social events and government policies, reflect the status of both society and economics, which is highly valuable for the decision-making and public relations of enterprises. At present, the analysis methods for Internet public opinion are mainly based on discriminative approaches, such as Support Vector Machine (SVM and neural network. However, when these approaches analyze the sentiment of Internet public opinion, they are failed to exploit information hidden in text, e.g. topic. Motivated by the above observation, this paper proposes a detection method for public sentiment based on Probabilistic Latent Semantic Analysis (PLSA model. PLSA inherits the advantages of LSA, exploiting the semantic topic hidden in data. The procedure of detecting the public sentiment using this algorithm is composed of three main steps: (1 Chinese word segmentation and word refinement, with which each document is represented by a bag of words; (2 modeling the probabilistic distribution of documents using PLSA; (3 using the Z-vector of PLSA as the features of documents and delivering it to SVM for sentiment detection. We collect a set of text data from Weibo, blog, BBS etc. to evaluate our proposed approach. The experimental results shows that the proposed method in this paper can detect the public sentiment with high accuracy, outperforming the state-of-the-art approaches, i.e., word histogram based approach. The results also suggest that, text semantic analysis using PLSA could significantly boost the sentiment detection

  18. Earth Science Enterprise Solutions Network User Needs Analysis

    Mita, D.; Easson, G.; Anderson, D. J.

    2006-12-01

    The University of Mississippi and Mississippi State University are partnering with National Aeronautics and Space Administration (NASA) to develop a Research Projects Knowledge Base (RPKB) and a Partner Network Knowledge Base (PNKB) that will allow the user community to more easily locate the results of NASA funded research and be able to contact potential partners for future research. The RPKB and PNKB will also help NASA and the user community generate new topics for research projects that benefit society. The assessment of the needs of the user community for the RPKB and the PNKB was conducted through a combination of phone interviews with representatives of the user community, evaluations of existing data and tools currently employed by scientists, collaborations between NASA and the MRC, and the results of a November, 2006 workshop. This information was combined with an email survey, resulting in a report that describes how the users of the RPKB and the PNKB want to access the information in the knowledge bases, input new information into the RPKB and PNKB, and generate reports and diagrams, such as formulation reports. The compilation of the user's needs presented here is being used to design access and input tools for the user community. When fully developed the RPKB and the PNKB will use the NASA enterprise architecture modeling and viewing capability to provide NASA and the user community a more efficient way to identify potential research projects and partners.

  19. Analysis of Users Web Browsing Behavior Using Markov chain Model

    Diwakar Shukla

    2011-03-01

    Full Text Available In present days of growing information technology, many browsers available for surfing and web mining. A user has option to use any of them at a time to mine out the desired website. Every browser has pre-defined level of popularity and reputation in the market. This paper considers the setup of only two browsers in a computer system and a user prefers to any one, if fails, switches to the other one .The behavior of user is modeled through Markov chain procedure and transition probabilities are calculated. The quitting to browsing is treated as a parameter of variation over the popularity. Graphical study is performed to explain the inter relationship between user behavior parameters and browser market popularity parameters. If rate of a company is lowest in terms of browser failure and lowest in terms of quitting probability then company enjoys better popularity and larger user proportion

  20. Analysis of Factors for Incorporating User Preferences in Air Traffic Management: A system Perspective

    Sheth, Kapil S.; Gutierrez-Nolasco, Sebastian

    2010-01-01

    This paper presents an analysis of factors that impact user flight schedules during air traffic congestion. In pre-departure flight planning, users file one route per flight, which often leads to increased delays, inefficient airspace utilization, and exclusion of user flight preferences. In this paper, first the idea of filing alternate routes and providing priorities on each of those routes is introduced. Then, the impact of varying planning interval and system imposed departure delay increment is discussed. The metrics of total delay and equity are used for analyzing the impact of these factors on increased traffic and on different users. The results are shown for four cases, with and without the optional routes and priority assignments. Results demonstrate that adding priorities to optional routes further improves system performance compared to filing one route per flight and using first-come first-served scheme. It was also observed that a two-hour planning interval with a five-minute system imposed departure delay increment results in highest delay reduction. The trend holds for a scenario with increased traffic.

  1. Methodological proposal for the analysis of user participation mechanisms in online media

    Jaime Alonso, Ph.D.

    2012-01-01

    Full Text Available This paper presents the results of an analysis of user participation mechanisms, particularly those based in Web 2.0 technologies and applications, based on a sample of fourteen relevant Spanish online media, including the websites of newspapers, radio stations, and television channels. This analysis was conducted in October and November 2010 as part of the research subproject La evolución de los cibermedios en el marco de la convergencia digital. Tecnología y distribución (The evolution of online media in the context of digital convergence. Tecnology and distribution. The study is based on a taxonomy of the different user participation mechanisms, which distinguishes between those that are integrated within the media’s news sections and those that are independent spaces. The analysis also examines the form in which these mechanisms are managed by the media in function of the role they are assigned. Finally, the study aims to compare the different online media and to show examples and trends in the field of user participation.

  2. Comparative Analysis of Virtual Education Applications

    Mehmet KURT

    2006-10-01

    Full Text Available The research was conducted in order to make comparative analysis of virtual education applications. The research is conducted in survey model. The study group consists of total 300 institutes providing virtual education in the fall, spring and summer semesters of 2004; 246 in USA, 10 in Australia, 3 in South Africa, 10 in India, 21 in UK, 6 in Japan, 4 in Turkey. The information has been collected by online questionnaire sent to the target mass by e-mail. The questionnaire has been developed in two information categories as personal information and institutes and their virtual education applications. The English web design of the online questionnaire and the database has been prepared by Microsoft ASP codes which is the script language of Microsoft Front Page editor and has been tested on personal web site. The questionnaire has been pre applied in institutions providing virtual education in Australia. The English text of the questionnaire and web site design have been sent to educational technology and virtual education specialists in the countries of the study group. With the feedback received, the spelling mistakes have been corrected and concept and language validity have been completed. The application of the questionnaire has taken 40 weeks during March-November 2004. Only 135 institutes have replied. Two of the questionnaires have been discharged because they included mistaken coding, names of the institutions and countries. Valid 133 questionnaires cover approximately 44% of the study group. Questionnaires saved in the online database have been transferred to Microsoft Excel and then to SPSS by external database connection. In regards of the research objectives, the data collected has been analyzed on computer and by using SPSS statistics package program. In data analysis frequency (f, percentage (% and arithmetic mean ( have been used. In comparisons of country, institute, year, and other variables, che-square test, independent t

  3. Model for Analysis of Energy Demand (MAED-2). User's manual

    The IAEA has been supporting its Member States in the area of energy planning for sustainable development. Development and dissemination of appropriate methodologies and their computer codes are important parts of this support. This manual has been produced to facilitate the use of the MAED model: Model for Analysis of Energy Demand. The methodology of the MAED model was originally developed by. B. Chateau and B. Lapillonne of the Institute Economique et Juridique de l'Energie (IEJE) of the University of Grenoble, France, and was presented as the MEDEE model. Since then the MEDEE model has been developed and adopted to be appropriate for modelling of various energy demand system. The IAEA adopted MEDEE-2 model and incorporated important modifications to make it more suitable for application in the developing countries, and it was named as the MAED model. The first version of the MAED model was designed for the DOS based system, which was later on converted for the Windows system. This manual presents the latest version of the MAED model. The most prominent feature of this version is its flexibility for representing structure of energy consumption. The model now allows country-specific representations of energy consumption patterns using the MAED methodology. The user can now disaggregate energy consumption according to the needs and/or data availability in her/his country. As such, MAED has now become a powerful tool for modelling widely diverse energy consumption patterns. This manual presents the model in details and provides guidelines for its application

  4. A generic analysis code of dynamic compartment model for evaluation of doses in terrestrial biosphere. GACOM user`s manual

    Takahashi, Tomoyuki [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    1998-02-01

    A computer code GACOM (Generic Analysis code for dynamic COmpartment Model) has been developed to evaluate the behavior of radionuclides in terrestrial biosphere and the subsequent individual doses. In this code, the simultaneous ordinary differential equations are solved by using the sixth-step fifth-order Runge-Kutta method called Fehlberg formula. This principal characteristics of this code are shown as follows; (1) user definition of such as a number of compartments and transfer pathways of nuclides makes it possible to apply this code to various subjects of analysis, (2) various kinds of equations for evaluating doses in terrestrial biosphere are available for making input data of this code, (3) the units of time and nuclides can be defined flexibly, (4) probabilistic analysis by using the Monte-Carlo simulation is possible. This report describes the structure and user information for execution of GACOM code. (author)

  5. Cultural politics of user-generated encyclopaedias: comparing Chinese Wikipedia and Baidu Baike

    Liao, Han-Teng; Schroeder, Ralph

    2015-01-01

    The question of how the Internet affects existing geo-cultural or geo-linguistic communities in relation to nation-states has continued to receive attention among academics and policymakers alike. Language-based technologies and services that aggregate, index, and distribute materials online may reshape pre-existing boundaries of the relationship between users and content, for instance with different language versions of user-generated encyclopaedias or different local versions of search engi...

  6. Logatome Discrimination in Cochlear Implant Users: Subjective Tests Compared to the Mismatch Negativity

    Torsten Rahne; Michael Ziese; Dorothea Rostalski; Roland Mühler

    2010-01-01

    This paper describes a logatome discrimination test for the assessment of speech perception in cochlear implant users (CI users), based on a multilingual speech database, the Oldenburg Logatome Corpus, which was originally recorded for the comparison of human and automated speech recognition. The logatome discrimination task is based on the presentation of 100 logatome pairs (i.e., nonsense syllables) with balanced representations of alternating “vowel-replacement” and “consonant-replacement”...

  7. Comparing user acceptance of a computer system in two pediatric offices: a qualitative study.

    Travers, D. A.; Downs, S. M.

    2000-01-01

    The purpose of this qualitative study was to examine user acceptance of a clinical computer system in two pediatric practices in the southeast. Data were gathered through interviews with practice and IS staff, observations in the clinical area, and review of system implementation records. Five months after implementation, Practice A continued to use the system but Practice B had quit using it because it was unacceptable to the users. The results are presented here, in relation to a conceptual...

  8. Consumption capability analysis for Micro-blog users based on data mining

    Yue Sun

    2013-07-01

    Full Text Available Data mining is an effective method of discovering useful information in a large amount of data. The capability of understanding the user’s consumption is vital for a company. Discovering the significant customers allows the company to focus on the most valuable customers. This paper uses micro-blog users’ check-in data and shop information for analysis and cluster method of data mining. We analyze user’s spending ability quantitatively based on user’s check-in actions. Compared with other clustering method, we choose DBSCAN clustering method of data mining to analyze the shop information and position. Users are divided into different categories according to their spending power. Discovering the users with high consumption level in large amounts of data, which is significant for a firm, can help the firm develop better strategies.

  9. Consumption capability analysis for Micro-blog users based on data mining

    Yue Sun

    2013-07-01

    Full Text Available Data mining is an effective method of discovering useful information in a large amount of data. Thecapability of understanding the user’s consumptionis vital for a company. Discovering the significantcustomers allows the company to focus on the most valuable customers. This paper uses micro-blog users’check-in data and shop information for analysis andcluster method of data mining. We analyze user’sspending ability quantitatively based on user’s check-in actions. Compared with other clustering method, wechoose DBSCAN clustering method of data mining to analyze the shop information and position. Users aredivided into different categories according to their spending power. Discovering the users with highconsumption level in large amounts of data, which is significant for a firm, can help the firm developbetterstrategies.

  10. [A comparative study of the incidence of Gardnerella vaginalis in users of IUD and oral contraceptives].

    Aleixo Neto, A; Peixoto, M L; Cabral, A C

    1987-07-01

    Leukorrhea constitutes one of the most frequent complaints by women visiting out-patient gynecological clinics. The most common etiological agents are Gardnerella vaginalis, Trichomonas vaginalis, Neisseria gonorrhoea and Candida albicans. Some authors have been able to verify an increased presence of certain pathogenic germs in the vaginal flora for users of contraceptive methods, e.g., for users of IUD, Kivijarvi et al. demonstrated a significantly increased presence of Gardnerella vaginalis considered responsible for a large number of inflammatory diseases of the pelvis in woman. In order to establish the frequency of colpo-cervicitis in users of contraceptive methods a study was initiated of women attending the Family Planning Clinic of the Department of Medicine at UFMG during the period January-December 1986. 305 cytological smears obtained from 111 users of oral hormonal contraceptives and from 194 users of IUD were studied by using the Papanicolaou staining method. A microscopic examination enabling the detection of cells that indicate the presence of Gardnerella vaginalis, trichomonas, Candida, papilloma, and dysplastic cells was performed. The results are listed in two tables. In 34.5% of cases a significant association was established between users of IUD and cells indicating the presence of Gardnerella vaginalis. For users of oral contraceptives that figure was lower, 28.8%. This circumstance has been attributed to probable epithelial lesions caused by incorrect insertion of the IUD and more abundant menstrual flux provoked b the IUD in combination with the hemophilic characteristics of Gardnerella vaginalis. A significant prevalence of papilloma and trichomonas was also noted for women using IUD as opposed to those oral contraceptives. The results demonstrate the necessity of frequent clinical attendance and laboratory tests for women using contraceptive methods, particularly the IUD. PMID:12282423

  11. User-Intention Based Program Analysis for Android Security

    Elish, Karim Omar Mahmoud

    2015-01-01

    The number of mobile applications (i.e., apps) is rapidly growing, as the mobile computing becomes an integral part of the modern user experience. Malicious apps have infiltrated open marketplaces for mobile platforms. These malicious apps can exfiltrate user's private data, abuse of system resources, or disrupting regular services. Despite the recent advances on mobile security, the problem of detecting vulnerable and malicious mobile apps with high detection accuracy remains an open problem...

  12. Analysis of the Navigation Behavior of the Users' using Grey Relational Pattern Analysis with Markov Chains

    BINDU MADHURI .Ch,

    2010-10-01

    Full Text Available Generally user page visits are sequential in nature. The large number of Web pages on many Web sites has raised navigational problems. Markov chains have been used to model user sequential navigational behavior on the World Wide Web (WWW.The enormous growth in the number of documents in the WWW increases the need for improved link navigation and path analysis models. Link prediction and path analysis are important problems with a wide range of applications ranging from personalization to websites. The complete size of the WWW coupled with the variation in users' navigation patterns makes this a very difficult sequence modeling problem. This paper generalizes the concept of grey relational analysis to develop a technique, called grey relational pattern analysis associated with Markov chains for sequential web data, for analyzing the similarity between given patterns. Based on this technique, a clustering algorithm” Grey Clustering algorithm for Sequential Data” is proposed to finding cluster of a given data set .The problem of determining the optimal number of clusters . We develop an evaluationframework in which the Sum of Squared Error (SSE is calculated to get the efficiency of proposed algorithm. The analyzed behavior of the users used in application areas for Web usage mining Personalization, System Improvement, Site Modification, Business Intelligence, and Usage Characterization.

  13. An Efficient SDN Load Balancing Scheme Based on Variance Analysis for Massive Mobile Users

    Hong Zhong

    2015-01-01

    Full Text Available In a traditional network, server load balancing is used to satisfy the demand for high data volumes. The technique requires large capital investment while offering poor scalability and flexibility, which difficultly supports highly dynamic workload demands from massive mobile users. To solve these problems, this paper analyses the principle of software-defined networking (SDN and presents a new probabilistic method of load balancing based on variance analysis. The method can be used to dynamically manage traffic flows for supporting massive mobile users in SDN networks. The paper proposes a solution using the OpenFlow virtual switching technology instead of the traditional hardware switching technology. A SDN controller monitors data traffic of each port by means of variance analysis and provides a probability-based selection algorithm to redirect traffic dynamically with the OpenFlow technology. Compared with the existing load balancing methods which were designed to support traditional networks, this solution has lower cost, higher reliability, and greater scalability which satisfy the needs of mobile users.

  14. Large Scale Management of Physicists Personal Analysis Data Without Employing User and Group Quotas

    Norman, A.; Diesbug, M.; Gheith, M.; Illingworth, R.; Lyon, A.; Mengel, M.

    2015-12-01

    The ability of modern HEP experiments to acquire and process unprecedented amounts of data and simulation have lead to an explosion in the volume of information that individual scientists deal with on a daily basis. Explosion has resulted in a need for individuals to generate and keep large personal analysis data sets which represent the skimmed portions of official data collections, pertaining to their specific analysis. While a significant reduction in size compared to the original data, these personal analysis and simulation sets can be many terabytes or 10s of TB in size and consist of 10s of thousands of files. When this personal data is aggregated across the many physicists in a single analysis group or experiment it can represent data volumes on par or exceeding the official production samples which require special data handling techniques to deal with effectively. In this paper we explore the changes to the Fermilab computing infrastructure and computing models which have been developed to allow experimenters to effectively manage their personal analysis data and other data that falls outside of the typically centrally managed production chains. In particular we describe the models and tools that are being used to provide the modern neutrino experiments like NOvA with storage resources that are sufficient to meet their analysis needs, without imposing specific quotas on users or groups of users. We discuss the storage mechanisms and the caching algorithms that are being used as well as the toolkits are have been developed to allow the users to easily operate with terascale+ datasets.

  15. FISCAL DISCIPLINE WITHIN THE EU: COMPARATIVE ANALYSIS

    SORIN CELEA

    2013-12-01

    Full Text Available This paper focuses on the analysis of the convergence indicators relative to fiscal area in the EU; subsequent to a description of the main peculiarities of the convergence criteria, the reseach develops a critical analysis on a comparative perspective of the actual values of fiscal convergence indicators registered in EU countries compared with the reference values of the indicators, with emphasis on the differences between emerging and developed countries.

  16. ERP Software Evaluation and Comparative Analysis

    Kalpic, Damir; Fertalj, Kresimir

    2004-01-01

    This paper presents the results of an investigation performed in 2001 under the title Comparative Analysis of Information Systems Software in Croatia. The focus was set on the comparative analysis of domestic and foreign Enterprise Resource Planning (ERP) software, which is present in Croatia. The investigation was performed from the standpoint of ERP applicability, regardless of the development methods and information technology. In other words, the evaluation was performed primarily from th...

  17. Comparative Environmental Threat Analysis: Three Case Studies.

    Latour, J. B.; Reiling, R.

    1994-01-01

    Reviews how carrying capacity for different environmental problems is operationalized. Discusses whether it is possible to compare threats, using the exceeding of carrying capacity as a yardstick. Points out problems in comparative threat analysis using three case studies: threats to European groundwater resources, threats to ecosystems in Europe,…

  18. SALOME. Software for the analysis of lines or multiplets from Extrap. User's guide

    This user's guide describes the centre piece of spectral analysis programs for Extrap-T1. The method for spectral analysis is presented theoretically. It also presents the actual use of the program PROBESCHUSS and how to work on the multiplet library. The present user's guide is about PROBESCHUSS 2.1 and MULTIFIT 2.0. 7 figs, 5 appendices

  19. Analysis of payment cards in terms of user

    Čechová, Jana

    2012-01-01

    The main goal of this thesis is to assess people’s knowledge of payment cards for those that already have one or are considering getting one, and to establish a profile of a normal cardholder. Another goal of this thesis is to perform a comparative analysis of three banks on the Czech banking market from the perspective of selected types of fees billed for card services, and from these three, to select the bank with the most favourable fees from the perspective of the customer. Bibliograph...

  20. Comparing writing style feature-based classification methods for estimating user reputations in social media.

    Suh, Jong Hwan

    2016-01-01

    In recent years, the anonymous nature of the Internet has made it difficult to detect manipulated user reputations in social media, as well as to ensure the qualities of users and their posts. To deal with this, this study designs and examines an automatic approach that adopts writing style features to estimate user reputations in social media. Under varying ways of defining Good and Bad classes of user reputations based on the collected data, it evaluates the classification performance of the state-of-art methods: four writing style features, i.e. lexical, syntactic, structural, and content-specific, and eight classification techniques, i.e. four base learners-C4.5, Neural Network (NN), Support Vector Machine (SVM), and Naïve Bayes (NB)-and four Random Subspace (RS) ensemble methods based on the four base learners. When South Korea's Web forum, Daum Agora, was selected as a test bed, the experimental results show that the configuration of the full feature set containing content-specific features and RS-SVM combining RS and SVM gives the best accuracy for classification if the test bed poster reputations are segmented strictly into Good and Bad classes by portfolio approach. Pairwise t tests on accuracy confirm two expectations coming from the literature reviews: first, the feature set adding content-specific features outperform the others; second, ensemble learning methods are more viable than base learners. Moreover, among the four ways on defining the classes of user reputations, i.e. like, dislike, sum, and portfolio, the results show that the portfolio approach gives the highest accuracy. PMID:27006870

  1. Analysis of diversified concept of users on the energy service

    Ono, Kenji; Morikiyo, Takashi

    1987-09-10

    On the basis of rapid environmental change in the electrical business, fine and delicate service on the user's side is desired to be adopted as well as the conventional abundunt, cheap, and stable power supply. By conducting a questionaire survey during 1986-1987 throughout the country, features of the concept and needs of the users concerning the energy service were clarified. The results of survey were as follows: (1) Determining factor for the image of energy is such benefit factors as comfort and cleanliness as such cost factors as economy and energy-saving. (2) Diversified life styles of the users were extracted to find concepts and needs of the respective life style. (3) Middle-aged housewives especially regard electicity as a costly energy than other energies although it is comfortable and clean. (7 figs, 3 tabs, 4 refs)

  2. Data oriented job submission scheme for the PHENIX user analysis in CCJ

    The RIKEN Computing Center in Japan (CCJ) has been developed to make it possible analyzing huge amount of data corrected by the PHENIX experiment at RHIC. The corrected raw data or reconstructed data are transferred via SINET3 with 10 Gbps bandwidth from Brookheaven National Laboratory (BNL) by using GridFTP. The transferred data are once stored in the hierarchical storage management system (HPSS) prior to the user analysis. Since the size of data grows steadily year by year, concentrations of the access request to data servers become one of the serious bottlenecks. To eliminate this I/O bound problem, 18 calculating nodes with total 180 TB local disks were introduced to store the data a priori. We added some setup in a batch job scheduler (LSF) so that user can specify the requiring data already distributed to the local disks. The locations of data are automatically obtained from a database, and jobs are dispatched to the appropriate node which has the required data. To avoid the multiple access to a local disk from several jobs in a node, techniques of lock file and access control list are employed. As a result, each job can handle a local disk exclusively. Indeed, the total throughput was improved drastically as compared to the preexisting nodes in CCJ, and users can analyze about 150 TB data within 9 hours. We report this successful job submission scheme and the feature of the PC cluster.

  3. Transportation Routing Analysis Geographic Information System (TRAGIS) User's Manual

    Johnson, PE

    2003-09-18

    The Transportation Routing Analysis Geographic Information System (TRAGIS) model is used to calculate highway, rail, or waterway routes within the United States. TRAGIS is a client-server application with the user interface and map data files residing on the user's personal computer and the routing engine and network data files on a network server. The user's manual provides documentation on installation and the use of the many features of the model.

  4. User satisfaction-based quality evaluation model and survey analysis of network information service

    LEI; Xue; JIAO; Yuying

    2009-01-01

    On the basis of user satisfaction,authors made research hypotheses by learning from relevant e-service quality evaluation models.A questionnaire survey was then conducted on some content-based websites in terms of their convenience,information quality,personalization and site aesthetics,which may affect the overall satisfaction of users.Statistical analysis was also made to build a user satisfaction-based quality evaluation system of network information service.

  5. THE COMPARATIVE ANALYSIS OF ECOLOGICAL INDICATORS

    Bikova E.V.

    2006-04-01

    Full Text Available The comparative analysis of ecological indicators designed and specified for Moldova and similar indicators of the countries of CIS is made in the work. Some general items of information about power systems of the countries of CIS (the established capacities, manufacture of the electric power are given, the analysis of dynamics of emissions GHG- СО2, NOx, SO2 in Moldova and comparison with the emissions level in other countries of CIS is made.

  6. Comparative analysis of black carbon in soils

    Schmidt, Michael W I; Skjemstad, Jan O.; Czimczik, Claudia I.; Glaser, Bruno; Prentice, Ken M; Gelinas, Yves; Thomas A.J. Kuhlbusch

    2001-01-01

    Black carbon (BC), produced by incomplete combustion of fossil fuels and vegetation, occurs ubiquitously in soils and sediments. BC exists as a continuum from partly charred material to highly graphitized soot particles, with no general agreement on clear-cut boundaries of definition or analysis. In a comparative analysis, we measured BC forms in eight soil samples by six established methods. All methods involved removal of the non-BC components from the sample by thermal or chemical means or...

  7. Creating Awareness Through Personel Informatics Systems User Expectations Analysis

    Armağan Kuru

    2013-12-01

    Full Text Available In many publications, healthy diet and re-gular exercise are reported to be the biggest inhibitors of health problems. In recent years, with the developments in technology, health research has been one of the focal topics of interdisciplinary studies. Most of these are the research and design aim to encourage people to be healthy through creating awareness and behavioral change. The technologies that are new for users, and are in rapid improvement, are open to several unanswered questions. The aim of this article is to determine the criteria that are required to change people’s behavior through technology and put emphasis on the importance of related work. With the potentials of these systems which are new in our country, this article focuses on the current studies, the current state of technology and user expectations and all will be discussed from the point of view of the designers and medical doctors

  8. Creating Awareness Through Personel Informatics Systems User Expectations Analysis

    Armağan Kuru; Çiğdem Erbuğ; Mehmet Tümer

    2013-01-01

    In many publications, healthy diet and re-gular exercise are reported to be the biggest inhibitors of health problems. In recent years, with the developments in technology, health research has been one of the focal topics of interdisciplinary studies. Most of these are the research and design aim to encourage people to be healthy through creating awareness and behavioral change. The technologies that are new for users, and are in rapid improvement, are open to several unanswered questions. Th...

  9. User Movement Behavior Analysis in Mobile Service Environment

    Mohbey Krishna K.; Thakur G.S.

    2013-01-01

    Mobile services are very important and useful to users for accessing desired information as well as performing some transactions. There are different kinds of mobile services which are used using the mobile devices. These mobile services are capable to mine the required knowledge from the huge amount of data. In this paper, mobile access pattern generation approach is proposed, which has capability to generate strong patterns between the four different parameters called mobile use...

  10. Analysis and Improvement of a User Authentication Improved Protocol

    Zuowen Tan

    2010-05-01

    Full Text Available Remote user authentication always adopts the method of password to login the server within insecure network environments. Recently, Peyravin and Jeffries proposed a practical authentication scheme based on one-way collision-resistant hash functions. However, Shim and Munilla independently showed that the scheme is vulnerable to off-line guessing attacks. In order to remove the weakness, Hölbl, Welzer and Brumenn presented an improved secure password-based protocols for remote user authentication, password change and session key establishment.  Unfortunately, the remedies of their improved scheme cannot work. The improved scheme still suffers from the off-line attacks. And the password change protocol is insecure against Denial-of-Service attack. A proposed scheme is presented which overcomes these weaknesses. Detailed cryanalysis show that the proposed password-based protocols for remote user authentication, password change and session key establishment are immune against man-in-the-middle attacks, replay attacks, password guessing attacks, outsider attacks, denial-of-Service attacks and impersonation attacks.

  11. Risk analysis and user satisfaction after implementation of computerized physician order entry in Dutch hospitals

    van der Veen, Willem; de Gier, Han J. J.; van der Schaaf, Tjerk; Taxis, Katja; van den Bemt, Patricia M. L. A.

    2013-01-01

    Background Computerized physician order entry (CPOE) in hospitals is widely considered to be important for patient safety, but implementation is lagging behind and user satisfaction is often low. Risk analysis methods may improve the implementation process and thus user satisfaction. Objective The a

  12. It Leaks More Than You Think: Fingerprinting Users from Web Traffic Analysis

    Xujing Huang

    2015-12-01

    Full Text Available We show how, in real-world web applications, confidential information about user identities can be leaked through “non-intuitive communications”, in particular web traffic which appear to be not related to the user information. In fact, our experiments on Google users demonstrate that even Google accounts are vulnerable on traffic attacks against user identities, using packet sizes and directions. And this work shows this kind of non-intuitive communication can leak even more information about user identities than the traffic explicitly using confidential information. Our work highlights possible side-channel leakage through cookies and more generally discovers fingerprints in web traffic which can improve the probability of correctly guessing a user identity. Our analysis is motivated by Hidden Markov Model, distance metric and guessing probability to analyse and evaluate these side-channel vulnerabilities.

  13. Tracking and Analysis Framework (TAF) model documentation and user`s guide

    Bloyd, C.; Camp, J.; Conzelmann, G. [and others

    1996-12-01

    With passage of the 1990 Clean Air Act Amendments, the United States embarked on a policy for controlling acid deposition that has been estimated to cost at least $2 billion. Title IV of the Act created a major innovation in environmental regulation by introducing market-based incentives - specifically, by allowing electric utility companies to trade allowances to emit sulfur dioxide (SO{sub 2}). The National Acid Precipitation Assessment Program (NAPAP) has been tasked by Congress to assess what Senator Moynihan has termed this {open_quotes}grand experiment.{close_quotes} Such a comprehensive assessment of the economic and environmental effects of this legislation has been a major challenge. To help NAPAP face this challenge, the U.S. Department of Energy (DOE) has sponsored development of an integrated assessment model, known as the Tracking and Analysis Framework (TAF). This section summarizes TAF`s objectives and its overall design.

  14. Sentiment Analysis of User-Generated Content on Drug Review Websites

    Na, Jin-Cheon

    2015-03-01

    Full Text Available This study develops an effective method for sentiment analysis of user-generated content on drug review websites, which has not been investigated extensively compared to other general domains, such as product reviews. A clause-level sentiment analysis algorithm is developed since each sentence can contain multiple clauses discussing multiple aspects of a drug. The method adopts a pure linguistic approach of computing the sentiment orientation (positive, negative, or neutral of a clause from the prior sentiment scores assigned to words, taking into consideration the grammatical relations and semantic annotation (such as disorder terms of words in the clause. Experiment results with 2,700 clauses show the effectiveness of the proposed approach, and it performed significantly better than the baseline approaches using a machine learning approach. Various challenging issues were identified and discussed through error analysis. The application of the proposed sentiment analysis approach will be useful not only for patients, but also for drug makers and clinicians to obtain valuable summaries of public opinion. Since sentiment analysis is domain specific, domain knowledge in drug reviews is incorporated into the sentiment analysis algorithm to provide more accurate analysis. In particular, MetaMap is used to map various health and medical terms (such as disease and drug names to semantic types in the Unified Medical Language System (UMLS Semantic Network.

  15. National Launch System comparative economic analysis

    Prince, A.

    1992-01-01

    Results are presented from an analysis of economic benefits (or losses), in the form of the life cycle cost savings, resulting from the development of the National Launch System (NLS) family of launch vehicles. The analysis was carried out by comparing various NLS-based architectures with the current Shuttle/Titan IV fleet. The basic methodology behind this NLS analysis was to develop a set of annual payload requirements for the Space Station Freedom and LEO, to design launch vehicle architectures around these requirements, and to perform life-cycle cost analyses on all of the architectures. A SEI requirement was included. Launch failure costs were estimated and combined with the relative reliability assumptions to measure the effects of losses. Based on the analysis, a Shuttle/NLS architecture evolving into a pressurized-logistics-carrier/NLS architecture appears to offer the best long-term cost benefit.

  16. Chipster: user-friendly analysis software for microarray and other high-throughput data

    Scheinin Ilari

    2011-10-01

    Full Text Available Abstract Background The growth of high-throughput technologies such as microarrays and next generation sequencing has been accompanied by active research in data analysis methodology, producing new analysis methods at a rapid pace. While most of the newly developed methods are freely available, their use requires substantial computational skills. In order to enable non-programming biologists to benefit from the method development in a timely manner, we have created the Chipster software. Results Chipster (http://chipster.csc.fi/ brings a powerful collection of data analysis methods within the reach of bioscientists via its intuitive graphical user interface. Users can analyze and integrate different data types such as gene expression, miRNA and aCGH. The analysis functionality is complemented with rich interactive visualizations, allowing users to select datapoints and create new gene lists based on these selections. Importantly, users can save the performed analysis steps as reusable, automatic workflows, which can also be shared with other users. Being a versatile and easily extendable platform, Chipster can be used for microarray, proteomics and sequencing data. In this article we describe its comprehensive collection of analysis and visualization tools for microarray data using three case studies. Conclusions Chipster is a user-friendly analysis software for high-throughput data. Its intuitive graphical user interface enables biologists to access a powerful collection of data analysis and integration tools, and to visualize data interactively. Users can collaborate by sharing analysis sessions and workflows. Chipster is open source, and the server installation package is freely available.

  17. Fundus camera systems: a comparative analysis

    DeHoog, Edward; Schwiegerling, James

    2009-01-01

    Retinal photography requires the use of a complex optical system, called a fundus camera, capable of illuminating and imaging the retina simultaneously. The patent literature shows two design forms but does not provide the specifics necessary for a thorough analysis of the designs to be performed. We have constructed our own designs based on the patent literature in optical design software and compared them for illumination efficiency, image quality, ability to accommodate for patient refract...

  18. Comparative Analysis of Terrorists’ Communication Strategies

    Denis Alexandrovich Zhuravliev

    2015-01-01

    There is a wide-spread approach in a research literature to regard terrorism as a communicative process. From this point of view, the author offers a comparative analysis of the three most common communication strategies of terrorist groups, including transforming the role of the mass media, the Internet and a combined approach. The author also argues that a particular communication strategy determines a structure of a terrorist organization.

  19. Comparative Analysis of VNSA Complex Engineering Efforts

    Gary Ackerman

    2016-01-01

    The case studies undertaken in this special issue demonstrate unequivocally that, despite being forced to operate clandestinely and facing the pressures of security forces seeking to hunt them down and neutralize them, at least a subset of violent non-state actors (VNSAs) are capable of some genuinely impressive feats of engineering. At the same time, success in such endeavours is not guaranteed and VNSAs will undoubtedly face a number of obstacles along the way. A comparative analysis of the...

  20. Comparative analysis of twelve Dothideomycete plant pathogens

    Ohm, Robin; Aerts, Andrea; Salamov, Asaf; Goodwin, Stephen B.; Grigoriev, Igor

    2011-03-11

    The Dothideomycetes are one of the largest and most diverse groups of fungi. Many are plant pathogens and pose a serious threat to agricultural crops grown for biofuel, food or feed. Most Dothideomycetes have only a single host and related Dothideomycete species can have very diverse host plants. Twelve Dothideomycete genomes have currently been sequenced by the Joint Genome Institute and other sequencing centers. They can be accessed via Mycocosm which has tools for comparative analysis

  1. Comparative analysis of enterprise architecture frameworks

    Oblak, Danica

    2012-01-01

    Today's enterprises are facing a competitive power in the dynamically changing business environment. With increasing complexity of enterprise, enterprise architecture have become an important field. Creating an enterprise architecture can be complex task, so enterprise architecture framework were created to simplify the process and guide an architect through all areas of architecture development. This study concentrates on the comparative analysis of enterprise architecture frameworks. T...

  2. ANALYSIS AND COMPARATIVE STUDY OF SEARCHING TECHNIQUES

    Yuvraj Singh Chandrawat*

    2015-01-01

    We live in the age of technolgy and it is quiet obvious that it is increasing day-by-day endlessly. In this technical era researchers are focusing on the development of the existing technologies. Software engineering is the dominant branch of Computer Science that deals with the development and analysis of the software. The objective of this study is to analyze and compare the existing searching algorithms (linear search and binary search). In this paper, we will discuss both thes...

  3. Chronic illness and multimorbidity among problem drug users: a comparative cross sectional pilot study in primary care.

    Cullen, Walter

    2009-01-01

    BACKGROUND: Although multimorbidity has important implications for patient care in general practice, limited research has examined chronic illness and health service utilisation among problem drug users. This study aimed to determine chronic illness prevalence and health service utilisation among problem drug users attending primary care for methadone treatment, to compare these rates with matched \\'controls\\' and to develop and pilot test a valid study instrument. METHODS: A cross-sectional study of patients attending three large urban general practices in Dublin, Ireland for methadone treatment was conducted, and this sample was compared with a control group matched by practice, age, gender and General Medical Services (GMS) status. RESULTS: Data were collected on 114 patients. Fifty-seven patients were on methadone treatment, of whom 52(91%) had at least one chronic illness (other then substance use) and 39(68%) were prescribed at least one regular medication. Frequent utilisation of primary care services and secondary care services in the previous six months was observed among patients on methadone treatment and controls, although the former had significantly higher chronic illness prevalence and primary care contact rates. The study instrument facilitated data collection that was feasible and with minimal inter-observer variation. CONCLUSION: Multimorbidity is common among problem drug users attending general practice for methadone treatment. Primary care may therefore have an important role in primary and secondary prevention of chronic illnesses among this population. This study offers a feasible study instrument for further work on this issue. (238 words).

  4. Chronic illness and multimorbidity among problem drug users: a comparative cross sectional pilot study in primary care.

    Cullen, Walter

    2012-02-01

    BACKGROUND: Although multimorbidity has important implications for patient care in general practice, limited research has examined chronic illness and health service utilisation among problem drug users. This study aimed to determine chronic illness prevalence and health service utilisation among problem drug users attending primary care for methadone treatment, to compare these rates with matched \\'controls\\' and to develop and pilot test a valid study instrument. METHODS: A cross-sectional study of patients attending three large urban general practices in Dublin, Ireland for methadone treatment was conducted, and this sample was compared with a control group matched by practice, age, gender and General Medical Services (GMS) status. RESULTS: Data were collected on 114 patients. Fifty-seven patients were on methadone treatment, of whom 52(91%) had at least one chronic illness (other then substance use) and 39(68%) were prescribed at least one regular medication. Frequent utilisation of primary care services and secondary care services in the previous six months was observed among patients on methadone treatment and controls, although the former had significantly higher chronic illness prevalence and primary care contact rates. The study instrument facilitated data collection that was feasible and with minimal inter-observer variation. CONCLUSION: Multimorbidity is common among problem drug users attending general practice for methadone treatment. Primary care may therefore have an important role in primary and secondary prevention of chronic illnesses among this population. This study offers a feasible study instrument for further work on this issue. (238 words).

  5. Aviation System Analysis Capability Quick Response System Report Server User's Guide

    Roberts, Eileen R.; Villani, James A.; Wingrove, Earl R., III

    1996-01-01

    This report is a user's guide for the Aviation System Analysis Capability Quick Response System (ASAC QRS) Report Server. The ASAC QRS is an automated online capability to access selected ASAC models and data repositories. It supports analysis by the aviation community. This system was designed by the Logistics Management Institute for the NASA Ames Research Center. The ASAC QRS Report Server allows users to obtain information stored in the ASAC Data Repositories.

  6. WASP (Wavelet Analysis of Secondary Particles Angular Distributions) package. Version 1.0. User's guide

    WASP package is a C++ program aimed to analyze angular distributions of secondary particles generated in nuclear interactions. (WASP is designed for data analysis of the STAR and ALICE experiments). It uses a wavelet analysis for this purpose and the vanishing momentum or gaussian wavelets are chosen for transformations. WASP provides an user-friendly Graphical User Interface (GUI) which makes it quite simple to use. WASP design, a brief description of the used wavelet transformation algorithm and GUI are presented in this user's guide

  7. Dashboard Task Monitor for managing ATLAS user analysis on the Grid

    Sargsyan, L; The ATLAS collaboration; Jha, M; Karavakis, E; Kokoszkiewicz, L; Saiz, P; Schovancova, J; Tuckett, D

    2013-01-01

    The organization of the distributed user analysis on the Worldwide LHC Computing Grid (WLCG) infrastructure is one of the most challenging tasks among the computing activities at the Large Hadron Collider. The Experiment Dashboard offers a solution that not only monitors but also manages (kill, resubmit) user tasks and jobs via a web interface. The ATLAS Dashboard Task Monitor provides analysis users with a tool that is operating system and GRID environment independent. This contribution describes the functionality of the application and its implementation details, in particular authentication, authorization and audit of the management operations.

  8. Dashboard Task Monitor for Managing ATLAS User Analysis on the Grid

    Sargsyan, L.; Andreeva, J.; Jha, M.; Karavakis, E.; Kokoszkiewicz, L.; Saiz, P.; Schovancova, J.; Tuckett, D.; Atlas Collaboration

    2014-06-01

    The organization of the distributed user analysis on the Worldwide LHC Computing Grid (WLCG) infrastructure is one of the most challenging tasks among the computing activities at the Large Hadron Collider. The Experiment Dashboard offers a solution that not only monitors but also manages (kill, resubmit) user tasks and jobs via a web interface. The ATLAS Dashboard Task Monitor provides analysis users with a tool that is independent of the operating system and Grid environment. This contribution describes the functionality of the application and its implementation details, in particular authentication, authorization and audit of the management operations.

  9. Dashboard task monitor for managing ATLAS user analysis on the grid

    The organization of the distributed user analysis on the Worldwide LHC Computing Grid (WLCG) infrastructure is one of the most challenging tasks among the computing activities at the Large Hadron Collider. The Experiment Dashboard offers a solution that not only monitors but also manages (kill, resubmit) user tasks and jobs via a web interface. The ATLAS Dashboard Task Monitor provides analysis users with a tool that is independent of the operating system and Grid environment. This contribution describes the functionality of the application and its implementation details, in particular authentication, authorization and audit of the management operations.

  10. Premo and Kansei: A Comparative Analysis

    Anitawati Mohd Lokman; Khairul Khalil Ishak; Ana Hadiana

    2013-01-01

    Kansei Engineering is a technology that enables incorporation of human emotion in design requirements. It has in its perspective that Kansei is unique for different domain and unique for different target user group, and use mainly a verbal measurement instruments in its methodology. The technology is seen to have little shortcoming when there is a need to build universal design for universal target user. It will involve complexity when handling semantics since people do not speak the same lan...

  11. User Behavior Analysis from Web Log using Log Analyzer Tool

    Brijesh Bakariya

    2013-11-01

    Full Text Available Now a day, internet plays a role of huge database in which many websites, information and search engines are available. But due to unstructured and semi-structured data in webpage, it has become a challenging task to extract relevant information. Its main reason is that traditional knowledge based technique are not correct to efficiently utilization the knowledge, because it consist of many discover pattern, contains a lots of noise and uncertainty. In this paper, analyzing of web usage mining has been made with the help if web log data for which web log analyzer tool, “Deep Log Analyzer” to find out abstract information from particular server and also tried to find out the user behavior and also developed an ontology which consist the relation among efficient web apart of web usage mining.

  12. Inpatient care in Kazakhstan: A comparative analysis

    Ainur B Kumar

    2013-01-01

    Full Text Available Background: Reforms in inpatient care are critical for the enhancement of the efficiency of health systems. It still remains the main costly sector of the health system, accounting for more than 60% of all expenditures. Inappropriate and ineffective use of the hospital infrastructure is also a big issue. We aimed to analyze statistical data on health indices and dynamics of the hospital stock in Kazakhstan in comparison with those of developed countries. Materials and Methods: Study design is comparative quantitative analysis of inpatient care indicators. We used information and analytical methods, content analysis, mathematical treatment, and comparative analysis of statistical data on health system and dynamics of hospital stock in Kazakhstan and some other countries of the world [Organization for Economic Cooperation and Development (OECD, USA, Canada, Russia, China, Japan, and Korea] over the period 2001-2011. Results : Despite substantial and continuous reductions over the past 10 years, hospitalization rates in Kazakhstan still remain high compared to some developed countries, including those of the OECD. In fact, the hospital stay length for all patients in Kazakhstan in 2011 is around 9.9 days, hospitalization ratio per 100 people is 16.3, and hospital beds capacity is 100 per 10,000 inhabitants. Conclusion: The decreased level of beds may adversely affect both medical organization and health system operations. Alternatives to the existing inpatient care are now being explored. The introduction of the unified national healthcare system allows shifting the primary focus on primary care organizations, which can decrease the demand on inpatient care as a result of improving the health status of people at the primary care level.

  13. Comparative analysis of black carbon in soils

    Schmidt, Michael W. I.; Skjemstad, Jan O.; Czimczik, Claudia I.; Glaser, Bruno; Prentice, Ken M.; Gelinas, Yves; Kuhlbusch, Thomas A. J.

    2001-03-01

    Black carbon (BC), produced by incomplete combustion of fossil fuels and vegetation, occurs ubiquitously in soils and sediments. BC exists as a continuum from partly charred material to highly graphitized soot particles, with no general agreement on clear-cut boundaries of definition or analysis. In a comparative analysis, we measured BC forms in eight soil samples by six established methods. All methods involved removal of the non-BC components from the sample by thermal or chemical means or a combination of both. The remaining carbon, operationally defined as BC, was quantified via mass balance, elemental composition or by exploiting benzenecarboxylic acids as molecular markers or applying 13C MAS NMR (magic angle spinning nuclear magnetic resonance) spectroscopy. BC concentrations measured for individual samples vary over 2 orders of magnitude (up to a factor of 571). One possible explanation for this wide range of results is that the individual BC methods rely on operational definitions with clear-cut but different boundaries and developed for specific scientific questions, whereas BC represents a continuum of materials with widely contrasting physicochemical properties. Thus the methods are inherently designed to analytically determine different parts of the continuum, and it is crucial to know how measurements made by different techniques relate to each other. It is clear from this preliminary comparative analysis that a collection of BC reference materials should be established as soon as possible 1 ) to ensure long-term intralaboratory and interlaboratory data quality and 2) to facilitate comparative analyses between different analytical techniques and scientific approaches

  14. A Comparative Study on Error Analysis

    Wu, Xiaoli; Zhang, Chun

    2015-01-01

    of errors in the written and spoken production of L2 learners has a long tradition in L2 pedagogy. Yet, in teaching and learning Chinese as a foreign language (CFL), only handful studies have been made either to define the ‘error’ in a pedagogically insightful way or to empirically investigate the...... occurrence of errors either in linguistic or pedagogical terms. The purpose of the current study is to demonstrate the theoretical and practical relevance of error analysis approach in CFL by investigating two cases - (1) Belgian (L1) learners’ use of Chinese (L2) comparative sentences in written production...

  15. Comparative Analysis of Virtual Education Applications

    Kurt, Mehmet

    2006-01-01

    The research was conducted in order to make comparative analysis of virtual education applications. The research is conducted in survey model. The study group consists of total 300 institutes providing virtual education in the fall, spring and summer semesters of 2004; 246 in USA, 10 in Australia, 3 in South Africa, 10 in India, 21 in UK, 6 in Japan, 4 in Turkey. The information has been collected by online questionnaire sent to the target mass by e-mail. The questionnaire has been developed ...

  16. Vegetation Change Analysis User's Manual

    D. J. Hansen; W. K. Ostler

    2002-10-01

    Approximately 70 percent of all U.S. military training lands are located in arid and semi-arid areas. Training activities in such areas frequently adversely affect vegetation, damaging plants and reducing the resilience of vegetation to recover once disturbed. Fugitive dust resulting from a loss of vegetation creates additional problems for human health, increasing accidents due to decreased visibility, and increasing maintenance costs for roads, vehicles, and equipment. Diagnostic techniques are needed to identify thresholds of sustainable military use. A cooperative effort among U.S. Department of Energy, U.S. Department of Defense, and selected university scientists was undertaken to focus on developing new techniques for monitoring and mitigating military impacts in arid lands. This manual focuses on the development of new monitoring techniques that have been implemented at Fort Irwin, California. New mitigation techniques are described in a separate companion manual. This User's Manual is designed to address diagnostic capabilities needed to distinguish between various degrees of sustainable and nonsustainable impacts due to military training and testing and habitat-disturbing activities in desert ecosystems. Techniques described here focus on the use of high-resolution imagery and the application of image-processing techniques developed primarily for medical research. A discussion is provided about the measurement of plant biomass and shrub canopy cover in arid. lands using conventional methods. Both semiquantitative methods and quantitative methods are discussed and reference to current literature is provided. A background about the use of digital imagery to measure vegetation is presented.

  17. EXPOSURE ANALYSIS MODELING SYSTEM (EXAMS): USER MANUAL AND SYSTEM DOCUMENTATION

    The Exposure Analysis Modeling System, first published in 1982 (EPA-600/3-82-023), provides interactive computer software for formulating aquatic ecosystem models and rapidly evaluating the fate, transport, and exposure concentrations of synthetic organic chemicals - pesticides, ...

  18. Exploratory factor analysis and reliability analysis with missing data: A simple method for SPSS users

    Bruce Weaver

    2014-09-01

    Full Text Available Missing data is a frequent problem for researchers conducting exploratory factor analysis (EFA or reliability analysis. The SPSS FACTOR procedure allows users to select listwise deletion, pairwise deletion or mean substitution as a method for dealing with missing data. The shortcomings of these methods are well-known. Graham (2009 argues that a much better way to deal with missing data in this context is to use a matrix of expectation maximization (EM covariances(or correlations as input for the analysis. SPSS users who have the Missing Values Analysis add-on module can obtain vectors ofEM means and standard deviations plus EM correlation and covariance matrices via the MVA procedure. But unfortunately, MVA has no /MATRIX subcommand, and therefore cannot write the EM correlations directly to a matrix dataset of the type needed as input to the FACTOR and RELIABILITY procedures. We describe two macros that (in conjunction with an intervening MVA command carry out the data management steps needed to create two matrix datasets, one containing EM correlations and the other EM covariances. Either of those matrix datasets can then be used asinput to the FACTOR procedure, and the EM correlations can also be used as input to RELIABILITY. We provide an example that illustrates the use of the two macros to generate the matrix datasets and how to use those datasets as input to the FACTOR and RELIABILITY procedures. We hope that this simple method for handling missing data will prove useful to both students andresearchers who are conducting EFA or reliability analysis.

  19. Modelling User-Costs in Life Cycle Cost-Benefit (LCCB) analysis

    Thoft-Christensen, Palle

    2008-01-01

    The importance of including user's costs in Life-Cycle Cost-Benefit analysis of structures is discussed in this paper. This is especially for bridges of great importance. Repair or/and failure of a bridge will usually result in user costs greater than the repair or replacement costs of the bridge....... For the society (and the user's) it is therefore of great importance that maintenance or replacement of a bridge is performed in such a way that all costs are minimized - not only the owners cost....

  20. Comparative genome analysis of Basidiomycete fungi

    Riley, Robert; Salamov, Asaf; Henrissat, Bernard; Nagy, Laszlo; Brown, Daren; Held, Benjamin; Baker, Scott; Blanchette, Robert; Boussau, Bastien; Doty, Sharon L.; Fagnan, Kirsten; Floudas, Dimitris; Levasseur, Anthony; Manning, Gerard; Martin, Francis; Morin, Emmanuelle; Otillar, Robert; Pisabarro, Antonio; Walton, Jonathan; Wolfe, Ken; Hibbett, David; Grigoriev, Igor

    2013-08-07

    Fungi of the phylum Basidiomycota (basidiomycetes), make up some 37percent of the described fungi, and are important in forestry, agriculture, medicine, and bioenergy. This diverse phylum includes symbionts, pathogens, and saprotrophs including the majority of wood decaying and ectomycorrhizal species. To better understand the genetic diversity of this phylum we compared the genomes of 35 basidiomycetes including 6 newly sequenced genomes. These genomes span extremes of genome size, gene number, and repeat content. Analysis of core genes reveals that some 48percent of basidiomycete proteins are unique to the phylum with nearly half of those (22percent) found in only one organism. Correlations between lifestyle and certain gene families are evident. Phylogenetic patterns of plant biomass-degrading genes in Agaricomycotina suggest a continuum rather than a dichotomy between the white rot and brown rot modes of wood decay. Based on phylogenetically-informed PCA analysis of wood decay genes, we predict that that Botryobasidium botryosum and Jaapia argillacea have properties similar to white rot species, although neither has typical ligninolytic class II fungal peroxidases (PODs). This prediction is supported by growth assays in which both fungi exhibit wood decay with white rot-like characteristics. Based on this, we suggest that the white/brown rot dichotomy may be inadequate to describe the full range of wood decaying fungi. Analysis of the rate of discovery of proteins with no or few homologs suggests the value of continued sequencing of basidiomycete fungi.

  1. User's guide for 10 CFR 61 impact analysis codes

    This document explains how to use the Impact Analysis Codes used in the Draft Environmental Impact Statement (DEIS) (NUREG-0782, Vol. 1-4) supporting 10 CFR 61, Licensing Requirements for Land Disposal of Radioactive Waste. The mathematical development of the impact Analysis Codes and other information necessary to understand the results of using the Codes is contained in the DEIS, and in a supporting document, Data Base for Radioactive Waste Management (NUREG/CR-1759, Vol. 1-3). This document was prepared with the intention of accompanying a computer magnetic tape containing the Impact Analysis Codes. A form is included at the end of this document which can be used to obtain such a tape

  2. Understanding users

    Johannsen, Carl Gustav Viggo

    2014-01-01

    Segmentation of users can help libraries in the process of understanding user similarities and differences. Segmentation can also form the basis for selecting segments of target users and for developing tailored services for specific target segments. Several approaches and techniques have been te...... segmentation project using computer-generated clusters. Compared to traditional marketing texts, this article also tries to identify user segments or images or metaphors by the library profession itself.......Segmentation of users can help libraries in the process of understanding user similarities and differences. Segmentation can also form the basis for selecting segments of target users and for developing tailored services for specific target segments. Several approaches and techniques have been...... tested in library contexts and the aim of this article is to identify the main approaches and to discuss their perspectives, including their strenghts and weaknesses in, especially, public library contexts. The purpose is also to prsent and discuss the results of a recent - 2014 - Danish library user...

  3. HORECA. Hoger onderwijs reactor elementary core analysis system. User's manual

    HORECA is developed at IRI Delft for quick analysis of power distribution, burnup and safety for the HOR. It can be used for the manual search of a better loading of the reactor. HORECA is based on the Penn State Fuel Management Package and uses the MCRAC code included in this package as a calculation engine. (orig./HP)

  4. ATLAS user analysis on private cloud resources at GoeGrid

    Glaser, F.; Nadal Serrano, J.; Grabowski, J.; Quadt, A.

    2015-12-01

    User analysis job demands can exceed available computing resources, especially before major conferences. ATLAS physics results can potentially be slowed down due to the lack of resources. For these reasons, cloud research and development activities are now included in the skeleton of the ATLAS computing model, which has been extended by using resources from commercial and private cloud providers to satisfy the demands. However, most of these activities are focused on Monte-Carlo production jobs, extending the resources at Tier-2. To evaluate the suitability of the cloud-computing model for user analysis jobs, we developed a framework to launch an ATLAS user analysis cluster in a cloud infrastructure on demand and evaluated two solutions. The first solution is entirely integrated in the Grid infrastructure by using the same mechanism, which is already in use at Tier-2: A designated Panda-Queue is monitored and additional worker nodes are launched in a cloud environment and assigned to a corresponding HTCondor queue according to the demand. Thereby, the use of cloud resources is completely transparent to the user. However, using this approach, submitted user analysis jobs can still suffer from a certain delay introduced by waiting time in the queue and the deployed infrastructure lacks customizability. Therefore, our second solution offers the possibility to easily deploy a totally private, customizable analysis cluster on private cloud resources belonging to the university.

  5. Interactive user's application to Genie 2000 spectroscopy system for automation of hair neutron activation analysis

    reporting. Genie 2000 software is available in several variations and with several layered optional packages. Genie 2000 Basic Spectroscopy and Gamma Analysis Software, which available in RAC permitting us automatically obtain nuclide identification report with all needed parameters. Any applications of Genie 2000 software have not possibility to calculate analyzed elements concentration. For automation this step of INAA by using Canberra Genie 2000 Spectroscopy System we developed user's 'Human hair analysis Application' software for single comparator standard method of hair INAA. The work with the developed Application for GENIE-2000 begins with the menu, which contains four items. 1. Copying of the data. 2. Data input. 3. Viewing, editing and analyzing of the data. 4. EXIT. The item 'Copying of the data' makes copying the entered values of special user parameters from one data source into another. It is very user-friendly. It is enough to him once in one data source to enter values of necessary parameters (nuclides name, γ-lines value, factors of transformation for various times of an irradiation and cooling). Further, with the help of procedure 'Copying of the data' he can transfer them to any other data source. The item 'Data input' is carried out with the help of Graphical Batch Tools function GBTPARS and specially developed set of Form Design Specification (FDS) files for this function. This developed Application works in interactive environment as a dialogue system with user and allows calculating required nuclides concentration in analyzed samples, separately for long-lived, middle-lived and short-lived nuclides. Using the Nuclide Library Editor and comprehensive standard libraries of Genie package we created three custom libraries: Stdlib.HairL, Stdlib.HairM, Stdlib.HairS, accordingly for long-, middle- and short-lived nuclides. After processing of the next data source the Application returns the user to the menu. From here he can continue data processing, having

  6. CMS Configuration Editor: GUI based application for user analysis job definition

    De Cosa, Annapaola

    2010-01-01

    We present the user interface and the software architecture of the Configuration Editor that is used by CMS physicists to configure their physics analysis tasks. Analysis workflows typically involve execution of a sequence of algorithms, and these are implemented as software modules that are integrated within the CMS software framework (CMSSW). In particular, a set of common analysis tools is provided in the so-called CMS Physics Analysis Toolkit (PAT) and these need to be steered and configured during the execution of an analysis job. The Python scripting language is used to define the job configuration that drives the analysis workflow. Configuring analysis jobs can be quite a challenging task, particularly for newcomers, and therefore a graphical tool, called the Configuration Editor, has been developed to facilitate the creation and inspection of these configuration files. Typically, a user-defined analysis workflow can be produced starting from a standard configuration file, applying and configuring PAT ...

  7. A Methodology for Evaluating User Perceptions of the Delivery of ICT Services: a comparative study of six UK local authorities

    Les Worrall

    2000-11-01

    Full Text Available Evaluating and managing the effective delivery of ICT services is an issue that has been brought into sharper relief recently. This has been particularly prevalent in the UK public sector where the growing emphasis on formalised client-contractor relationships, outsourcing and benchmarking (both between local authorities and between local authorities and private sector organisations has meant that the definition of service standards and agreeing performance criteria has attracted considerable practitioner attention. This research is based on 295 interviews conducted in six UK local authorities. The investigation used both gap analysis and perceptual mapping techniques to develop an understanding of the aspects of ICT service delivery that users' value most in conjunction with an assessment of how well they perceive their ICT department is performing on these criteria. The paper exposes considerable differences in the relative performance of the six local authorities from both the gap analysis and the perceptual mapping elements of the investigation. The methodology is shown to provide an effective way of identifying key performance issues from the user perspective and benchmarking service performance across organisations.

  8. Nigerian Power Sector: Comparative Analysis of Productivity

    Iwuamadi ObiomaChidiebere

    2015-06-01

    Full Text Available Undoubtedly, power instability in the Nigerian Power Sector despite several mitigative measures by the government has created some chocks in the national socio-economic wheel of development. Unfortunately, the conceptual objective of the power reforms to remedy inadequate power generation capacity, inefficient usage of capacity, ineffective regulation and high technical losses is tardily being achieved. This research comparatively analyzed the rate of productivity change in Nigeria’s power sector from 2005 – 2013. The analysis reveals that privatization improved the productivity index by 89%. It is expected that this work may assist the power policy makers and regulators to come up with abetter framework for the full realization of the noble goals envisaged in this reform act.

  9. Comparative Genome Analysis of Basidiomycete Fungi

    Riley, Robert; Salamov, Asaf; Morin, Emmanuelle; Nagy, Laszlo; Manning, Gerard; Baker, Scott; Brown, Daren; Henrissat, Bernard; Levasseur, Anthony; Hibbett, David; Martin, Francis; Grigoriev, Igor

    2012-03-19

    Fungi of the phylum Basidiomycota (basidiomycetes), make up some 37percent of the described fungi, and are important in forestry, agriculture, medicine, and bioenergy. This diverse phylum includes the mushrooms, wood rots, symbionts, and plant and animal pathogens. To better understand the diversity of phenotypes in basidiomycetes, we performed a comparative analysis of 35 basidiomycete fungi spanning the diversity of the phylum. Phylogenetic patterns of lignocellulose degrading genes suggest a continuum rather than a sharp dichotomy between the white rot and brown rot modes of wood decay. Patterns of secondary metabolic enzymes give additional insight into the broad array of phenotypes found in the basidiomycetes. We suggest that the profile of an organism in lignocellulose-targeting genes can be used to predict its nutritional mode, and predict Dacryopinax sp. as a brown rot; Botryobasidium botryosum and Jaapia argillacea as white rots.

  10. HERMES: A user-friendly connectivity analysis software

    Niso Galán, Julia Guiomar; Bruña Fernandez, Ricardo; Pereda, Ernesto; Gutierrez, Ricardo; Bajo Breton, Ricardo; Maestú, Fernando; Pozo Guerrero, Francisco del

    2012-01-01

    The analysis of the interdependence between time series has become an important field of research, mainly as a result of advances in the characterization of dynamical systems from the signals they produce, and the introduction of concepts such as Generalized (GS) and Phase synchronization (PS). This increase in the number of approaches to tackle the existence of the so-called functional (FC) and effective connectivity (EC) (Friston 1994) between two, (or among many) neural networks, along wit...

  11. Wisdom of the Crowd or Wisdom of a Few? An Analysis of Users' Content Generation

    Baeza-Yates, Ricardo

    2016-01-01

    In this paper we analyze how user generated content (UGC) is created, challenging the well known {\\it wisdom of crowds} concept. Although it is known that user activity in most settings follow a power law, that is, few people do a lot, while most do nothing, there are few studies that characterize well this activity. In our analysis of datasets from two different social networks, Facebook and Twitter, we find that a small percentage of active users and much less of all users represent 50\\% of the UGC. We also analyze the dynamic behavior of the generation of this content to find that the set of most active users is quite stable in time. Moreover, we study the social graph, finding that those active users are highly connected among them. This implies that most of the wisdom comes from a few users, challenging the independence assumption needed to have a wisdom of crowds. We also address the content that is never seen by any people, which we call digital desert, that challenges the assumption that the content o...

  12. Comparative Analysis of Students’ Media Competences Levels

    Alexander Fedorov

    2015-08-01

    Full Text Available This article analyzed the results of survey of university students’ media literacy competence (on the base of a classification of indicators of media literacy competence of the audience as an effective tool for comparative analysis of the levels of development of media competence of students of the control and experimental groups: the level of media competence of students who have a one-year training course in the framework of media literacy education courses four times higher than in similar indicators in the control group. Analysis of the results of this survey confirmed the general trend of media contacts of student audience – its orientation to entertainment genres of audiovisual media, visually appealing; positive, active, unmarried, childless, educated, highly qualified characters (primarily – male characters aged 19 to 35 years. These heroes are characteristic optimism, independence, intelligence, emotion. They have an excellent command of the life situation and have a positive impact on the development progress of the plot of a media text.

  13. Deductive Evaluation: Formal Code Analysis With Low User Burden

    Di Vito, Ben. L

    2016-01-01

    We describe a framework for symbolically evaluating iterative C code using a deductive approach that automatically discovers and proves program properties. Although verification is not performed, the method can infer detailed program behavior. Software engineering work flows could be enhanced by this type of analysis. Floyd-Hoare verification principles are applied to synthesize loop invariants, using a library of iteration-specific deductive knowledge. When needed, theorem proving is interleaved with evaluation and performed on the fly. Evaluation results take the form of inferred expressions and type constraints for values of program variables. An implementation using PVS (Prototype Verification System) is presented along with results for sample C functions.

  14. Exploiting Formal Concept Analysis in a Customizing Recommendation for New User and Gray Sheep Problems

    Li, Xiaohui; Murata, Tomohiro

    Recommender systems are becoming an indispensable application and re-shaping the world in e-commerce scopes. This paper reviews the major problems in the existing recommender systems and presents a tracking recommendation approach based on information of user's behavior and two-level property of items. A new recommendation model based the synergistic use of knowledge from repository, which includes user's behavior, and items property was constructed and utilizes the Formal Concept Analysis (FCA) mapping to guide a personalized recommendation for user. We simulate a prototype recommender system that can make the quality recommendation by tracking user's behavior for implementing the proposed approach and testing its performance. Experiments using two datasets show our strategy was more robust against the drawbacks and preponderate over traditional recommendation approaches in cold-start conditions.

  15. CULTURE AND SOCIAL MEDIA USAGE: ANALYSIS OF JAPANESE TWITTER USERS

    Adam Acar

    2013-06-01

    Full Text Available Twitter, one of the most popular microblogging tools, has been used extensively all around the world. However, up to date, no study has addressed how culture influences the use of this communication platform. In order to close the literature gap and promote cross-cultural understandings, this paper content analyzed 4,000 tweets from 200 college students in Japan and the USA. The results showed that Japanese college students post more self-related messages and ask fewer questions compared to American college students. It was also found that tweets that refer to TV are more common in Japan, whereas sports and news tweets stand out in the USA. The evidence from this study suggests that there is a subtle and complicated relationship between culture and Twitter use.

  16. Following User Pathways: Cross Platform and Mixed Methods Analysis in Social Media Studies

    Hall, Margeret; Mazarakis, Athanasios; Peters, Isabella;

    2016-01-01

    Social media and the resulting tidal wave of available data have changed the ways and methods researchers analyze communities at scale. But the full potential for social scientists (and others) is not yet achieved. Despite the popularity of social media analysis in the past decade, few researchers...... is the mixed method approach (e.g. qualitative and quantitative methods) in order to better understand how users and society interacts online. The workshop 'Following User Pathways' brings together a community of researchers and professionals to address methodological, analytical, conceptual, and technological...... challenges and opportunities of cross-platform, mixed method analysis in social media ecosystems....

  17. Explicet: graphical user interface software for metadata-driven management, analysis and visualization of microbiome data.

    Robertson, Charles E; Harris, J Kirk; Wagner, Brandie D; Granger, David; Browne, Kathy; Tatem, Beth; Feazel, Leah M; Park, Kristin; Pace, Norman R; Frank, Daniel N

    2013-12-01

    Studies of the human microbiome, and microbial community ecology in general, have blossomed of late and are now a burgeoning source of exciting research findings. Along with the advent of next-generation sequencing platforms, which have dramatically increased the scope of microbiome-related projects, several high-performance sequence analysis pipelines (e.g. QIIME, MOTHUR, VAMPS) are now available to investigators for microbiome analysis. The subject of our manuscript, the graphical user interface-based Explicet software package, fills a previously unmet need for a robust, yet intuitive means of integrating the outputs of the software pipelines with user-specified metadata and then visualizing the combined data. PMID:24021386

  18. Which Users Should Be the Focus of Mobile Personal Health Records? Analysis of User Characteristics Influencing Usage of a Tethered Mobile Personal Health Record

    Lee, Guna; Park, Joong Yeol; Shin, Soo-Yong; Hwang, Jong Su; Ryu, Hyeon Jeong; Bates, David W.

    2016-01-01

    Abstract Background: This study was conducted to analyze the usage pattern of a hospital-tethered mobile personal health records (m-PHRs) application named My Chart in My Hand (MCMH) and to identify user characteristics that influence m-PHR usage. Materials and Methods: Access logs to MCMH and its menus were collected for a total of 18 months, from August 2011 to January 2013. Usage patterns between users without a patient identification number (ID) and users with a patient ID were compared. Users with a patient ID were divided into light and heavy user groups by the median number of monthly access. Multiple linear regression models were used to assess MCMH usage pattern by characteristics of MCMH user with a patient ID. Results: The total number of MCMH logins was 105,603, and the median number of accesses was 15 times. Users (n = 7,096) mostly accessed the “My Chart” menu, but “OPD [outpatient department] Service Support” and “Health Management” menus were also frequently used. Patients with chronic diseases, experience of hospital visits including emergency room and OPD, and age group of 0–19 years were more frequently found among users with a patient ID (n = 2,186) (p < 0.001). A similar trend was found in the heavy user group (n = 1,123). Submenus of laboratory result, online appointment, and medication lists that were accessed mostly by users with a patient ID were associated with OPD visit and chronic diseases. Conclusions: This study showed that focuses on patients with chronic disease and more hospital visits and empowerment functions in a tethered m-PHR would be helpful to pursue the extensive use. PMID:26447775

  19. Comparative analysis of safety related site characteristics

    This document presents a comparative analysis of site characteristics related to long-term safety for the two candidate sites for a final repository for spent nuclear fuel in Forsmark (municipality of Oesthammar) and in Laxemar (municipality of Oskarshamn) from the point of view of site selection. The analyses are based on the updated site descriptions of Forsmark /SKB 2008a/ and Laxemar /SKB 2009a/, together with associated updated repository layouts and designs /SKB 2008b and SKB 2009b/. The basis for the comparison is thus two equally and thoroughly assessed sites. However, the analyses presented here are focussed on differences between the sites rather than evaluating them in absolute terms. The document serves as a basis for the site selection, from the perspective of long-term safety, in SKB's application for a final repository. A full evaluation of safety is made for a repository at the selected site in the safety assessment SR-Site /SKB 2011/, referred to as SR-Site main report in the following

  20. Comparative analysis of safety related site characteristics

    Andersson, Johan (ed.)

    2010-12-15

    This document presents a comparative analysis of site characteristics related to long-term safety for the two candidate sites for a final repository for spent nuclear fuel in Forsmark (municipality of Oesthammar) and in Laxemar (municipality of Oskarshamn) from the point of view of site selection. The analyses are based on the updated site descriptions of Forsmark /SKB 2008a/ and Laxemar /SKB 2009a/, together with associated updated repository layouts and designs /SKB 2008b and SKB 2009b/. The basis for the comparison is thus two equally and thoroughly assessed sites. However, the analyses presented here are focussed on differences between the sites rather than evaluating them in absolute terms. The document serves as a basis for the site selection, from the perspective of long-term safety, in SKB's application for a final repository. A full evaluation of safety is made for a repository at the selected site in the safety assessment SR-Site /SKB 2011/, referred to as SR-Site main report in the following

  1. NFAP: the nonlinear finite element analysis program. Users manual; Version 1977

    A brief outline of the analysis capability together with the input instructions are given for a nonlinear finite element analysis program called NFAP, which is an extended version of the NONSAP Program. Extensions include additional element types, material models and several user's features as further described in the report. Similar to NONSAP, the NFAP program can be used for conducting linear or nonlinear analysis of various structures under static or dynamic loadings. Nonlinearities involve both nonlinear materials and large deformations

  2. Neutron activation analysis at the Californium User Facility for Neutron Science

    The Californium User Facility (CUF) for Neutron Science has been established to provide 252Cf-based neutron irradiation services and research capabilities including neutron activation analysis (NAA). A major advantage of the CUF is its accessibility and controlled experimental conditions compared with those of a reactor environment The CUF maintains the world's largest inventory of compact 252Cf neutron sources. Neutron source intensities of ≤ 1011 neutrons/s are available for irradiations within a contamination-free hot cell, capable of providing thermal and fast neutron fluxes exceeding 108 cm-2 s-1 at the sample. Total flux of ≥109 cm-2 s-1 is feasible for large-volume irradiation rabbits within the 252Cf storage pool. Neutron and gamma transport calculations have been performed using the Monte Carlo transport code MCNP to estimate irradiation fluxes available for sample activation within the hot cell and storage pool and to design and optimize a prompt gamma NAA (PGNAA) configuration for large sample volumes. Confirmatory NAA irradiations have been performed within the pool. Gamma spectroscopy capabilities including PGNAA are being established within the CUF for sample analysis

  3. Exploratory analysis of user-generated photos and indicators that influence their appeal

    Urban Sedlar

    2014-07-01

    Full Text Available In this paper we analyze if simple indicators related to photo quality (brightness, sharpness, color palette and established content detection techniques (face detection can predict the success of photos in obtaining more “likes” from other users of photo-sharing social networks. This provides a unique look into the habits of users of such networks. The analysis was performed on 394.000 images downloaded from the social photo-sharing site Instagram, paired with a de-identified dataset of user liking activity, provided by a seller of a social-media mobile app. Two user groups were analyzed: all users in a two month period (N = 122.260 and a highly selective group (N = 3.982 of users that only like <10% of what they view. No correlation was found with any of the indicators using the whole (non-selective population, likely due to their bias towards earning virtual currency in exchange for liking. However, in selective group, small positive correlation was found between like ratio and image sharpness (r=0.09, p<0.0001 and small negative correlation between like ratio and the number of faces (r=-0.10, p<0.0001.

  4. Performance Analysis of Mean Value-Based Power Allocation with Primary User Interference in Spectrum Sharing Systems

    Duan, Ruifeng; Jäntti, Riku; Elmusrati, Mohammed S.

    2014-01-01

    In this paper, we provide an exact expression for the ergodic capacity of the secondary user, and a unified closed-form expression for its bounds with taking the consideration of the primary interference at the secondary receiver. In addition, a simple but accurate approximation of the the ergodic capacity of the primary user is presented. Moreover, a primary user capacity loss based power allocation scheme for the secondary user is also proposed. Finally, we compare the performance of the tw...

  5. Enabling Semantic Analysis of User Browsing Patterns in the Web of Data

    Hoxha, Julia; Agarwal, Sudhir

    2012-01-01

    A useful step towards better interpretation and analysis of the usage patterns is to formalize the semantics of the resources that users are accessing in the Web. We focus on this problem and present an approach for the semantic formalization of usage logs, which lays the basis for eective techniques of querying expressive usage patterns. We also present a query answering approach, which is useful to nd in the logs expressive patterns of usage behavior via formulation of semantic and temporal-based constraints. We have processed over 30 thousand user browsing sessions extracted from usage logs of DBPedia and Semantic Web Dog Food. All these events are formalized semantically using respective domain ontologies and RDF representations of the Web resources being accessed. We show the eectiveness of our approach through experimental results, providing in this way an exploratory analysis of the way users browse theWeb of Data.

  6. Comparative Analysis of Different LIDAR System Calibration Techniques

    Miller, M.; Habib, A.

    2016-06-01

    With light detection and ranging (LiDAR) now being a crucial tool for engineering products and on the fly spatial analysis, it is necessary for the user community to have standardized calibration methods. The three methods in this study were developed and proven by the Digital Photogrammetry Research Group (DPRG) for airborne LiDAR systems and are as follows; Simplified, Quasi-Rigorous, and Rigorous. In lieu of using expensive control surfaces for calibration, these methods compare overlapping LiDAR strips to estimate the systematic errors. These systematic errors are quantified by these methods and include the lever arm biases, boresight biases, range bias and scan angle scale bias. These three methods comprehensively represent all of the possible flight configurations and data availability and this paper will test the limits of the method with the most assumptions, the simplified calibration, by using data that violates the assumptions it's math model is based on and compares the results to the quasi-rigorous and rigorous techniques. The overarching goal is to provide a LiDAR system calibration that does not require raw measurements which can be carried out with minimal control and flight lines to reduce costs. This testing is unique because the terrain used for calibration does not contain gable roofs, all other LiDAR system calibration testing and development has been done with terrain containing features with high geometric integrity such as gable roofs.

  7. Comparative analysis on flexibility requirements of typical cryogenic transfer lines

    The cryogenic systems and their applications, primarily in large Fusion devices, utilize multiple cryogen transfer lines of various sizes and complexities in terms of layout to transfer cryogenic fluids from plant to the various user/applications. These transfer lines are composed of various critical sections like tee sections, elbows, flexible components etc. The mechanical sustainability (under failure circumstances) of these transfer lines are primary requirement for safe operation of the system and applications. The transfer lines need to be designed for multiple design constraint conditions like line layout, support locations and space restrictions. The transfer lines are subjected to single load and multiple load combinations, such as operational loads, seismic loads, leak in insulation vacuum etc. The analytical calculations and flexibility analysis using CAESAR II software are performed for the typical transfer lines without any flexible component, the results were analysed for functional and mechanical load conditions. The failure modes were identified along the critical sections. The same transfer line was then refurbished with the flexible components and analysed for failure modes. Inclusion of these components provides additional flexibility to the transfer line system and makes it safe. The optimization was performed by selection of the appropriate flexible components to meet the design requirements as per ASME B31.3/EN 13480 codes. This paper describes the results obtained from the analytical calculations, which are compared and validated with those obtained from the flexibility analysis software calculations. (author)

  8. User-centered design of a computerized HRQoL questionnaire : qualitative analysis of user needs and prototype evaluation

    Syropoulos, Nikolaos

    2014-01-01

    The digitalization of Health-Related Quality of Life instruments as well as the design of computerized systems according to user needs can improve the usability and fulfil the expectations of the end-users. The purpose of this study is to identify the features that a computerized Health-Related Quality of Life questionnaire, which has been designed through Questionnaire Service, should support, when it is being used from the patients and the healthcare providers. These features come from the ...

  9. AUDITOR ROTATION - A CRITICAL AND COMPARATIVE ANALYSIS

    Mocanu Mihaela

    2011-12-01

    Full Text Available The present paper starts out from the challenge regarding auditor tenure launched in 2010 by the Green Paper of the European Commission Audit Policy: Lessons from the Crisis. According to this document, the European Commission speaks both in favor of the mandatory rotation of the audit firm, and in favor of the mandatory rotation of audit partners. Rotation is considered a solution to mitigate threats to independence generated by familiarity, intimidation and self-interest in the context of a long-term audit-client relationship. At international level, there are several studies on auditor rotation, both empirical (e.g. Lu and Sivaramakrishnan, 2009, Li, 2010, Kaplan and Mauldin, 2008, Jackson et al., 2008 and normative in nature (e.g. Marten et al., 2007, Muller, 2006 and Gelter, 2004. The objective of the present paper is to perform a critical and comparative analysis of the regulations on internal and external rotation in force at international level, in the European Union and in the United States of America. Moreover, arguments both in favor and against mandatory rotation are brought into discussion. With regard to the research design, the paper has a normative approach. The main findings are first of all that by comparison, all regulatory authorities require internal rotation at least in the case of public interest entities, while the external rotation is not in the focus of the regulators. In general, the most strict and detailed requirements are those issued by the Securities and Exchange Commission from the United States of America. Second of all, in favor of mandatory rotation speaks the fact that the auditor becomes less resilient in case of divergence of opinions between him and company management, less stimulated to follow his own interest, and more scrupulous in conducting the audit. However, mandatory rotation may also have negative consequences, thus the debate on the opportunity of this regulatory measure remains open-ended.

  10. A comparative analysis of influenza vaccination programs.

    Shweta Bansal

    2006-10-01

    Full Text Available BACKGROUND: The threat of avian influenza and the 2004-2005 influenza vaccine supply shortage in the United States have sparked a debate about optimal vaccination strategies to reduce the burden of morbidity and mortality caused by the influenza virus. METHODS AND FINDINGS: We present a comparative analysis of two classes of suggested vaccination strategies: mortality-based strategies that target high-risk populations and morbidity-based strategies that target high-prevalence populations. Applying the methods of contact network epidemiology to a model of disease transmission in a large urban population, we assume that vaccine supplies are limited and then evaluate the efficacy of these strategies across a wide range of viral transmission rates and for two different age-specific mortality distributions. We find that the optimal strategy depends critically on the viral transmission level (reproductive rate of the virus: morbidity-based strategies outperform mortality-based strategies for moderately transmissible strains, while the reverse is true for highly transmissible strains. These results hold for a range of mortality rates reported for prior influenza epidemics and pandemics. Furthermore, we show that vaccination delays and multiple introductions of disease into the community have a more detrimental impact on morbidity-based strategies than mortality-based strategies. CONCLUSIONS: If public health officials have reasonable estimates of the viral transmission rate and the frequency of new introductions into the community prior to an outbreak, then these methods can guide the design of optimal vaccination priorities. When such information is unreliable or not available, as is often the case, this study recommends mortality-based vaccination priorities.

  11. Premo and Kansei: A Comparative Analysis

    Anitawati Mohd Lokman

    2013-04-01

    Full Text Available Kansei Engineering is a technology that enables incorporation of human emotion in design requirements. It has in its perspective that Kansei is unique for different domain and unique for different target user group, and use mainly a verbal measurement instruments in its methodology. The technology is seen to have little shortcoming when there is a need to build universal design for universal target user. It will involve complexity when handling semantics since people do not speak the same language across the planet. Hence, a non-verbal emotion measurement tool is assumed to enhance the capability of K.E. in managing universal Kansei. This study aims to investigate the possibility of integrating PrEmo, a non-verbal self-reporting tool which were developed based on studies across culture and demographical setting into Kansei Engineering. The objectives are to analyze the similarities and differences of Kansei structure resulted by two different measurement tools, non-verbal (PrEmo and verbal (Kansei checklist self-reporting instrument, to provide hypothetical evidence of the feasibility of PrEmo as a tool to measure Kansei. 10 websites with significant visual design differences were used as stimuli in the evaluation procedure involving 30 respondents, who provided their Kansei responses using both instruments. The result has shown that the Kansei structure by both instruments are mostly similar, thus provide hypothetical evidence that PrEmo could be used as non-verbal self-reporting instrument to measure Kansei. The findings provide insights into future research for integration of universal Kansei.

  12. The technique of creation of simulation systems with experiments design and analysis controlled by user's questions

    G. Magariu

    1999-01-01

    The article describes a technique of creation of simulation systems with elements of intellectual handle of experiments initiation and analysis results of experiments. The technique proposed is based on user question analyzing, determination of necessary simulation experiments for the question answering, initiation of these experiments running and obtaining the answer on the basis of experiments results.

  13. Inventory of activation analysis facilities available in the European Community to Industrial users

    This inventory includes lists of activation equipment produced in the European Community, facilities available for industrial users and activation laboratories existing in the European companies. The aim of this inventory is to provide all information that may be useful, to companies interested in activation analysis, as well as to give an idea on existing routine applications and on the European market in facilities

  14. Digital Avionics Information System (DAIS): Training Requirements Analysis Model Users Guide. Final Report.

    Czuchry, Andrew J.; And Others

    This user's guide describes the functions, logical operations and subroutines, input data requirements, and available outputs of the Training Requirements Analysis Model (TRAMOD), a computerized analytical life cycle cost modeling system for use in the early stages of system design. Operable in a stand-alone mode, TRAMOD can be used for the…

  15. User-Centric Approach for Benchmark RDF Data Generator in Big Data Performance Analysis

    Purohit, Sumit; Paulson, Patrick R.; Rodriguez, Luke R.

    2016-02-05

    This research focuses on user-centric approach of building such tools and proposes a flexible, extensible, and easy to use framework to support performance analysis of Big Data systems. Finally, case studies from two different domains are presented to validate the framework.

  16. Comparing Results from Constant Comparative and Computer Software Methods: A Reflection about Qualitative Data Analysis

    Putten, Jim Vander; Nolen, Amanda L.

    2010-01-01

    This study compared qualitative research results obtained by manual constant comparative analysis with results obtained by computer software analysis of the same data. An investigated about issues of trustworthiness and accuracy ensued. Results indicated that the inductive constant comparative data analysis generated 51 codes and two coding levels…

  17. Three looks at users: a comparison of methods for studying digital library use. User studies, Digital libraries, Digital music libraries, Music, Information use, Information science, Contextual inquiry, Contextual design, User research, Questionnaires, Log file analysis

    Mark Notess

    2004-01-01

    Compares three user research methods of studying real-world digital library usage within the context of the Variations and Variations2 digital music libraries at Indiana University. After a brief description of both digital libraries, each method is described and illustrated with findings from the studies. User satisfaction questionnaires were used in two studies, one of Variations (n=30) and the other of Variations2 (n=12). Second, session activity log files were examined for 175 Variations2...

  18. A novel R-package graphic user interface for the analysis of metabonomic profiles

    Villa Palmira

    2009-10-01

    Full Text Available Abstract Background Analysis of the plethora of metabolites found in the NMR spectra of biological fluids or tissues requires data complexity to be simplified. We present a graphical user interface (GUI for NMR-based metabonomic analysis. The "Metabonomic Package" has been developed for metabonomics research as open-source software and uses the R statistical libraries. Results The package offers the following options: Raw 1-dimensional spectra processing: phase, baseline correction and normalization. Importing processed spectra. Including/excluding spectral ranges, optional binning and bucketing, detection and alignment of peaks. Sorting of metabolites based on their ability to discriminate, metabolite selection, and outlier identification. Multivariate unsupervised analysis: principal components analysis (PCA. Multivariate supervised analysis: partial least squares (PLS, linear discriminant analysis (LDA, k-nearest neighbor classification. Neural networks. Visualization and overlapping of spectra. Plot values of the chemical shift position for different samples. Furthermore, the "Metabonomic" GUI includes a console to enable other kinds of analyses and to take advantage of all R statistical tools. Conclusion We made complex multivariate analysis user-friendly for both experienced and novice users, which could help to expand the use of NMR-based metabonomics.

  19. E-learning interventions are comparable to user's manual in a randomized trial of training strategies for the AGREE II

    Durocher Lisa D

    2011-07-01

    Full Text Available Abstract Background Practice guidelines (PGs are systematically developed statements intended to assist in patient and practitioner decisions. The AGREE II is the revised tool for PG development, reporting, and evaluation, comprised of 23 items, two global rating scores, and a new User's Manual. In this study, we sought to develop, execute, and evaluate the impact of two internet interventions designed to accelerate the capacity of stakeholders to use the AGREE II. Methods Participants were randomized to one of three training conditions. 'Tutorial'--participants proceeded through the online tutorial with a virtual coach and reviewed a PDF copy of the AGREE II. 'Tutorial + Practice Exercise'--in addition to the Tutorial, participants also appraised a 'practice' PG. For the practice PG appraisal, participants received feedback on how their scores compared to expert norms and formative feedback if scores fell outside the predefined range. 'AGREE II User's Manual PDF (control condition'--participants reviewed a PDF copy of the AGREE II only. All participants evaluated a test PG using the AGREE II. Outcomes of interest were learners' performance, satisfaction, self-efficacy, mental effort, time-on-task, and perceptions of AGREE II. Results No differences emerged between training conditions on any of the outcome measures. Conclusions We believe these results can be explained by better than anticipated performance of the AGREE II PDF materials (control condition or the participants' level of health methodology and PG experience rather than the failure of the online training interventions. Some data suggest the online tools may be useful for trainees new to this field; however, this requires further study.

  20. Tool for Turbine Engine Closed-Loop Transient Analysis (TTECTrA) Users' Guide

    Csank, Jeffrey T.; Zinnecker, Alicia M.

    2014-01-01

    The tool for turbine engine closed-loop transient analysis (TTECTrA) is a semi-automated control design tool for subsonic aircraft engine simulations. At a specific flight condition, TTECTrA produces a basic controller designed to meet user-defined goals and containing only the fundamental limiters that affect the transient performance of the engine. The purpose of this tool is to provide the user a preliminary estimate of the transient performance of an engine model without the need to design a full nonlinear controller.

  1. Advanced Techniques in Web Intelligence-2 Web User Browsing Behaviour and Preference Analysis

    Palade, Vasile; Jain, Lakhmi

    2013-01-01

    This research volume focuses on analyzing the web user browsing behaviour and preferences in traditional web-based environments, social  networks and web 2.0 applications,  by using advanced  techniques in data acquisition, data processing, pattern extraction and  cognitive science for modeling the human actions.  The book is directed to  graduate students, researchers/scientists and engineers  interested in updating their knowledge with the recent trends in web user analysis, for developing the next generation of web-based systems and applications.

  2. Code development and analysis program. RELAP4/MOD7 (Version 2): user's manual

    None

    1978-08-01

    This manual describes RELAP4/MOD7 (Version 2), which is the latest version of the RELAP4 LPWR blowdown code. Version 2 is a precursor to the final version of RELAP4/MOD7, which will address LPWR LOCA analysis in integral fashion (i.e., blowdown, refill, and reflood in continuous fashion). This manual describes the new code models and provides application information required to utilize the code. It must be used in conjunction with the RELAP4/MOD5 User's Manual (ANCR-NUREG-1335, dated September 1976), and the RELAP4/MOD6 User's Manual (CDAP-TR-003, dated January 1978).

  3. Comparative proteomic analysis of Clostridium difficile

    Chilton, Caroline Hazel

    2011-01-01

    The recent increase in availability of next generation sequencing methodologies has led to extensive analysis of the genome of Clostridium difficile. In contrast, protein expression analysis, crucial to the elucidation of mechanisms of disease, has severely lagged behind. In this study, in-depth proteomic analysis of three strains of varying virulence, demonstrated previously in an animal model, has been undertaken against a background of the sequenced genomes. Strain B-1 is ...

  4. Nestedness for Dummies (NeD): a User Friendly Web Interface for Exploratory Nestedness Analysis

    Giovanni Strona; Paolo Galli; Davide Seveso; Simone Montano; Simone Fattorini

    2014-01-01

    Recent theoretical advances in nestedness analysis have led to the introduction of several alternative metrics to overcome most of the problems biasing the use of matrix 'temperature' calculated by Atmar's Nestedness Temperature Calculator. However, all of the currently available programs for nestedness analysis lack the user friendly appeal that has made the Nestedness Temperature Calculator one of the most popular community ecology programs. The software package NeD is an intuitive open sou...

  5. AITRAC: Augmented Interactive Transient Radiation Analysis by Computer. User's information manual

    AITRAC is a program designed for on-line, interactive, DC, and transient analysis of electronic circuits. The program solves linear and nonlinear simultaneous equations which characterize the mathematical models used to predict circuit response. The program features 100 external node--200 branch capability; conversional, free-format input language; built-in junction, FET, MOS, and switch models; sparse matrix algorithm with extended-precision H matrix and T vector calculations, for fast and accurate execution; linear transconductances: beta, GM, MU, ZM; accurate and fast radiation effects analysis; special interface for user-defined equations; selective control of multiple outputs; graphical outputs in wide and narrow formats; and on-line parameter modification capability. The user describes the problem by entering the circuit topology and part parameters. The program then automatically generates and solves the circuit equations, providing the user with printed or plotted output. The circuit topology and/or part values may then be changed by the user, and a new analysis, requested. Circuit descriptions may be saved on disk files for storage and later use. The program contains built-in standard models for resistors, voltage and current sources, capacitors, inductors including mutual couplings, switches, junction diodes and transistors, FETS, and MOS devices. Nonstandard models may be constructed from standard models or by using the special equations interface. Time functions may be described by straight-line segments or by sine, damped sine, and exponential functions. 42 figures, 1 table

  6. Human Capital Development: Comparative Analysis of BRICs

    Ardichvili, Alexandre; Zavyalova, Elena; Minina, Vera

    2012-01-01

    Purpose: The goal of this article is to conduct macro-level analysis of human capital (HC) development strategies, pursued by four countries commonly referred to as BRICs (Brazil, Russia, India, and China). Design/methodology/approach: This analysis is based on comparisons of macro indices of human capital and innovativeness of the economy and a…

  7. Performance Analysis of New Binary User Codes for DS-CDMA Communication

    Usha, Kamle; Jaya Sankar, Kottareddygari

    2016-03-01

    This paper analyzes new binary spreading codes through correlation properties and also presents their performance over additive white Gaussian noise (AWGN) channel. The proposed codes are constructed using gray and inverse gray codes. In this paper, a n-bit gray code appended by its n-bit inverse gray code to construct the 2n-length binary user codes are discussed. Like Walsh codes, these binary user codes are available in sizes of power of two and additionally code sets of length 6 and their even multiples are also available. The simple construction technique and generation of code sets of different sizes are the salient features of the proposed codes. Walsh codes and gold codes are considered for comparison in this paper as these are popularly used for synchronous and asynchronous multi user communications respectively. In the current work the auto and cross correlation properties of the proposed codes are compared with those of Walsh codes and gold codes. Performance of the proposed binary user codes for both synchronous and asynchronous direct sequence CDMA communication over AWGN channel is also discussed in this paper. The proposed binary user codes are found to be suitable for both synchronous and asynchronous DS-CDMA communication.

  8. Baby boomers as future care users--An analysis of expectations in print media.

    Jönson, Håkan; Jönsson, Anders

    2015-08-01

    The aim of the study was to investigate media presentations of baby boomers as future care users. The Swedish baby boomer generation, born in the 1940s, and known as the '40s generation, has been characterized as youthful and powerful, and a question investigated in the study was whether boomers are supposed to display these characteristics as care users. We analyzed 481 articles in Swedish newspapers, published between 1995 and 2012, with a qualitative content analysis. The results showed that the '40s generation was predicted to become a new breed of demanding, self-aware care users. These claims were supported by descriptions of the formative events and typical characteristics of these individuals, which were then projected onto their future behavior as care users. Such projections tended to portray contemporary care users as passive, submissive, and partly responsible for problems associated with elder care. Consequently, approaches that focus on differences between cohorts need to incorporate a constructionist dimension to highlight the problem of generationism. PMID:26162728

  9. Formal Model for Data Dependency Analysis between Controls and Actions of a Graphical User Interface

    SKVORC, D.

    2012-02-01

    Full Text Available End-user development is an emerging computer science discipline that provides programming paradigms, techniques, and tools suitable for users not trained in software engineering. One of the techniques that allow ordinary computer users to develop their own applications without the need to learn a classic programming language is a GUI-level programming based on programming-by-demonstration. To build wizard-based tools that assist users in application development and to verify the correctness of user programs, a computer-supported method for GUI-level data dependency analysis is necessary. Therefore, formal model for GUI representation is needed. In this paper, we present a finite state machine for modeling the data dependencies between GUI controls and GUI actions. Furthermore, we present an algorithm for automatic construction of finite state machine for arbitrary GUI application. We show that proposed state aggregation scheme successfully manages state explosion in state machine construction algorithm, which makes the model applicable for applications with complex GUIs.

  10. Discovery and Analysis of Intersecting Datasets: JMARS as a Comparative Science Platform

    Carter, S.; Christensen, P. R.; Dickenshied, S.; Anwar, S.; Noss, D.

    2014-12-01

    A great deal can be discovered from comparing and studying a chosen region or area on a planetary body. In this age, science has an enormous amount of instruments and data to study from; often the first obstacle can be finding the right information. Developed at Arizona State University, Java Mission-planning and Analysis for Remote Sensing (JMARS), enables users to easily find and study related datasets. JMARS supports a long list of planetary bodies in our solar system, including Earth, the Moon, Mars, and other planets, satellites, and asteroids. Within JMARS a user can start with a particular area and search for all datasets that have images/information intersecting that region of interest. Once a user has found data they are interested in comparing, they can view the image at once and see the numeric information at that location. This information can be analyzed in a few powerful ways. If the dataset of interest varies with time but the location stays constant, then the user may want to compare specific locations through time. This can be done the Investigate Tool in JMARS. Users can create a Data Spike and the information at that point will be plotted through time. If the region does not have a temporal dataset, then a different method would be suitable and involves a profile line. Also using the Investigate Tool, a user can create a Data Profile (a line which can contain as many vertices as necessary) and all numeric data underneath the line will be plotted on one graph for easy comparison. This can be used to compare differences between similar datasets - perhaps the same measurement but from different instruments - or to find correlations from one dataset to another. A third form of analysis is planned for future development. This method involves entire areas (polygons). Sampling of the different data sources beneath an area can reveal statistics like maximum, minimum, and average values, and standard deviation. These values can be compared to other data

  11. Following User Pathways: Cross Platform and Mixed Methods Analysis in Social Media Studies

    Hall, Margeret; Mazarakis, Athanasios; Peters, Isabella; Chorley, Martin; Simon, Simon; Mai, Jens-Erik; Strohmaier, Markus

    2016-01-01

    Social media and the resulting tidal wave of available data have changed the ways and methods researchers analyze communities at scale. But the full potential for social scientists (and others) is not yet achieved. Despite the popularity of social media analysis in the past decade, few researchers...... challenges and opportunities of crossplatform, mixed method analysis in social media ecosystems....... invest in cross-platform analyses. This is a major oversight as 42% of Online Social Media users have multiple social media accounts. Missing are the models and tools necessary to undertake analysis at scale across multiple platforms. Especially promising in support of cross-platform analysis is the...

  12. CORCON-MOD3: An integrated computer model for analysis of molten core-concrete interactions. User`s manual

    Bradley, D.R.; Gardner, D.R.; Brockmann, J.E.; Griffith, R.O. [Sandia National Labs., Albuquerque, NM (United States)

    1993-10-01

    The CORCON-Mod3 computer code was developed to mechanistically model the important core-concrete interaction phenomena, including those phenomena relevant to the assessment of containment failure and radionuclide release. The code can be applied to a wide range of severe accident scenarios and reactor plants. The code represents the current state of the art for simulating core debris interactions with concrete. This document comprises the user`s manual and gives a brief description of the models and the assumptions and limitations in the code. Also discussed are the input parameters and the code output. Two sample problems are also given.

  13. MANAGEMENT AND COMPARATIVE ANALYSIS OF DATASET ENSEMBLES

    Geveci, Berk [Senior Director, Scientific Computing

    2010-05-17

    The primary Phase I technical objective was to develop a prototype that demonstrates the functionality of all components required for an end-to-end meta-data management and comparative visualization system.

  14. Comparative Analysis of Competitive Strategy Implementation

    Maina A. S. Waweru

    2011-01-01

    This paper presents research findings on Competitive Strategy Implementation which compared the levels of strategy implementation achieved by different generic strategy groups, comprising firms inclined towards low cost leadership, differentiation or dual strategic advantage.  The study sought to determine the preferences for use of implementation armaments and compared how such armaments related to the level of implementation achieved.   Respondents comprised 71 top executives from 59 compan...

  15. Comparative Analysis of Competitive Strategy Implementation

    Maina A. S. Waweru

    2011-01-01

    This paper presents research findings on Competitive Strategy Implementation which compared the levels of strategy implementation achieved by different generic strategy groups, comprising firms inclined towards low cost leadership, differentiation or dual strategic advantage. The study sought to determine the preferences for use of implementation armaments and compared how such armaments related to the level of implementation achieved. Respondents comprised 71 top executives from 59 companies...

  16. Integrated Reliability and Risk Analysis System (IRRAS) Version 2.0 user's guide

    The Integrated Reliability and Risk Analysis System (IRRAS) is a state-of-the-art, microcomputer-based probabilistic risk assessment (PRA) model development and analysis tool to address key nuclear plant safety issues. IRRAS is an integrated software tool that gives the user the ability to create and analyze fault trees and accident sequences using a microcomputer. This program provides functions that range from graphical fault tree construction to cut set generation and quantification. Also provided in the system is an integrated full-screen editor for use when interfacing with remote mainframe computer systems. Version 1.0 of the IRRAS program was released in February of 1987. Since that time, many user comments and enhancements have been incorporated into the program providing a much more powerful and user-friendly system. This version has been designated IRRAS 2.0 and is the subject of this user's guide. Version 2.0 of IRRAS provides all of the same capabilities as Version 1.0 and adds a relational data base facility for managing the data, improved functionality, and improved algorithm performance. 9 refs., 292 figs., 4 tabs

  17. Alice Meets Bob: A Comparative Usability Study of Wireless Device Pairing Methods for a "Two-User" Setting

    Kumar, Arun; Uzun, Ersin

    2009-01-01

    When users want to establish wireless communication between/among their devices, the channel has to be bootstrapped first. To prevent any malicious control of or eavesdropping over the communication, the channel is desired to be authenticated and confidential. The process of setting up a secure communication channel between two previously unassociated devices is referred to as "Secure Device Pairing". When there is no prior security context, e.g., shared secrets, common key servers or public key certificates, device pairing requires user involvement into the process. The idea usually involves leveraging an auxiliary human-perceptible channel to authenticate the data exchanged over the insecure wireless channel. We observe that the focus of prior research has mostly been limited to pairing scenarios where a single user controls both the devices. In this paper, we consider more general and emerging "two-user" scenarios, where two different users establish pairing between their respective devices. Although a num...

  18. Comparison of three commercially available DIGE analysis software packages: minimal user intervention in gel-based proteomics.

    Kang, Yunyi; Techanukul, Tanasit; Mantalaris, Anthanasios; Nagy, Judit M

    2009-02-01

    The success of high-performance differential gel electrophoresis using fluorescent dyes (DIGE) depends on the quality of the digital image captured after electrophoresis, the DIGE enabled image analysis software tool chosen for highlighting the differences, and the statistical analysis. This study compares three commonly available DIGE enabled software packages for the first time: DeCyder V6.5 (GE-Healthcare), Progenesis SameSpots V3.0 (Nonlinear Dynamics), and Dymension 3 (Syngene). DIGE gel images of cell culture media samples conditioned by HepG2 and END2 cell lines were used to evaluate the software packages both quantitatively and subjectively considering ease of use with minimal user intervention. Consistency of spot matching across the three software packages was compared, focusing on the top fifty spots ranked statistically by each package. In summary, Progenesis SameSpots outperformed the other two software packages in matching accuracy, possibly being benefited by its new approach: that is, identical spot outline across all the gels. Interestingly, the statistical analysis of the software packages was not consistent on account of differences in workflow, algorithms, and default settings. Results obtained for protein fold changes were substantially different in each package, which indicates that in spite of using internal standards, quantification is software dependent. A future research goal must be to reduce or eliminate user controlled settings, either by automatic sample-to-sample optimization by intelligent software, or by alternative parameter-free segmentation methods. PMID:19133722

  19. Advertisement Analysis: A Comparative Critical Study

    Noureldin Mohamed Abdelaal

    2014-12-01

    Full Text Available This study aimed at analyzing two advertisements, and investigating how advertisers use discourse and semiotics to make people and customers buy into their ideas, beliefs, or simply their products. The two advertisements analyzed are beauty products which have been selected from internet magazines. The methodology adopted in this study is qualitative method. The first advertisement is analyzed qualitatively in terms of content; there was no focus on a specific theoretical frame work, while the second advertisement analysis is based on Fairclough’s framework, the critical discourse analysis framework.  

  20. Wellness Model of Supervision: A Comparative Analysis

    Lenz, A. Stephen; Sangganjanavanich, Varunee Faii; Balkin, Richard S.; Oliver, Marvarene; Smith, Robert L.

    2012-01-01

    This quasi-experimental study compared the effectiveness of the Wellness Model of Supervision (WELMS; Lenz & Smith, 2010) with alternative supervision models for developing wellness constructs, total personal wellness, and helping skills among counselors-in-training. Participants were 32 master's-level counseling students completing their…

  1. Comparative analysis of Orem's and King's theories.

    Hanucharurnkul, S

    1989-05-01

    Dorothea Orem and Imogene King are two nursing theorists who are contributing significantly to the development of nursing knowledge. This paper compares the similarities and differences in their strategies for theory development, their views of nursing metaparadigm concepts, and their theories of nursing system and goal attainment in terms of scope, usefulness, and their unique contribution to nursing science. PMID:2738232

  2. Corporate Social Responsibility: A Global Comparative Analysis

    Mikalsen, Maiken Foss

    2014-01-01

    The topic of the thesis is development of corporate social responsibility in a global context, and examines and compares the practice of CSR in different countries around the world. Furthermore, the question of whether or not CSR should be regulated by law is discussed.

  3. Comparative Distributions of Hazard Modeling Analysis

    Rana Abdul Wajid

    2006-07-01

    Full Text Available In this paper we present the comparison among the distributions used in hazard analysis. Simulation technique has been used to study the behavior of hazard distribution modules. The fundamentals of Hazard issues are discussed using failure criteria. We present the flexibility of the hazard modeling distribution that approaches to different distributions.

  4. General Education Requirements: A Comparative Analysis

    Warner, Darrell B.; Koeppel, Katie

    2009-01-01

    While "general education" is a phrase heavily used in higher education, Leskes and Wright note that it has multiple meanings: it can refer to those courses that a college or university requires all of its students must pass as a condition for graduation, a common curriculum, a distribution requirement, or even core texts. This analysis of general…

  5. Analysis of chromosomal aberrations, micronuclei and hematological disorders among workers of wireless communication instruments and cell phone (Mobile) users

    This study was carried out to investigate the hazardous effect of electromagnetic radiation (EMR) such as chromosomal aberration, disturbed micronucleus formation and hematological disorders that may detected among workers of wireless communication instruments and mobile phone users. Seven individuals ( 3 males and 4 females) of a central workers in the microwave unit of the wireless station and 7 users of Mobil phone (4 males and 3 females ) were volunteered to give blood samples. Chromosomes and micronucleus were prepared for cytogenetic analysis as well as blood film for differential count. The results obtained in the microwave group indicated that, the total summation of all types of aberrations (chromosomes and chromatid aberrations) had a frequency of 6. 14% for the exposed group, whereas, the frequency in the control group amounted to 1.57%. In Mobil phone users, the total summation of all types of aberrations(chromosome and chromatid aberrations) had a frequency of 4.43% for the exposed group and 1.71% for the control group. The incidence of the total number of micronuclei in the exposed microwave group was increased 4.3 folds as compared with those of the control group The incidence of the total number of micronuclei in the exposed mobile phone group was increased 2 fold as compared with those in the control group. On the other hand, normal ranges of total white blood cells counts were determined for mobile phone users but abnormalities in the differential counts of the different types of the white blood cells such as neutropenia, eosinophilia and lymphocytosis were observed in the individuals number 1,2,3,7 in microwave group

  6. Industrialization Lessons from BRICS: A Comparative Analysis

    Naudé, Wim A.; Szirmai, Adam; Lavopa, Alejandro

    2013-01-01

    To date there has been few systematic and comparative empirical analyses of the nature of economic development in Brazil, Russia, India, China and South Africa (BRICS). We contribute to addressing this gap by exploring the patterns of structural change between 1980 and 2010, focusing on the manufacturing sector. We show that three of the BRICS are experiencing de-industrialization (Brazil, Russia and South Africa). China is the only country where an expanding manufacturing sector accounts for...

  7. Comparative Analysis of Frames with Varying Inertia

    Prerana Nampalli; Prakarsh Sangave

    2015-01-01

    This paper presents an elastic seismic response of reinforced concrete frames with 3 variations of heights, i.e. (G+2), (G+4), (G+6) storey models are compared for bare frame and frame with brick infill structures which have been analyzed for gravity as well as seismic forces and their response is studied as the geometric parameters varying from view point of predicting behavior of similar structures subjected to similar loads or load combinations. In this study, two different cas...

  8. Loss Given Default Modelling: Comparative Analysis

    Yashkir, Olga; Yashkir, Yuriy

    2013-01-01

    In this study we investigated several most popular Loss Given Default (LGD) models (LSM, Tobit, Three-Tiered Tobit, Beta Regression, Inflated Beta Regression, Censored Gamma Regression) in order to compare their performance. We show that for a given input data set, the quality of the model calibration depends mainly on the proper choice (and availability) of explanatory variables (model factors), but not on the fitting model. Model factors were chosen based on the amplitude of their correlati...

  9. Comparative study of FDMA, TDMA and hybrid 30/20 GHz satellite communications systems for small users

    Berk, G.; Jean, P. N.; Rotholz, E.

    1982-01-01

    This study compares several satellite uplink and downlink accessing schemes for a Customer Premises Service. Four conceptual system designs are presented: Satellite-Routed FDMA, Frequency-Routed TDMA, Satellite-Switched TDMA, and Processor-Routed TDMA, operating in the 30/20 GHz band. The designs are compared on the basis of estimated satellite weight, power consumption, and cost. The system capacities are analyzed for a fixed multibeam coverage of CONUS. Analysis shows that the system capacity is limited by the available satellite resources and by the terminal size and cost.

  10. Analysis and comparation of animation techniques

    Joštová, Barbora

    2015-01-01

    This thesis is focused on the analysis and comparison of animation techniques. In the theoretical part of the thesis I define key terms, the historical development and the basic principles of animation techniques. In the practical part I describe the comparison between classic and digital types of animation. Based on this research I chose the most suitable animations that are further used to verify my hypothesis. The provided hypothesis is the order of based on how demanding it is in terms of...

  11. Measuring Sustainability of Nations: a Comparative Analysis

    Priscilla Altili; Pietro Zoppoli

    2011-01-01

    Indicators are widely used tools to measure the sustainability of a nation. The definition of a concept conditions strictly its measurement. Therefore, the definition of sustainable development will determine the appropriateness of the variables selected for its measurement. The first part of this paper explores why it is impossible, at the present time, to have a univocal definition and measurement of sustainable development. The second part is devoted to the analysis of a set of synthetic i...

  12. Towards for Analyzing Alternatives of Interaction Design Based on Verbal Decision Analysis of User Experience

    Marília Soares Mendes

    2010-04-01

    Full Text Available In domains (as digital TV, smart home, and tangible interfaces that represent a new paradigm of interactivity, the decision of the most appropriate interaction design solution is a challenge. HCI researchers have promoted in their works the validation of design alternative solutions with users before producing the final solution. User experience with technology is a subject that has also gained ground in these works in order to analyze the appropriate solution(s. Following this concept, a study was accomplished under the objective of finding a better interaction solution for an application of mobile TV. Three executable applications of mobile TV prototypes were built. A Verbal Decision Analysis model was applied on the investigations for the favorite characteristics in each prototype based on the user’s experience and their intentions of use. This model led a performance of a qualitative analysis which objectified the design of a new prototype.

  13. [Amaranth flour: characteristics, comparative analysis, application possibilities].

    Zharkov, I M; Miroshnichenko, L A; Zviagin, A A; Bavykina, I A

    2014-01-01

    Amaranth flour--a product of amaranth seeds processing--is a valuable industrial raw material that has an unique chemical composition and may be used for nutrition of people suffering from intolerance to traditional cereals protein, including celiac disease patients. The research aim was to study the composition of amaranth flour of two types compared with semolina which is traditionally used for nutrition by Russian population, as well as to compare the composition of milk amaranth flour porridge with milk semolina porridge. The composition of amaranth whole-ground flour and amaranth flour of premium grade processed from amaranth seeds grown in Voronezh region has been researched. It is to be noted that protein content in amaranth flour was 10.8-24.3% higher than in semolina, and its biological value and NPU-coefficient were higher by 22.65 and 46.51% respectively; lysine score in amaranth flour protein of premium grade came up to 107.54%, and in semolina protein only 40.95%. The level of digestible carbohydrates, including starch, was lower in amaranth flour than in semolina by 2.79-12.85 and 4.76-15.85% respectively, while fiber content was 15.5-30 fold higher. Fat content in amaranth flour of premium grade was 2,4 fold lower than in whole-ground amaranth flour but it was 45% higher than in semolina. The main advantage of amaranth flour protein compared to wheat protein is the predominance of albumins and globulins and a minimal content of prolamines and alpha-gliadin complete absence. The specifics of chemical composition allow the amaranth flour to be recommended for being included into nutrition of both healthy children and adults and also celiac disease patients. PMID:25059059

  14. Comparative analysis of some search engines

    Taiwo O. Edosomwan; Joseph Edosomwan

    2010-01-01

    We compared the information retrieval performances of some popular search engines (namely, Google, Yahoo, AlltheWeb, Gigablast, Zworks and AltaVista and Bing/MSN) in response to a list of ten queries, varying in complexity. These queries were run on each search engine and the precision and response time of the retrieved results were recorded. The first ten documents on each retrieval output were evaluated as being ‘relevant’ or ‘non-relevant’ for evaluation of the search engine’s precision. T...

  15. Knowledge Level Assessment in e-Learning Systems Using Machine Learning and User Activity Analysis

    Nazeeh Ghatasheh

    2015-01-01

    Electronic Learning has been one of the foremost trends in education so far. Such importance draws the attention to an important shift in the educational paradigm. Due to the complexity of the evolving paradigm, the prospective dynamics of learning require an evolution of knowledge delivery and evaluation. This research work tries to put in hand a futuristic design of an autonomous and intelligent e-Learning system. In which machine learning and user activity analysis play the role of an auto...

  16. Shape Analysis of 3D Head Scan Data for U.S. Respirator Users

    Stephanie Lynch; Viscusi, Dennis J.; Stacey Benson; Slice, Dennis E.; Ziqing Zhuang

    2010-01-01

    In 2003, the National Institute for Occupational Safety and Health (NIOSH) conducted a head-and-face anthropometric survey of diverse, civilian respirator users. Of the 3,997 subjects measured using traditional anthropometric techniques, surface scans and 26 three-dimensional (3D) landmark locations were collected for 947 subjects. The objective of this study was to report the size and shape variation of the survey participants using the 3D data. Generalized Procrustes Analysis (GPA) was con...

  17. Crawling Ajax-based Web Applications through Dynamic Analysis of User Interface State Changes

    Mesbah, A.; Van Deursen, A.; Lenselink, S.

    2011-01-01

    Using JavaScript and dynamic DOM manipulation on the client-side of web applications is becoming a widespread approach for achieving rich interactivity and responsiveness in modern web applications. At the same time, such techniques, collectively known as Ajax, shatter the metaphor of web ‘pages’ with unique URLs, on which traditional web crawlers are based. This paper describes a novel technique for crawling Ajax-based applications through automatic dynamic analysis of user interface state c...

  18. Configuration Analysis Tool (CAT). System Description and users guide (revision 1)

    Decker, W.; Taylor, W.; Mcgarry, F. E.; Merwarth, P.

    1982-01-01

    A system description of, and user's guide for, the Configuration Analysis Tool (CAT) are presented. As a configuration management tool, CAT enhances the control of large software systems by providing a repository for information describing the current status of a project. CAT provides an editing capability to update the information and a reporting capability to present the information. CAT is an interactive program available in versions for the PDP-11/70 and VAX-11/780 computers.

  19. An analysis of search-based user interaction on the semantic web

    Hildebrand, M; Ossenbruggen, van, Jacco; Hardman, HL Lynda

    2007-01-01

    Many Semantic Web applications provide access to their resources through text-based search queries, using explicit semantics to improve the search results. This paper provides an analysis of the current state of the art in semantic search, based on 35 existing systems. We identify different types of semantic search features that are used during query construction, the core search process, the presentation of the search results and user feedback on query and results. For each of these, we cons...

  20. Comparative analysis of life insurance market

    Malynych, Anna Mykolayivna

    2011-05-01

    Full Text Available The article deals with the comprehensive analysis of statistic insight into development of the world and regional life insurance markets on the basis of macroeconomic indicators. The author located domestic life insurance market on the global scale, analyzed its development and suggested the methods to calculate the marketing life insurance index. There was also approbated the mentioned methods on database of 77 countries all over the world. The author also defined the national rating on the basis of marketing life insurance index.

  1. A comparative analysis of capacity adequacy policies

    In this paper a stochastic dynamic optimization model is used to analyze the effect of different generation adequacy policies in restructured power systems. The expansion decisions of profit-maximizing investors are simulated under a number of different market designs: Energy Only with and without a price cap, Capacity Payment, Capacity Obligation, Capacity Subscription, and Demand Elasticity. The results show that the overall social welfare is reduced compared to a centralized social welfare optimization for all policies except Capacity Subscription and Demand Elasticity. In particular, an energy only market with a low price cap leads to a significant increase in involuntary load shedding. Capacity payments and obligations give additional investment incentives and more generating capacity, but also result in a considerable transfer of wealth from consumers to producers due to the capacity payments. Increased demand elasticity increases social welfare, but also results in a transfer from producers to consumers, compared to the theoretical social welfare optimum. In contrast, the capacity subscription policy increases the social welfare, and both producers and consumers benefit. This is possible because capacity subscription explicitly utilizes differences in consumers' preferences for uninterrupted supply. This advantage must be weighed against the cost of implementation, which is not included in the model.

  2. Comparing structural decomposition analysis and index

    To analyze and understand historical changes in economic, environmental, employment or other socio-economic indicators, it is useful to assess the driving forces or determinants that underlie these changes. Two techniques for decomposing indicator changes at the sector level are structural decomposition analysis (SDA) and index decomposition analysis (IDA). For example, SDA and IDA have been used to analyze changes in indicators such as energy use, CO2-emissions, labor demand and value added. The changes in these variables are decomposed into determinants such as technological, demand, and structural effects. SDA uses information from input-output tables while IDA uses aggregate data at the sector-level. The two methods have developed quite independently, which has resulted in each method being characterized by specific, unique techniques and approaches. This paper has three aims. First, the similarities and differences between the two approaches are summarized. Second, the possibility of transferring specific techniques and indices is explored. Finally, a numerical example is used to illustrate differences between the two approaches

  3. Radiological Safety Analysis Computer (RSAC) Program Version 7.0 Users Manual

    The Radiological Safety Analysis Computer (RSAC) Program Version 7.0 (RSAC-7) is the newest version of the RSAC legacy code. It calculates the consequences of a release of radionuclides to the atmosphere. A user can generate a fission product inventory from either reactor operating history or a nuclear criticality event. RSAC-7 models the effects of high-efficiency particulate air filters or other cleanup systems and calculates the decay and ingrowth during transport through processes, facilities, and the environment. Doses are calculated for inhalation, air immersion, ground surface, ingestion, and cloud gamma pathways. RSAC-7 can be used as a tool to evaluate accident conditions in emergency response scenarios, radiological sabotage events and to evaluate safety basis accident consequences. This users manual contains the mathematical models and operating instructions for RSAC-7. Instructions, screens, and examples are provided to guide the user through the functions provided by RSAC-7. This program was designed for users who are familiar with radiological dose assessment methods

  4. Resilience and electricity systems: A comparative analysis

    Electricity systems have generally evolved based on the natural resources available locally. Few metrics exist to compare the security of electricity supply of different countries despite the increasing likelihood of potential shocks to the power system like energy price increases and carbon price regulation. This paper seeks to calculate a robust measure of national power system resilience by analysing each step in the process of transformation from raw energy to consumed electricity. Countries with sizeable deposits of mineral resources are used for comparison because of the need for electricity-intensive metals processing. We find that shifts in electricity-intensive industry can be predicted based on countries' power system resilience. - Highlights: ► We establish a resilience index measure for major electricity systems. ► We examine a range of OECD and developing nations electricity systems and their ability to cope with shocks. ► Robustness measures are established to show resilience of electricity systems.

  5. Comparative analysis of some search engines

    Taiwo O. Edosomwan

    2010-10-01

    Full Text Available We compared the information retrieval performances of some popular search engines (namely, Google, Yahoo, AlltheWeb, Gigablast, Zworks and AltaVista and Bing/MSN in response to a list of ten queries, varying in complexity. These queries were run on each search engine and the precision and response time of the retrieved results were recorded. The first ten documents on each retrieval output were evaluated as being ‘relevant’ or ‘non-relevant’ for evaluation of the search engine’s precision. To evaluate response time, normalised recall ratios were calculated at various cut-off points for each query and search engine. This study shows that Google appears to be the best search engine in terms of both average precision (70% and average response time (2 s. Gigablast and AlltheWeb performed the worst overall in this study.

  6. MONITORING POTENTIAL DRUG INTERACTIONS AND REACTIONS VIA NETWORK ANALYSIS OF INSTAGRAM USER TIMELINES.

    Correia, Rion Brattig; Li, Lang; Rocha, Luis M

    2016-01-01

    Much recent research aims to identify evidence for Drug-Drug Interactions (DDI) and Adverse Drug reactions (ADR) from the biomedical scientific literature. In addition to this "Bibliome", the universe of social media provides a very promising source of large-scale data that can help identify DDI and ADR in ways that have not been hitherto possible. Given the large number of users, analysis of social media data may be useful to identify under-reported, population-level pathology associated with DDI, thus further contributing to improvements in population health. Moreover, tapping into this data allows us to infer drug interactions with natural products-including cannabis-which constitute an array of DDI very poorly explored by biomedical research thus far. Our goal is to determine the potential of Instagram for public health monitoring and surveillance for DDI, ADR, and behavioral pathology at large. Most social media analysis focuses on Twitter and Facebook, but Instagram is an increasingly important platform, especially among teens, with unrestricted access of public posts, high availability of posts with geolocation coordinates, and images to supplement textual analysis. Using drug, symptom, and natural product dictionaries for identification of the various types of DDI and ADR evidence, we have collected close to 7000 user timelines spanning from October 2010 to June 2015.We report on 1) the development of a monitoring tool to easily observe user-level timelines associated with drug and symptom terms of interest, and 2) population-level behavior via the analysis of co-occurrence networks computed from user timelines at three different scales: monthly, weekly, and daily occurrences. Analysis of these networks further reveals 3) drug and symptom direct and indirect associations with greater support in user timelines, as well as 4) clusters of symptoms and drugs revealed by the collective behavior of the observed population. This demonstrates that Instagram

  7. Comparative Marketing: An Interdisciplinary Framework for Institutional Analysis

    Gopalkrishnan R Iyer

    1997-01-01

    Institutional analysis is proposed as an alternative theoretical methodology for the study of comparative marketing systems. This paper argues that institutional analysis offers considerable potential for understanding dynamics marketing systems and for the explicit study of change. Disciplinary insights of institutional analysis are reviewed and the richness of the conceptual apparatus of comparative institutional analysis, as applied to the study of comparative marketing systems, is explica...

  8. Comparative Modal Analysis of Sieve Hardware Designs

    Thompson, Nathaniel

    2012-01-01

    The CMTB Thwacker hardware operates as a testbed analogue for the Flight Thwacker and Sieve components of CHIMRA, a device on the Curiosity Rover. The sieve separates particles with a diameter smaller than 150 microns for delivery to onboard science instruments. The sieving behavior of the testbed hardware should be similar to the Flight hardware for the results to be meaningful. The elastodynamic behavior of both sieves was studied analytically using the Rayleigh Ritz method in conjunction with classical plate theory. Finite element models were used to determine the mode shapes of both designs, and comparisons between the natural frequencies and mode shapes were made. The analysis predicts that the performance of the CMTB Thwacker will closely resemble the performance of the Flight Thwacker within the expected steady state operating regime. Excitations of the testbed hardware that will mimic the flight hardware were recommended, as were those that will improve the efficiency of the sieving process.

  9. Comparative analysis of cystatin superfamily in platyhelminths.

    Aijiang Guo

    Full Text Available The cystatin superfamily is comprised of cysteine proteinase inhibitors and encompasses at least 3 subfamilies: stefins, cystatins and kininogens. In this study, the platyhelminth cystatin superfamily was identified and grouped into stefin and cystatin subfamilies. The conserved domain of stefins (G, QxVxG was observed in all members of platyhelminth stefins. The three characteristics of cystatins, the cystatin-like domain (G, QxVxG, PW, a signal peptide, and one or two conserved disulfide bonds, were observed in platyhelminths, with the exception of cestodes, which lacked the conserved disulfide bond. However, it is noteworthy that cestode cystatins had two tandem repeated domains, although the second tandem repeated domain did not contain a cystatin-like domain, which has not been previously reported. Tertiary structure analysis of Taenia solium cystatin, one of the cestode cystatins, demonstrated that the N-terminus of T. solium cystatin formed a five turn α-helix, a five stranded β-pleated sheet and a hydrophobic edge, similar to the structure of chicken cystatin. Although no conserved disulfide bond was found in T. solium cystatin, the models of T. solium cystatin and chicken cystatin corresponded at the site of the first disulfide bridge of the chicken cystatin. However, the two models were not similar regarding the location of the second disulfide bridge of chicken cystatin. These results showed that T. solium cystatin and chicken cystatin had similarities and differences, suggesting that the biochemistry of T. solium cystatin could be similar to chicken cystatin in its inhibitory function and that it may have further functional roles. The same results were obtained for other cestode cystatins. Phylogenetic analysis showed that cestode cystatins constituted an independent clade and implied that cestode cystatins should be considered to have formed a new clade during evolution.

  10. User`s manual for the data analysis system for monitoring the fuel oil spill at the Sandia National Laboratories installation in Livermore, California

    Widing, M.A.; Leser, C.C.

    1995-04-01

    This report describes the use of the data analysis software developed by Argonne National laboratory (ANL) and installed at the fuel oil spill site at Sandia National Laboratories. This software provides various programs for anlayzing the data from physical and chemical sensors. This manual provides basic information on the design and use of these user interfaces. Analysts use these interfaces to evaluate the site data. Four software programs included in the data analysis software suite provide the following capabilities; physical data analysis, chemical data entry, chemical data analysis, and data management.