WorldWideScience

Sample records for empirical bayes analysis

  1. An Empirical Bayes Approach to Mantel-Haenszel DIF Analysis.

    Science.gov (United States)

    Zwick, Rebecca; Thayer, Dorothy T.; Lewis, Charles

    1999-01-01

    Developed an empirical Bayes enhancement to Mantel-Haenszel (MH) analysis of differential item functioning (DIF) in which it is assumed that the MH statistics are normally distributed and that the prior distribution of underlying DIF parameters is also normal. (Author/SLD)

  2. The effect of loss functions on empirical Bayes reliability analysis

    Directory of Open Access Journals (Sweden)

    Camara Vincent A. R.

    1998-01-01

    Full Text Available The aim of the present study is to investigate the sensitivity of empirical Bayes estimates of the reliability function with respect to changing of the loss function. In addition to applying some of the basic analytical results on empirical Bayes reliability obtained with the use of the “popular” squared error loss function, we shall derive some expressions corresponding to empirical Bayes reliability estimates obtained with the Higgins–Tsokos, the Harris and our proposed logarithmic loss functions. The concept of efficiency, along with the notion of integrated mean square error, will be used as a criterion to numerically compare our results. It is shown that empirical Bayes reliability functions are in general sensitive to the choice of the loss function, and that the squared error loss does not always yield the best empirical Bayes reliability estimate.

  3. The effect of loss functions on empirical Bayes reliability analysis

    Directory of Open Access Journals (Sweden)

    Vincent A. R. Camara

    1999-01-01

    Full Text Available The aim of the present study is to investigate the sensitivity of empirical Bayes estimates of the reliability function with respect to changing of the loss function. In addition to applying some of the basic analytical results on empirical Bayes reliability obtained with the use of the “popular” squared error loss function, we shall derive some expressions corresponding to empirical Bayes reliability estimates obtained with the Higgins–Tsokos, the Harris and our proposed logarithmic loss functions. The concept of efficiency, along with the notion of integrated mean square error, will be used as a criterion to numerically compare our results.

  4. Developing a Clustering-Based Empirical Bayes Analysis Method for Hotspot Identification

    Directory of Open Access Journals (Sweden)

    Yajie Zou

    2017-01-01

    Full Text Available Hotspot identification (HSID is a critical part of network-wide safety evaluations. Typical methods for ranking sites are often rooted in using the Empirical Bayes (EB method to estimate safety from both observed crash records and predicted crash frequency based on similar sites. The performance of the EB method is highly related to the selection of a reference group of sites (i.e., roadway segments or intersections similar to the target site from which safety performance functions (SPF used to predict crash frequency will be developed. As crash data often contain underlying heterogeneity that, in essence, can make them appear to be generated from distinct subpopulations, methods are needed to select similar sites in a principled manner. To overcome this possible heterogeneity problem, EB-based HSID methods that use common clustering methodologies (e.g., mixture models, K-means, and hierarchical clustering to select “similar” sites for building SPFs are developed. Performance of the clustering-based EB methods is then compared using real crash data. Here, HSID results, when computed on Texas undivided rural highway cash data, suggest that all three clustering-based EB analysis methods are preferred over the conventional statistical methods. Thus, properly classifying the road segments for heterogeneous crash data can further improve HSID accuracy.

  5. An empirical Bayes method for updating inferences in analysis of quantitative trait loci using information from related genome scans.

    Science.gov (United States)

    Zhang, Kui; Wiener, Howard; Beasley, Mark; George, Varghese; Amos, Christopher I; Allison, David B

    2006-08-01

    Individual genome scans for quantitative trait loci (QTL) mapping often suffer from low statistical power and imprecise estimates of QTL location and effect. This lack of precision yields large confidence intervals for QTL location, which are problematic for subsequent fine mapping and positional cloning. In prioritizing areas for follow-up after an initial genome scan and in evaluating the credibility of apparent linkage signals, investigators typically examine the results of other genome scans of the same phenotype and informally update their beliefs about which linkage signals in their scan most merit confidence and follow-up via a subjective-intuitive integration approach. A method that acknowledges the wisdom of this general paradigm but formally borrows information from other scans to increase confidence in objectivity would be a benefit. We developed an empirical Bayes analytic method to integrate information from multiple genome scans. The linkage statistic obtained from a single genome scan study is updated by incorporating statistics from other genome scans as prior information. This technique does not require that all studies have an identical marker map or a common estimated QTL effect. The updated linkage statistic can then be used for the estimation of QTL location and effect. We evaluate the performance of our method by using extensive simulations based on actual marker spacing and allele frequencies from available data. Results indicate that the empirical Bayes method can account for between-study heterogeneity, estimate the QTL location and effect more precisely, and provide narrower confidence intervals than results from any single individual study. We also compared the empirical Bayes method with a method originally developed for meta-analysis (a closely related but distinct purpose). In the face of marked heterogeneity among studies, the empirical Bayes method outperforms the comparator.

  6. Empirical Bayes Approaches to Multivariate Fuzzy Partitions.

    Science.gov (United States)

    Woodbury, Max A.; Manton, Kenneth G.

    1991-01-01

    An empirical Bayes-maximum likelihood estimation procedure is presented for the application of fuzzy partition models in describing high dimensional discrete response data. The model describes individuals in terms of partial membership in multiple latent categories that represent bounded discrete spaces. (SLD)

  7. Using Loss Functions for DIF Detection: An Empirical Bayes Approach.

    Science.gov (United States)

    Zwick, Rebecca; Thayer, Dorothy; Lewis, Charles

    2000-01-01

    Studied a method for flagging differential item functioning (DIF) based on loss functions. Builds on earlier research that led to the development of an empirical Bayes enhancement to the Mantel-Haenszel DIF analysis. Tested the method through simulation and found its performance better than some commonly used DIF classification systems. (SLD)

  8. Further Evaluation of Covariate Analysis using Empirical Bayes Estimates in Population Pharmacokinetics: the Perception of Shrinkage and Likelihood Ratio Test.

    Science.gov (United States)

    Xu, Xu Steven; Yuan, Min; Yang, Haitao; Feng, Yan; Xu, Jinfeng; Pinheiro, Jose

    2017-01-01

    Covariate analysis based on population pharmacokinetics (PPK) is used to identify clinically relevant factors. The likelihood ratio test (LRT) based on nonlinear mixed effect model fits is currently recommended for covariate identification, whereas individual empirical Bayesian estimates (EBEs) are considered unreliable due to the presence of shrinkage. The objectives of this research were to investigate the type I error for LRT and EBE approaches, to confirm the similarity of power between the LRT and EBE approaches from a previous report and to explore the influence of shrinkage on LRT and EBE inferences. Using an oral one-compartment PK model with a single covariate impacting on clearance, we conducted a wide range of simulations according to a two-way factorial design. The results revealed that the EBE-based regression not only provided almost identical power for detecting a covariate effect, but also controlled the false positive rate better than the LRT approach. Shrinkage of EBEs is likely not the root cause for decrease in power or inflated false positive rate although the size of the covariate effect tends to be underestimated at high shrinkage. In summary, contrary to the current recommendations, EBEs may be a better choice for statistical tests in PPK covariate analysis compared to LRT. We proposed a three-step covariate modeling approach for population PK analysis to utilize the advantages of EBEs while overcoming their shortcomings, which allows not only markedly reducing the run time for population PK analysis, but also providing more accurate covariate tests.

  9. Empirical Bayes conditional independence graphs for regulatory network recovery

    Science.gov (United States)

    Mahdi, Rami; Madduri, Abishek S.; Wang, Guoqing; Strulovici-Barel, Yael; Salit, Jacqueline; Hackett, Neil R.; Crystal, Ronald G.; Mezey, Jason G.

    2012-01-01

    Motivation: Computational inference methods that make use of graphical models to extract regulatory networks from gene expression data can have difficulty reconstructing dense regions of a network, a consequence of both computational complexity and unreliable parameter estimation when sample size is small. As a result, identification of hub genes is of special difficulty for these methods. Methods: We present a new algorithm, Empirical Light Mutual Min (ELMM), for large network reconstruction that has properties well suited for recovery of graphs with high-degree nodes. ELMM reconstructs the undirected graph of a regulatory network using empirical Bayes conditional independence testing with a heuristic relaxation of independence constraints in dense areas of the graph. This relaxation allows only one gene of a pair with a putative relation to be aware of the network connection, an approach that is aimed at easing multiple testing problems associated with recovering densely connected structures. Results: Using in silico data, we show that ELMM has better performance than commonly used network inference algorithms including GeneNet, ARACNE, FOCI, GENIE3 and GLASSO. We also apply ELMM to reconstruct a network among 5492 genes expressed in human lung airway epithelium of healthy non-smokers, healthy smokers and individuals with chronic obstructive pulmonary disease assayed using microarrays. The analysis identifies dense sub-networks that are consistent with known regulatory relationships in the lung airway and also suggests novel hub regulatory relationships among a number of genes that play roles in oxidative stress and secretion. Availability and implementation: Software for running ELMM is made available at http://mezeylab.cb.bscb.cornell.edu/Software.aspx. Contact: ramimahdi@yahoo.com or jgm45@cornell.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:22685074

  10. Merging expert and empirical data for rare event frequency estimation: Pool homogenisation for empirical Bayes models

    International Nuclear Information System (INIS)

    Quigley, John; Hardman, Gavin; Bedford, Tim; Walls, Lesley

    2011-01-01

    Empirical Bayes provides one approach to estimating the frequency of rare events as a weighted average of the frequencies of an event and a pool of events. The pool will draw upon, for example, events with similar precursors. The higher the degree of homogeneity of the pool, then the Empirical Bayes estimator will be more accurate. We propose and evaluate a new method using homogenisation factors under the assumption that events are generated from a Homogeneous Poisson Process. The homogenisation factors are scaling constants, which can be elicited through structured expert judgement and used to align the frequencies of different events, hence homogenising the pool. The estimation error relative to the homogeneity of the pool is examined theoretically indicating that reduced error is associated with larger pool homogeneity. The effects of misspecified expert assessments of the homogenisation factors are examined theoretically and through simulation experiments. Our results show that the proposed Empirical Bayes method using homogenisation factors is robust under different degrees of misspecification.

  11. Empirical water depth predictions in Dublin Bay based on satellite EO multispectral imagery and multibeam data using spatially weighted geographical analysis

    Science.gov (United States)

    Monteys, Xavier; Harris, Paul; Caloca, Silvia

    2014-05-01

    The coastal shallow water zone can be a challenging and expensive environment within which to acquire bathymetry and other oceanographic data using traditional survey methods. Dangers and limited swath coverage make some of these areas unfeasible to survey using ship borne systems, and turbidity can preclude marine LIDAR. As a result, an extensive part of the coastline worldwide remains completely unmapped. Satellite EO multispectral data, after processing, allows timely, cost efficient and quality controlled information to be used for planning, monitoring, and regulating coastal environments. It has the potential to deliver repetitive derivation of medium resolution bathymetry, coastal water properties and seafloor characteristics in shallow waters. Over the last 30 years satellite passive imaging methods for bathymetry extraction, implementing analytical or empirical methods, have had a limited success predicting water depths. Different wavelengths of the solar light penetrate the water column to varying depths. They can provide acceptable results up to 20 m but become less accurate in deeper waters. The study area is located in the inner part of Dublin Bay, on the East coast of Ireland. The region investigated is a C-shaped inlet covering an area of 10 km long and 5 km wide with water depths ranging from 0 to 10 m. The methodology employed on this research uses a ratio of reflectance from SPOT 5 satellite bands, differing to standard linear transform algorithms. High accuracy water depths were derived using multibeam data. The final empirical model uses spatially weighted geographical tools to retrieve predicted depths. The results of this paper confirm that SPOT satellite scenes are suitable to predict depths using empirical models in very shallow embayments. Spatial regression models show better adjustments in the predictions over non-spatial models. The spatial regression equation used provides realistic results down to 6 m below the water surface, with

  12. Bayes Empirical Bayes Inference of Amino Acid Sites Under Positive Selection

    DEFF Research Database (Denmark)

    Yang, Ziheng; Wong, Wendy Shuk Wan; Nielsen, Rasmus

    2005-01-01

    , with > 1 indicating positive selection. Statistical distributions are used to model the variation in among sites, allowing a subset of sites to have > 1 while the rest of the sequence may be under purifying selection with ... probabilities that a site comes from the site class with > 1. Current implementations, however, use the naive EB (NEB) approach and fail to account for sampling errors in maximum likelihood estimates of model parameters, such as the proportions and ratios for the site classes. In small data sets lacking...... information, this approach may lead to unreliable posterior probability calculations. In this paper, we develop a Bayes empirical Bayes (BEB) approach to the problem, which assigns a prior to the model parameters and integrates over their uncertainties. We compare the new and old methods on real and simulated...

  13. Using empirical Bayes predictors from generalized linear mixed models to test and visualize associations among longitudinal outcomes.

    Science.gov (United States)

    Mikulich-Gilbertson, Susan K; Wagner, Brandie D; Grunwald, Gary K; Riggs, Paula D; Zerbe, Gary O

    2018-01-01

    Medical research is often designed to investigate changes in a collection of response variables that are measured repeatedly on the same subjects. The multivariate generalized linear mixed model (MGLMM) can be used to evaluate random coefficient associations (e.g. simple correlations, partial regression coefficients) among outcomes that may be non-normal and differently distributed by specifying a multivariate normal distribution for their random effects and then evaluating the latent relationship between them. Empirical Bayes predictors are readily available for each subject from any mixed model and are observable and hence, plotable. Here, we evaluate whether second-stage association analyses of empirical Bayes predictors from a MGLMM, provide a good approximation and visual representation of these latent association analyses using medical examples and simulations. Additionally, we compare these results with association analyses of empirical Bayes predictors generated from separate mixed models for each outcome, a procedure that could circumvent computational problems that arise when the dimension of the joint covariance matrix of random effects is large and prohibits estimation of latent associations. As has been shown in other analytic contexts, the p-values for all second-stage coefficients that were determined by naively assuming normality of empirical Bayes predictors provide a good approximation to p-values determined via permutation analysis. Analyzing outcomes that are interrelated with separate models in the first stage and then associating the resulting empirical Bayes predictors in a second stage results in different mean and covariance parameter estimates from the maximum likelihood estimates generated by a MGLMM. The potential for erroneous inference from using results from these separate models increases as the magnitude of the association among the outcomes increases. Thus if computable, scatterplots of the conditionally independent empirical Bayes

  14. Mixing Bayes and empirical Bayes inference to anticipate the realization of engineering concerns about variant system designs

    International Nuclear Information System (INIS)

    Quigley, John; Walls, Lesley

    2011-01-01

    Mixing Bayes and Empirical Bayes inference provides reliability estimates for variant system designs by using relevant failure data - observed and anticipated - about engineering changes arising due to modification and innovation. A coherent inference framework is proposed to predict the realization of engineering concerns during product development so that informed decisions can be made about the system design and the analysis conducted to prove reliability. The proposed method involves combining subjective prior distributions for the number of engineering concerns with empirical priors for the non-parametric distribution of time to realize these concerns in such a way that we can cross-tabulate classes of concerns to failure events within time partitions at an appropriate level of granularity. To support efficient implementation, a computationally convenient hypergeometric approximation is developed for the counting distributions appropriate to our underlying stochastic model. The accuracy of our approximation over first-order alternatives is examined, and demonstrated, through an evaluation experiment. An industrial application illustrates model implementation and shows how estimates can be updated using information arising during development test and analysis.

  15. Flexible Modeling of Epidemics with an Empirical Bayes Framework

    Science.gov (United States)

    Brooks, Logan C.; Farrow, David C.; Hyun, Sangwon; Tibshirani, Ryan J.; Rosenfeld, Roni

    2015-01-01

    Seasonal influenza epidemics cause consistent, considerable, widespread loss annually in terms of economic burden, morbidity, and mortality. With access to accurate and reliable forecasts of a current or upcoming influenza epidemic’s behavior, policy makers can design and implement more effective countermeasures. This past year, the Centers for Disease Control and Prevention hosted the “Predict the Influenza Season Challenge”, with the task of predicting key epidemiological measures for the 2013–2014 U.S. influenza season with the help of digital surveillance data. We developed a framework for in-season forecasts of epidemics using a semiparametric Empirical Bayes framework, and applied it to predict the weekly percentage of outpatient doctors visits for influenza-like illness, and the season onset, duration, peak time, and peak height, with and without using Google Flu Trends data. Previous work on epidemic modeling has focused on developing mechanistic models of disease behavior and applying time series tools to explain historical data. However, tailoring these models to certain types of surveillance data can be challenging, and overly complex models with many parameters can compromise forecasting ability. Our approach instead produces possibilities for the epidemic curve of the season of interest using modified versions of data from previous seasons, allowing for reasonable variations in the timing, pace, and intensity of the seasonal epidemics, as well as noise in observations. Since the framework does not make strict domain-specific assumptions, it can easily be applied to some other diseases with seasonal epidemics. This method produces a complete posterior distribution over epidemic curves, rather than, for example, solely point predictions of forecasting targets. We report prospective influenza-like-illness forecasts made for the 2013–2014 U.S. influenza season, and compare the framework’s cross-validated prediction error on historical data to

  16. EbayesThresh: R Programs for Empirical Bayes Thresholding

    Directory of Open Access Journals (Sweden)

    Iain Johnstone

    2005-04-01

    Full Text Available Suppose that a sequence of unknown parameters is observed sub ject to independent Gaussian noise. The EbayesThresh package in the S language implements a class of Empirical Bayes thresholding methods that can take advantage of possible sparsity in the sequence, to improve the quality of estimation. The prior for each parameter in the sequence is a mixture of an atom of probability at zero and a heavy-tailed density. Within the package, this can be either a Laplace (double exponential density or else a mixture of normal distributions with tail behavior similar to the Cauchy distribution. The mixing weight, or sparsity parameter, is chosen automatically by marginal maximum likelihood. If estimation is carried out using the posterior median, this is a random thresholding procedure; the estimation can also be carried out using other thresholding rules with the same threshold, and the package provides the posterior mean, and hard and soft thresholding, as additional options. This paper reviews the method, and gives details (far beyond those previously published of the calculations needed for implementing the procedures. It explains and motivates both the general methodology, and the use of the EbayesThresh package, through simulated and real data examples. When estimating the wavelet transform of an unknown function, it is appropriate to apply the method level by level to the transform of the observed data. The package can carry out these calculations for wavelet transforms obtained using various packages in R and S-PLUS. Details, including a motivating example, are presented, and the application of the method to image estimation is also explored. The final topic considered is the estimation of a single sequence that may become progressively sparser along the sequence. An iterated least squares isotone regression method allows for the choice of a threshold that depends monotonically on the order in which the observations are made. An alternative

  17. Empirical Bayes ranking and selection methods via semiparametric hierarchical mixture models in microarray studies.

    Science.gov (United States)

    Noma, Hisashi; Matsui, Shigeyuki

    2013-05-20

    The main purpose of microarray studies is screening of differentially expressed genes as candidates for further investigation. Because of limited resources in this stage, prioritizing genes are relevant statistical tasks in microarray studies. For effective gene selections, parametric empirical Bayes methods for ranking and selection of genes with largest effect sizes have been proposed (Noma et al., 2010; Biostatistics 11: 281-289). The hierarchical mixture model incorporates the differential and non-differential components and allows information borrowing across differential genes with separation from nuisance, non-differential genes. In this article, we develop empirical Bayes ranking methods via a semiparametric hierarchical mixture model. A nonparametric prior distribution, rather than parametric prior distributions, for effect sizes is specified and estimated using the "smoothing by roughening" approach of Laird and Louis (1991; Computational statistics and data analysis 12: 27-37). We present applications to childhood and infant leukemia clinical studies with microarrays for exploring genes related to prognosis or disease progression. Copyright © 2012 John Wiley & Sons, Ltd.

  18. Analyses of reliability characteristics of emergency diesel generator population using empirical Bayes methods

    International Nuclear Information System (INIS)

    Vesely, W.E.; Uryas'ev, S.P.; Samanta, P.K.

    1993-01-01

    Emergency Diesel Generators (EDGs) provide backup power to nuclear power plants in case of failure of AC buses. The reliability of EDGs is important to assure response to loss-of-offsite power accident scenarios, a dominant contributor to the plant risk. The reliable performance of EDGs has been of concern both for regulators and plant operators. In this paper the authors present an approach and results from the analysis of failure data from a large population of EDGs. They used empirical Bayes approach to obtain both the population distribution and the individual failure probabilities from EDGs failure to start and load-run data over 4 years for 194 EDGs at 63 plant units

  19. Estimating rate of occurrence of rare events with empirical bayes: A railway application

    International Nuclear Information System (INIS)

    Quigley, John; Bedford, Tim; Walls, Lesley

    2007-01-01

    Classical approaches to estimating the rate of occurrence of events perform poorly when data are few. Maximum likelihood estimators result in overly optimistic point estimates of zero for situations where there have been no events. Alternative empirical-based approaches have been proposed based on median estimators or non-informative prior distributions. While these alternatives offer an improvement over point estimates of zero, they can be overly conservative. Empirical Bayes procedures offer an unbiased approach through pooling data across different hazards to support stronger statistical inference. This paper considers the application of Empirical Bayes to high consequence low-frequency events, where estimates are required for risk mitigation decision support such as as low as reasonably possible. A summary of empirical Bayes methods is given and the choices of estimation procedures to obtain interval estimates are discussed. The approaches illustrated within the case study are based on the estimation of the rate of occurrence of train derailments within the UK. The usefulness of empirical Bayes within this context is discussed

  20. EMPIRICAL RESEARCH AND CONGREGATIONAL ANALYSIS ...

    African Journals Online (AJOL)

    empirical research has made to the process of congregational analysis. 1 Part of this ... contextual congegrational analysis – meeting social and divine desires”) at the IAPT .... methodology of a congregational analysis should be regarded as a process. ... essential to create space for a qualitative and quantitative approach.

  1. β-empirical Bayes inference and model diagnosis of microarray data

    Directory of Open Access Journals (Sweden)

    Hossain Mollah Mohammad

    2012-06-01

    Full Text Available Abstract Background Microarray data enables the high-throughput survey of mRNA expression profiles at the genomic level; however, the data presents a challenging statistical problem because of the large number of transcripts with small sample sizes that are obtained. To reduce the dimensionality, various Bayesian or empirical Bayes hierarchical models have been developed. However, because of the complexity of the microarray data, no model can explain the data fully. It is generally difficult to scrutinize the irregular patterns of expression that are not expected by the usual statistical gene by gene models. Results As an extension of empirical Bayes (EB procedures, we have developed the β-empirical Bayes (β-EB approach based on a β-likelihood measure which can be regarded as an ’evidence-based’ weighted (quasi- likelihood inference. The weight of a transcript t is described as a power function of its likelihood, fβ(yt|θ. Genes with low likelihoods have unexpected expression patterns and low weights. By assigning low weights to outliers, the inference becomes robust. The value of β, which controls the balance between the robustness and efficiency, is selected by maximizing the predictive β0-likelihood by cross-validation. The proposed β-EB approach identified six significant (p−5 contaminated transcripts as differentially expressed (DE in normal/tumor tissues from the head and neck of cancer patients. These six genes were all confirmed to be related to cancer; they were not identified as DE genes by the classical EB approach. When applied to the eQTL analysis of Arabidopsis thaliana, the proposed β-EB approach identified some potential master regulators that were missed by the EB approach. Conclusions The simulation data and real gene expression data showed that the proposed β-EB method was robust against outliers. The distribution of the weights was used to scrutinize the irregular patterns of expression and diagnose the model

  2. Empirically Testing Thematic Analysis (ETTA)

    DEFF Research Database (Denmark)

    Gildberg, Frederik Alkier; Bradley, Stephen K.; Tingleff, Elllen B.

    2015-01-01

    Text analysis is not a question of a right or wrong way to go about it, but a question of different traditions. These tend to not only give answers to how to conduct an analysis, but also to provide the answer as to why it is conducted in the way that it is. The problem however may be that the li...... for themselves. The advantage of utilizing the presented analytic approach is argued to be the integral empirical testing, which should assure systematic development, interpretation and analysis of the source textual material....... between tradition and tool is unclear. The main objective of this article is therefore to present Empirical Testing Thematic Analysis, a step by step approach to thematic text analysis; discussing strengths and weaknesses, so that others might assess its potential as an approach that they might utilize/develop...

  3. Empirical analysis of consumer behavior

    NARCIS (Netherlands)

    Huang, Yufeng

    2015-01-01

    This thesis consists of three essays in quantitative marketing, focusing on structural empirical analysis of consumer behavior. In the first essay, he investigates the role of a consumer's skill of product usage, and its imperfect transferability across brands, in her product choice. It shows that

  4. Empirical Bayes Credibility Models for Economic Catastrophic Losses by Regions

    Directory of Open Access Journals (Sweden)

    Jindrová Pavla

    2017-01-01

    Full Text Available Catastrophic events affect various regions of the world with increasing frequency and intensity. The number of catastrophic events and the amount of economic losses is varying in different world regions. Part of these losses is covered by insurance. Catastrophe events in last years are associated with increases in premiums for some lines of business. The article focus on estimating the amount of net premiums that would be needed to cover the total or insured catastrophic losses in different world regions using Bühlmann and Bühlmann-Straub empirical credibility models based on data from Sigma Swiss Re 2010-2016. The empirical credibility models have been developed to estimate insurance premiums for short term insurance contracts using two ingredients: past data from the risk itself and collateral data from other sources considered to be relevant. In this article we deal with application of these models based on the real data about number of catastrophic events and about the total economic and insured catastrophe losses in seven regions of the world in time period 2009-2015. Estimated credible premiums by world regions provide information how much money in the monitored regions will be need to cover total and insured catastrophic losses in next year.

  5. A nonparametric empirical Bayes framework for large-scale multiple testing.

    Science.gov (United States)

    Martin, Ryan; Tokdar, Surya T

    2012-07-01

    We propose a flexible and identifiable version of the 2-groups model, motivated by hierarchical Bayes considerations, that features an empirical null and a semiparametric mixture model for the nonnull cases. We use a computationally efficient predictive recursion (PR) marginal likelihood procedure to estimate the model parameters, even the nonparametric mixing distribution. This leads to a nonparametric empirical Bayes testing procedure, which we call PRtest, based on thresholding the estimated local false discovery rates. Simulations and real data examples demonstrate that, compared to existing approaches, PRtest's careful handling of the nonnull density can give a much better fit in the tails of the mixture distribution which, in turn, can lead to more realistic conclusions.

  6. An empirical Bayes safety evaluation of tram/streetcar signal and lane priority measures in Melbourne.

    Science.gov (United States)

    Naznin, Farhana; Currie, Graham; Sarvi, Majid; Logan, David

    2016-01-01

    Streetcars/tram systems are growing worldwide, and many are given priority to increase speed and reliability performance in mixed traffic conditions. Research related to the road safety impact of tram priority is limited. This study explores the road safety impacts of tram priority measures including lane and intersection/signal priority measures. A before-after crash study was conducted using the empirical Bayes (EB) method to provide more accurate crash impact estimates by accounting for wider crash trends and regression to the mean effects. Before-after crash data for 29 intersections with tram signal priority and 23 arterials with tram lane priority in Melbourne, Australia, were analyzed to evaluate the road safety impact of tram priority. The EB before-after analysis results indicated a statistically significant adjusted crash reduction rate of 16.4% after implementation of tram priority measures. Signal priority measures were found to reduce crashes by 13.9% and lane priority by 19.4%. A disaggregate level simple before-after analysis indicated reductions in total and serious crashes as well as vehicle-, pedestrian-, and motorcycle-involved crashes. In addition, reductions in on-path crashes, pedestrian-involved crashes, and collisions among vehicles moving in the same and opposite directions and all other specific crash types were found after tram priority implementation. Results suggest that streetcar/tram priority measures result in safety benefits for all road users, including vehicles, pedestrians, and cyclists. Policy implications and areas for future research are discussed.

  7. Generalized least squares and empirical Bayes estimation in regional partial duration series index-flood modeling

    DEFF Research Database (Denmark)

    Madsen, Henrik; Rosbjerg, Dan

    1997-01-01

    parameters is inferred from regional data using generalized least squares (GLS) regression. Two different Bayesian T-year event estimators are introduced: a linear estimator that requires only some moments of the prior distributions to be specified and a parametric estimator that is based on specified......A regional estimation procedure that combines the index-flood concept with an empirical Bayes method for inferring regional information is introduced. The model is based on the partial duration series approach with generalized Pareto (GP) distributed exceedances. The prior information of the model...

  8. Incorporating Functional Genomic Information in Genetic Association Studies Using an Empirical Bayes Approach.

    Science.gov (United States)

    Spencer, Amy V; Cox, Angela; Lin, Wei-Yu; Easton, Douglas F; Michailidou, Kyriaki; Walters, Kevin

    2016-04-01

    There is a large amount of functional genetic data available, which can be used to inform fine-mapping association studies (in diseases with well-characterised disease pathways). Single nucleotide polymorphism (SNP) prioritization via Bayes factors is attractive because prior information can inform the effect size or the prior probability of causal association. This approach requires the specification of the effect size. If the information needed to estimate a priori the probability density for the effect sizes for causal SNPs in a genomic region isn't consistent or isn't available, then specifying a prior variance for the effect sizes is challenging. We propose both an empirical method to estimate this prior variance, and a coherent approach to using SNP-level functional data, to inform the prior probability of causal association. Through simulation we show that when ranking SNPs by our empirical Bayes factor in a fine-mapping study, the causal SNP rank is generally as high or higher than the rank using Bayes factors with other plausible values of the prior variance. Importantly, we also show that assigning SNP-specific prior probabilities of association based on expert prior functional knowledge of the disease mechanism can lead to improved causal SNPs ranks compared to ranking with identical prior probabilities of association. We demonstrate the use of our methods by applying the methods to the fine mapping of the CASP8 region of chromosome 2 using genotype data from the Collaborative Oncological Gene-Environment Study (COGS) Consortium. The data we analysed included approximately 46,000 breast cancer case and 43,000 healthy control samples. © 2016 The Authors. *Genetic Epidemiology published by Wiley Periodicals, Inc.

  9. Sources, fate, and transport of nitrogen and phosphorus in the Chesapeake Bay watershed-An empirical model

    Science.gov (United States)

    Ator, Scott W.; Brakebill, John W.; Blomquist, Joel D.

    2011-01-01

    Spatially Referenced Regression on Watershed Attributes (SPARROW) was used to provide empirical estimates of the sources, fate, and transport of total nitrogen (TN) and total phosphorus (TP) in the Chesapeake Bay watershed, and the mean annual TN and TP flux to the bay and in each of 80,579 nontidal tributary stream reaches. Restoration efforts in recent decades have been insufficient to meet established standards for water quality and ecological conditions in Chesapeake Bay. The bay watershed includes 166,000 square kilometers of mixed land uses, multiple nutrient sources, and variable hydrogeologic, soil, and weather conditions, and bay restoration is complicated by the multitude of nutrient sources and complex interacting factors affecting the occurrence, fate, and transport of nitrogen and phosphorus from source areas to streams and the estuary. Effective and efficient nutrient management at the regional scale in support of Chesapeake Bay restoration requires a comprehensive understanding of the sources, fate, and transport of nitrogen and phosphorus in the watershed, which is only available through regional models. The current models, Chesapeake Bay nutrient SPARROW models, version 4 (CBTN_v4 and CBTP_v4), were constructed at a finer spatial resolution than previous SPARROW models for the Chesapeake Bay watershed (versions 1, 2, and 3), and include an updated timeframe and modified sources and other explantory terms.

  10. Gradient Analysis and Classification of Carolina Bay Vegetation: A Framework for Bay Wetlands Conservation and Restoration

    Energy Technology Data Exchange (ETDEWEB)

    Diane De Steven,Ph.D.; Maureen Tone,PhD.

    1997-10-01

    This report address four project objectives: (1) Gradient model of Carolina bay vegetation on the SRS--The authors use ordination analyses to identify environmental and landscape factors that are correlated with vegetation composition. Significant factors can provide a framework for site-based conservation of existing diversity, and they may also be useful site predictors for potential vegetation in bay restorations. (2) Regional analysis of Carolina bay vegetation diversity--They expand the ordination analyses to assess the degree to which SRS bays encompass the range of vegetation diversity found in the regional landscape of South Carolina's western Upper Coastal Plain. Such comparisons can indicate floristic status relative to regional potentials and identify missing species or community elements that might be re-introduced or restored. (3) Classification of vegetation communities in Upper Coastal Plain bays--They use cluster analysis to identify plant community-types at the regional scale, and explore how this classification may be functional with respect to significant environmental and landscape factors. An environmentally-based classification at the whole-bay level can provide a system of templates for managing bays as individual units and for restoring bays to desired plant communities. (4) Qualitative model for bay vegetation dynamics--They analyze present-day vegetation in relation to historic land uses and disturbances. The distinctive history of SRS bays provides the possibility of assessing pathways of post-disturbance succession. They attempt to develop a coarse-scale model of vegetation shifts in response to changing site factors; such qualitative models can provide a basis for suggesting management interventions that may be needed to maintain desired vegetation in protected or restored bays.

  11. Empirical direction in design and analysis

    CERN Document Server

    Anderson, Norman H

    2001-01-01

    The goal of Norman H. Anderson's new book is to help students develop skills of scientific inference. To accomplish this he organized the book around the ""Experimental Pyramid""--six levels that represent a hierarchy of considerations in empirical investigation--conceptual framework, phenomena, behavior, measurement, design, and statistical inference. To facilitate conceptual and empirical understanding, Anderson de-emphasizes computational formulas and null hypothesis testing. Other features include: *emphasis on visual inspection as a basic skill in experimental analysis to help student

  12. Gamma Activation Analysis in the Havana Bay superficial sediments

    International Nuclear Information System (INIS)

    Lopez, N.; Gelen, A.; Diaz Riso, O.; Manso, M.V.; Simon, M.J.; Maslov, A.G.; Gustova, M.V.; Beltran, J.; Soto, J.

    2003-01-01

    A preliminary study of 26 elements of Havana Bay superficial sediments were made using Gamma Activation Analysis. Samples from five zones of Havana Bay were analyzed. The results show a close interrelation between the concentration levels of the studied elements and the contaminant sources

  13. Improved estimation of subject-level functional connectivity using full and partial correlation with empirical Bayes shrinkage.

    Science.gov (United States)

    Mejia, Amanda F; Nebel, Mary Beth; Barber, Anita D; Choe, Ann S; Pekar, James J; Caffo, Brian S; Lindquist, Martin A

    2018-05-15

    Reliability of subject-level resting-state functional connectivity (FC) is determined in part by the statistical techniques employed in its estimation. Methods that pool information across subjects to inform estimation of subject-level effects (e.g., Bayesian approaches) have been shown to enhance reliability of subject-level FC. However, fully Bayesian approaches are computationally demanding, while empirical Bayesian approaches typically rely on using repeated measures to estimate the variance components in the model. Here, we avoid the need for repeated measures by proposing a novel measurement error model for FC describing the different sources of variance and error, which we use to perform empirical Bayes shrinkage of subject-level FC towards the group average. In addition, since the traditional intra-class correlation coefficient (ICC) is inappropriate for biased estimates, we propose a new reliability measure denoted the mean squared error intra-class correlation coefficient (ICC MSE ) to properly assess the reliability of the resulting (biased) estimates. We apply the proposed techniques to test-retest resting-state fMRI data on 461 subjects from the Human Connectome Project to estimate connectivity between 100 regions identified through independent components analysis (ICA). We consider both correlation and partial correlation as the measure of FC and assess the benefit of shrinkage for each measure, as well as the effects of scan duration. We find that shrinkage estimates of subject-level FC exhibit substantially greater reliability than traditional estimates across various scan durations, even for the most reliable connections and regardless of connectivity measure. Additionally, we find partial correlation reliability to be highly sensitive to the choice of penalty term, and to be generally worse than that of full correlations except for certain connections and a narrow range of penalty values. This suggests that the penalty needs to be chosen carefully

  14. The Italian Footwear Industry: an Empirical Analysis

    OpenAIRE

    Pirolo, Luca; Giustiniano, Luca; Nenni, Maria Elena

    2013-01-01

    This paper aims to provide readers with a deep empirical analysis on the Italian footwear industry in order to investigate the evolution of its structure (trends in sales and production, number of firms and employees, main markets, etc.), together with the identification of the main drivers of competitiveness in order to explain the strategies implemented by local actors.

  15. The problem analysis for empirical studies

    NARCIS (Netherlands)

    Groenland, E.A.G.

    2014-01-01

    This article proposes a systematic methodology for the development of a problem analysis for cross-sectional, empirical research. This methodology is referred to as the 'Annabel approach'. It is suitable both for academic studies and applied (business) studies. In addition it can be used for both

  16. Empirical analysis of uranium spot prices

    International Nuclear Information System (INIS)

    Morman, M.R.

    1988-01-01

    The objective is to empirically test a market model of the uranium industry that incorporates the notion that, if the resource is viewed as an asset by economic agents, then its own rate of return along with the own rate of return of a competing asset would be a major factor in formulating the price of the resource. The model tested is based on a market model of supply and demand. The supply model incorporates the notion that the decision criteria used by uranium mine owners is to select that extraction rate that maximizes the net present value of their extraction receipts. The demand model uses a concept that allows for explicit recognition of the prospect of arbitrage between a natural-resource market and the market for other capital goods. The empirical approach used for estimation was a recursive or causal model. The empirical results were consistent with the theoretical models. The coefficients of the demand and supply equations had the appropriate signs. Tests for causality were conducted to validate the use of the causal model. The results obtained were favorable. The implication of the findings as related to future studies of exhaustible resources are: (1) in some cases causal models are the appropriate specification for empirical analysis; (2) supply models should incorporate a measure to capture depletion effects

  17. An Empirical Bayes before-after evaluation of road safety effects of a new motorway in Norway.

    Science.gov (United States)

    Elvik, Rune; Ulstein, Heidi; Wifstad, Kristina; Syrstad, Ragnhild S; Seeberg, Aase R; Gulbrandsen, Magnus U; Welde, Morten

    2017-11-01

    This paper presents an Empirical Bayes before-after evaluation of the road safety effects of a new motorway (freeway) in Østfold county, Norway. The before-period was 1996-2002. The after-period was 2009-2015. The road was rebuilt from an undivided two-lane road into a divided four-lane road. The number of killed or seriously injured road users was reduced by 75 percent, controlling for (downward) long-term trends and regression-to-the-mean (statistically significant at the 5 percent level; recorded numbers 71 before, 11 after). There were small changes in the number of injury accidents (185 before, 123 after; net effect -3%) and the number of slightly injured road users (403 before 279 after; net effect +5%). Motorways appear to mainly reduce injury severity, not the number of accidents. The paper discusses challenges in implementing the Empirical Bayes design when less than ideal data are available. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. Unemployment and Mental Disorders - An Empirical Analysis

    DEFF Research Database (Denmark)

    Agerbo, Esben; Eriksson, Tor Viking; Mortensen, Preben Bo

    1998-01-01

    The purpose of this paper is also to analyze the importance of unemployment and other social factors as risk factors for impaired mental health. It departs from previous studies in that we make use of information about first admissions to a psychiatric hospital or ward as our measure of mental...... from the Psychiatric case register. Secondly, we estimate conditional logistic regression models for case-control data on first admissions to a psychiatric hospital. The explanatory variables in the empirical analysis include age, gender, education, marital status, income, wealth, and unemployment (and...

  19. Monterey Bay Aquarium Volunteer Guide Scheduling Analysis

    Science.gov (United States)

    2014-12-01

    TERMS 15. NUMBER OF Monterey Bay Aquarium, linear programing, network design, multi commodity flow, resilience PAGES 17. SECURITY 18. SECURITY...Volunteers fill many roles that include Aquarium guides, information desk attendants, divers, and animal caregivers . Julie Packard, Executive Director of...further analyze the resiliency of the shifts to changes in staffing levels caused by no-shows or drop-ins. 3 While the guide program managers have

  20. An Empirical Analysis of the Budget Deficit

    Directory of Open Access Journals (Sweden)

    Ioan Talpos

    2007-11-01

    Full Text Available Economic policies and, particularly, fiscal policies are not designed and implemented in an “empty space”: the structural characteristics of the economic systems, the institutional architecture of societies, the cultural paradigm and the power relations between different social groups define the borders of these policies. This paper tries to deal with these borders, to describe their nature and the implications of their existence to the fiscal policies’ quality and impact at a theoretical level as well as at an empirical one. The main results of the proposed analysis support the ideas that the mentioned variables matters both for the social mandate entrusted by the society to the state and thus to the role and functions of the state and for the economic growth as a support of the resources collected at distributed by the public authorities.

  1. Writing Indigenous women's lives in the Bay of Bengal: cultures of empire in the Andaman Islands, 1789-1906.

    Science.gov (United States)

    Anderson, Clare

    2011-01-01

    This article explores the lives of two Andamanese women, both of whom the British called “Tospy.” The first part of the article takes an indigenous and gendered perspective on early British colonization of the Andamans in the 1860s, and through the experiences of a woman called Topsy stresses the sexual violence that underpinned colonial settlement as well as the British reliance on women as cultural interlocutors. Second, the article discusses colonial naming practices, and the employment of Andamanese women and men as nursemaids and household servants during the 1890s–1910s. Using an extraordinary murder case in which a woman known as Topsy-ayah was a central witness, it argues that both reveal something of the enduring associations and legacies of slavery, as well as the cultural influence of the Atlantic in the Bay of Bengal. In sum, these women's lives present a kaleidoscope view of colonization, gender, networks of Empire, labor, and domesticity in the Bay of Bengal.

  2. Empirical analysis of online human dynamics

    Science.gov (United States)

    Zhao, Zhi-Dan; Zhou, Tao

    2012-06-01

    Patterns of human activities have attracted increasing academic interests, since the quantitative understanding of human behavior is helpful to uncover the origins of many socioeconomic phenomena. This paper focuses on behaviors of Internet users. Six large-scale systems are studied in our experiments, including the movie-watching in Netflix and MovieLens, the transaction in Ebay, the bookmark-collecting in Delicious, and the posting in FreindFeed and Twitter. Empirical analysis reveals some common statistical features of online human behavior: (1) The total number of user's actions, the user's activity, and the interevent time all follow heavy-tailed distributions. (2) There exists a strongly positive correlation between user's activity and the total number of user's actions, and a significantly negative correlation between the user's activity and the width of the interevent time distribution. We further study the rescaling method and show that this method could to some extent eliminate the different statistics among users caused by the different activities, yet the effectiveness depends on the data sets.

  3. An update on the "empirical turn" in bioethics: analysis of empirical research in nine bioethics journals.

    Science.gov (United States)

    Wangmo, Tenzin; Hauri, Sirin; Gennet, Eloise; Anane-Sarpong, Evelyn; Provoost, Veerle; Elger, Bernice S

    2018-02-07

    A review of literature published a decade ago noted a significant increase in empirical papers across nine bioethics journals. This study provides an update on the presence of empirical papers in the same nine journals. It first evaluates whether the empirical trend is continuing as noted in the previous study, and second, how it is changing, that is, what are the characteristics of the empirical works published in these nine bioethics journals. A review of the same nine journals (Bioethics; Journal of Medical Ethics; Journal of Clinical Ethics; Nursing Ethics; Cambridge Quarterly of Healthcare Ethics; Hastings Center Report; Theoretical Medicine and Bioethics; Christian Bioethics; and Kennedy Institute of Ethics Journal) was conducted for a 12-year period from 2004 to 2015. Data obtained was analysed descriptively and using a non-parametric Chi-square test. Of the total number of original papers (N = 5567) published in the nine bioethics journals, 18.1% (n = 1007) collected and analysed empirical data. Journal of Medical Ethics and Nursing Ethics led the empirical publications, accounting for 89.4% of all empirical papers. The former published significantly more quantitative papers than qualitative, whereas the latter published more qualitative papers. Our analysis reveals no significant difference (χ2 = 2.857; p = 0.091) between the proportion of empirical papers published in 2004-2009 and 2010-2015. However, the increasing empirical trend has continued in these journals with the proportion of empirical papers increasing from 14.9% in 2004 to 17.8% in 2015. This study presents the current state of affairs regarding empirical research published nine bioethics journals. In the quarter century of data that is available about the nine bioethics journals studied in two reviews, the proportion of empirical publications continues to increase, signifying a trend towards empirical research in bioethics. The growing volume is mainly attributable to two

  4. Sources of Currency Crisis: An Empirical Analysis

    OpenAIRE

    Weber, Axel A.

    1997-01-01

    Two types of currency crisis models coexist in the literature: first generation models view speculative attacks as being caused by economic fundamentals which are inconsistent with a given parity. Second generation models claim self-fulfilling speculation as the main source of a currency crisis. Recent empirical research in international macroeconomics has attempted to distinguish between the sources of currency crises. This paper adds to this literature by proposing a new empirical approach ...

  5. An empirical analysis of Diaspora bonds

    OpenAIRE

    AKKOYUNLU, Şule; STERN, Max

    2018-01-01

    Abstract. This study is the first to investigate theoretically and empirically the determinants of Diaspora Bonds for eight developing countries (Bangladesh, Ethiopia, Ghana, India, Lebanon, Pakistan, the Philippines, and Sri-Lanka) and one developed country - Israel for the period 1951 and 2008. Empirical results are consistent with the predictions of the theoretical model. The most robust variables are the closeness indicator and the sovereign rating, both on the demand-side. The spread is ...

  6. Multilevel Empirical Bayes Modeling for Improved Estimation of Toxicant Formulations to Suppress Parasitic Sea Lamprey in the Upper Great Lakes

    Science.gov (United States)

    Hatfield, L.A.; Gutreuter, S.; Boogaard, M.A.; Carlin, B.P.

    2011-01-01

    Estimation of extreme quantal-response statistics, such as the concentration required to kill 99.9% of test subjects (LC99.9), remains a challenge in the presence of multiple covariates and complex study designs. Accurate and precise estimates of the LC99.9 for mixtures of toxicants are critical to ongoing control of a parasitic invasive species, the sea lamprey, in the Laurentian Great Lakes of North America. The toxicity of those chemicals is affected by local and temporal variations in water chemistry, which must be incorporated into the modeling. We develop multilevel empirical Bayes models for data from multiple laboratory studies. Our approach yields more accurate and precise estimation of the LC99.9 compared to alternative models considered. This study demonstrates that properly incorporating hierarchical structure in laboratory data yields better estimates of LC99.9 stream treatment values that are critical to larvae control in the field. In addition, out-of-sample prediction of the results of in situ tests reveals the presence of a latent seasonal effect not manifest in the laboratory studies, suggesting avenues for future study and illustrating the importance of dual consideration of both experimental and observational data. ?? 2011, The International Biometric Society.

  7. Determining individual variation in growth and its implication for life-history and population processes using the empirical Bayes method.

    Directory of Open Access Journals (Sweden)

    Simone Vincenzi

    2014-09-01

    Full Text Available The differences in demographic and life-history processes between organisms living in the same population have important consequences for ecological and evolutionary dynamics. Modern statistical and computational methods allow the investigation of individual and shared (among homogeneous groups determinants of the observed variation in growth. We use an Empirical Bayes approach to estimate individual and shared variation in somatic growth using a von Bertalanffy growth model with random effects. To illustrate the power and generality of the method, we consider two populations of marble trout Salmo marmoratus living in Slovenian streams, where individually tagged fish have been sampled for more than 15 years. We use year-of-birth cohort, population density during the first year of life, and individual random effects as potential predictors of the von Bertalanffy growth function's parameters k (rate of growth and L∞ (asymptotic size. Our results showed that size ranks were largely maintained throughout marble trout lifetime in both populations. According to the Akaike Information Criterion (AIC, the best models showed different growth patterns for year-of-birth cohorts as well as the existence of substantial individual variation in growth trajectories after accounting for the cohort effect. For both populations, models including density during the first year of life showed that growth tended to decrease with increasing population density early in life. Model validation showed that predictions of individual growth trajectories using the random-effects model were more accurate than predictions based on mean size-at-age of fish.

  8. Determining individual variation in growth and its implication for life-history and population processes using the empirical Bayes method.

    Science.gov (United States)

    Vincenzi, Simone; Mangel, Marc; Crivelli, Alain J; Munch, Stephan; Skaug, Hans J

    2014-09-01

    The differences in demographic and life-history processes between organisms living in the same population have important consequences for ecological and evolutionary dynamics. Modern statistical and computational methods allow the investigation of individual and shared (among homogeneous groups) determinants of the observed variation in growth. We use an Empirical Bayes approach to estimate individual and shared variation in somatic growth using a von Bertalanffy growth model with random effects. To illustrate the power and generality of the method, we consider two populations of marble trout Salmo marmoratus living in Slovenian streams, where individually tagged fish have been sampled for more than 15 years. We use year-of-birth cohort, population density during the first year of life, and individual random effects as potential predictors of the von Bertalanffy growth function's parameters k (rate of growth) and L∞ (asymptotic size). Our results showed that size ranks were largely maintained throughout marble trout lifetime in both populations. According to the Akaike Information Criterion (AIC), the best models showed different growth patterns for year-of-birth cohorts as well as the existence of substantial individual variation in growth trajectories after accounting for the cohort effect. For both populations, models including density during the first year of life showed that growth tended to decrease with increasing population density early in life. Model validation showed that predictions of individual growth trajectories using the random-effects model were more accurate than predictions based on mean size-at-age of fish.

  9. Empirical Bayes Estimation of Semi-parametric Hierarchical Mixture Models for Unbiased Characterization of Polygenic Disease Architectures

    Directory of Open Access Journals (Sweden)

    Jo Nishino

    2018-04-01

    Full Text Available Genome-wide association studies (GWAS suggest that the genetic architecture of complex diseases consists of unexpectedly numerous variants with small effect sizes. However, the polygenic architectures of many diseases have not been well characterized due to lack of simple and fast methods for unbiased estimation of the underlying proportion of disease-associated variants and their effect-size distribution. Applying empirical Bayes estimation of semi-parametric hierarchical mixture models to GWAS summary statistics, we confirmed that schizophrenia was extremely polygenic [~40% of independent genome-wide SNPs are risk variants, most within odds ratio (OR = 1.03], whereas rheumatoid arthritis was less polygenic (~4 to 8% risk variants, significant portion reaching OR = 1.05 to 1.1. For rheumatoid arthritis, stratified estimations revealed that expression quantitative loci in blood explained large genetic variance, and low- and high-frequency derived alleles were prone to be risk and protective, respectively, suggesting a predominance of deleterious-risk and advantageous-protective mutations. Despite genetic correlation, effect-size distributions for schizophrenia and bipolar disorder differed across allele frequency. These analyses distinguished disease polygenic architectures and provided clues for etiological differences in complex diseases.

  10. Bayes factor design analysis: Planning for compelling evidence.

    Science.gov (United States)

    Schönbrodt, Felix D; Wagenmakers, Eric-Jan

    2018-02-01

    A sizeable literature exists on the use of frequentist power analysis in the null-hypothesis significance testing (NHST) paradigm to facilitate the design of informative experiments. In contrast, there is almost no literature that discusses the design of experiments when Bayes factors (BFs) are used as a measure of evidence. Here we explore Bayes Factor Design Analysis (BFDA) as a useful tool to design studies for maximum efficiency and informativeness. We elaborate on three possible BF designs, (a) a fixed-n design, (b) an open-ended Sequential Bayes Factor (SBF) design, where researchers can test after each participant and can stop data collection whenever there is strong evidence for either [Formula: see text] or [Formula: see text], and (c) a modified SBF design that defines a maximal sample size where data collection is stopped regardless of the current state of evidence. We demonstrate how the properties of each design (i.e., expected strength of evidence, expected sample size, expected probability of misleading evidence, expected probability of weak evidence) can be evaluated using Monte Carlo simulations and equip researchers with the necessary information to compute their own Bayesian design analyses.

  11. Insurability of Cyber Risk: An Empirical Analysis

    OpenAIRE

    Biener, Christian; Eling, Martin; Wirfs, Jan Hendrik

    2015-01-01

    This paper discusses the adequacy of insurance for managing cyber risk. To this end, we extract 994 cases of cyber losses from an operational risk database and analyse their statistical properties. Based on the empirical results and recent literature, we investigate the insurability of cyber risk by systematically reviewing the set of criteria introduced by Berliner (1982). Our findings emphasise the distinct characteristics of cyber risks compared with other operational risks and bring to li...

  12. Compassion: An Evolutionary Analysis and Empirical Review

    OpenAIRE

    Goetz, Jennifer L.; Keltner, Dacher; Simon-Thomas, Emiliana

    2010-01-01

    What is compassion? And how did it evolve? In this review, we integrate three evolutionary arguments that converge on the hypothesis that compassion evolved as a distinct affective experience whose primary function is to facilitate cooperation and protection of the weak and those who suffer. Our empirical review reveals compassion to have distinct appraisal processes attuned to undeserved suffering, distinct signaling behavior related to caregiving patterns of touch, posture, and vocalization...

  13. DORIAN, Bayes Method Plant Age Risk Analysis

    International Nuclear Information System (INIS)

    Atwood, C.L.

    2002-01-01

    1 - Description of program or function: DORIAN is an integrated package for performing Bayesian aging analysis of reliability data; e.g. for identifying trends in component failure rates and/or outage durations as a function of time. The user must specify several alternative hypothesized 'aging models' (i.e., possible trends) along prior probabilities indicating the subject probability that each trend is actually the correct one. DORIAN then uses component failure and/or repair data over time to update these prior probabilities and develop a posterior probability for each aging model, representing the probability that each model is the correct one in light of the observed data rather than a priori. Mean, median, and 5. and 95. percentile trends are also compiled from the posterior probabilities. 2 - Method of solution: DORIAN carries out a Bayesian analysis of failure data and a prior distribution on a time-dependent failure rate to obtain a posterior distribution on the failure rate. The form of the time-dependent failure rate is arbitrary, because DORIAN approximates the form by a step-function, constant within specified time intervals. Similarly, the parameters may have any prior distribution, because DORIAN uses a discrete distribution to approximate this. Likewise, the database file produced by DORIAN approximates the entire range of possible failure rates or outage durations developed by means of a discrete probability distribution containing no more than 20 distinct values with their probabilities. 3 - Restrictions on the complexity of the problem: Prior distribution is discrete with up to 25 values. Up to 60 times are accommodated in the discrete time history

  14. An Empirical Bayes Mixture Model for Effect Size Distributions in Genome-Wide Association Studies

    DEFF Research Database (Denmark)

    Thompson, Wesley K.; Wang, Yunpeng; Schork, Andrew J.

    2015-01-01

    -wide association study (GWAS) test statistics. Test statistics corresponding to null associations are modeled as random draws from a normal distribution with zero mean; test statistics corresponding to non-null associations are also modeled as normal with zero mean, but with larger variance. The model is fit via...... analytically and in simulations. We apply this approach to meta-analysis test statistics from two large GWAS, one for Crohn’s disease (CD) and the other for schizophrenia (SZ). A scale mixture of two normals distribution provides an excellent fit to the SZ nonparametric replication effect size estimates. While...... minimizing discrepancies between the parametric mixture model and resampling-based nonparametric estimates of replication effect sizes and variances. We describe in detail the implications of this model for estimation of the non-null proportion, the probability of replication in de novo samples, the local...

  15. Contemporary statistical procedures (Parametric Empirical Bayes) and nuclear plant event rates

    International Nuclear Information System (INIS)

    Gaver, D.P.; Worledge, D.H.

    1985-01-01

    The conduct of a nuclear power plant probabilistic risk assessment (PRA) recognizes that each of a great many vital components and systems is subject to failure. One aspect of the PRA procedure is to quantify individual item failure propensity, often in terms of the failure rate parameter of an exponential distribution or Poisson process, and then to combine rates so as to effectively infer the probability of plant failure, e.g., core damage. The formal method of combination of such rates involves use of fault-tree analysis. The defensibility of the final fault-tree result depends both upon the adequacy of the failure representations of its components, and upon the correctness and inclusiveness of the fault tree logic. This paper focuses upon the first issue, in particular, upon contemporary proposals for deriving estimates of individual rates. The purpose of the paper is to present, in basically non-mathematical terms, the essential nature of some of these proposals, and an assessment of how they might fit into, and contribute positively to, a more defensible or trustworthy PRA process

  16. An Empirical Bayes Mixture Model for Effect Size Distributions in Genome-Wide Association Studies.

    Directory of Open Access Journals (Sweden)

    Wesley K Thompson

    2015-12-01

    Full Text Available Characterizing the distribution of effects from genome-wide genotyping data is crucial for understanding important aspects of the genetic architecture of complex traits, such as number or proportion of non-null loci, average proportion of phenotypic variance explained per non-null effect, power for discovery, and polygenic risk prediction. To this end, previous work has used effect-size models based on various distributions, including the normal and normal mixture distributions, among others. In this paper we propose a scale mixture of two normals model for effect size distributions of genome-wide association study (GWAS test statistics. Test statistics corresponding to null associations are modeled as random draws from a normal distribution with zero mean; test statistics corresponding to non-null associations are also modeled as normal with zero mean, but with larger variance. The model is fit via minimizing discrepancies between the parametric mixture model and resampling-based nonparametric estimates of replication effect sizes and variances. We describe in detail the implications of this model for estimation of the non-null proportion, the probability of replication in de novo samples, the local false discovery rate, and power for discovery of a specified proportion of phenotypic variance explained from additive effects of loci surpassing a given significance threshold. We also examine the crucial issue of the impact of linkage disequilibrium (LD on effect sizes and parameter estimates, both analytically and in simulations. We apply this approach to meta-analysis test statistics from two large GWAS, one for Crohn's disease (CD and the other for schizophrenia (SZ. A scale mixture of two normals distribution provides an excellent fit to the SZ nonparametric replication effect size estimates. While capturing the general behavior of the data, this mixture model underestimates the tails of the CD effect size distribution. We discuss the

  17. An Empirical Bayes Mixture Model for Effect Size Distributions in Genome-Wide Association Studies.

    Science.gov (United States)

    Thompson, Wesley K; Wang, Yunpeng; Schork, Andrew J; Witoelar, Aree; Zuber, Verena; Xu, Shujing; Werge, Thomas; Holland, Dominic; Andreassen, Ole A; Dale, Anders M

    2015-12-01

    Characterizing the distribution of effects from genome-wide genotyping data is crucial for understanding important aspects of the genetic architecture of complex traits, such as number or proportion of non-null loci, average proportion of phenotypic variance explained per non-null effect, power for discovery, and polygenic risk prediction. To this end, previous work has used effect-size models based on various distributions, including the normal and normal mixture distributions, among others. In this paper we propose a scale mixture of two normals model for effect size distributions of genome-wide association study (GWAS) test statistics. Test statistics corresponding to null associations are modeled as random draws from a normal distribution with zero mean; test statistics corresponding to non-null associations are also modeled as normal with zero mean, but with larger variance. The model is fit via minimizing discrepancies between the parametric mixture model and resampling-based nonparametric estimates of replication effect sizes and variances. We describe in detail the implications of this model for estimation of the non-null proportion, the probability of replication in de novo samples, the local false discovery rate, and power for discovery of a specified proportion of phenotypic variance explained from additive effects of loci surpassing a given significance threshold. We also examine the crucial issue of the impact of linkage disequilibrium (LD) on effect sizes and parameter estimates, both analytically and in simulations. We apply this approach to meta-analysis test statistics from two large GWAS, one for Crohn's disease (CD) and the other for schizophrenia (SZ). A scale mixture of two normals distribution provides an excellent fit to the SZ nonparametric replication effect size estimates. While capturing the general behavior of the data, this mixture model underestimates the tails of the CD effect size distribution. We discuss the implications of

  18. Time-frequency analysis : mathematical analysis of the empirical mode decomposition.

    Science.gov (United States)

    2009-01-01

    Invented over 10 years ago, empirical mode : decomposition (EMD) provides a nonlinear : time-frequency analysis with the ability to successfully : analyze nonstationary signals. Mathematical : Analysis of the Empirical Mode Decomposition : is a...

  19. Elemental analysis of Uranouchi bay seabed sludge using PIXE

    International Nuclear Information System (INIS)

    Kabir, M. Hasnat; Narusawa, Tadashi; Nishiyama, Fumitaka; Sumi, Katsuhiro

    2006-01-01

    Elemental analyses were carried out for the seabed sludge collected from Uranouchi bay (Kochi, Japan) using Particle Induced X-ray Emission (PIXE). Seabed-sludge contamination with heavy metals as well as toxic elements becomes one of the most serious environmental problems. The aim of the present study is to investigate the polluted areas in the bay by heavy and toxic elements. As a results of analyses of samples collected from eleven different places in the bay, seventeen elements including toxic ones were detected. The results suggest that the center region of the bay is seriously contaminated by heavy and toxic elements in comparison with the other areas in the bay. (author)

  20. Empirical and theoretical analysis of complex systems

    Science.gov (United States)

    Zhao, Guannan

    structures evolve on a similar timescale to individual level transmission, we investigated the process of transmission through a model population comprising of social groups which follow simple dynamical rules for growth and break-up, and the profiles produced bear a striking resemblance to empirical data obtained from social, financial and biological systems. Finally, for better implementation of a widely accepted power law test algorithm, we have developed a fast testing procedure using parallel computation.

  1. Government debt in Greece: An empirical analysis

    Directory of Open Access Journals (Sweden)

    Gisele Mah

    2014-06-01

    Full Text Available Greek government debt has been increasing above the percentage stated in the growth and stability path from 112.9% in 2008 to 175.6% in 2013. This paper investigates the determinants of the general government debt in Greek by means of Vector Error Correction Model framework, Variance Decomposition and Generalized Impulse Response Function Analysis. The analysis showed a significant negative relationship between general government debt and government deficit, general government debt and inflation. Shocks to general government and inflation will cause general government debt to increase. Government deficit should be increased since there is gross capital formation included in its calculation which could be invested in income generating projects. The current account balance should be reduced by improving the net trade balance.

  2. Microscopic saw mark analysis: an empirical approach.

    Science.gov (United States)

    Love, Jennifer C; Derrick, Sharon M; Wiersema, Jason M; Peters, Charles

    2015-01-01

    Microscopic saw mark analysis is a well published and generally accepted qualitative analytical method. However, little research has focused on identifying and mitigating potential sources of error associated with the method. The presented study proposes the use of classification trees and random forest classifiers as an optimal, statistically sound approach to mitigate the potential for error of variability and outcome error in microscopic saw mark analysis. The statistical model was applied to 58 experimental saw marks created with four types of saws. The saw marks were made in fresh human femurs obtained through anatomical gift and were analyzed using a Keyence digital microscope. The statistical approach weighed the variables based on discriminatory value and produced decision trees with an associated outcome error rate of 8.62-17.82%. © 2014 American Academy of Forensic Sciences.

  3. THE LISBON STRATEGY: AN EMPIRICAL ANALYSIS

    Directory of Open Access Journals (Sweden)

    Silvestri Marcello

    2010-07-01

    Full Text Available This paper investigates the European economic integration within the frame work of the 2000 Lisbon Council with the aim of studying the dynamics affecting the social and economic life of European Countries. Such a descriptive investigation focuses on certain significant variables of the new theories highlighting the importance of technological innovation and human capital. To this end the multivariate statistic technique of Principal Component Analysis has been applied in order to classify Countries with regard to the investigated phenomenon.

  4. Some connections for manuals of empirical logic to functional analysis

    International Nuclear Information System (INIS)

    Cook, T.A.

    1981-01-01

    In this informal presentation, the theory of manuals of operations is connected with some familiar concepts in functional analysis; namely, base normed and order unit normed spaces. The purpose of this discussion is to present several general open problems which display the interplay of empirical logic with functional analysis. These are mathematical problems with direct physical interpretation. (orig./HSI)

  5. Empirical analysis of industrial operations in Montenegro

    Directory of Open Access Journals (Sweden)

    Galić Jelena

    2012-12-01

    Full Text Available Since the starting process of transition, industrial production in Montenegro has been faced with serious problems and its share in GDP is constantly decreasing. Global financial crises had in large extent negatively influenced industry. Analysis of financial indicators showed that industry had significant losses, problem of undercapitalisation and liquidity problems. If we look by industry sectors, than situation is more favourable in the production of electricity, gas and water compared to extracting industry and mining. In paper is proposed measures of economic policy in order to improve situation in industry.

  6. Empirical models based on the universal soil loss equation fail to predict sediment discharges from Chesapeake Bay catchments.

    Science.gov (United States)

    Boomer, Kathleen B; Weller, Donald E; Jordan, Thomas E

    2008-01-01

    The Universal Soil Loss Equation (USLE) and its derivatives are widely used for identifying watersheds with a high potential for degrading stream water quality. We compared sediment yields estimated from regional application of the USLE, the automated revised RUSLE2, and five sediment delivery ratio algorithms to measured annual average sediment delivery in 78 catchments of the Chesapeake Bay watershed. We did the same comparisons for another 23 catchments monitored by the USGS. Predictions exceeded observed sediment yields by more than 100% and were highly correlated with USLE erosion predictions (Pearson r range, 0.73-0.92; p USLE estimates (r = 0.87; p USLE model did not change the results. In ranked comparisons between observed and predicted sediment yields, the models failed to identify catchments with higher yields (r range, -0.28-0.00; p > 0.14). In a multiple regression analysis, soil erodibility, log (stream flow), basin shape (topographic relief ratio), the square-root transformed proportion of forest, and occurrence in the Appalachian Plateau province explained 55% of the observed variance in measured suspended sediment loads, but the model performed poorly (r(2) = 0.06) at predicting loads in the 23 USGS watersheds not used in fitting the model. The use of USLE or multiple regression models to predict sediment yields is not advisable despite their present widespread application. Integrated watershed models based on the USLE may also be unsuitable for making management decisions.

  7. Application of parameters space analysis tools for empirical model validation

    Energy Technology Data Exchange (ETDEWEB)

    Paloma del Barrio, E. [LEPT-ENSAM UMR 8508, Talence (France); Guyon, G. [Electricite de France, Moret-sur-Loing (France)

    2004-01-01

    A new methodology for empirical model validation has been proposed in the framework of the Task 22 (Building Energy Analysis Tools) of the International Energy Agency. It involves two main steps: checking model validity and diagnosis. Both steps, as well as the underlying methods, have been presented in the first part of the paper. In this part, they are applied for testing modelling hypothesis in the framework of the thermal analysis of an actual building. Sensitivity analysis tools have been first used to identify the parts of the model that can be really tested on the available data. A preliminary diagnosis is then supplied by principal components analysis. Useful information for model behaviour improvement has been finally obtained by optimisation techniques. This example of application shows how model parameters space analysis is a powerful tool for empirical validation. In particular, diagnosis possibilities are largely increased in comparison with residuals analysis techniques. (author)

  8. Neutron activation analysis to the profile surface sediments from several sites on the Havana Bay

    International Nuclear Information System (INIS)

    Diaz Riso, O.; Gelen, A.; Lopez, N.; Gonzalez, H.; Manso, M.V.; Graciano, A.M.; Nogueira, C.A.; Beltran, J.; Soto, J.

    2003-01-01

    Instrumental neutron activation analysis (INAA) technique was employed to analyze the surface sediments from several sites on the Havana Bay, Cuba. Measurements of heavy and trace elements in the sediments are reported. The results show that the concentration of the elements is site dependent. The data suggest that an anthropogenic input into the bay from domestic sewage and industries occurred

  9. An Empirical Analysis of the Relationship between Minimum Wage ...

    African Journals Online (AJOL)

    An Empirical Analysis of the Relationship between Minimum Wage, Investment and Economic Growth in Ghana. ... In addition, the ratio of public investment to tax revenue must increase as minimum wage increases since such complementary changes are more likely to lead to economic growth. Keywords: minimum wage ...

  10. FY 2016 Grant Announcement: FY 2016 Technical Analysis and Programmatic Evaluation Support to the Chesapeake Bay Program Partnership

    Science.gov (United States)

    The U.S. Environmental Protection Agency’s Chesapeake Bay Program Office is announcing a Request for Proposals for applicants to provide the Chesapeake Bay Program partners with a proposal(s) for providing technical analysis and programmatic evaluation

  11. Victim countries of transnational terrorism: an empirical characteristics analysis.

    Science.gov (United States)

    Elbakidze, Levan; Jin, Yanhong

    2012-12-01

    This study empirically investigates the association between country-level socioeconomic characteristics and risk of being victimized in transnational terrorism events. We find that a country's annual financial contribution to the U.N. general operating budget has a positive association with the frequency of being victimized in transnational terrorism events. In addition, per capita GDP, political freedom, and openness to trade are nonlinearly related to the frequency of being victimized in transnational terrorism events. © 2012 Society for Risk Analysis.

  12. Islamic banks and profitability: an empirical analysis of Indonesian banking

    OpenAIRE

    Jordan, Sarah

    2013-01-01

    This paper provides an empirical analysis of the factors that determine the profitability of Indonesian banks between the years 2006-2012. In particular, it investigates whether there are any significant differences in terms of profitability between Islamic banks and commercial banks. The results, obtained by applying the system-GMM estimator to the panel of 54 banks, indicate that the high bank profitability during these years were determined mainly by the size of the banks, the market share...

  13. Explaining Innovation. An Empirical Analysis of Industry Data from Norway

    Directory of Open Access Journals (Sweden)

    Torbjørn Lorentzen

    2016-01-01

    Full Text Available The objective of the paper is to analyse why some firms innovate while others do not. The paper combines different theories of innovation by relating innovation to internal, firm specific assets and external, regional factors. Hypotheses are derived from theories and tested empirically by using logistic regression. The empirical analysis indicates that internal funding of R&D and size of the firm are the most important firm specific attributes for successful innovation. External, regional factors are also important. The analysis shows that firms located in large urban regions have significantly higher innovation rates than firms located in the periphery, and firms involved in regional networking are more likely to innovate compared to firms not involved in networking. The analysis contributes to a theoretical and empirical understanding of factors that influence on innovation and the role innovation plays in the market economy. Innovation policy should be targeted at developing a tax system and building infrastructure which give firms incentives to invest and allocate internal resources to R&D-activities and collaborate with others in innovation. From an economic policy perspective, consideration should be given to allocating more public resources to rural areas in order to compensate for the asymmetric distribution of resources between the centre and periphery. The paper contributes to the scientific literature of innovation by combining the firm oriented perspective with weight on firm specific, internal resources and a system perspective which focuses on external resources and networking as the most important determinants of innovation in firms.

  14. Multielemental analysis of surface sediments in Havana bay (Cuba) using X-ray fluorescence

    International Nuclear Information System (INIS)

    Gelen, A.; Corrales, Y.; Lopez, N.; Manso Guevara, M. V.; Casanova, A. O.; Alessandro, K. D'; Diaz, O.; Espen, P. Van; Beltran, J.; Soto, J.

    2006-01-01

    Multielemental Analysis was performed in Superficial Sediments in Havana Bay. Twenty one samples were analysed by Dispersive Energy X- Ray Fluorescence using an spectrometer based on Si (Li) semiconductor detector an a 109 Cd source. The results showed a similar behaviour in the levels of contamination related with neutron activation analysis. The data suggest that an anthropogenic input into the bay from domestic sewage and industries occurred. (Full text)

  15. Integration of least angle regression with empirical Bayes for multi-locus genome-wide association studies

    Science.gov (United States)

    Multi-locus genome-wide association studies has become the state-of-the-art procedure to identify quantitative trait loci (QTL) associated with traits simultaneously. However, implementation of multi-locus model is still difficult. In this study, we integrated least angle regression with empirical B...

  16. EMPIRICAL ANALYSIS OF REMITTANCE INFLOW: THE CASE OF NEPAL

    Directory of Open Access Journals (Sweden)

    Karan Singh Thagunna

    2013-01-01

    Full Text Available This paper analyzes the nine year remittance inflow and macroeconomic data of Nepal, and studies the effect of remittance on each of those macroeconomic variables. We have used Unit Root Test, Least Squared Regression Analysis, and Granger Causality Test. The empirical results suggest that remittance has more causality on the consumption pattern as well as the import patter, and less on investments. Furthermore, with available literatures, this paper discusses the importance of channeling the remittance funds into the productive capital, mainly the public infrastructure, in comparison with the South Korean case study.

  17. Food labelled Information: An Empirical Analysis of Consumer Preferences

    Directory of Open Access Journals (Sweden)

    Alessandro Banterle

    2012-12-01

    Full Text Available This paper aims at analysing which kinds of currently labelled information are of interest and actually used by consumers, and which additional kinds could improve consumer choices. We investigate the attitude of consumers with respect to innovative strategies for the diffusion of product information, as smart-labels for mobile-phones. The empirical analysis was organised in focus groups followed by a survey on 240 consumers. Results show that the most important nutritional claims are vitamins, energy and fat content. Consumers show a high interest in the origin of the products, GMOs, environmental impact, animal welfare and type of breeding.

  18. Prominent feature extraction for review analysis: an empirical study

    Science.gov (United States)

    Agarwal, Basant; Mittal, Namita

    2016-05-01

    Sentiment analysis (SA) research has increased tremendously in recent times. SA aims to determine the sentiment orientation of a given text into positive or negative polarity. Motivation for SA research is the need for the industry to know the opinion of the users about their product from online portals, blogs, discussion boards and reviews and so on. Efficient features need to be extracted for machine-learning algorithm for better sentiment classification. In this paper, initially various features are extracted such as unigrams, bi-grams and dependency features from the text. In addition, new bi-tagged features are also extracted that conform to predefined part-of-speech patterns. Furthermore, various composite features are created using these features. Information gain (IG) and minimum redundancy maximum relevancy (mRMR) feature selection methods are used to eliminate the noisy and irrelevant features from the feature vector. Finally, machine-learning algorithms are used for classifying the review document into positive or negative class. Effects of different categories of features are investigated on four standard data-sets, namely, movie review and product (book, DVD and electronics) review data-sets. Experimental results show that composite features created from prominent features of unigram and bi-tagged features perform better than other features for sentiment classification. mRMR is a better feature selection method as compared with IG for sentiment classification. Boolean Multinomial Naïve Bayes) algorithm performs better than support vector machine classifier for SA in terms of accuracy and execution time.

  19. Contribution of Online Trading of Used Goods to Resource Efficiency: An Empirical Study of eBay Users

    Directory of Open Access Journals (Sweden)

    Jens Clausen

    2010-06-01

    Full Text Available This paper discusses the sustainability impact (contribution to sustainability, reduction of adverse environmental impacts of online second-hand trading. A survey of eBay users shows that a relationship between the trading of used goods and the protection of natural resources is hardly realized. Secondly, the environmental motivation and the willingness to act in a sustainable manner differ widely between groups of consumers. Given these results from a user perspective, the paper tries to find some objective hints of online second-hand trading’s environmental impact. The greenhouse gas emissions resulting from the energy used for the trading transactions seem to be considerably lower than the emissions due to the (avoided production of new goods. The paper concludes with a set of recommendations for second-hand trade and consumer policy. Information about the sustainability benefits of purchasing second-hand goods should be included in general consumer information, and arguments for changes in behavior should be targeted to different groups of consumers.

  20. An analysis of the number of parking bays and checkout counters for a supermarket using SAS simulation studio

    Science.gov (United States)

    Kar, Leow Soo

    2014-07-01

    Two important factors that influence customer satisfaction in large supermarkets or hypermarkets are adequate parking facilities and short waiting times at the checkout counters. This paper describes the simulation analysis of a large supermarket to determine the optimal levels of these two factors. SAS Simulation Studio is used to model a large supermarket in a shopping mall with car park facility. In order to make the simulation model more realistic, a number of complexities are introduced into the model. For example, arrival patterns of customers vary with the time of the day (morning, afternoon and evening) and with the day of the week (weekdays or weekends), the transport mode of arriving customers (by car or other means), the mode of payment (cash or credit card), customer shopping pattern (leisurely, normal, exact) or choice of checkout counters (normal or express). In this study, we focus on 2 important components of the simulation model, namely the parking area, the normal and express checkout counters. The parking area is modeled using a Resource Pool block where one resource unit represents one parking bay. A customer arriving by car seizes a unit of the resource from the Pool block (parks car) and only releases it when he exits the system. Cars arriving when the Resource Pool is empty (no more parking bays) leave without entering the system. The normal and express checkouts are represented by Server blocks with appropriate service time distributions. As a case study, a supermarket in a shopping mall with a limited number of parking bays in Bangsar was chosen for this research. Empirical data on arrival patterns, arrival modes, payment modes, shopping patterns, service times of the checkout counters were collected and analyzed to validate the model. Sensitivity analysis was also performed with different simulation scenarios to identify the parameters for the optimal number the parking spaces and checkout counters.

  1. Energy efficiency determinants: An empirical analysis of Spanish innovative firms

    International Nuclear Information System (INIS)

    Costa-Campi, María Teresa; García-Quevedo, José; Segarra, Agustí

    2015-01-01

    This paper examines the extent to which innovative Spanish firms pursue improvements in energy efficiency (EE) as an objective of innovation. The increase in energy consumption and its impact on greenhouse gas emissions justifies the greater attention being paid to energy efficiency and especially to industrial EE. The ability of manufacturing companies to innovate and improve their EE has a substantial influence on attaining objectives regarding climate change mitigation. Despite the effort to design more efficient energy policies, the EE determinants in manufacturing firms have been little studied in the empirical literature. From an exhaustive sample of Spanish manufacturing firms and using a logit model, we examine the energy efficiency determinants for those firms that have innovated. To carry out the econometric analysis, we use panel data from the Community Innovation Survey for the period 2008–2011. Our empirical results underline the role of size among the characteristics of firms that facilitate energy efficiency innovation. Regarding company behaviour, firms that consider the reduction of environmental impacts to be an important objective of innovation and that have introduced organisational innovations are more likely to innovate with the objective of increasing energy efficiency. -- Highlights: •Drivers of innovation in energy efficiency at firm-level are examined. •Tangible investments have a greater influence on energy efficiency than R&D. •Environmental and energy efficiency innovation objectives are complementary. •Organisational innovation favors energy efficiency innovation. •Public policies should be implemented to improve firms’ energy efficiency

  2. Environmental pressure group strength and air pollution. An empirical analysis

    Energy Technology Data Exchange (ETDEWEB)

    Binder, Seth; Neumayer, Eric [Department of Geography and Environment and Center for Environmental Policy and Governance (CEPG), London School of Economics and Political Science, Houghton Street, London WC2A 2AE (United Kingdom)

    2005-12-01

    There is an established theoretical and empirical case-study literature arguing that environmental pressure groups have a real impact on pollution levels. Our original contribution to this literature is to provide the first systematic quantitative test of the strength of environmental non-governmental organizations (ENGOs) on air pollution levels. We find that ENGO strength exerts a statistically significant impact on sulfur dioxide, smoke and heavy particulates concentration levels in a cross-country time-series regression analysis. This result holds true both for ordinary least squares and random-effects estimation. It is robust to controlling for the potential endogeneity of ENGO strength with the help of instrumental variables. The effect is also substantively important. Strengthening ENGOs represents an important strategy by which aid donors, foundations, international organizations and other stakeholders can try to achieve lower pollution levels around the world.

  3. Environmental pressure group strength and air pollution. An empirical analysis

    International Nuclear Information System (INIS)

    Binder, Seth; Neumayer, Eric

    2005-01-01

    There is an established theoretical and empirical case-study literature arguing that environmental pressure groups have a real impact on pollution levels. Our original contribution to this literature is to provide the first systematic quantitative test of the strength of environmental non-governmental organizations (ENGOs) on air pollution levels. We find that ENGO strength exerts a statistically significant impact on sulfur dioxide, smoke and heavy particulates concentration levels in a cross-country time-series regression analysis. This result holds true both for ordinary least squares and random-effects estimation. It is robust to controlling for the potential endogeneity of ENGO strength with the help of instrumental variables. The effect is also substantively important. Strengthening ENGOs represents an important strategy by which aid donors, foundations, international organizations and other stakeholders can try to achieve lower pollution levels around the world

  4. Auto-correlation analysis of wave heights in the Bay of Bengal

    Indian Academy of Sciences (India)

    Time series observations of significant wave heights in the Bay of Bengal were subjected to auto- correlation analysis to determine temporal variability scale. The analysis indicates an exponen- tial fall of auto-correlation in the first few hours with a decorrelation time scale of about six hours. A similar figure was found earlier ...

  5. Refined discrete and empirical horizontal gradients in VLBI analysis

    Science.gov (United States)

    Landskron, Daniel; Böhm, Johannes

    2018-02-01

    Missing or incorrect consideration of azimuthal asymmetry of troposphere delays is a considerable error source in space geodetic techniques such as Global Navigation Satellite Systems (GNSS) or Very Long Baseline Interferometry (VLBI). So-called horizontal troposphere gradients are generally utilized for modeling such azimuthal variations and are particularly required for observations at low elevation angles. Apart from estimating the gradients within the data analysis, which has become common practice in space geodetic techniques, there is also the possibility to determine the gradients beforehand from different data sources than the actual observations. Using ray-tracing through Numerical Weather Models (NWMs), we determined discrete gradient values referred to as GRAD for VLBI observations, based on the standard gradient model by Chen and Herring (J Geophys Res 102(B9):20489-20502, 1997. https://doi.org/10.1029/97JB01739) and also for new, higher-order gradient models. These gradients are produced on the same data basis as the Vienna Mapping Functions 3 (VMF3) (Landskron and Böhm in J Geod, 2017.https://doi.org/10.1007/s00190-017-1066-2), so they can also be regarded as the VMF3 gradients as they are fully consistent with each other. From VLBI analyses of the Vienna VLBI and Satellite Software (VieVS), it becomes evident that baseline length repeatabilities (BLRs) are improved on average by 5% when using a priori gradients GRAD instead of estimating the gradients. The reason for this improvement is that the gradient estimation yields poor results for VLBI sessions with a small number of observations, while the GRAD a priori gradients are unaffected from this. We also developed a new empirical gradient model applicable for any time and location on Earth, which is included in the Global Pressure and Temperature 3 (GPT3) model. Although being able to describe only the systematic component of azimuthal asymmetry and no short-term variations at all, even these

  6. Multielemental analysis to the profile sediments from several sites on the Havana Bay, by the Neutron activation analysis

    International Nuclear Information System (INIS)

    Diaz Riso, O.; Gelen, A.; Lopez, N.; Manso, M.V.; Graciano, A.M.; Nogueira, C.A.; Beltran, J.

    2003-01-01

    Instrumental neutron activation analysis (INAA) technique was employed to analyze the profile sediments from several sites on the Havana Bay, Cuba. Measurements of 24 heavy and trace elements in the sediments are reported. The results show the increment in the sediment pollution occurred in the 70-80 years of the last century. The data confirm that an anthropogenic input into the bay from domestic sewage and industries occurred

  7. pKWmEB: integration of Kruskal-Wallis test with empirical Bayes under polygenic background control for multi-locus genome-wide association study.

    Science.gov (United States)

    Ren, Wen-Long; Wen, Yang-Jun; Dunwell, Jim M; Zhang, Yuan-Ming

    2018-03-01

    Although nonparametric methods in genome-wide association studies (GWAS) are robust in quantitative trait nucleotide (QTN) detection, the absence of polygenic background control in single-marker association in genome-wide scans results in a high false positive rate. To overcome this issue, we proposed an integrated nonparametric method for multi-locus GWAS. First, a new model transformation was used to whiten the covariance matrix of polygenic matrix K and environmental noise. Using the transferred model, Kruskal-Wallis test along with least angle regression was then used to select all the markers that were potentially associated with the trait. Finally, all the selected markers were placed into multi-locus model, these effects were estimated by empirical Bayes, and all the nonzero effects were further identified by a likelihood ratio test for true QTN detection. This method, named pKWmEB, was validated by a series of Monte Carlo simulation studies. As a result, pKWmEB effectively controlled false positive rate, although a less stringent significance criterion was adopted. More importantly, pKWmEB retained the high power of Kruskal-Wallis test, and provided QTN effect estimates. To further validate pKWmEB, we re-analyzed four flowering time related traits in Arabidopsis thaliana, and detected some previously reported genes that were not identified by the other methods.

  8. A chi-square goodness-of-fit test for non-identically distributed random variables: with application to empirical Bayes

    International Nuclear Information System (INIS)

    Conover, W.J.; Cox, D.D.; Martz, H.F.

    1997-12-01

    When using parametric empirical Bayes estimation methods for estimating the binomial or Poisson parameter, the validity of the assumed beta or gamma conjugate prior distribution is an important diagnostic consideration. Chi-square goodness-of-fit tests of the beta or gamma prior hypothesis are developed for use when the binomial sample sizes or Poisson exposure times vary. Nine examples illustrate the application of the methods, using real data from such diverse applications as the loss of feedwater flow rates in nuclear power plants, the probability of failure to run on demand and the failure rates of the high pressure coolant injection systems at US commercial boiling water reactors, the probability of failure to run on demand of emergency diesel generators in US commercial nuclear power plants, the rate of failure of aircraft air conditioners, baseball batting averages, the probability of testing positive for toxoplasmosis, and the probability of tumors in rats. The tests are easily applied in practice by means of corresponding Mathematica reg-sign computer programs which are provided

  9. Multilevel eEmpirical Bayes modeling for improved estimation of toxicant formulations tosuppress parasitic sea lamprey in the Upper Great Lakes

    Science.gov (United States)

    Hatfield, Laura A.; Gutreuter, Steve; Boogaard, Michael A.; Carlin, Bradley P.

    2011-01-01

    Estimation of extreme quantal-response statistics, such as the concentration required to kill 99.9% of test subjects (LC99.9), remains a challenge in the presence of multiple covariates and complex study designs. Accurate and precise estimates of the LC99.9 for mixtures of toxicants are critical to ongoing control of a parasitic invasive species, the sea lamprey, in the Laurentian Great Lakes of North America. The toxicity of those chemicals is affected by local and temporal variations in water chemistry, which must be incorporated into the modeling. We develop multilevel empirical Bayes models for data from multiple laboratory studies. Our approach yields more accurate and precise estimation of the LC99.9 compared to alternative models considered. This study demonstrates that properly incorporating hierarchical structure in laboratory data yields better estimates of LC99.9 stream treatment values that are critical to larvae control in the field. In addition, out-of-sample prediction of the results of in situ tests reveals the presence of a latent seasonal effect not manifest in the laboratory studies, suggesting avenues for future study and illustrating the importance of dual consideration of both experimental and observational data.

  10. Numerical Analysis of Storm Surge and Seiche at Tokyo Bay caused by the 2 Similar Typhoons, Typhoon Phanphon and Vongfong in 2014

    Science.gov (United States)

    Iwamoto, T.; Takagawa, T.

    2017-12-01

    A long period damped oscillation, or seiche, sometimes happens inside a harbor after passing typhoon. For some cases, a maximum sea level is observed due to the superposition of astronomical tide and seiche rather than a peak of storm surge. Hence to clarify seiche factors for reducing disaster potential is important, a long-period seiche with a fundamental period of 5.46 hours in Tokyo Bay (Konishi, 2008) was investigated through numerical simulations and analyses. We examined the case of Typhoon Phanphon and Vongfong in 2014 (Hereafter Case P and V). The intensity and moving velocity were similar and the best-tracks were an arc-shaped, typical one approaching to Tokyo Bay. The track of Case V was about 1.5 degree higher latitude than that of Case P, only Typhoon Phanphon caused significant seiche.Firstly, numerical simulations for the 2 storm surges at Tokyo Bay were conducted by Regional Ocean Modeling System (ROMS) and Meso-Scale Model Grid Point Values (MSM-GPV). MSM-GPV gave the 10m wind speed and Sea Level Pressure (SLP), especially the Mean Error (ME) and Root Mean Squire Error (RMSE) of SLP were low compared to the 12 JMA observation points data (Case P: ME -0.303hPa, RMSE 1.87hPa, Case V: ME -0.285hPa, RMSE 0.74hPa). The computational results showed that the maximum of storm surge was underestimated but the difference was less than 20cm at 5 observation points in Tokyo Bay(Fig.1, 2).Then, power spectrals, coherences and phase differences of storm surges at the 5 observation points were obtained by spectral analysis of observed and simulated waveforms. For Case P, the phase-difference between the bay mouth and innermost part of Tokyo Bay was little, and coherence was almost 1(Fig.3, 4). However, for Case V, coherence was small around the fundamental period of 5.46 hours. Furthermore, Empirical Orthogonal Function (EOF) analysis of storm surge, SLP and sea surface stress were conducted. The contributions of EOF1 were above 90% for the all variables, the

  11. AutoBayes: A System for Generating Data Analysis Programs from Statistical Models

    OpenAIRE

    Fischer, Bernd; Schumann, Johann

    2003-01-01

    Data analysis is an important scientific task which is required whenever information needs to be extracted from raw data. Statistical approaches to data analysis, which use methods from probability theory and numerical analysis, are well-founded but dificult to implement: the development of a statistical data analysis program for any given application is time-consuming and requires substantial knowledge and experience in several areas. In this paper, we describe AutoBayes, a program synthesis...

  12. BayesLCA: An R Package for Bayesian Latent Class Analysis

    Directory of Open Access Journals (Sweden)

    Arthur White

    2014-11-01

    Full Text Available The BayesLCA package for R provides tools for performing latent class analysis within a Bayesian setting. Three methods for fitting the model are provided, incorporating an expectation-maximization algorithm, Gibbs sampling and a variational Bayes approximation. The article briefly outlines the methodology behind each of these techniques and discusses some of the technical difficulties associated with them. Methods to remedy these problems are also described. Visualization methods for each of these techniques are included, as well as criteria to aid model selection.

  13. Analysis of the Daya Bay Reactor Antineutrino Flux Changes with Fuel Burnup

    Science.gov (United States)

    Hayes, A. C.; Jungman, Gerard; McCutchan, E. A.; Sonzogni, A. A.; Garvey, G. T.; Wang, X. B.

    2018-01-01

    We investigate the recent Daya Bay results on the changes in the antineutrino flux and spectrum with the burnup of the reactor fuel. We find that the discrepancy between current model predictions and the Daya Bay results can be traced to the original measured U 235 /Pu 239 ratio of the fission β spectra that were used as a base for the expected antineutrino fluxes. An analysis of the antineutrino spectra that is based on a summation over all fission fragment β decays, using nuclear database input, explains all of the features seen in the Daya Bay evolution data. However, this summation method still allows for an anomaly. We conclude that there is currently not enough information to use the antineutrino flux changes to rule out the possible existence of sterile neutrinos.

  14. Analysis of payload bay magnetic fields due to dc power multipoint and single point ground configurations

    Science.gov (United States)

    Lawton, R. M.

    1976-01-01

    An analysis of magnetic fields in the Orbiter Payload Bay resulting from the present grounding configuration (structure return) was presented and the amount of improvement that would result from installing wire returns for the three dc power buses was determined. Ac and dc magnetic fields at five points in a cross-section of the bay are calculated for both grounding configurations. Y and Z components of the field at each point are derived in terms of a constant coefficient and the current amplitude of each bus. The dc loads assumed are 100 Amperes for each bus. The ac noise current used is a spectrum 6 db higher than the Orbiter equipment limit for narrowband conducted emissions. It was concluded that installing return wiring to provide a single point ground for the dc Buses in the Payload Bay would reduce the ac and dc magnetic field intensity by approximately 30 db.

  15. A sensitivity analysis of centrifugal compressors' empirical models

    International Nuclear Information System (INIS)

    Yoon, Sung Ho; Baek, Je Hyun

    2001-01-01

    The mean-line method using empirical models is the most practical method of predicting off-design performance. To gain insight into the empirical models, the influence of empirical models on the performance prediction results is investigated. We found that, in the two-zone model, the secondary flow mass fraction has a considerable effect at high mass flow-rates on the performance prediction curves. In the TEIS model, the first element changes the slope of the performance curves as well as the stable operating range. The second element makes the performance curves move up and down as it increases or decreases. It is also discovered that the slip factor affects pressure ratio, but it has little effect on efficiency. Finally, this study reveals that the skin friction coefficient has significant effect on both the pressure ratio curve and the efficiency curve. These results show the limitations of the present empirical models, and more reasonable empirical models are reeded

  16. Trend analysis of stressors and ecological responses, particularly nutrients, in the Narragansett Bay Watershed.

    Science.gov (United States)

    Current and historic impacts of nitrogen on water quality were evaluated and relationships between nutrients and ecosystem structure and function were developed for Narragansett Bay, RI. Land use land cover change analysis from 1985 thru 2005 resulted in a 7% increase in urban la...

  17. An Empirical Analysis of Human Performance and Nuclear Safety Culture

    International Nuclear Information System (INIS)

    Jeffrey Joe; Larry G. Blackwood

    2006-01-01

    The purpose of this analysis, which was conducted for the US Nuclear Regulatory Commission (NRC), was to test whether an empirical connection exists between human performance and nuclear power plant safety culture. This was accomplished through analyzing the relationship between a measure of human performance and a plant's Safety Conscious Work Environment (SCWE). SCWE is an important component of safety culture the NRC has developed, but it is not synonymous with it. SCWE is an environment in which employees are encouraged to raise safety concerns both to their own management and to the NRC without fear of harassment, intimidation, retaliation, or discrimination. Because the relationship between human performance and allegations is intuitively reciprocal and both relationship directions need exploration, two series of analyses were performed. First, human performance data could be indicative of safety culture, so regression analyses were performed using human performance data to predict SCWE. It also is likely that safety culture contributes to human performance issues at a plant, so a second set of regressions were performed using allegations to predict HFIS results

  18. Empirically characteristic analysis of chaotic PID controlling particle swarm optimization.

    Science.gov (United States)

    Yan, Danping; Lu, Yongzhong; Zhou, Min; Chen, Shiping; Levy, David

    2017-01-01

    Since chaos systems generally have the intrinsic properties of sensitivity to initial conditions, topological mixing and density of periodic orbits, they may tactfully use the chaotic ergodic orbits to achieve the global optimum or their better approximation to given cost functions with high probability. During the past decade, they have increasingly received much attention from academic community and industry society throughout the world. To improve the performance of particle swarm optimization (PSO), we herein propose a chaotic proportional integral derivative (PID) controlling PSO algorithm by the hybridization of chaotic logistic dynamics and hierarchical inertia weight. The hierarchical inertia weight coefficients are determined in accordance with the present fitness values of the local best positions so as to adaptively expand the particles' search space. Moreover, the chaotic logistic map is not only used in the substitution of the two random parameters affecting the convergence behavior, but also used in the chaotic local search for the global best position so as to easily avoid the particles' premature behaviors via the whole search space. Thereafter, the convergent analysis of chaotic PID controlling PSO is under deep investigation. Empirical simulation results demonstrate that compared with other several chaotic PSO algorithms like chaotic PSO with the logistic map, chaotic PSO with the tent map and chaotic catfish PSO with the logistic map, chaotic PID controlling PSO exhibits much better search efficiency and quality when solving the optimization problems. Additionally, the parameter estimation of a nonlinear dynamic system also further clarifies its superiority to chaotic catfish PSO, genetic algorithm (GA) and PSO.

  19. DDT Analysis of Wetland Sediments in Upper Escambia Bay, Florida

    Science.gov (United States)

    Hopko, M. N.; Wright, J.; Liebens, J.; Vaughan, P.

    2017-12-01

    Dichlorodiphenyltrichloroethane (DDT) was a commonly used pesticide from World War II through the 1960's. DDT is generally used to control mosquito populations and as an agricultural insecticide. The pesticide and its degradation products (DDD and DDE) can bioaccumulate within ecosystems having negative implications for animal and human health. Consequently, DDT usage was banned in the United States in 1973. In a contaminant study performed in Escambia Bay, Florida, in 2009, DDT was present in 25% of study sites, most of which were located in the upper bay wetlands. Concentrations were well above the Florida Department of Environmental Protection's (FDEP) Probable Effect Level (PEL) and ratios of DDT and its metabolites indicated a recent introduction to the system. A follow-up study performed in 2016 found no DDT, but did show DDE at several sites. The current study repeated sampling in May 2017 at sites from the 2009 and 2016 studies. Sediment samples were collected in triplicate using a ponar sampler and DDT, DDD and DDE were extracted using EPA methods 3540c and 3620c. Extracts were analyzed using a gas chromatograph with electron capture detection (GC-ECD) as per EPA method 8081c. Sediment was also analyzed for organic carbon and particle size using an elemental NC analyzer and a laser diffraction particle sizer. Results show the presence of breakdown products DDE and DDD at multiple sites, but no detectable levels of DDT at any site. Sampling sites with high levels of DDT contamination in 2009 show only breakdown products in both 2016 and 2017. Particle size has little influence on DDD or DDE concentrations but OC is a controlling factor as indicated for contaminated sites by Pearson correlations between OC and DDE and DDD of 0.82 and 0.92, respectively. The presence of only DDD and/or DDE in the 2016 and 2017 studies indicates that the parent, DDT, has not been re-introduced into the watershed since 2009 but is degrading in the environment.

  20. Empirical Analysis of Closed-Loop Duopoly Advertising Strategies

    OpenAIRE

    Gary M. Erickson

    1992-01-01

    Closed-loop (perfect) equilibria in a Lanchester duopoly differential game of advertising competition are used as the basis for empirical investigation. Two systems of simultaneous nonlinear equations are formed, one from a general Lanchester model and one from a constrained model. Two empirical applications are conducted. In one involving Coca-Cola and Pepsi-Cola, a formal statistical testing procedure is used to detect whether closed-loop equilibrium advertising strategies are used by the c...

  1. Differences in Dynamic Brand Competition Across Markets: An Empirical Analysis

    OpenAIRE

    Jean-Pierre Dubé; Puneet Manchanda

    2005-01-01

    We investigate differences in the dynamics of marketing decisions across geographic markets empirically. We begin with a linear-quadratic game involving forward-looking firms competing on prices and advertising. Based on the corresponding Markov perfect equilibrium, we propose estimable econometric equations for demand and marketing policy. Our model allows us to measure empirically the strategic response of competitors along with economic measures such as firm profitability. We use a rich da...

  2. HOOPER BAY HOUSING ANALYSIS AND ENERGY FEASIBILITY REPORT

    Energy Technology Data Exchange (ETDEWEB)

    SEA LION CORPORATION; COLD CLIMATE HOUSING RESEARCH CENTER; SOLUTIONS FOR HEALTHY BREATHING; WHITNEY CONSTRUCTION

    2012-12-30

    Sea Lion applied for and received a grant from the Department of Energy (DOE) towards this end titled Energy Efficiency Development and Deployment in Indian Country. The initial objectives of the Hooper Bay Energy Efficiency Feasibility Study were to demonstrate a 30% reduction in residential/commercial energy usage and identify the economic benefits of implementing energy efficiency measures to the Tribe through: (1) partnering with Whitney Construction and Solutions for Healthy Breathing in the training and hire of 2 local energy assessors to conduct energy audits of 9 representative housing models and 2 commercial units in the community. These homes are representative of 52 homes constructed across different eras. (2) partnering with Cold Climate Housing Research Center to document current electrical and heating energy consumption and analyze data for a final feasibility report (3) assessing the economics of electricity & heating fuel usage; (4) projecting energy savings or fossil fuel reduction by modeling of improvement scenarios and cost feasibility The following two objectives will be completed after the publication of this report: (5) the development of materials lists for energy efficiency improvements (6) identifying financing options for the follow-up energy efficiency implementation phase.

  3. Competition in the German pharmacy market: an empirical analysis.

    Science.gov (United States)

    Heinsohn, Jörg G; Flessa, Steffen

    2013-10-10

    Pharmaceutical products are an important component of expenditure on public health insurance in the Federal Republic of Germany. For years, German policy makers have regulated public pharmacies in order to limit the increase in costs. One reform has followed another, main objective being to increase competition in the pharmacy market. It is generally assumed that an increase in competition would reduce healthcare costs. However, there is a lack of empirical proof of a stronger orientation of German public pharmacies towards competition thus far. This paper analyses the self-perceptions of owners of German public pharmacies and their orientation towards competition in the pharmacy markets. It is based on a cross-sectional survey (N = 289) and distinguishes between successful and less successful pharmacies, the location of the pharmacies (e.g. West German States and East German States) and the gender of the pharmacy owner. The data are analysed descriptively by survey items and employing bivariate and structural equation modelling. The analysis reveals that the majority of owners of public pharmacies in Germany do not currently perceive very strong competitive pressure in the market. However, the innovativeness of the pharmacist is confirmed as most relevant for net revenue development and the profit margin. Some differences occur between regions, e.g. public pharmacies in West Germany have a significantly higher profit margin. This study provides evidence that the German healthcare reforms aimed at increasing the competition between public pharmacies in Germany have not been completely successful. Many owners of public pharmacies disregard instruments of active customer-orientated management (such as customer loyalty or an offensive position and economies of scale), which could give them a competitive advantage. However, it is clear that those pharmacists who strive for systematic and innovative management and adopt an offensive and competitive stance are quite

  4. An empirical analysis of the hydropower portfolio in Pakistan

    International Nuclear Information System (INIS)

    Siddiqi, Afreen; Wescoat, James L.; Humair, Salal; Afridi, Khurram

    2012-01-01

    The Indus Basin of Pakistan with 800 hydropower project sites and a feasible hydropower potential of 60 GW, 89% of which is undeveloped, is a complex system poised for large-scale changes in the future. Motivated by the need to understand future impacts of hydropower alternatives, this study conducted a multi-dimensional, empirical analysis of the full hydropower portfolio. The results show that the full portfolio spans multiple scales of capacity from mega (>1000 MW) to micro (<0.1 MW) projects with a skewed spatial distribution within the provinces, as well as among rivers and canals. Of the total feasible potential, 76% lies in two (out of six) administrative regions and 68% lies in two major rivers (out of more than 125 total channels). Once projects currently under implementation are commissioned, there would be a five-fold increase from a current installed capacity of 6720 MW to 36759 MW. It is recommended that the implementation and design decisions should carefully include spatial distribution and environmental considerations upfront. Furthermore, uncertainties in actual energy generation, and broader hydrological risks due to expected climate change effects should be included in the current planning of these systems that are to provide service over several decades into the future. - Highlights: ► Pakistan has a hydropower potential of 60 GW distributed across 800 projects. ► Under-development projects will realize 36.7 GW of this potential by 2030. ► Project locations are skewed towards some sub-basins and provinces. ► Project sizes are very diverse and have quite limited private sector ownership. ► Gaps in data prevent proper risk assessment for Pakistan's hydropower development.

  5. Empirically characteristic analysis of chaotic PID controlling particle swarm optimization

    Science.gov (United States)

    Yan, Danping; Lu, Yongzhong; Zhou, Min; Chen, Shiping; Levy, David

    2017-01-01

    Since chaos systems generally have the intrinsic properties of sensitivity to initial conditions, topological mixing and density of periodic orbits, they may tactfully use the chaotic ergodic orbits to achieve the global optimum or their better approximation to given cost functions with high probability. During the past decade, they have increasingly received much attention from academic community and industry society throughout the world. To improve the performance of particle swarm optimization (PSO), we herein propose a chaotic proportional integral derivative (PID) controlling PSO algorithm by the hybridization of chaotic logistic dynamics and hierarchical inertia weight. The hierarchical inertia weight coefficients are determined in accordance with the present fitness values of the local best positions so as to adaptively expand the particles’ search space. Moreover, the chaotic logistic map is not only used in the substitution of the two random parameters affecting the convergence behavior, but also used in the chaotic local search for the global best position so as to easily avoid the particles’ premature behaviors via the whole search space. Thereafter, the convergent analysis of chaotic PID controlling PSO is under deep investigation. Empirical simulation results demonstrate that compared with other several chaotic PSO algorithms like chaotic PSO with the logistic map, chaotic PSO with the tent map and chaotic catfish PSO with the logistic map, chaotic PID controlling PSO exhibits much better search efficiency and quality when solving the optimization problems. Additionally, the parameter estimation of a nonlinear dynamic system also further clarifies its superiority to chaotic catfish PSO, genetic algorithm (GA) and PSO. PMID:28472050

  6. Modeling gallic acid production rate by empirical and statistical analysis

    Directory of Open Access Journals (Sweden)

    Bratati Kar

    2000-01-01

    Full Text Available For predicting the rate of enzymatic reaction empirical correlation based on the experimental results obtained under various operating conditions have been developed. Models represent both the activation as well as deactivation conditions of enzymatic hydrolysis and the results have been analyzed by analysis of variance (ANOVA. The tannase activity was found maximum at incubation time 5 min, reaction temperature 40ºC, pH 4.0, initial enzyme concentration 0.12 v/v, initial substrate concentration 0.42 mg/ml, ionic strength 0.2 M and under these optimal conditions, the maximum rate of gallic acid production was 33.49 mumoles/ml/min.Para predizer a taxa das reações enzimaticas uma correlação empírica baseada nos resultados experimentais foi desenvolvida. Os modelos representam a ativação e a desativativação da hydrolise enzimatica. Os resultados foram avaliados pela análise de variança (ANOVA. A atividade máxima da tannase foi obtida após 5 minutos de incubação, temperatura 40ºC, pH 4,0, concentração inicial da enzima de 0,12 v/v, concentração inicial do substrato 0,42 mg/ml, força iônica 0,2 M. Sob essas condições a taxa máxima de produção ácido galico foi de 33,49 µmoles/ml/min.

  7. Empirically characteristic analysis of chaotic PID controlling particle swarm optimization.

    Directory of Open Access Journals (Sweden)

    Danping Yan

    Full Text Available Since chaos systems generally have the intrinsic properties of sensitivity to initial conditions, topological mixing and density of periodic orbits, they may tactfully use the chaotic ergodic orbits to achieve the global optimum or their better approximation to given cost functions with high probability. During the past decade, they have increasingly received much attention from academic community and industry society throughout the world. To improve the performance of particle swarm optimization (PSO, we herein propose a chaotic proportional integral derivative (PID controlling PSO algorithm by the hybridization of chaotic logistic dynamics and hierarchical inertia weight. The hierarchical inertia weight coefficients are determined in accordance with the present fitness values of the local best positions so as to adaptively expand the particles' search space. Moreover, the chaotic logistic map is not only used in the substitution of the two random parameters affecting the convergence behavior, but also used in the chaotic local search for the global best position so as to easily avoid the particles' premature behaviors via the whole search space. Thereafter, the convergent analysis of chaotic PID controlling PSO is under deep investigation. Empirical simulation results demonstrate that compared with other several chaotic PSO algorithms like chaotic PSO with the logistic map, chaotic PSO with the tent map and chaotic catfish PSO with the logistic map, chaotic PID controlling PSO exhibits much better search efficiency and quality when solving the optimization problems. Additionally, the parameter estimation of a nonlinear dynamic system also further clarifies its superiority to chaotic catfish PSO, genetic algorithm (GA and PSO.

  8. An empirical analysis of cigarette demand in Argentina.

    Science.gov (United States)

    Martinez, Eugenio; Mejia, Raul; Pérez-Stable, Eliseo J

    2015-01-01

    To estimate the long-term and short-term effects on cigarette demand in Argentina based on changes in cigarette price and income per person >14 years old. Public data from the Ministry of Economics and Production were analysed based on monthly time series data between 1994 and 2010. The econometric analysis used cigarette consumption per person >14 years of age as the dependent variable and the real income per person >14 years old and the real average price of cigarettes as independent variables. Empirical analyses were done to verify the order of integration of the variables, to test for cointegration to capture the long-term effects and to capture the short-term dynamics of the variables. The demand for cigarettes in Argentina was affected by changes in real income and the real average price of cigarettes. The long-term income elasticity was equal to 0.43, while the own-price elasticity was equal to -0.31, indicating a 10% increase in the growth of real income led to an increase in cigarette consumption of 4.3% and a 10% increase in the price produced a fall of 3.1% in cigarette consumption. The vector error correction model estimated that the short-term income elasticity was 0.25 and the short-term own-price elasticity of cigarette demand was -0.15. A simulation exercise showed that increasing the price of cigarettes by 110% would maximise revenues and result in a potentially large decrease in total cigarette consumption. Econometric analyses of cigarette consumption and their relationship with cigarette price and income can provide valuable information for developing cigarette price policy. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  9. Univariate and Bivariate Empirical Mode Decomposition for Postural Stability Analysis

    Directory of Open Access Journals (Sweden)

    Jacques Duchêne

    2008-05-01

    Full Text Available The aim of this paper was to compare empirical mode decomposition (EMD and two new extended methods of  EMD named complex empirical mode decomposition (complex-EMD and bivariate empirical mode decomposition (bivariate-EMD. All methods were used to analyze stabilogram center of pressure (COP time series. The two new methods are suitable to be applied to complex time series to extract complex intrinsic mode functions (IMFs before the Hilbert transform is subsequently applied on the IMFs. The trace of the analytic IMF in the complex plane has a circular form, with each IMF having its own rotation frequency. The area of the circle and the average rotation frequency of IMFs represent efficient indicators of the postural stability status of subjects. Experimental results show the effectiveness of these indicators to identify differences in standing posture between groups.

  10. Ownership dynamics with large shareholders : An empirical analysis

    NARCIS (Netherlands)

    Donelli, M.; Urzua Infante, F.; Larrain, B.

    2013-01-01

    We study the empirical determinants of corporate ownership dynamics in a market where large shareholders are prevalent. We use a unique, hand-collected 20-year dataset on the ownership structure of Chilean companies. Controllers’ blockholdings are on average high -as in continental Europe, for

  11. WHAT FACTORS INFLUENCE QUALITY SERVICE IMPROVEMENT IN MONTENEGRO: EMPIRICAL ANALYSIS

    Directory of Open Access Journals (Sweden)

    Djurdjica Perovic

    2013-03-01

    Full Text Available In this paper, using an Ordinary Least Square regression (OLS, we investigate whether intangible elements influence tourist's perception about service quality. Our empirical results based on tourist survey in Montenegro, indicate that intangible elements of tourism product have positive impact on tourist's overall perception of service quality in Montenegro.

  12. Opposing the nuclear threat: The convergence of moral analysis and empirical data

    International Nuclear Information System (INIS)

    Hehir, J.B.

    1986-01-01

    This paper examines the concept of nuclear winter from the perspective of religious and moral values. The objective is to identify points of intersection between the empirical arguments about nuclear winter and ethical perspectives on nuclear war. The analysis moves through three steps: (1) the context of the nuclear debate; (2) the ethical and empirical contributions to the nuclear debate; and (3) implications for policy drawn from the ethical-empirical data

  13. Use of Principal Components Analysis to Explain Controls on Nutrient Fluxes to the Chesapeake Bay

    Science.gov (United States)

    Rice, K. C.; Mills, A. L.

    2017-12-01

    The Chesapeake Bay watershed, on the east coast of the United States, encompasses about 166,000-square kilometers (km2) of diverse land use, which includes a mixture of forested, agricultural, and developed land. The watershed is now managed under a Total Daily Maximum Load (TMDL), which requires implementation of management actions by 2025 that are sufficient to reduce nitrogen, phosphorus, and suspended-sediment fluxes to the Chesapeake Bay and restore the bay's water quality. We analyzed nutrient and sediment data along with land-use and climatic variables in nine sub watersheds to better understand the drivers of flux within the watershed and to provide relevant management implications. The nine sub watersheds range in area from 300 to 30,000 km2, and the analysis period was 1985-2014. The 31 variables specific to each sub watershed were highly statistically significantly correlated, so Principal Components Analysis was used to reduce the dimensionality of the dataset. The analysis revealed that about 80% of the variability in the whole dataset can be explained by discharge, flux, and concentration of nutrients and sediment. The first two principal components (PCs) explained about 68% of the total variance. PC1 loaded strongly on discharge and flux, and PC2 loaded on concentration. The PC scores of both PC1 and PC2 varied by season. Subsequent analysis of PC1 scores versus PC2 scores, broken out by sub watershed, revealed management implications. Some of the largest sub watersheds are largely driven by discharge, and consequently large fluxes. In contrast, some of the smaller sub watersheds are more variable in nutrient concentrations than discharge and flux. Our results suggest that, given no change in discharge, a reduction in nutrient flux to the streams in the smaller watersheds could result in a proportionately larger decrease in fluxes of nutrients down the river to the bay, than in the larger watersheds.

  14. Analysis of Marketing and Customer Satisfaction in Base Housing Communities of the Monterey Bay Area

    Science.gov (United States)

    2011-06-01

    in Seattle, Washington. The company claims to be based on four basic principles : “exceptional people, strong customer service, market knowledge, and...FtOrd.html Keller, K., & Kotler , P. (2009). A framework for marketing management. Upper Saddle River, NJ: Pearson Education, Inc. Office of...SUBTITLE Analysis of Marketing and Customer Satisfaction in Base Housing Communities of the Monterey Bay Area 5. FUNDING NUMBERS 6. AUTHOR(S

  15. Economic Growth and the Environment. An empirical analysis

    Energy Technology Data Exchange (ETDEWEB)

    De Bruyn, S.M.

    1999-12-21

    A number of economists have claimed that economic growth benefits environmental quality as it raises political support and financial means for environmental policy measures. Since the early 1990s this view has increasingly been supported by empirical evidence that has challenged the traditional belief held by environmentalists that economic growth degrades the environment. This study investigates the relationship between economic growth and environmental quality and elaborates the question whether economic growth can be combined with a reduced demand for natural resources. Various hypotheses on this relationship are described and empirically tested for a number of indicators of environmental pressure. The outcome of the tests advocates the use of alternative models for estimation that alter conclusions about the relationship between economic growth and the environment and give insight into the driving forces of emission reduction in developed economies. refs.

  16. The Environmental Kuznets Curve. An empirical analysis for OECD countries

    Energy Technology Data Exchange (ETDEWEB)

    Georgiev, E.

    2008-09-15

    This paper tests the Environmental Kuznets Curve hypothesis for four local (SOx, NOx, CO, VOC) and two global (CO2, GHG) air pollutants. Using a new panel data set of thirty OECD countries, the paper finds that the postulated inverted U-shaped relationship between income and pollution does not hold for all gases. A meaningful Environmental Kuznets Curve exists only for CO, VOC and NOx, where for CO2 the curve is monotonically increasing. For GHG there is indication of an inverted U-shaped relationship between income and pollution, but still most countries are on the increasing path of the curve and hence the future development of the curve is uncertain. For SOx it was found that emissions follow an U-shaped curve. Based on the empirical results, the paper concludes that the Environmental Kuznets Curve does not hold for all gases, it is rather an empirical artefact than a regularity.

  17. The Environmental Kuznets Curve. An empirical analysis for OECD countries

    International Nuclear Information System (INIS)

    Georgiev, E.

    2008-09-01

    This paper tests the Environmental Kuznets Curve hypothesis for four local (SOx, NOx, CO, VOC) and two global (CO2, GHG) air pollutants. Using a new panel data set of thirty OECD countries, the paper finds that the postulated inverted U-shaped relationship between income and pollution does not hold for all gases. A meaningful Environmental Kuznets Curve exists only for CO, VOC and NOx, where for CO2 the curve is monotonically increasing. For GHG there is indication of an inverted U-shaped relationship between income and pollution, but still most countries are on the increasing path of the curve and hence the future development of the curve is uncertain. For SOx it was found that emissions follow an U-shaped curve. Based on the empirical results, the paper concludes that the Environmental Kuznets Curve does not hold for all gases, it is rather an empirical artefact than a regularity.

  18. Does gender equality promote social trust? An empirical analysis

    OpenAIRE

    Seo-Young Cho

    2015-01-01

    Fairness can be an important factor that promotes social trust among people. In this paper, I investigate empirically whether fairness between men and women increases social trust. Using the data of the World Value Survey from 91 countries, I find that gender discriminatory values negatively affect the trust level of both men and women, while actual conditions on gender equality, measured by labor and educational attainments and political participation, are not a significant determinant of so...

  19. Tax morale : theory and empirical analysis of tax compliance

    OpenAIRE

    Torgler, Benno

    2003-01-01

    Tax morale is puzzling in our society. Observations show that tax compliance cannot be satisfactorily explained by the level of enforcement. Other factors may well be relevant. This paper contains a short survey of important theoretical and empirical findings in the tax morale literature, focussing on personal income tax morale. The following three key topics are discussed: moral sentiments, fairness and the relationship between taxpayer and government. The survey stresses the ...

  20. Empirical fractal geometry analysis of some speculative financial bubbles

    Science.gov (United States)

    Redelico, Francisco O.; Proto, Araceli N.

    2012-11-01

    Empirical evidence of a multifractal signature during increasing of a financial bubble leading to a crash is presented. The April 2000 crash in the NASDAQ composite index and a time series from the discrete Chakrabarti-Stinchcombe model for earthquakes are analyzed using a geometric approach and some common patterns are identified. These patterns can be related the geometry of the rising period of a financial bubbles with the non-concave entropy problem.

  1. Analysis of Air Traffic Track Data with the AutoBayes Synthesis System

    Science.gov (United States)

    Schumann, Johann Martin Philip; Cate, Karen; Lee, Alan G.

    2010-01-01

    The Next Generation Air Traffic System (NGATS) is aiming to provide substantial computer support for the air traffic controllers. Algorithms for the accurate prediction of aircraft movements are of central importance for such software systems but trajectory prediction has to work reliably in the presence of unknown parameters and uncertainties. We are using the AutoBayes program synthesis system to generate customized data analysis algorithms that process large sets of aircraft radar track data in order to estimate parameters and uncertainties. In this paper, we present, how the tasks of finding structure in track data, estimation of important parameters in climb trajectories, and the detection of continuous descent approaches can be accomplished with compact task-specific AutoBayes specifications. We present an overview of the AutoBayes architecture and describe, how its schema-based approach generates customized analysis algorithms, documented C/C++ code, and detailed mathematical derivations. Results of experiments with actual air traffic control data are discussed.

  2. CytoBayesJ: software tools for Bayesian analysis of cytogenetic radiation dosimetry data.

    Science.gov (United States)

    Ainsbury, Elizabeth A; Vinnikov, Volodymyr; Puig, Pedro; Maznyk, Nataliya; Rothkamm, Kai; Lloyd, David C

    2013-08-30

    A number of authors have suggested that a Bayesian approach may be most appropriate for analysis of cytogenetic radiation dosimetry data. In the Bayesian framework, probability of an event is described in terms of previous expectations and uncertainty. Previously existing, or prior, information is used in combination with experimental results to infer probabilities or the likelihood that a hypothesis is true. It has been shown that the Bayesian approach increases both the accuracy and quality assurance of radiation dose estimates. New software entitled CytoBayesJ has been developed with the aim of bringing Bayesian analysis to cytogenetic biodosimetry laboratory practice. CytoBayesJ takes a number of Bayesian or 'Bayesian like' methods that have been proposed in the literature and presents them to the user in the form of simple user-friendly tools, including testing for the most appropriate model for distribution of chromosome aberrations and calculations of posterior probability distributions. The individual tools are described in detail and relevant examples of the use of the methods and the corresponding CytoBayesJ software tools are given. In this way, the suitability of the Bayesian approach to biological radiation dosimetry is highlighted and its wider application encouraged by providing a user-friendly software interface and manual in English and Russian. Copyright © 2013 Elsevier B.V. All rights reserved.

  3. Economic Growth and Transboundary Pollution in Europe. An Empirical Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Ansuategi, A. [Ekonomi Analisiaren Oinarriak I Saila, Ekonomi Zientzien Fakultatea, Lehendakari Agirre Etorbidea, 83, 48015 Bilbao (Spain)

    2003-10-01

    The existing empirical evidence suggests that environmental Kuznets curves only exist for pollutants with semi-local and medium term impacts. Ansuategi and Perrings (2000) have considered the behavioral basis for the correlation observed between different spatial incidence of environmental degradation and the relation between economic growth and environmental quality. They show that self-interested planners following a Nash-type strategy tend to address environmental effects sequentially: addressing those with the most immediate costs first, and those whose costs are displaced in space later. This paper tests such behavioral basis in the context of sulphur dioxide emissions in Europe.

  4. Economic Growth and Transboundary Pollution in Europe. An Empirical Analysis

    International Nuclear Information System (INIS)

    Ansuategi, A.

    2003-01-01

    The existing empirical evidence suggests that environmental Kuznets curves only exist for pollutants with semi-local and medium term impacts. Ansuategi and Perrings (2000) have considered the behavioral basis for the correlation observed between different spatial incidence of environmental degradation and the relation between economic growth and environmental quality. They show that self-interested planners following a Nash-type strategy tend to address environmental effects sequentially: addressing those with the most immediate costs first, and those whose costs are displaced in space later. This paper tests such behavioral basis in the context of sulphur dioxide emissions in Europe

  5. Development of an empirical model of turbine efficiency using the Taylor expansion and regression analysis

    International Nuclear Information System (INIS)

    Fang, Xiande; Xu, Yu

    2011-01-01

    The empirical model of turbine efficiency is necessary for the control- and/or diagnosis-oriented simulation and useful for the simulation and analysis of dynamic performances of the turbine equipment and systems, such as air cycle refrigeration systems, power plants, turbine engines, and turbochargers. Existing empirical models of turbine efficiency are insufficient because there is no suitable form available for air cycle refrigeration turbines. This work performs a critical review of empirical models (called mean value models in some literature) of turbine efficiency and develops an empirical model in the desired form for air cycle refrigeration, the dominant cooling approach in aircraft environmental control systems. The Taylor series and regression analysis are used to build the model, with the Taylor series being used to expand functions with the polytropic exponent and the regression analysis to finalize the model. The measured data of a turbocharger turbine and two air cycle refrigeration turbines are used for the regression analysis. The proposed model is compact and able to present the turbine efficiency map. Its predictions agree with the measured data very well, with the corrected coefficient of determination R c 2 ≥ 0.96 and the mean absolute percentage deviation = 1.19% for the three turbines. -- Highlights: → Performed a critical review of empirical models of turbine efficiency. → Developed an empirical model in the desired form for air cycle refrigeration, using the Taylor expansion and regression analysis. → Verified the method for developing the empirical model. → Verified the model.

  6. EMPIRICAL ANALYSIS OF THE ROLE OF THE FIRMS’ VALUE DRIVERS

    Directory of Open Access Journals (Sweden)

    Anita KISS

    2015-12-01

    Full Text Available This paper focuses on the value creation and the value drivers. One of the objectives of this paper is to present the concept of maximizing shareholder value. The main goal is to categorize the most important value drivers, and their role of the firms’ value. This study proceeds as follows. The first section presents the value chain, the primary and the support activities. The second part describes the theoretical background of maximizing shareholder value. The third part illustrates the key value drivers. The fourth empirical section of the study analyses the database featuring data from 18 European countries, 10 sectors and 1553 firms in the period between 2004 and 2011. Finally, the fifth section includes concluding remarks. Based on the related literature reviewed and in the conducted empirical research it can be assessed that, the EBIT, reinvestment, invested capital, the return on invested capital, the net margin and the sales growth rate all have a positive effect on firm value, while the tax rate and the market value of return on asset (MROA has a negative one.

  7. An empirical test of the 'shark nursery area concept' in Texas bays using a long-term fisheries-independent data set

    Science.gov (United States)

    Froeschke, John T.; Stunz, Gregory W.; Sterba-Boatwright, Blair; Wildhaber, Mark L.

    2010-01-01

    Using a long-term fisheries-independent data set, we tested the 'shark nursery area concept' proposed by Heupel et al. (2007) with the suggested working assumptions that a shark nursery habitat would: (1) have an abundance of immature sharks greater than the mean abundance across all habitats where they occur; (2) be used by sharks repeatedly through time (years); and (3) see immature sharks remaining within the habitat for extended periods of time. We tested this concept using young-of-the-year (age 0) and juvenile (age 1+ yr) bull sharks Carcharhinus leucas from gill-net surveys conducted in Texas bays from 1976 to 2006 to estimate the potential nursery function of 9 coastal bays. Of the 9 bay systems considered as potential nursery habitat, only Matagorda Bay satisfied all 3 criteria for young-of-the-year bull sharks. Both Matagorda and San Antonio Bays met the criteria for juvenile bull sharks. Through these analyses we examined the utility of this approach for characterizing nursery areas and we also describe some practical considerations, such as the influence of the temporal or spatial scales considered when applying the nursery role concept to shark populations.

  8. Empirical modeling and data analysis for engineers and applied scientists

    CERN Document Server

    Pardo, Scott A

    2016-01-01

    This textbook teaches advanced undergraduate and first-year graduate students in Engineering and Applied Sciences to gather and analyze empirical observations (data) in order to aid in making design decisions. While science is about discovery, the primary paradigm of engineering and "applied science" is design. Scientists are in the discovery business and want, in general, to understand the natural world rather than to alter it. In contrast, engineers and applied scientists design products, processes, and solutions to problems. That said, statistics, as a discipline, is mostly oriented toward the discovery paradigm. Young engineers come out of their degree programs having taken courses such as "Statistics for Engineers and Scientists" without any clear idea as to how they can use statistical methods to help them design products or processes. Many seem to think that statistics is only useful for demonstrating that a device or process actually does what it was designed to do. Statistics courses emphasize creati...

  9. An empirical analysis of the corporate call decision

    International Nuclear Information System (INIS)

    Carlson, M.D.

    1998-01-01

    An economic study of the the behaviour of financial managers of utility companies was presented. The study examined whether or not an option pricing based model of the call decision does a better job of explaining callable preferred share prices and call decisions compared to other models. In this study, the Rust (1987) empirical technique was extended to include the use of information from preferred share prices in addition to the call decisions. Reasonable estimates were obtained from data of shares of the Pacific Gas and Electric Company (PGE) for the transaction costs associated with a call. It was concluded that the managers of the PGE clearly take into account the value of the option to delay the call when making their call decisions

  10. Empirical Analysis of Using Erasure Coding in Outsourcing Data Storage With Provable Security

    Science.gov (United States)

    2016-06-01

    computing and communication technologies become powerful and advanced , people are exchanging a huge amount of data, and they are de- manding more storage...NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA THESIS EMPIRICAL ANALYSIS OF USING ERASURE CODING IN OUTSOURCING DATA STORAGEWITH PROVABLE SECURITY by...2015 to 06-17-2016 4. TITLE AND SUBTITLE EMPIRICAL ANALYSIS OF USING ERASURE CODING IN OUTSOURCING DATA STORAGE WITH PROVABLE SECURITY 5. FUNDING

  11. Population, internal migration, and economic growth: an empirical analysis.

    Science.gov (United States)

    Moreland, R S

    1982-01-01

    The role of population growth in the development process has received increasing attention during the last 15 years, as manifested in the literature in 3 broad categories. In the 1st category, the effects of rapid population growth on the growth of income have been studied with the use of simulation models, which sometimes include endogenous population growth. The 2nd category of the literature is concerned with theoretical and empirical studies of the economic determinants of various demographic rates--most usually fertility. Internal migration and dualism is the 3rd population development category to recieve attention. An attempt is made to synthesize developments in these 3 categories by estimating from a consistent set of data a 2 sector economic demographic model in which the major demographic rates are endogenous. Due to the fact that the interactions between economic and demographic variables are nonlinear and complex, the indirect effects of changes in a particular variable may depend upon the balance of numerical coefficients. For this reason it was felt that the model should be empirically grounded. A brief overview of the model is provided, and the model is compared to some similar existing models. Estimation of the model's 9 behavior equations is discussed, followed by a "base run" simulation of a developing country "stereotype" and a report of a number of policy experiments. The relatively new field of economic determinants of demographic variables was drawn upon in estimating equations to endogenize demographic phenomena that are frequently left exogenous in simulation models. The fertility and labor force participation rate functions are fairly standard, but a step beyong existing literature was taken in the life expectancy and intersectorial migration equations. On the economic side, sectoral savings functions were estimated, and it was found that the marginal propensity to save is lower in agriculture than in nonagriculture. Testing to see the

  12. A Temporal Extension to Traditional Empirical Orthogonal Function Analysis

    DEFF Research Database (Denmark)

    Nielsen, Allan Aasbjerg; Hilger, Klaus Baggesen; Andersen, Ole Baltazar

    2002-01-01

    (EOF) analysis, which provides a non-temporal analysis of one variable over time. The temporal extension proves its strength in separating the signals at different periods in an analysis of relevant oceanographic properties related to one of the largest El Niño events ever recorded....

  13. Data envelopment analysis in service quality evaluation: an empirical study

    Science.gov (United States)

    Najafi, Seyedvahid; Saati, Saber; Tavana, Madjid

    2015-09-01

    Service quality is often conceptualized as the comparison between service expectations and the actual performance perceptions. It enhances customer satisfaction, decreases customer defection, and promotes customer loyalty. Substantial literature has examined the concept of service quality, its dimensions, and measurement methods. We introduce the perceived service quality index (PSQI) as a single measure for evaluating the multiple-item service quality construct based on the SERVQUAL model. A slack-based measure (SBM) of efficiency with constant inputs is used to calculate the PSQI. In addition, a non-linear programming model based on the SBM is proposed to delineate an improvement guideline and improve service quality. An empirical study is conducted to assess the applicability of the method proposed in this study. A large number of studies have used DEA as a benchmarking tool to measure service quality. These models do not propose a coherent performance evaluation construct and consequently fail to deliver improvement guidelines for improving service quality. The DEA models proposed in this study are designed to evaluate and improve service quality within a comprehensive framework and without any dependency on external data.

  14. Regime switching model for financial data: Empirical risk analysis

    Science.gov (United States)

    Salhi, Khaled; Deaconu, Madalina; Lejay, Antoine; Champagnat, Nicolas; Navet, Nicolas

    2016-11-01

    This paper constructs a regime switching model for the univariate Value-at-Risk estimation. Extreme value theory (EVT) and hidden Markov models (HMM) are combined to estimate a hybrid model that takes volatility clustering into account. In the first stage, HMM is used to classify data in crisis and steady periods, while in the second stage, EVT is applied to the previously classified data to rub out the delay between regime switching and their detection. This new model is applied to prices of numerous stocks exchanged on NYSE Euronext Paris over the period 2001-2011. We focus on daily returns for which calibration has to be done on a small dataset. The relative performance of the regime switching model is benchmarked against other well-known modeling techniques, such as stable, power laws and GARCH models. The empirical results show that the regime switching model increases predictive performance of financial forecasting according to the number of violations and tail-loss tests. This suggests that the regime switching model is a robust forecasting variant of power laws model while remaining practical to implement the VaR measurement.

  15. The Twin Deficits Hypothesis: An Empirical Analysis for Tanzania

    Directory of Open Access Journals (Sweden)

    Manamba Epaphra

    2017-09-01

    Full Text Available This paper examines the relationship between current account and government budget deficits in Tanzania. The paper tests the validity of the twin deficits hypothesis, using annual time series data for the 1966-2015 period. The paper is thought to be significant because the concept of the twin deficit hypothesis is fraught with controversy. Some researches support the hypothesis that there is a positive relationship between current account deficits and fiscal deficits in the economy while others do not. In this paper, the empirical tests fail to reject the twin deficits hypothesis, indicating that rising budget deficits put more strain on the current account deficits in Tanzania. Specifically, the Vector Error Correction Model results support the conventional theory of a positive relationship between fiscal and external balances, with a relatively high speed of adjustment toward the equilibrium position. This evidence is consistent with a small open economy. To address the problem that may result from this kind of relationship, appropriate policy variables for reducing budget deficits such as reduction in non-development expenditure, enhancement of domestic revenue collection and actively fight corruption and tax evasion should be adopted. The government should also target export oriented firms and encourage an import substitution industry by creating favorable business environments.

  16. Environmentalism and elitism: a conceptual and empirical analysis

    Science.gov (United States)

    Morrison, Denton E.; Dunlap, Riley E.

    1986-09-01

    The frequent charge that environmentalism is “elitist” is examined conceptually and empirically. First, the concept of elitism is analyzed by distinguishing between three types of accusations made against the environmental movement: (a) compositional elitism suggests that environmentalists are drawn from privileged socioeconomic strata, (b) ideological elitism suggests that environmental reforms are a subterfuge for distributing benefits to environmentalists and/or costs to others, and (c) impact elitism suggests that environmental reforms, whether intentionally or not, do in fact have regressive social impacts. The evidence bearing on each of the three types of elitism is examined in some detail, and the following conclusions are drawn: Compositional elitism is an exaggeration, for although environmentalists are typically above average in socioeconomic status (as are most sociopolitical activists), few belong to the upper class. Ideological elitism may hold in some instances, but environmentalists have shown increasing sensitivity to equity concerns and there is little evidence of consistent pursuit of self-interest. Impact elitism is the most important issue, and also the most difficult to assess. It appears that there has been a general tendency for environmental reforms to have regressive impacts. However, it is increasingly recognized that problems such as workplace pollution and toxic waste contamination disproportionately affect the lower socioeconomic strata, and thus reforms aimed at such problems will likely have more progressive impacts.

  17. Production functions for climate policy modeling. An empirical analysis

    International Nuclear Information System (INIS)

    Van der Werf, Edwin

    2008-01-01

    Quantitative models for climate policy modeling differ in the production structure used and in the sizes of the elasticities of substitution. The empirical foundation for both is generally lacking. This paper estimates the parameters of 2-level CES production functions with capital, labour and energy as inputs, and is the first to systematically compare all nesting structures. Using industry-level data from 12 OECD countries, we find that the nesting structure where capital and labour are combined first, fits the data best, but for most countries and industries we cannot reject that all three inputs can be put into one single nest. These two nesting structures are used by most climate models. However, while several climate policy models use a Cobb-Douglas function for (part of the) production function, we reject elasticities equal to one, in favour of considerably smaller values. Finally we find evidence for factor-specific technological change. With lower elasticities and with factor-specific technological change, some climate policy models may find a bigger effect of endogenous technological change on mitigating the costs of climate policy. (author)

  18. Determination of heavy metals and other elements in sediments from Sepetiba Bay (RJ, Brazil) by neutron activation analysis

    International Nuclear Information System (INIS)

    Pellegatti, Fabio

    2000-01-01

    Sepetiba Bay, located about 60 km south of the city of Rio de Janeiro, Brazil, is one of the most important fishery areas in the State of Rio de Janeiro. A large harbor brought up a lot of industrial investment in that area. Since the 1970's, the Sepetiba region has undergone fast industrial expansion, leading to high levels of pollution by metals. For the last two decades, an industrial park composed of about 400 industrial plants, basically metallurgical, was established in the Sepetiba Bay basin, releasing its industrial waste either straight into the bay or through local rivers. Metal contamination in the bay for some metals, such as Zn, has already exceeded acceptable levels. Many authors have studied the distribution and behavior of heavy metals and another elements in the bay, but only few elements have been focused (Cd, Cr, Cu, Fe, Mn, Ni, Pb and Zn). This is probably due to the fact that the analytical technique most employed has been atomic absorption spectrometry, which is not a multi-elemental technique. In this work, Instrumental Neutron Activation Analysis (INAA) was applied to the determination of the elements As, Ba, Br, Ce, Co, Cr, Cs, Eu, Fe, Hf, La, Lu, Nd, Rb, Sc, Sm, Ta, Tb, Th, U, Yb and Zn in 28 bottom sediment samples and four sediment cores from Sepetiba Bay. The elements Co, Cr, Cs, Fe, Sc, Ta and Zn presented similar behavior in the bottom sediments, showing higher concentration along the Northern coast of the bay, where most of the fluvial water flows out to the bay. The contamination of Sepetiba Bay was also assessed by the analysis of four sediment cores. Two of them were sampled in the Eastern part of the bay, where the industrial park is located, whereas the other two were sampled in the Western part of the bay, a more preserved region. For each region, two cores were sampled within the mangrove trees and the others at the edge of the tidal flat. The results showed that, the sediments displayed higher metal concentration within the

  19. A Bivariate Extension to Traditional Empirical Orthogonal Function Analysis

    DEFF Research Database (Denmark)

    Nielsen, Allan Aasbjerg; Hilger, Klaus Baggesen; Andersen, Ole Baltazar

    2002-01-01

    This paper describes the application of canonical correlations analysis to the joint analysis of global monthly mean values of 1996-1997 sea surface temperature (SST) and height (SSH) data. The SST data are considered as one set and the SSH data as another set of multivariate observations, both w...... as for example an increase in the SST will lead to an increase in the SSH. The analysis clearly shows the build-up of one of the largest El Niño events on record. Also the analysis indicates a phase lag of approximately one month between the SST and SSH fields....

  20. Empirical Comparison of Publication Bias Tests in Meta-Analysis.

    Science.gov (United States)

    Lin, Lifeng; Chu, Haitao; Murad, Mohammad Hassan; Hong, Chuan; Qu, Zhiyong; Cole, Stephen R; Chen, Yong

    2018-04-16

    Decision makers rely on meta-analytic estimates to trade off benefits and harms. Publication bias impairs the validity and generalizability of such estimates. The performance of various statistical tests for publication bias has been largely compared using simulation studies and has not been systematically evaluated in empirical data. This study compares seven commonly used publication bias tests (i.e., Begg's rank test, trim-and-fill, Egger's, Tang's, Macaskill's, Deeks', and Peters' regression tests) based on 28,655 meta-analyses available in the Cochrane Library. Egger's regression test detected publication bias more frequently than other tests (15.7% in meta-analyses of binary outcomes and 13.5% in meta-analyses of non-binary outcomes). The proportion of statistically significant publication bias tests was greater for larger meta-analyses, especially for Begg's rank test and the trim-and-fill method. The agreement among Tang's, Macaskill's, Deeks', and Peters' regression tests for binary outcomes was moderately strong (most κ's were around 0.6). Tang's and Deeks' tests had fairly similar performance (κ > 0.9). The agreement among Begg's rank test, the trim-and-fill method, and Egger's regression test was weak or moderate (κ < 0.5). Given the relatively low agreement between many publication bias tests, meta-analysts should not rely on a single test and may apply multiple tests with various assumptions. Non-statistical approaches to evaluating publication bias (e.g., searching clinical trials registries, records of drug approving agencies, and scientific conference proceedings) remain essential.

  1. Empirical Green's function analysis: Taking the next step

    Science.gov (United States)

    Hough, S.E.

    1997-01-01

    An extension of the empirical Green's function (EGF) method is presented that involves determination of source parameters using standard EGF deconvolution, followed by inversion for a common attenuation parameter for a set of colocated events. Recordings of three or more colocated events can thus be used to constrain a single path attenuation estimate. I apply this method to recordings from the 1995-1996 Ridgecrest, California, earthquake sequence; I analyze four clusters consisting of 13 total events with magnitudes between 2.6 and 4.9. I first obtain corner frequencies, which are used to infer Brune stress drop estimates. I obtain stress drop values of 0.3-53 MPa (with all but one between 0.3 and 11 MPa), with no resolved increase of stress drop with moment. With the corner frequencies constrained, the inferred attenuation parameters are very consistent; they imply an average shear wave quality factor of approximately 20-25 for alluvial sediments within the Indian Wells Valley. Although the resultant spectral fitting (using corner frequency and ??) is good, the residuals are consistent among the clusters analyzed. Their spectral shape is similar to the the theoretical one-dimensional response of a layered low-velocity structure in the valley (an absolute site response cannot be determined by this method, because of an ambiguity between absolute response and source spectral amplitudes). I show that even this subtle site response can significantly bias estimates of corner frequency and ??, if it is ignored in an inversion for only source and path effects. The multiple-EGF method presented in this paper is analogous to a joint inversion for source, path, and site effects; the use of colocated sets of earthquakes appears to offer significant advantages in improving resolution of all three estimates, especially if data are from a single site or sites with similar site response.

  2. Empirical analysis on the runners' velocity distribution in city marathons

    Science.gov (United States)

    Lin, Zhenquan; Meng, Fan

    2018-01-01

    In recent decades, much researches have been performed on human temporal activity and mobility patterns, while few investigations have been made to examine the features of the velocity distributions of human mobility patterns. In this paper, we investigated empirically the velocity distributions of finishers in New York City marathon, American Chicago marathon, Berlin marathon and London marathon. By statistical analyses on the datasets of the finish time records, we captured some statistical features of human behaviors in marathons: (1) The velocity distributions of all finishers and of partial finishers in the fastest age group both follow log-normal distribution; (2) In the New York City marathon, the velocity distribution of all male runners in eight 5-kilometer internal timing courses undergoes two transitions: from log-normal distribution at the initial stage (several initial courses) to the Gaussian distribution at the middle stage (several middle courses), and to log-normal distribution at the last stage (several last courses); (3) The intensity of the competition, which is described by the root-mean-square value of the rank changes of all runners, goes weaker from initial stage to the middle stage corresponding to the transition of the velocity distribution from log-normal distribution to Gaussian distribution, and when the competition gets stronger in the last course of the middle stage, there will come a transition from Gaussian distribution to log-normal one at last stage. This study may enrich the researches on human mobility patterns and attract attentions on the velocity features of human mobility.

  3. 188 An Empirical Investigation of Value-Chain Analysis and ...

    African Journals Online (AJOL)

    User

    Analysis has a positive but insignificant impact on Competitive Advantage of a manufacturing firm in ... chain analysis is a means of increasing customer satisfaction and managing costs more ... The linkages express the relationships between the ... margins, return on assets, benchmarking, and capital budgeting. When a.

  4. Multi-elemental analysis of marine sediments of Sorsogon Bay using x-ray fluorescence spectroscopy

    International Nuclear Information System (INIS)

    Gonzales, Ralph Roly A.; Quirit, Leni L.; Rosales, Colleen Marciel F.; Pabroa, Preciosa Corazon B.; Sta Maria, Efren J.

    2011-01-01

    Metal composition and nutrient loadings of our bodies of water, when uncontrolled, may cause harmful bacterial contamination and pose threats in aquatic and human life. Toxic and trace element inputs in Sorsogon Bay sediments were determined using nuclear analytical techniques, more specifically, x-ray fluorescence spectrometry, in this study. Pre-treated marine sediment samples from Sorsogon Bay were homogenized using SPEX # 8000 mixer/mill and agate mortar and pestle, pelletized into 31-mm flat discs using SPEX 3630 X-Press and analyzed using PAN Analytical Epsilon 5 EDX X-ray Fluorescence Spectrometer with the emission and transmission method using silver and germanium secondary targets. Spectrum fitting performed using AXIL (Analysis of X-ray Spectra by Iterative Least-Squares Fitting), a subprogram in Quantitative X-ray Analysis System developed by the International Atomic Energy Agency and Quantitative Analysis of Environmental Samples program, was used for quantification of results. Results indicate generally moderate to high metal enrichment, specifically manganese, lead, cadmium, zinc and copper. Mercury and iron level enrichment are found to be low, marking an improvement from previous studies indicating high enrichment of these metals. (author)

  5. Empirical Risk Analysis of Severe Reactor Accidents in Nuclear Power Plants after Fukushima

    OpenAIRE

    Kaiser, Jan Christian

    2012-01-01

    Many countries are reexamining the risks connected with nuclear power generation after the Fukushima accidents. To provide updated information for the corresponding discussion a simple empirical approach is applied for risk quantification of severe reactor accidents with International Nuclear and Radiological Event Scale (INES) level ≥5. The analysis is based on worldwide data of commercial nuclear facilities. An empirical hazard of 21 (95% confidence intervals (CI) 4; 62) severe accidents am...

  6. Urban Noise Modelling in Boka Kotorska Bay

    Directory of Open Access Journals (Sweden)

    Aleksandar Nikolić

    2014-04-01

    Full Text Available Traffic is the most significant noise source in urban areas. The village of Kamenari in Boka Kotorska Bay is a site where, in a relatively small area, road traffic and sea (ferry traffic take place at the same time. Due to the specificity of the location, i.e. very rare synergy of sound effects of road and sea traffic in the urban area, as well as the expressed need for assessment of noise level in a simple and quick way, a research was conducted, using empirical methods and statistical analysis methods, which led to the creation of acoustic model for the assessment of equivalent noise level (Leq. The developed model for noise assessment in the Village of Kamenari in Boka Kotorska Bay quite realistically provides data on possible noise levels at the observed site, with very little deviations in relation to empirically obtained values.

  7. Multi-elemental analysis of marine sediments of Manila Bay using x-ray fluorescence spectrometry

    International Nuclear Information System (INIS)

    Rosales, Colleen Marciel Fontelera

    2011-04-01

    An analysis of the marine sediments of Manila Bay was done by employing X-ray fluorescence spectrometry. The general trends observed in sediments are increasing (Ca and Sr), decreasing (Zr), or constant (Cl, Na, S, K) with respect to depth, sometimes, no trend can be observed. These trends are further explained by correlations present among these elements, plus all the other elements. The two X RF data analysis methods Auto Quantify and AXIL were also compared on the basis of the correlation plot obtained. AutoQuantify gave clearer correlations; thus, results from this method were used for constructing correlation plots. Correlations using Microsoft Excel and Stat graphics Centurion X V show that there are naturally occurring [lithogenic (Si, Ti, Al, Mg, Rb, Zn and Fe), biogenic (Ca, Mg), and conservative (Na, Cl)] and non-naturally occurring [mostly anthropogenic, brought to the bodies of water by aeolian or fluvial input (heavy metals Pb-Cu-Zn and Ni-Cr)] correlation present in the sediments. Moreover, pairs of elements that may coexist in a source and not coexist in another (Cr and Mg, Cr and Ni) have also been observed. The heavy metal enrichment was attributed to the burning of fossil fuels, iron and steel manufacturing (present in Valenzuela-Bulacan area), ferry and fishing services and other industrialization activities present in Manila Bay. Marine organisms are affected by the presence of these heavy metals by means of bioaccumulations, and may later on affect humans because of trophic transfer and bio magnification. (author)

  8. Analysis and evaluation of seismic response of reactor building for Daya Bay Nuclear Power Plant

    International Nuclear Information System (INIS)

    Li Zhongcheng; China Guangdong Nuclear Power Company, Shenzhen; Li Zhongxian

    2005-01-01

    Daya Bay NPP has been operating safely and stably over 10 years since 1994, and its' seismic analysis of nuclear island was in accordance with the approaches in RCC-G standard for the model M310, in which the Simplified Impedance Matrix Method (SIMM) was employed for the consideration of SSI. Thanks to the rapid progress being made in upgrading the evaluation technology and the capability of data processing systems, methods and software tools for the SSI analysis have experienced significant development all over the world. Focused on the model of reactor building of the Daya Bay NPP, in his paper the more sophisticated 3D half-space continuum impedance method based on the Green functions is used to analyze the functions of the soil, and then the seismic responses of the coupled SSI system are calculated and compared with the corresponding design values. It demonstrates that the design method provides a set of conservatively safe results. The conclusions from the study is hopefully to provide some important references to the assessment of seismic safety margin for the operating NPPs. (authors)

  9. Empirical analysis of scaling and fractal characteristics of outpatients

    International Nuclear Information System (INIS)

    Zhang, Li-Jiang; Liu, Zi-Xian; Guo, Jin-Li

    2014-01-01

    The paper uses power-law frequency distribution, power spectrum analysis, detrended fluctuation analysis, and surrogate data testing to evaluate outpatient registration data of two hospitals in China and to investigate the human dynamics of systems that use the “first come, first served” protocols. The research results reveal that outpatient behavior follow scaling laws. The results also suggest that the time series of inter-arrival time exhibit 1/f noise and have positive long-range correlation. Our research may contribute to operational optimization and resource allocation in hospital based on FCFS admission protocols.

  10. Empirical Analysis of Pneumatic Tire Friction on Ice

    OpenAIRE

    Holley, Troy Nigel

    2010-01-01

    Pneumatic tire friction on ice is an under-researched area of tire mechanics. This study covers the design and analysis of a series of pneumatic tire tests on a flat-level ice road surface. The terramechanics rig of the Advanced Vehicle Dynamics Lab (AVDL) is a single-wheel test rig that allows for the experimental analysis of the forces and moments on a tire, providing directly the data for the drawbar pull of said tire, thus supporting the calculation of friction based on this data. This...

  11. Formal analysis of empirical traces in incident management

    International Nuclear Information System (INIS)

    Hoogendoorn, Mark; Jonker, Catholijn M.; Maanen, Peter-Paul van; Sharpanskykh, Alexei

    2008-01-01

    Within the field of incident management split second decisions have to be made, usually on the basis of incomplete and partially incorrect information. As a result of these conditions, errors occur in such decision processes. In order to avoid repetition of such errors, historic cases, disaster plans, and training logs need to be thoroughly analysed. This paper presents a formal approach for such an analysis that pays special attention to spatial and temporal aspects, to information exchange, and to organisational structure. The formal nature of the approach enables automation of analysis, which is illustrated by case studies of two disasters

  12. Empirical Analysis of Religiosity as Predictor of Social Media Addiction

    OpenAIRE

    Jamal J Almenayes

    2015-01-01

    This study sought to examine the dimensions of social media addiction and its relationship to religiosity.  To investigate the matter, the present research utilized a well-known Internet addiction scale and modified it to fit social media (Young, 1996).  Factor analysis of items generated by a sample of 1326 participants, three addiction factors were apparent.  These factors were later regressed on a scale of religiosity.  This scale contained a single factor based on factor analysis.  Result...

  13. Empirical analysis of scaling and fractal characteristics of outpatients

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Li-Jiang, E-mail: zljjiang@gmail.com [College of Management and Economics, Tianjin University, Tianjin 300072 (China); Management Institute, Xinxiang Medical University, Xinxiang 453003, Henan (China); Liu, Zi-Xian, E-mail: liuzixian@tju.edu.cn [College of Management and Economics, Tianjin University, Tianjin 300072 (China); Guo, Jin-Li, E-mail: phd5816@163.com [Business School, University of Shanghai for Science and Technology, Shanghai 200093 (China)

    2014-01-31

    The paper uses power-law frequency distribution, power spectrum analysis, detrended fluctuation analysis, and surrogate data testing to evaluate outpatient registration data of two hospitals in China and to investigate the human dynamics of systems that use the “first come, first served” protocols. The research results reveal that outpatient behavior follow scaling laws. The results also suggest that the time series of inter-arrival time exhibit 1/f noise and have positive long-range correlation. Our research may contribute to operational optimization and resource allocation in hospital based on FCFS admission protocols.

  14. Physical Violence between Siblings: A Theoretical and Empirical Analysis

    Science.gov (United States)

    Hoffman, Kristi L.; Kiecolt, K. Jill; Edwards, John N.

    2005-01-01

    This study develops and tests a theoretical model to explain sibling violence based on the feminist, conflict, and social learning theoretical perspectives and research in psychology and sociology. A multivariate analysis of data from 651 young adults generally supports hypotheses from all three theoretical perspectives. Males with brothers have…

  15. Unemployment and Labor Market Institutions : An Empirical Analysis

    NARCIS (Netherlands)

    Belot, M.V.K.; van Ours, J.C.

    2001-01-01

    The development of the unemployment rate di¤ers substantially between OECD countries.In this paper we investigate to what extent these differences are related to labor market institutions.In our analysis we use data of eighteen OECD countries over the period 1960-1994 and show that the way in which

  16. A Bayes Factor Meta-Analysis of Recent Extrasensory Perception Experiments: Comment on Storm, Tressoldi, and Di Risio (2010)

    Science.gov (United States)

    Rouder, Jeffrey N.; Morey, Richard D.; Province, Jordan M.

    2013-01-01

    Psi phenomena, such as mental telepathy, precognition, and clairvoyance, have garnered much recent attention. We reassess the evidence for psi effects from Storm, Tressoldi, and Di Risio's (2010) meta-analysis. Our analysis differs from Storm et al.'s in that we rely on Bayes factors, a Bayesian approach for stating the evidence from data for…

  17. An Empirical Study of Precise Interprocedural Array Analysis

    Directory of Open Access Journals (Sweden)

    Michael Hind

    1994-01-01

    Full Text Available In this article we examine the role played by the interprocedural analysis of array accesses in the automatic parallelization of Fortran programs. We use the PTRAN system to provide measurements of several benchmarks to compare different methods of representing interprocedurally accessed arrays. We examine issues concerning the effectiveness of automatic parallelization using these methods and the efficiency of a precise summarization method.

  18. Empirical Analysis of Religiosity as Predictor of Social Media Addiction

    Directory of Open Access Journals (Sweden)

    Jamal J Almenayes

    2015-10-01

    Full Text Available This study sought to examine the dimensions of social media addiction and its relationship to religiosity.  To investigate the matter, the present research utilized a well-known Internet addiction scale and modified it to fit social media (Young, 1996.  Factor analysis of items generated by a sample of 1326 participants, three addiction factors were apparent.  These factors were later regressed on a scale of religiosity.  This scale contained a single factor based on factor analysis.  Results indicated that social media addiction had three factors; "Social Consequences", "Time Displacement" and "Compulsive feelings.  Religiosity, on the other hand, contained a single factor.  Both of these results were arrived at using factor analysis of their respective scales. The relationship between religiosity and social media addiction was then examined using linear regression.  The results indicated that only two of the addiction factors were significantly related to religiosity.  Future research should address the operationalization of the concept of religiosity to account for multiple dimensions.

  19. Exploring Advertising in Higher Education: An Empirical Analysis in North America, Europe, and Japan

    Science.gov (United States)

    Papadimitriou, Antigoni; Blanco Ramírez, Gerardo

    2015-01-01

    This empirical study explores higher education advertising campaigns displayed in five world cities: Boston, New York, Oslo, Tokyo, and Toronto. The study follows a mixed-methods research design relying on content analysis and multimodal semiotic analysis and employs a conceptual framework based on the knowledge triangle of education, research,…

  20. Tourism Competitiveness Index – An Empirical Analysis Romania vs. Bulgaria

    Directory of Open Access Journals (Sweden)

    Mihai CROITORU

    2011-09-01

    Full Text Available In the conditions of the current economic downturn, many specialists consider tourism as one of the sectors with the greatest potential to provide worldwide economic growth and development. A growing tourism sector can contribute effectively to employment, increase national income, and can also make a decisive mark on the balance of payments. Thus, tourism can be an important driving force for growth and prosperity, especially in emerging economies, being a key element in reducing poverty and regional disparities. Despite its contribution to economic growth, tourism sector development can be undermined by a series of economic and legislative barriers that can affect the competitiveness of this sector. In this context, the World Economic Forum proposes, via the Tourism Competitiveness Index (TCI, in addition to a methodology to identify key factors that contribute to increasing tourism competitiveness, tools for analysis and evaluation of these factors. In this context, this paper aims to analyze the underlying determinants of TCI from the perspective of two directly competing states, Romania and Bulgaria in order to highlight the effects of communication on the competitiveness of the tourism sector. The purpose of this analysis is to provide some answers, especially in terms of communication strategies, which may explain the completely different performances of the two national economies in the tourism sector.

  1. IFRS and Stock Returns: An Empirical Analysis in Brazil

    Directory of Open Access Journals (Sweden)

    Rodrigo F. Malaquias

    2016-09-01

    Full Text Available In recent years, the convergence of accounting standards has been an issue that motivated new studies in the accounting field. It is expected that the convergence provides users, especially external users of accounting information, with comparable reports among different economies. Considering this scenario, this article was developed in order to compare the effect of accounting numbers on the stock market before and after the accounting convergence in Brazil. The sample of the study involved Brazilian listed companies at BM&FBOVESPA that had American Depository Receipts (levels II and III at the New York Stock Exchange (NYSE. For data analysis, descriptive statistics and graphic analysis were employed in order to analyze the behavior of stock returns around the publication dates. The main results indicate that the stock market reacts to the accounting reports. Therefore, the accounting numbers contain relevant information for the decision making of investors in the stock market. Moreover, it is observed that after the accounting convergence, the stock returns of the companies seem to present lower volatility.

  2. Hybrid modeling and empirical analysis of automobile supply chain network

    Science.gov (United States)

    Sun, Jun-yan; Tang, Jian-ming; Fu, Wei-ping; Wu, Bing-ying

    2017-05-01

    Based on the connection mechanism of nodes which automatically select upstream and downstream agents, a simulation model for dynamic evolutionary process of consumer-driven automobile supply chain is established by integrating ABM and discrete modeling in the GIS-based map. Firstly, the rationality is proved by analyzing the consistency of sales and changes in various agent parameters between the simulation model and a real automobile supply chain. Second, through complex network theory, hierarchical structures of the model and relationships of networks at different levels are analyzed to calculate various characteristic parameters such as mean distance, mean clustering coefficients, and degree distributions. By doing so, it verifies that the model is a typical scale-free network and small-world network. Finally, the motion law of this model is analyzed from the perspective of complex self-adaptive systems. The chaotic state of the simulation system is verified, which suggests that this system has typical nonlinear characteristics. This model not only macroscopically illustrates the dynamic evolution of complex networks of automobile supply chain but also microcosmically reflects the business process of each agent. Moreover, the model construction and simulation of the system by means of combining CAS theory and complex networks supplies a novel method for supply chain analysis, as well as theory bases and experience for supply chain analysis of auto companies.

  3. Hyperspectral Biofilm Classification Analysis for Carrying Capacity of Migratory Birds in the South Bay Salt Ponds

    Science.gov (United States)

    Hsu, Wei-Chen; Kuss, Amber Jean; Ketron, Tyler; Nguyen, Andrew; Remar, Alex Covello; Newcomer, Michelle; Fleming, Erich; Debout, Leslie; Debout, Brad; Detweiler, Angela; hide

    2011-01-01

    Tidal marshes are highly productive ecosystems that support migratory birds as roosting and over-wintering habitats on the Pacific Flyway. Microphytobenthos, or more commonly 'biofilms' contribute significantly to the primary productivity of wetland ecosystems, and provide a substantial food source for macroinvertebrates and avian communities. In this study, biofilms were characterized based on taxonomic classification, density differences, and spectral signatures. These techniques were then applied to remotely sensed images to map biofilm densities and distributions in the South Bay Salt Ponds and predict the carrying capacity of these newly restored ponds for migratory birds. The GER-1500 spectroradiometer was used to obtain in situ spectral signatures for each density-class of biofilm. The spectral variation and taxonomic classification between high, medium, and low density biofilm cover types was mapped using in-situ spectral measurements and classification of EO-1 Hyperion and Landsat TM 5 images. Biofilm samples were also collected in the field to perform laboratory analyses including chlorophyll-a, taxonomic classification, and energy content. Comparison of the spectral signatures between the three density groups shows distinct variations useful for classification. Also, analysis of chlorophyll-a concentrations show statistically significant differences between each density group, using the Tukey-Kramer test at an alpha level of 0.05. The potential carrying capacity in South Bay Salt Ponds is estimated to be 250,000 birds.

  4. The Analysis Performance Method Naive Bayes Andssvm Determine Pattern Groups of Disease

    Science.gov (United States)

    Sitanggang, Rianto; Tulus; Situmorang, Zakarias

    2017-12-01

    Information is a very important element and into the daily needs of the moment, to get a precise and accurate information is not easy, this research can help decision makers and make a comparison. Researchers perform data mining techniques to analyze the performance of methods and algorithms naïve Bayes methods Smooth Support Vector Machine (ssvm) in the grouping of the disease.The pattern of disease that is often suffered by people in the group can be in the detection area of the collection of information contained in the medical record. Medical records have infromasi disease by patients in coded according to standard WHO. Processing of medical record data to find patterns of this group of diseases that often occur in this community take the attribute address, sex, type of disease, and age. Determining the next analysis is grouping of four ersebut attribute. From the results of research conducted on the dataset fever diabete mellitus, naïve Bayes method produces an average value of 99% and an accuracy and SSVM method produces an average value of 93% accuracy

  5. Analysis and dating of Sediments in Montevideo Bay and Silver river

    International Nuclear Information System (INIS)

    Odino, R.; Suarez Antola, R.; Cabral, W.

    1999-01-01

    The present work includes generalities on the River of the Silver, antecedents of studies carried out during the years 1992-1997, in the issuing sub aquatic of Tip Carts, studies about the contamination of silts carried out during the years 1993-1996, studies preliminary of the contamination of silts in the Bay,results obtain,treatment of the data and enrichment factors. In the description of the project it is looked for to obtain reliable data on the state of contamination for heavy metals sedimentation ,velocity and age of the silts of the Bay of Montevideo and the River of the Silver, describe geologic aspects in the area involved in the study, the sampling of silts is planned having present the purpose of the same one that is to say to take samples that represent the characteristics real of silts in the sampling area, X Ray Fluorescence dispersive Energy are used in the analysis of samples, for finish, results and discussion are presented

  6. Cause, setting and ownership analysis of dog bites in Bay County, Florida from 2009 to 2010.

    Science.gov (United States)

    Matthias, J; Templin, M; Jordan, M M; Stanek, D

    2015-02-01

    Emergency room and hospital discharge data have been used to describe the risk factors and public health impact of dog bites. These data sets are based on financial charges for severe bites and underestimates dog bite burdens within communities. This study expands both the source of information and risk factor data collected to provide demographic analysis of dog bite injury risk factors reported in Bay County, Florida in 2009-2010. Extended data for dog bites reported by various sources from January 1, 2009 to December 31, 2010 were collected by Florida Department of Health in Bay County. Data collected included bite victim's age and gender, primary reported cause of bite, setting, dog's restraint status and relationship between the victim and the dog. A total of 799 bites were reported. Most bites (55%) were reported first by healthcare practitioners, particularly bites involving childrenmanagement was the most common cause of bites (26%), followed by protective behaviour (24%). Bites of unknown cause were 2.5 times more likely in childrenrisks by age group or gender provides an opportunity to implement targeted interventions to prevent dog bites. © 2014 Blackwell Verlag GmbH.

  7. GO-Bayes: Gene Ontology-based overrepresentation analysis using a Bayesian approach.

    Science.gov (United States)

    Zhang, Song; Cao, Jing; Kong, Y Megan; Scheuermann, Richard H

    2010-04-01

    A typical approach for the interpretation of high-throughput experiments, such as gene expression microarrays, is to produce groups of genes based on certain criteria (e.g. genes that are differentially expressed). To gain more mechanistic insights into the underlying biology, overrepresentation analysis (ORA) is often conducted to investigate whether gene sets associated with particular biological functions, for example, as represented by Gene Ontology (GO) annotations, are statistically overrepresented in the identified gene groups. However, the standard ORA, which is based on the hypergeometric test, analyzes each GO term in isolation and does not take into account the dependence structure of the GO-term hierarchy. We have developed a Bayesian approach (GO-Bayes) to measure overrepresentation of GO terms that incorporates the GO dependence structure by taking into account evidence not only from individual GO terms, but also from their related terms (i.e. parents, children, siblings, etc.). The Bayesian framework borrows information across related GO terms to strengthen the detection of overrepresentation signals. As a result, this method tends to identify sets of closely related GO terms rather than individual isolated GO terms. The advantage of the GO-Bayes approach is demonstrated with a simulation study and an application example.

  8. Empirical analysis of the effects of cyber security incidents.

    Science.gov (United States)

    Davis, Ginger; Garcia, Alfredo; Zhang, Weide

    2009-09-01

    We analyze the time series associated with web traffic for a representative set of online businesses that have suffered widely reported cyber security incidents. Our working hypothesis is that cyber security incidents may prompt (security conscious) online customers to opt out and conduct their business elsewhere or, at the very least, to refrain from accessing online services. For companies relying almost exclusively on online channels, this presents an important business risk. We test for structural changes in these time series that may have been caused by these cyber security incidents. Our results consistently indicate that cyber security incidents do not affect the structure of web traffic for the set of online businesses studied. We discuss various public policy considerations stemming from our analysis.

  9. Modeling for Determinants of Human Trafficking: An Empirical Analysis

    Directory of Open Access Journals (Sweden)

    Seo-Young Cho

    2015-02-01

    Full Text Available This study aims to identify robust push and pull factors of human trafficking. I test for the robustness of 70 push and 63 pull factors suggested in the literature. In doing so, I employ an extreme bound analysis, running more than two million regressions with all possible combinations of variables for up to 153 countries during the period of 1995–2010. My results show that crime prevalence robustly explains human trafficking both in destination and origin countries. Income level also has a robust impact, suggesting that the cause of human trafficking shares that of economic migration. Law enforcement matters more in origin countries than destination countries. Interestingly, a very low level of gender equality may have constraining effects on human trafficking outflow, possibly because gender discrimination limits female mobility that is necessary for the occurrence of human trafficking.

  10. Generalisability of an online randomised controlled trial: an empirical analysis.

    Science.gov (United States)

    Wang, Cheng; Mollan, Katie R; Hudgens, Michael G; Tucker, Joseph D; Zheng, Heping; Tang, Weiming; Ling, Li

    2018-02-01

    Investigators increasingly use online methods to recruit participants for randomised controlled trials (RCTs). However, the extent to which participants recruited online represent populations of interest is unknown. We evaluated how generalisable an online RCT sample is to men who have sex with men in China. Inverse probability of sampling weights (IPSW) and the G-formula were used to examine the generalisability of an online RCT using model-based approaches. Online RCT data and national cross-sectional study data from China were analysed to illustrate the process of quantitatively assessing generalisability. The RCT (identifier NCT02248558) randomly assigned participants to a crowdsourced or health marketing video for promotion of HIV testing. The primary outcome was self-reported HIV testing within 4 weeks, with a non-inferiority margin of -3%. In the original online RCT analysis, the estimated difference in proportions of HIV tested between the two arms (crowdsourcing and health marketing) was 2.1% (95% CI, -5.4% to 9.7%). The hypothesis that the crowdsourced video was not inferior to the health marketing video to promote HIV testing was not demonstrated. The IPSW and G-formula estimated differences were -2.6% (95% CI, -14.2 to 8.9) and 2.7% (95% CI, -10.7 to 16.2), with both approaches also not establishing non-inferiority. Conducting generalisability analysis of an online RCT is feasible. Examining the generalisability of online RCTs is an important step before an intervention is scaled up. NCT02248558. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  11. Analysis of trends in aviation maintenance risk: An empirical approach

    International Nuclear Information System (INIS)

    Marais, Karen B.; Robichaud, Matthew R.

    2012-01-01

    Safety is paramount in the airline industry. A significant amount of effort has been devoted to reducing mechanical failures and pilot errors. Recently, more attention has been devoted to the contribution of maintenance to accidents and incidents. This study investigates and quantifies the contribution of maintenance, both in terms of frequency and severity, to passenger airline risk by analyzing three different sources of data from 1999 to 2008: 769 NTSB accident reports, 3242 FAA incident reports, and 7478 FAA records of fines and other legal actions taken against airlines and associated organizations. We analyze several safety related metrics and develop an aviation maintenance risk scorecard that collects these metrics to synthesize a comprehensive track record of maintenance contribution to airline accidents and incidents. We found for example that maintenance-related accidents are approximately 6.5 times more likely to be fatal than accidents in general, and that when fatalities do occur, maintenance accidents result in approximately 3.6 times more fatalities on average. Our analysis of accident trends indicates that this contribution to accident risk has remained fairly constant over the past decade. Our analysis of incidents and FAA fines and legal actions also revealed similar trends. We found that at least 10% of incidents involving mechanical failures such as ruptured hydraulic lines can be attributed to maintenance, suggesting that there may be issues surrounding both the design of and compliance with maintenance plans. Similarly 36% of FAA fines and legal actions involve inadequate maintenance, with recent years showing a decline to about 20%, which may be a reflection of improved maintenance practices. Our results can aid industry and government in focusing resources to continue improving aviation safety.

  12. Application of GO methodology in reliability analysis of offsite power supply of Daya Bay NPP

    International Nuclear Information System (INIS)

    Shen Zupei; Li Xiaodong; Huang Xiangrui

    2003-01-01

    The author applies the GO methodology to reliability analysis of the offsite power supply system of Daya Bay NPP. The direct quantitative calculation formulas of the stable reliability target of the system with shared signals and the dynamic calculation formulas of the state probability for the unit with two states are derived. The method to solve the fault event sets of the system is also presented and all the fault event sets of the outer power supply system and their failure probability are obtained. The resumption reliability of the offsite power supply system after the stability failure of the power net is also calculated. The result shows that the GO methodology is very simple and useful in the stable and dynamic reliability analysis of the repairable system

  13. Aspect-based sentiment analysis to review products using Naïve Bayes

    Science.gov (United States)

    Mubarok, Mohamad Syahrul; Adiwijaya, Aldhi, Muhammad Dwi

    2017-08-01

    Product reviews can provide great benefits for consumers and producers. Number of reviews could be ranging from hundreds to thousands and containing various opinions. These make the process of analyzing and extracting information on existing reviews become increasingly difficult. In this research, sentiment analysis was used to analyze and extract sentiment polarity on product reviews based on a specific aspect of the product. This research was conducted in three phases, such as data preprocessing which involves part-of-speech (POS) tagging, feature selection using Chi Square, and classification of sentiment polarity of aspects using Naïve Bayes. Based on evaluation results, it is known that the system is able to perform aspect-based sentiment analysis with its highest F1-Measure of 78.12%.

  14. The Effect of Shocks: An Empirical Analysis of Ethiopia

    Directory of Open Access Journals (Sweden)

    Yilebes Addisu Damtie

    2015-07-01

    Full Text Available Besides striving for the increase of production and development, it is also necessary to reduce the losses created by the shocks. The people of Ethiopia are exposed to the impact of both natural and man-made shocks. Following this, policy makers, governmental and non-governmental organizations need to identify the important shocks and their effect and use as an input. This study was conducted to identify the food insecurity shocks and to estimate their effect based on the conceptual framework developed in Ethiopia, Amhara National Regional State of Libo Kemkem District. Descriptive statistical analysis, multiple regression, binary logistic regression, chi-squared and independent sample t-test were used as a data analysis technique. The results showed eight shocks affecting households which were weather variability, weed, plant insect and pest infestation, soil fertility problem, animal disease and epidemics, human disease and epidemics, price fluctuation problem and conflict. Weather variability, plant insect and pest infestation, weed, animal disease and epidemics created a mean loss of 3,821.38, 886.06, 508.04 and 1,418.32 Birr, respectively. In addition, human disease and epidemics, price fluctuation problem and conflict affected 68.11%, 88.11% and 14.59% of households, respectively. Among the sample households 28,1 % were not able to meet their food need throughout the year while 71,9 % could. The result of the multiple regression models revealed that weed existence (β = –0,142, p < 0,05, plant insect and pest infestation (β = –0,279, p < 0,01 and soil fertility problem (β = –0,321, p < 0,01 had significant effect on income. Asset was found significantly affected by plant insect and pest infestation (β = –0,229, p < 0,01, human disease and epidemics (β = 0,145, p < 0,05, and soil fertility problem (β = –0,317, p < 0,01 while food production was affected by soil fertility problem (β = –0,314, p < 0,01. Binary logistic

  15. An empirical study of tourist preferences using conjoint analysis

    Directory of Open Access Journals (Sweden)

    Tripathi, S.N.

    2010-01-01

    Full Text Available Tourism and hospitality have become key global economic activities as expectations with regard to our use of leisure time have evolved, attributing greater meaning to our free time. While the growth in tourism has been impressive, India's share in total global tourism arrivals and earnings is quite insignificant. It is an accepted fact that India has tremendous potential for development of tourism. This anomaly and the various underlying factors responsible for it are the focus of our study. The objective being determination of customer preferences for multi attribute hybrid services like tourism, so as to enable the state tourism board to deliver a desired combination of intrinsic attributes, helping it to create a sustainable competitive advantage, leading to greater customer satisfaction and positive word of mouth. Conjoint analysis has been used for this purpose, which estimates the structure of a consumer’s preferences, given his/her overall evaluations of a set of alternatives that are pre-specified in terms of levels of different attributes.

  16. Empirical Analysis on CSR Communication in Romania: Transparency and Participation

    Directory of Open Access Journals (Sweden)

    Irina-Eugenia Iamandi

    2012-12-01

    Full Text Available In the specific field of corporate social responsibility (CSR, the participation of companies in supporting social and environmental issues is mainly analysed and/or measured based on their CSR communication policy; in this way, the transparency of the CSR reporting procedures is one of the most precise challenges for researchers and practitioners in the field. The main research objective of the present paper is to distinguish between different types of CSR participation by identifying the reasons behind CSR communication for a series of companies acting on the Romanian market. The descriptive analysis – conducted both at integrated and corporate level for the Romanian companies – took into account five main CSR communication related issues: CSR site, CSR report, CSR listing, CSR budget and CSR survey. The results highlight both the declarative/prescriptive and practical/descriptive perspectives of CSR communication in Romania, showing that the Romanian CSR market is reaching its full maturity. In more specific terms, the majority of the investigated companies are already using different types of CSR participation, marking the transition from CSR just for commercial purposes to CSR for long-term strategic use. The achieved results are broadly analysed in the paper and specific conclusions are emphasized.

  17. Political determinants of social expenditures in Greece: an empirical analysis

    Directory of Open Access Journals (Sweden)

    Ebru Canikalp

    2017-09-01

    Full Text Available A view prominently expounded is that the interaction between the composition and the volume of public expenditures is directly affected by political, institutional, psephological and ideological indicators. A crucial component of public expenditures, social expenditures play an important role in the economy as they directly and indirectly affect the distribution of income and wealth. Social expenditures aim at reallocating income and wealth unequal distribution. These expenditures comprise cash benefits, direct in-kind provision of goods and services, and tax breaks with social purposes.The aim of this study is to determine the relationship between political structure, i.e. government fragmentation, ideological composition, elections and so on, and the social expenditures in Greece. Employing data from the Comparative Political Dataset (CPDS and the OECD Social Expenditure Database (SOCX, a time series analysis was conducted for Greece for the 1980-2014 period. The findings of the study indicate that voter turnout, spending on the elderly population and the number of government changes have positive and statistically significant effects on social expenditures in Greece while debt stock and cabinet composition have negative effects.

  18. Empirical Analysis on The Existence of The Phillips Curve

    Directory of Open Access Journals (Sweden)

    Shaari Mohd Shahidan

    2018-01-01

    Full Text Available The Phillips curve shows the trade-off relationship between the inflation and unemployment rates. A rise in inflation due to the high economic growth, more jobs are available and therefore unemployment will fall. However, the existence of the Phillips curve in high-income countries has not been much discussed. Countries with high income should have low unemployment rate, suggesting a high inflation. However, some high-income countries, the United States in 1970s for example, could not avert stagflation whereby high unemployment rate and inflation occurred in the same time. This situation is contrary to the Phillips curve. Therefore, this study aims to investigate the existence of the Phillips curve in high-income countries for the period 1990-2014 using the panel data analysis. The most interesting finding of this study is the existence of a bidirectional relationship between unemployment rate and inflation rate in both long and short runs. Therefore, the governments should choose to stabilize inflation rate or reduce unemployment rate

  19. Uncertainty analysis and validation of environmental models. The empirically based uncertainty analysis

    International Nuclear Information System (INIS)

    Monte, Luigi; Hakanson, Lars; Bergstroem, Ulla; Brittain, John; Heling, Rudie

    1996-01-01

    The principles of Empirically Based Uncertainty Analysis (EBUA) are described. EBUA is based on the evaluation of 'performance indices' that express the level of agreement between the model and sets of empirical independent data collected in different experimental circumstances. Some of these indices may be used to evaluate the confidence limits of the model output. The method is based on the statistical analysis of the distribution of the index values and on the quantitative relationship of these values with the ratio 'experimental data/model output'. Some performance indices are described in the present paper. Among these, the so-called 'functional distance' (d) between the logarithm of model output and the logarithm of the experimental data, defined as d 2 =Σ n 1 ( ln M i - ln O i ) 2 /n where M i is the i-th experimental value, O i the corresponding model evaluation and n the number of the couplets 'experimental value, predicted value', is an important tool for the EBUA method. From the statistical distribution of this performance index, it is possible to infer the characteristics of the distribution of the ratio 'experimental data/model output' and, consequently to evaluate the confidence limits for the model predictions. This method was applied to calculate the uncertainty level of a model developed to predict the migration of radiocaesium in lacustrine systems. Unfortunately, performance indices are affected by the uncertainty of the experimental data used in validation. Indeed, measurement results of environmental levels of contamination are generally associated with large uncertainty due to the measurement and sampling techniques and to the large variability in space and time of the measured quantities. It is demonstrated that this non-desired effect, in some circumstances, may be corrected by means of simple formulae

  20. Analysis of the Energy Performance of the Chesapeake Bay Foundation's Philip Merrill Environmental Center

    Energy Technology Data Exchange (ETDEWEB)

    Griffith, B.; Deru M.; Torcellini, P.; Ellis, P.

    2005-04-01

    The Chesapeake Bay Foundation designed their new headquarters building to minimize its environmental impact on the already highly polluted Chesapeake Bay by incorporating numerous high-performance energy saving features into the building design. CBF then contacted NREL to perform a nonbiased energy evaluation of the building. Because their building attracted much attention in the sustainable design community, an unbiased evaluation was necessary to help designers replicate successes and identify and correct problem areas. This report focuses on NREL's monitoring and analysis of the overall energy performance of the building.

  1. Analysis of multi-channel seismic reflection and magnetic data along 13 degrees N latitude across the Bay of Bengal

    Digital Repository Service at National Institute of Oceanography (India)

    Rao, D.G.; Bhattacharya, G.C.; Ramana, M.V.; Subrahmanyam, V.; Ramprasad, T.; Krishna, K.S.; Chaubey, A.K.; Murty, G.P.S.; Srinivas, K.; Desa, M.; Reddy, S.I.; Ashalata, B.; Subrahmanyam, C.; Mital, G.S.; Drolia, R.K.; rai, S.N.; Ghosh, S.K.; Singh, R.N.; Majumdar, M.

    Analysis of the multi-channel seismic reflection, magnetic and bathymetric data collected along a transect, 1110 km long parallel to 13 degrees N lat. across the Bay of Bengal was made. The transect is from the continental shelf off Madras...

  2. An Empirical Analysis of the Impact of Capital Market Activities on ...

    African Journals Online (AJOL)

    An Empirical Analysis of the Impact of Capital Market Activities on the Nigerian Economy. ... Others include the expansion of the stock market in terms of depth and breadth and the attraction of foreign direct investment and foreign portfolio investment into the Nigerian economic landscape. Keywords: Nigeria, Market ...

  3. Perception of urban retailing environments : an empirical analysis of consumer information and usage fields

    NARCIS (Netherlands)

    Timmermans, H.J.P.; vd Heijden, R.E.C.M.; Westerveld, J.

    1982-01-01

    This article reports on an empirical analysis of consumer information and usage fields in the city of Eindhoven. The main purposes of this study are to investigate the distance, sectoral and directional biases of these fields, to analyse whether the degree of biases is related to personal

  4. Steering the Ship through Uncertain Waters: Empirical Analysis and the Future of Evangelical Higher Education

    Science.gov (United States)

    Rine, P. Jesse; Guthrie, David S.

    2016-01-01

    Leaders of evangelical Christian colleges must navigate a challenging environment shaped by public concern about college costs and educational quality, federal inclinations toward increased regulation, and lingering fallout from the Great Recession. Proceeding from the premise that empirical analysis empowers institutional actors to lead well in…

  5. Deriving Multidimensional Poverty Indicators: Methodological Issues and an Empirical Analysis for Italy

    Science.gov (United States)

    Coromaldi, Manuela; Zoli, Mariangela

    2012-01-01

    Theoretical and empirical studies have recently adopted a multidimensional concept of poverty. There is considerable debate about the most appropriate degree of multidimensionality to retain in the analysis. In this work we add to the received literature in two ways. First, we derive indicators of multiple deprivation by applying a particular…

  6. Extended Analysis of Empirical Citations with Skinner's "Verbal Behavior": 1984-2004

    Science.gov (United States)

    Dixon, Mark R.; Small, Stacey L.; Rosales, Rocio

    2007-01-01

    The present paper comments on and extends the citation analysis of verbal operant publications based on Skinner's "Verbal Behavior" (1957) by Dymond, O'Hora, Whelan, and O'Donovan (2006). Variations in population parameters were evaluated for only those studies that Dymond et al. categorized as empirical. Preliminary results indicate that the…

  7. Does risk management contribute to IT project success? A meta-analysis of empirical evidence

    NARCIS (Netherlands)

    de Bakker, K.F.C.; Boonstra, A.; Wortmann, J.C.

    The question whether risk management contributes to IT project success is considered relevant by people from both academic and practitioners' communities already for a long time. This paper presents a meta-analysis of the empirical evidence that either supports or opposes the claim that risk

  8. Critical Access Hospitals and Retail Activity: An Empirical Analysis in Oklahoma

    Science.gov (United States)

    Brooks, Lara; Whitacre, Brian E.

    2011-01-01

    Purpose: This paper takes an empirical approach to determining the effect that a critical access hospital (CAH) has on local retail activity. Previous research on the relationship between hospitals and economic development has primarily focused on single-case, multiplier-oriented analysis. However, as the efficacy of federal and state-level rural…

  9. An Empirical Analysis of the Default Rate of Informal Lending—Evidence from Yiwu, China

    Science.gov (United States)

    Lu, Wei; Yu, Xiaobo; Du, Juan; Ji, Feng

    This study empirically analyzes the underlying factors contributing to the default rate of informal lending. This paper adopts snowball sampling interview to collect data and uses the logistic regression model to explore the specific factors. The results of these analyses validate the explanation of how the informal lending differs from the commercial loan. Factors that contribute to the default rate have particular attributes, while sharing some similarities with commercial bank or FICO credit scoring Index. Finally, our concluding remarks draw some inferences from empirical analysis and speculate as to what this may imply for the role of formal and informal financial sectors.

  10. Metagenomic analysis of lysogeny in Tampa Bay: implications for prophage gene expression.

    Directory of Open Access Journals (Sweden)

    Lauren McDaniel

    Full Text Available Phage integrase genes often play a role in the establishment of lysogeny in temperate phage by catalyzing the integration of the phage into one of the host's replicons. To investigate temperate phage gene expression, an induced viral metagenome from Tampa Bay was sequenced by 454/Pyrosequencing. The sequencing yielded 294,068 reads with 6.6% identifiable. One hundred-three sequences had significant similarity to integrases by BLASTX analysis (e < or =0.001. Four sequences with strongest amino-acid level similarity to integrases were selected and real-time PCR primers and probes were designed. Initial testing with microbial fraction DNA from Tampa Bay revealed 1.9 x 10(7, and 1300 gene copies of Vibrio-like integrase and Oceanicola-like integrase L(-1 respectively. The other two integrases were not detected. The integrase assay was then tested on microbial fraction RNA extracted from 200 ml of Tampa Bay water sampled biweekly over a 12 month time series. Vibrio-like integrase gene expression was detected in three samples, with estimated copy numbers of 2.4-1280 L(-1. Clostridium-like integrase gene expression was detected in 6 samples, with estimated copy numbers of 37 to 265 L(-1. In all cases, detection of integrase gene expression corresponded to the occurrence of lysogeny as detected by prophage induction. Investigation of the environmental distribution of the two expressed integrases in the Global Ocean Survey Database found the Vibrio-like integrase was present in genome equivalents of 3.14% of microbial libraries and all four viral metagenomes. There were two similar genes in the library from British Columbia and one similar gene was detected in both the Gulf of Mexico and Sargasso Sea libraries. In contrast, in the Arctic library eleven similar genes were observed. The Clostridium-like integrase was less prevalent, being found in 0.58% of the microbial and none of the viral libraries. These results underscore the value of metagenomic data

  11. Study of environmental pollution by heavy metals in Sepetiba Bay and Paraiba do Sul River - Guandu River by analysis of critical parameters

    International Nuclear Information System (INIS)

    Pfeiffer, W.C.; Fiszman, M.; Malm, O.; Lima, N.R.W.; Azcue, J.M.

    The heavy metal pollution in Sepetiba Bay and Paraiba do Sul River - Guandu River is studied by analysis of critical parameters. This ones are employed in environmental impact determination of nuclear installations. Three critic metals (Cr, Zn, Cd) and four (Pb, Cu, Zn, Cr) ones are lauched by the industrial park of Sepetiba Bay and Paraiba Vale respectively. (M.A.C.) [pt

  12. Elemental analysis in bed sediment samples of Karnafuli estuarine zone in the Bay of Bengal by instrumental neutron activation analysis

    International Nuclear Information System (INIS)

    Molla, N.I.; Hossain, S.M.; Basunia, S.; Miah, R.U.; Rahman, M.; Sikder, D.H.; Chowdhury, M.I.

    1997-01-01

    The concentration of rare earths and other elements have been determined in the bed sediment samples of Karnafuli estuarine zone in the Bay of Bengal by instrumental neutron activation analysis (INAA). The samples and the standards soil-5, soil-7, coal fly ash and pond sediment were prepared and simultaneously irradiated for short and long time at the TRIGA Mark-II research reactor facility of Atomic Energy Research Establishment, Savar, Dhaka. The maximum thermal neutron flux was of the order of 10 13 n x cm -2 x s -1 . After irradiation the radioactivity of the product nuclides was measured by using a high resolution high purity germanium detector system. Analysis of γ-ray spectra and quantitative analysis of the elemental concentration were done via the software GANAAS. It has been possible to determine the concentration level of 27 elements including the rare earths La, Ce, Sm, Eu, Tb, Dy and Yb and uranium and thorium. (author)

  13. Pharmacoeconomic analysis of voriconazole vs. caspofungin in the empirical antifungal therapy of febrile neutropenia in Australia.

    Science.gov (United States)

    Al-Badriyeh, Daoud; Liew, Danny; Stewart, Kay; Kong, David C M

    2012-05-01

    In two major clinical trials, voriconazole and caspofungin were recommended as alternatives to liposomal amphotericin B for empirical use in febrile neutropenia. This study investigated the health economic impact of using voriconazole vs. caspofungin in patients with febrile neutropenia. A decision analytic model was developed to measure downstream consequences of empirical antifungal therapy. Clinical outcomes measured were success, breakthrough infection, persistent base-line infection, persistent fever, premature discontinuation and death. Treatment transition probabilities and patterns were directly derived from data in two relevant randomised controlled trials. Resource use was estimated using an expert clinical panel. Cost inputs were obtained from latest Australian sources. The analysis adopted the perspective of the Australian hospital system. The use of caspofungin led to a lower expected mean cost per patient than voriconazole (AU$40,558 vs. AU$41,356), with a net cost saving of AU$798 (1.9%) per patient. Results were most sensitive to the duration of therapy and the alternative therapy used post-discontinuation. In uncertainty analysis, the cost associated with caspofungin is less than that with voriconazole in 65.5% of cases. This is the first economic evaluation of voriconazole vs. caspofungin for empirical therapy. Caspofungin appears to have a higher probability of having cost-savings than voriconazole for empirical therapy. The difference between the two medications does not seem to be statistically significant however. © 2011 Blackwell Verlag GmbH.

  14. L1-norm kernel discriminant analysis via Bayes error bound optimization for robust feature extraction.

    Science.gov (United States)

    Zheng, Wenming; Lin, Zhouchen; Wang, Haixian

    2014-04-01

    A novel discriminant analysis criterion is derived in this paper under the theoretical framework of Bayes optimality. In contrast to the conventional Fisher's discriminant criterion, the major novelty of the proposed one is the use of L1 norm rather than L2 norm, which makes it less sensitive to the outliers. With the L1-norm discriminant criterion, we propose a new linear discriminant analysis (L1-LDA) method for linear feature extraction problem. To solve the L1-LDA optimization problem, we propose an efficient iterative algorithm, in which a novel surrogate convex function is introduced such that the optimization problem in each iteration is to simply solve a convex programming problem and a close-form solution is guaranteed to this problem. Moreover, we also generalize the L1-LDA method to deal with the nonlinear robust feature extraction problems via the use of kernel trick, and hereafter proposed the L1-norm kernel discriminant analysis (L1-KDA) method. Extensive experiments on simulated and real data sets are conducted to evaluate the effectiveness of the proposed method in comparing with the state-of-the-art methods.

  15. Environmental And Area Support Capability Analysis For Seaweed Mariculture Development In Hading Bay Of East Flores Regency

    Directory of Open Access Journals (Sweden)

    Dominikus K. Da Costa

    2015-08-01

    Full Text Available Abstract East Flores regency has adequate marine resources potentials to develop seaweed aquaculture area. Animportant aspect of seaweed aquaculture is the site selection. Site selection is based on the marine area extent and its ecological quality. The objectives of the study were to analyze the water ecology and its support capability and to determine the best site for continuous seaweed mariculture in Hading Bay of East Flores Regency. The study used descriptive method. It was conducted in Hading Bay Lewolema District East Flores Regency in March 2015. Data analysis was done using GIS based on area suitability value and the method applied in the mariculture was long line method. Total Hading Bay water territory was864676 ha. Site Awas135345 ha site B was 474222 ha and site C was 255108 ha. Area with S1 category was 729331 ha extended in Site B and C. Area with S2 category was 135345 ha as extended in Site A. Water territory support capability was 778208 ha. The number of seaweed mariculture units was 194552 units and seaweed territory capacity was 99. Hading Bay waster has the capacity and area support capability for K. alvareziiseaweed mariculture site. Site A was categorized S2 on suitability class and site B and C were categorized S1 on suitability class. The results showed different quality of water territory in those three sites was not significant and still in normal range of K. alvarezii seaweed mariculture development.

  16. The effect of marketing expenses on car sales – an empirical analysis

    Directory of Open Access Journals (Sweden)

    Tudose Mihaela Brînduşa

    2017-01-01

    Full Text Available The paper assesses empirically the relationship between marketing expenditures and sales in a highly competitive industry, namely automotive, by analyzing the marketing expending of Automobile Dacia S.A. The first part of the paper presents the state-of-the-art and discusses the studies previously conducted which focus on the structure, dynamic and the impact of marketing expenses, while the second part consists in an empirical analysis conducted on Automobile Dacia S.A. marketing spending. The results of the study show that the company managed to increase its’ market share by adopting differentiated marketing for each geographical area. Although the research revealed that the allocation percentage from sales for marketing spending is relatively low (5-6%, the analysis conducted on the cost per unit sold reveals a share of 3% on marketing spending.

  17. Life Cycle Inventory Analysis for a Small-Scale Trawl Fishery in Sendai Bay, Japan

    Directory of Open Access Journals (Sweden)

    Kazuhito Watanabe

    2016-04-01

    Full Text Available A reduced environmental burden, while maintaining high quality and low cost, has become an important factor for achieving sustainability in the fisheries sector. The authors performed life cycle inventory (LCI analysis targeting the fish production for a small-scale trawl fishery including small trawlers operating in Sendai Bay, Japan. The average annual cumulative CO2 emissions for the small trawlers were 4.7 ton-CO2/ton-product and 8.3 ton-CO2/million Japanese yen (JPN. Total fuel consumption contributed to 97% of the global warming potential. The range of variation in the basic unit of CO2 for each small trawler was also elucidated. Energy conservation through lower fuel consumption is shown to be an effective measure for reducing CO2 in a small trawler fishery. Moreover, the authors examined the system boundary, the determination of the functional unit, and the allocation method of applying LCI analysis to fisheries. Finally, the economy and environment of small trawler fisheries are discussed as important factors for sustainable fisheries, and the life cycle approach is applied to a new fishery type in Japan.

  18. Seismic analysis and design of spent subassembly storage bay (SSSB) pool

    International Nuclear Information System (INIS)

    Abdul Gani, H.I.; Ramanjaneyulu, K.V.S.; Pillai, C.S.; Chetal, S.C.

    2003-01-01

    Fuel bundles, after their specified stay in reactor core, are replaced by fresh fuel for sustaining power generation at rated levels. The irradiated fuel subassembly, removed fresh from core, known as spent fuel sub assembly, is radioactive and decay heat generating. It needs to be cooled before it becomes amenable for handling, either for reprocessing or for immobilisation. For this purpose, it is immersed in a pool of water, retained in a concrete structure referred as Spent Subassembly Storage Bay (SSSB) pool. The height of water column above fuel bundles is arrived from shielding considerations. SSSB pool is one of the nuclear safety related structures and warrants rigorous analysis and design. The SSSB pool, in case of PFBR 500 MW(e) is located in fuel building. It is a stainless steel lined. water retaining rectangular R.C.C. open tank of size 7.5 X 29.0 m, with a height of 11.0 m. This structure is analysed for two levels of site specific earthquakes taking in to account liquid structure interactions as per ASCE-4, 1998. The design of walls and bottom slab is carried out satisfying the AERB code for nuclear safety related structures. Analysis and design of SSSB pool of PFBR is presented in the following paper. (author)

  19. Empowering Kanban through TPS-Principles - An Empirical Analysis of the Toyota Production System

    OpenAIRE

    Thun , Jörn-Henrik; Drüke , Martin; Grübner , Andre

    2010-01-01

    Abstract The purpose of this paper is the empirical investigation of the Toyota Production System in order to test existing relationships as they are proposed in theory. The underlying model consists of seven factors reflecting the key practices of the Toyota Production System. Using data from 188 manufacturing plants participating in the High Performance Manufacturing research project, the model?s measurement characteristics were validated through confirmatory factor analysis. Pat...

  20. Risk and Protective Factors of Internet Addiction: A Meta-Analysis of Empirical Studies in Korea

    OpenAIRE

    Koo, Hoon Jung; Kwon, Jung-Hye

    2014-01-01

    Purpose A meta-analysis of empirical studies performed in Korea was conducted to systematically investigate the associations between the indices of Internet addiction (IA) and psychosocial variables. Materials and Methods Systematic literature searches were carried out using the Korean Studies Information Service System, Research Information Sharing Service, Science Direct, Google Scholar, and references in review articles. The key words were Internet addiction, (Internet) game addiction, and...

  1. Empirical evidence from an inter-industry descriptive analysis of overall materiality measures

    OpenAIRE

    N. Pecchiari; C. Emby; G. Pogliani

    2013-01-01

    This study presents an empirical cross-industry descriptive analysis of overall quantitative materiality measures. We examine the behaviour of four commonly used quantitative materiality measures within and across industries with respect to their size, relative size and stability, over ten years. The sample consists of large- and medium-sized European companies, representing 24 different industry categories for the years 1998 through 2007 (a total sample of over 36,000 data points). Our resul...

  2. Characterizing Social Interaction in Tobacco-Oriented Social Networks: An Empirical Analysis

    OpenAIRE

    Liang, Yunji; Zheng, Xiaolong; Zeng, Daniel Dajun; Zhou, Xingshe; Leischow, Scott James; Chung, Wingyan

    2015-01-01

    Social media is becoming a new battlefield for tobacco ?wars?. Evaluating the current situation is very crucial for the advocacy of tobacco control in the age of social media. To reveal the impact of tobacco-related user-generated content, this paper characterizes user interaction and social influence utilizing social network analysis and information theoretic approaches. Our empirical studies demonstrate that the exploding pro-tobacco content has long-lasting effects with more active users a...

  3. Establishment of Grain Farmers' Supply Response Model and Empirical Analysis under Minimum Grain Purchase Price Policy

    OpenAIRE

    Zhang, Shuang

    2012-01-01

    Based on farmers' supply behavior theory and price expectations theory, this paper establishes grain farmers' supply response model of two major grain varieties (early indica rice and mixed wheat) in the major producing areas, to test whether the minimum grain purchase price policy can have price-oriented effect on grain production and supply in the major producing areas. Empirical analysis shows that the minimum purchase price published annually by the government has significant positive imp...

  4. An ACE-based Nonlinear Extension to Traditional Empirical Orthogonal Function Analysis

    DEFF Research Database (Denmark)

    Hilger, Klaus Baggesen; Nielsen, Allan Aasbjerg; Andersen, Ole

    2001-01-01

    This paper shows the application of the empirical orthogonal unctions/principal component transformation on global sea surface height and temperature data from 1996 and 1997. A nonlinear correlation analysis of the transformed data is proposed and performed by applying the alternating conditional...... expectations algorithm. New canonical variates are found that indicate that the highest correlation between ocean temperature and height is associated with the build-up of the El Niño during the last half of 1997....

  5. An Empirical Study Based on the SPSS Variance Analysis of College Teachers' Sports Participation and Satisfaction

    OpenAIRE

    Yunqiu Liang

    2013-01-01

    The study on University Teachers ' sports participation and their job satisfaction relationship for empirical research, mainly from the group to participate in sports activities situation on the object of study, investigation and mathematical statistics analysis SPSS. Results show that sports groups participate in job satisfaction higher than those in groups of job satisfaction; sports participation, different job satisfaction is also different. Recommendations for college teachers to address...

  6. An empirical analysis of the relationship between the consumption of alcohol and liver cirrhosis mortality

    DEFF Research Database (Denmark)

    Bentzen, Jan Børsen; Smith, Valdemar

    The question whether intake of alcohol is associated with liver cirrhosis mortality is analyzed using aggregate data for alcohol consumption, alcohol related diseases and alcohol policies of 16 European countries. The empirical analysis gives support to a close association between cirrhosis morta...... mortality and intake of alcohol - and the latter also concerns each of the specific beverages, i.e. spirits, wine and beer, where other studies usually only find evidence of spirits and wine related to liver cirrhosis mortality.  ...

  7. Fouha Bay Moving Window Analysis, Benthic Quadrat Surveys at Guam in 2014

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — PIRO Fishery Biologist gathered benthic cover data using a 1m2 quadrat with 25 intersecting points every five meters along a transect running from the inner bay to...

  8. Analysis of Level of Technogenic Impact on Water Area of Uglovoy Bay

    Science.gov (United States)

    Petukhov, V. I.; Petrova, E. A.; Losev, O. V.

    2017-11-01

    Industrial effluent discharge and man-induced soil fills play a decisive role in increased pollutant concentrations. Several areas which are unfavorable in terms of the heavy metal and oil product content have been identified by the environmental monitoring results in the Uglovoy Bay in February 2015. Maximum permissible concentrations (MPC) of heavy metals and oil products were exceeded in the northeastern part of the Uglovoy Bay in locations where the Peschanka River and the Aerodromnaya River drain into the sea. Integral heavy-metal index calculations showed that this area is the most polluted in the Uglovoy Bay. Other significantly polluted areas were identified off the Zima Yuzhnaya settlement in the mouth of the bay and in vicinity of the low-level bridge.

  9. Transcriptome analysis deciphers evolutionary mechanisms underlying genetic differentiation between coastal and offshore anchovy populations in the Bay of Biscay

    KAUST Repository

    Montes, Iratxe; Zarraonaindia, Iratxe; Iriondo, Mikel; Grant, W. Stewart; Manzano, Carmen; Cotano, Unai; Conklin, Darrell; Irigoien, Xabier; Estonba, Andone

    2016-01-01

    Morphometry and otolith microchemistry point to the existence of two populations of the European anchovy (Engraulis encrasicolus) in the Bay of Biscay: one in open seawaters, and a yet unidentified population in coastal waters. To test this hypothesis, we assembled a large number of samples from the region, including 587 juveniles and spawning adults from offshore and coastal waters, and 264 fish from other locations covering most of the species’ European range. These samples were genotyped for 456 exonic SNPs that provide a robust way to decipher adaptive processes in these populations. Two genetically differentiated populations of anchovy inhabit the Bay of Biscay with different population dynamics: (1) a large offshore population associated with marine waters included in the wide-shelf group, and (2) a coastal metapopulation adapted to estuarine environments in the Bay of Biscay and North Sea included in the narrow-shelf group. Transcriptome analysis identified neutral and adaptive evolutionary processes underlying differentiation between these populations. Reduced gene flow between offshore and coastal populations in the Bay of Biscay appears to result from divergence between two previously isolated gene pools adapted to contrasting habitats and now in secondary contact. Eleven molecular markers appear to mark divergent selection between the ecotypes, and a majority of these markers are associated with salinity variability. Ecotype differences at two outlier genes, TSSK6 and basigin, may hinder gamete compatibility between the ecotypes and reinforce reproductive isolation. Additionally, possible convergent evolution between offshore and coastal populations in the Bay of Biscay has been detected for the syntaxin1B-otoferlin gene system, which is involved in the control of larval buoyancy. Further study of exonic markers opens the possibility of understanding the mechanisms of adaptive divergence between European anchovy populations. © 2016, Springer

  10. Transcriptome analysis deciphers evolutionary mechanisms underlying genetic differentiation between coastal and offshore anchovy populations in the Bay of Biscay

    KAUST Repository

    Montes, Iratxe

    2016-09-13

    Morphometry and otolith microchemistry point to the existence of two populations of the European anchovy (Engraulis encrasicolus) in the Bay of Biscay: one in open seawaters, and a yet unidentified population in coastal waters. To test this hypothesis, we assembled a large number of samples from the region, including 587 juveniles and spawning adults from offshore and coastal waters, and 264 fish from other locations covering most of the species’ European range. These samples were genotyped for 456 exonic SNPs that provide a robust way to decipher adaptive processes in these populations. Two genetically differentiated populations of anchovy inhabit the Bay of Biscay with different population dynamics: (1) a large offshore population associated with marine waters included in the wide-shelf group, and (2) a coastal metapopulation adapted to estuarine environments in the Bay of Biscay and North Sea included in the narrow-shelf group. Transcriptome analysis identified neutral and adaptive evolutionary processes underlying differentiation between these populations. Reduced gene flow between offshore and coastal populations in the Bay of Biscay appears to result from divergence between two previously isolated gene pools adapted to contrasting habitats and now in secondary contact. Eleven molecular markers appear to mark divergent selection between the ecotypes, and a majority of these markers are associated with salinity variability. Ecotype differences at two outlier genes, TSSK6 and basigin, may hinder gamete compatibility between the ecotypes and reinforce reproductive isolation. Additionally, possible convergent evolution between offshore and coastal populations in the Bay of Biscay has been detected for the syntaxin1B-otoferlin gene system, which is involved in the control of larval buoyancy. Further study of exonic markers opens the possibility of understanding the mechanisms of adaptive divergence between European anchovy populations. © 2016, Springer

  11. Analysis of a viral metagenomic library from 200 m depth in Monterey Bay, California constructed by direct shotgun cloning

    OpenAIRE

    Preston Christina M; Steward Grieg F

    2011-01-01

    Abstract Background Viruses have a profound influence on both the ecology and evolution of marine plankton, but the genetic diversity of viral assemblages, particularly those in deeper ocean waters, remains poorly described. Here we report on the construction and analysis of a viral metagenome prepared from below the euphotic zone in a temperate, eutrophic bay of coastal California. Methods We purified viruses from approximately one cubic meter of seawater collected from 200m depth in Montere...

  12. Optimization of maintenance programme at Daya Bay Nuclear Power Station based on RCM analysis results

    International Nuclear Information System (INIS)

    Liu Min

    2003-01-01

    This paper begins with an introduction to Guangdong Daya Bay Nuclear Power Station (GNPS) and gives a simple introduction to the operations and maintenance documentation system at GNPS. It then will review the maintenance program guidelines base and the associated problems prior to the application of reliability centered maintenance (RCM). How RCM was implemented at GNPS, how the results of RCM analysis were used to optimize the maintenance program and test program, and what is the interface between RCM and the existing maintenance program will be shown. Next, it will show the successful implementation of RCM at GNPS resulted in the following changes: A new understanding of equipment failure challenges operations and maintenance beliefs, maintenance concepts undergo a large change, maintenance program and periodic testing program are continuously modified and optimized, new on-condition maintenance technologies are introduced, non-productive scheduled overhauls are discarded, maintenance costs are effectively controlled, maintenance appropriateness has improved, management of hidden failures is more effective and timely. It will show the benefit of greater equipment reliability brought about by all of these changes, which in turn increases the reliability and safety of the entire power station. (author)

  13. Morphological variation and phylogenetic analysis of the dinoflagellate Gymnodinium aureolum from a tributary of Chesapeake Bay.

    Science.gov (United States)

    Tang, Ying Zhong; Egerton, Todd A; Kong, Lesheng; Marshall, Harold G

    2008-01-01

    Cultures of four strains of the dinoflagellate Gymnodinium aureolum (Hulburt) G. Hansen were established from the Elizabeth River, a tidal tributary of the Chesapeake Bay, USA. Light microscopy, scanning electron microscopy, nuclear-encoded large sub-unit rDNA sequencing, and culturing observations were conducted to further characterize this species. Observations of morphology included: a multiple structured apical groove; a peduncle located between the emerging points of the two flagella; pentagonal and hexagonal vesicles on the amphiesma; production and germination of resting cysts; variation in the location of the nucleus within the center of the cell; a longitudinal ventral concavity; and considerable variation in cell width/length and overall cell size. A fish bioassay using juvenile sheepshead minnows detected no ichthyotoxicity from any of the strains over a 48-h period. Molecular analysis confirmed the dinoflagellate was conspecific with G. aureolum strains from around the world, and formed a cluster along with several other Gymnodinium species. Morphological evidence suggests that further research is necessary to examine the relationship between G. aureolum and a possibly closely related species Gymnodinium maguelonnense.

  14. Calibrating a combined energy systems analysis and controller design method with empirical data

    International Nuclear Information System (INIS)

    Murphy, Gavin Bruce; Counsell, John; Allison, John; Brindley, Joseph

    2013-01-01

    The drive towards low carbon constructions has seen buildings increasingly utilise many different energy systems simultaneously to control the human comfort of the indoor environment; such as ventilation with heat recovery, various heating solutions and applications of renewable energy. This paper describes a dynamic modelling and simulation method (IDEAS – Inverse Dynamics based Energy Assessment and Simulation) for analysing the energy utilisation of a building and its complex servicing systems. The IDEAS case study presented in this paper is based upon small perturbation theory and can be used for the analysis of the performance of complex energy systems and also for the design of smart control systems. This paper presents a process of how any dynamic model can be calibrated against a more empirical based data model, in this case the UK Government's SAP (Standard Assessment Procedure). The research targets of this work are building simulation experts for analysing the energy use of a building and also control engineers to assist in the design of smart control systems for dwellings. The calibration process presented is transferable and has applications for simulation experts to assist in calibrating any dynamic building simulation method with an empirical based method. - Highlights: • Presentation of an energy systems analysis method for assessing the energy utilisation of buildings and their complex servicing systems. • An inverse dynamics based controller design method is detailed. • Method of how a dynamic model can be calibrated with an empirical based model

  15. An empirical analysis of the relationship between cost of control activities and project management success

    Directory of Open Access Journals (Sweden)

    Al-Tmeemy Samiaah

    2018-01-01

    Full Text Available To achieve the objectives of continuous improvement programs, construction managers must link the achievement of quality with cost. This paper aims to associate project management success (PMS with cost of control (COC activities in an empirical manner. Thus, the purpose is to determine the extent to which COC activities impact PMS. Quantitative method was adopted to collect data from Malaysian building companies using postal and email surveys. Hypothesis is tested using correlation and simple linear regression analysis. The findings of this study indicate that COC activities are positively associated with the PMS. The empirical evidences obtained from this research, provides financial justification for all quality improvement efforts. This can assist building contractors to enhance the success of project management by reducing the level of business failures due to poor quality, cost overruns, and delays.

  16. Tools for Empirical and Operational Analysis of Mobile Offloading in Loop-Based Applications

    Directory of Open Access Journals (Sweden)

    Alexandru-Corneliu OLTEANU

    2013-01-01

    Full Text Available Offloading for mobile devices is an increasingly popular research topic, matching the popu-larity mobile devices have in the general population. Studying mobile offloading is challenging because of device and application heterogeneity. However, we believe that focusing on a specific type of application can bring advances in offloading for mobile devices, while still keeping a wide range of applicability. In this paper we focus on loop-based applications, in which most of the functionality is given by iterating an execution loop. We model the main loop of the application with a graph that consists of a cycle and propose an operational analysis to study offloading on this model. We also propose a testbed based on a real-world application to empirically evaluate offloading. We conduct performance evaluation using both tools and compare the analytical and empirical results.

  17. Joint production and corporate pricing: An empirical analysis of joint products in the petroleum industry

    International Nuclear Information System (INIS)

    Karimnejad, H.

    1990-01-01

    This dissertation investigates the pricing mechanism of joint products in large multi-plant and multi-product corporations. The primary objective of this dissertation is to show the consistency of classical theories of production with corporate pricing of joint products. This dissertation has two major parts. Part One provides a theoretical framework for joint production and corporate pricing. In this part, joint production is defined and its historical treatment by classical and contemporary economists is analyzed. Part Two conducts an empirical analysis of joint products in the US petroleum industry. Methods of cost allocation are used in the pricing of each individual petroleum product. Three methods are employed to distribute joint production costs to individual petroleum products. These methods are, the sales value method, the barrel gravity method and the average unit cost method. The empirical findings of dissertation provide useful guidelines for pricing policies of large multi-product corporations

  18. Canonical Least-Squares Monte Carlo Valuation of American Options: Convergence and Empirical Pricing Analysis

    Directory of Open Access Journals (Sweden)

    Xisheng Yu

    2014-01-01

    Full Text Available The paper by Liu (2010 introduces a method termed the canonical least-squares Monte Carlo (CLM which combines a martingale-constrained entropy model and a least-squares Monte Carlo algorithm to price American options. In this paper, we first provide the convergence results of CLM and numerically examine the convergence properties. Then, the comparative analysis is empirically conducted using a large sample of the S&P 100 Index (OEX puts and IBM puts. The results on the convergence show that choosing the shifted Legendre polynomials with four regressors is more appropriate considering the pricing accuracy and the computational cost. With this choice, CLM method is empirically demonstrated to be superior to the benchmark methods of binominal tree and finite difference with historical volatilities.

  19. Regulatory reforms and productivity: An empirical analysis of the Japanese electricity industry

    International Nuclear Information System (INIS)

    Nakano, Makiko; Managi, Shunsuke

    2008-01-01

    The Japanese electricity industry has experienced regulatory reforms since the mid-1990s. This article measures productivity in Japan's steam power-generation sector and examines the effect of reforms on the productivity of this industry over the period 1978-2003. We estimate the Luenberger productivity indicator, which is a generalization of the commonly used Malmquist productivity index, using a data envelopment analysis approach. Factors associated with productivity change are investigated through dynamic generalized method of moments (GMM) estimation of panel data. Our empirical analysis shows that the regulatory reforms have contributed to productivity growth in the steam power-generation sector in Japan

  20. A Price Index Model for Road Freight Transportation and Its Empirical analysis in China

    Directory of Open Access Journals (Sweden)

    Liu Zhishuo

    2017-01-01

    Full Text Available The aim of price index for road freight transportation (RFT is to reflect the changes of price in the road transport market. Firstly, a price index model for RFT based on the sample data from Alibaba logistics platform is built. This model is a three levels index system including total index, classification index and individual index and the Laspeyres method is applied to calculate these indices. Finally, an empirical analysis of the price index for RFT market in Zhejiang Province is performed. In order to demonstrate the correctness and validity of the exponential model, a comparative analysis with port throughput and PMI index is carried out.

  1. Fast multidimensional ensemble empirical mode decomposition for the analysis of big spatio-temporal datasets.

    Science.gov (United States)

    Wu, Zhaohua; Feng, Jiaxin; Qiao, Fangli; Tan, Zhe-Min

    2016-04-13

    In this big data era, it is more urgent than ever to solve two major issues: (i) fast data transmission methods that can facilitate access to data from non-local sources and (ii) fast and efficient data analysis methods that can reveal the key information from the available data for particular purposes. Although approaches in different fields to address these two questions may differ significantly, the common part must involve data compression techniques and a fast algorithm. This paper introduces the recently developed adaptive and spatio-temporally local analysis method, namely the fast multidimensional ensemble empirical mode decomposition (MEEMD), for the analysis of a large spatio-temporal dataset. The original MEEMD uses ensemble empirical mode decomposition to decompose time series at each spatial grid and then pieces together the temporal-spatial evolution of climate variability and change on naturally separated timescales, which is computationally expensive. By taking advantage of the high efficiency of the expression using principal component analysis/empirical orthogonal function analysis for spatio-temporally coherent data, we design a lossy compression method for climate data to facilitate its non-local transmission. We also explain the basic principles behind the fast MEEMD through decomposing principal components instead of original grid-wise time series to speed up computation of MEEMD. Using a typical climate dataset as an example, we demonstrate that our newly designed methods can (i) compress data with a compression rate of one to two orders; and (ii) speed-up the MEEMD algorithm by one to two orders. © 2016 The Authors.

  2. Tracking polychlorinated biphenyls (PCBs) congener patterns in Newark Bay surface sediment using principal component analysis (PCA) and positive matrix factorization (PMF).

    Science.gov (United States)

    Saba, Tarek; Su, Steave

    2013-09-15

    PCB congener data for Newark Bay surface sediments were analyzed using PCA and PMF, and relationships between the outcomes from these two techniques were explored. The PCA scores plot separated the Lower Passaic River Mouth samples from North Newark Bay, thus indicating dissimilarity. Although PCA was able to identify subareas in the Bay system with specific PCB congener patterns (e.g., higher chlorinated congeners in Elizabeth River), further conclusions reading potential PCB source profiles or potential upland source areas were not clear for the PCA scores plot. PMF identified five source factors, and explained the Bay sample congener profiles as a mix of these Factors. This PMF solution was equivalent to (1) defining an envelope that encompasses all samples on the PCA scores plot, (2) defining source factors that plot on that envelope, and (3) explaining the congener profile for each Bay sediment sample (inside the scores plot envelope) as a mix of factors. PMF analysis allowed identifying characteristic features in the source factor congener distributions that allowed tracking of source factors to shoreline areas where PCB inputs to the Bay may have originated. The combined analysis from PCA and PMF showed that direct discharges to the Bay are likely the dominant sources of PCBs to the sediment. Review of historical upland activities and regulatory files will be needed, in addition to the PCA and PMF analysis, to fully reconstruct the history of operations and PCB releases around the Newark Bay area that impacted the Bay sediment. Copyright © 2013 Elsevier B.V. All rights reserved.

  3. Lagrangian structure of flows in the Chesapeake Bay: challenges and perspectives on the analysis of estuarine flows

    Directory of Open Access Journals (Sweden)

    M. Branicki

    2010-03-01

    Full Text Available In this work we discuss applications of Lagrangian techniques to study transport properties of flows generated by shallow water models of estuarine flows. We focus on the flow in the Chesapeake Bay generated by Quoddy (see Lynch and Werner, 1991, a finite-element (shallow water model adopted to the bay by Gross et al. (2001. The main goal of this analysis is to outline the potential benefits of using Lagrangian tools for both understanding transport properties of such flows, and for validating the model output and identifying model deficiencies. We argue that the currently available 2-D Lagrangian tools, including the stable and unstable manifolds of hyperbolic trajectories and techniques exploiting 2-D finite-time Lyapunov exponent fields, are of limited use in the case of partially mixed estuarine flows. A further development and efficient implementation of three-dimensional Lagrangian techniques, as well as improvements in the shallow-water modelling of 3-D velocity fields, are required for reliable transport analysis in such flows. Some aspects of the 3-D trajectory structure in the Chesapeake Bay, based on the Quoddy output, are also discussed.

  4. Biofuels policy and the US market for motor fuels: Empirical analysis of ethanol splashing

    Energy Technology Data Exchange (ETDEWEB)

    Walls, W.D., E-mail: wdwalls@ucalgary.ca [Department of Economics, University of Calgary, 2500 University Drive NW, Calgary, Alberta, T2N 1N4 (Canada); Rusco, Frank; Kendix, Michael [US GAO (United States)

    2011-07-15

    Low ethanol prices relative to the price of gasoline blendstock, and tax credits, have resulted in discretionary blending at wholesale terminals of ethanol into fuel supplies above required levels-a practice known as ethanol splashing in industry parlance. No one knows precisely where or in what volume ethanol is being blended with gasoline and this has important implications for motor fuels markets: Because refiners cannot perfectly predict where ethanol will be blended with finished gasoline by wholesalers, they cannot know when to produce and where to ship a blendstock that when mixed with ethanol at 10% would create the most economically efficient finished motor gasoline that meets engine standards and has comparable evaporative emissions as conventional gasoline without ethanol blending. In contrast to previous empirical analyses of biofuels that have relied on highly aggregated data, our analysis is disaggregated to the level of individual wholesale fuel terminals or racks (of which there are about 350 in the US). We incorporate the price of ethanol as well as the blendstock price to model the wholesaler's decision of whether or not to blend additional ethanol into gasoline at any particular wholesale city-terminal. The empirical analysis illustrates how ethanol and gasoline prices affect ethanol usage, controlling for fuel specifications, blend attributes, and city-terminal-specific effects that, among other things, control for differential costs of delivering ethanol from bio-refinery to wholesale rack. - Research Highlights: > Low ethanol prices and tax credits have resulted in discretionary blending of ethanol into fuel supplies above required levels. > This has important implications for motor fuels markets and vehicular emissions. > Our analysis incorporates the price of ethanol as well as the blendstock price to model the wholesaler's decision of whether or not to blend additional ethanol into gasoline at any particular wholesale city

  5. Biofuels policy and the US market for motor fuels: Empirical analysis of ethanol splashing

    International Nuclear Information System (INIS)

    Walls, W.D.; Rusco, Frank; Kendix, Michael

    2011-01-01

    Low ethanol prices relative to the price of gasoline blendstock, and tax credits, have resulted in discretionary blending at wholesale terminals of ethanol into fuel supplies above required levels-a practice known as ethanol splashing in industry parlance. No one knows precisely where or in what volume ethanol is being blended with gasoline and this has important implications for motor fuels markets: Because refiners cannot perfectly predict where ethanol will be blended with finished gasoline by wholesalers, they cannot know when to produce and where to ship a blendstock that when mixed with ethanol at 10% would create the most economically efficient finished motor gasoline that meets engine standards and has comparable evaporative emissions as conventional gasoline without ethanol blending. In contrast to previous empirical analyses of biofuels that have relied on highly aggregated data, our analysis is disaggregated to the level of individual wholesale fuel terminals or racks (of which there are about 350 in the US). We incorporate the price of ethanol as well as the blendstock price to model the wholesaler's decision of whether or not to blend additional ethanol into gasoline at any particular wholesale city-terminal. The empirical analysis illustrates how ethanol and gasoline prices affect ethanol usage, controlling for fuel specifications, blend attributes, and city-terminal-specific effects that, among other things, control for differential costs of delivering ethanol from bio-refinery to wholesale rack. - Research highlights: → Low ethanol prices and tax credits have resulted in discretionary blending of ethanol into fuel supplies above required levels. → This has important implications for motor fuels markets and vehicular emissions. → Our analysis incorporates the price of ethanol as well as the blendstock price to model the wholesaler's decision of whether or not to blend additional ethanol into gasoline at any particular wholesale city-terminal.

  6. Stable-isotope analysis of canvasback winter diet in upper Chesapeake Bay

    Science.gov (United States)

    Haramis, G.M.; Jorde, Dennis G.; Macko, S.A.; Walker, J.L.

    2001-01-01

    A major decline in submerged aquatic vegetation (SAV) in Chesapeake Bay has altered the diet of wintering Canvasbacks (Aythya valisineria) from historically plant to a combination of benthic animal foods, especially the ubiquitous Baltic clam (Macoma balthica), supplemented with anthropogenic corn (Zea mays). Because the isotopic signature of corn is readily discriminated from bay benthos, but not SAV, we used stable-isotope methodology to investigate the corn–SAV component of the winter diet of Canvasbacks. Feeding trials with penned Canvasbacks were conducted to establish turnover rates and fractionation end-point loci of δ13C and δ15N signatures of whole blood for individual ducks fed ad libitum diets of (1) Baltic clams, (2) Baltic clams and corn, and (3) tubers of wild celery (Vallisneria americana). Turnover time constants averaged 4.5 weeks, indicating that signatures of wild ducks would be representative of bay diets by late February. Isotopic signatures of wild Canvasbacks sampled in February fell on a continuum between end-point loci for the Baltic clam and the combination Baltic clam and corn diet. Although that finding verifies a clear dependence on corn–SAV for wintering Canvasbacks, it also reveals that not enough corn–SAV is available to establish ad libitum consumption for the 15,000+ Canvasbacks wintering in the upper bay. On the basis of mean δ13C signature of bay Canvasbacks (n = 59) and ingestion rates from feeding trials, we estimated that 258 kg corn per day would account for the observed δ13C enrichment and supply 18% of daily energetic needs for 15,000 Canvasbacks. That level of corn availability is so realistic that we conclude that SAV is likely of little dietary importance to Canvasbacks in that portion of the bay.

  7. The demand for gasoline in South Africa. An empirical analysis using co-integration techniques

    International Nuclear Information System (INIS)

    Akinboade, Oludele A.; Ziramba, Emmanuel; Kumo, Wolassa L.

    2008-01-01

    Using the recently developed Autoregressive Distributed Lag (ARDL) bound testing approach to co-integration, suggested by Pesaran et al. (Pesaran, M.H., Shin, Y., Smith, R.J. Bounds Testing Approaches to the Analysis of Level Relationships. Journal of Applied Econometrics 2001; 16(3) 289-326), we empirically analyzed the long-run relationship among the variables in the aggregate gasoline demand function over the period 1978-2005. Our study confirms the existence of a co-integrating relationship. The estimated price and income elasticities of - 0.47 and 0.36 imply that gasoline demand in South Africa is price and income inelastic. (author)

  8. Federalism and regional health care expenditures: an empirical analysis for the Swiss cantons.

    Science.gov (United States)

    Crivelli, Luca; Filippini, Massimo; Mosca, Ilaria

    2006-05-01

    Switzerland (7.2 million inhabitants) is a federal state composed of 26 cantons. The autonomy of cantons and a particular health insurance system create strong heterogeneity in terms of regulation and organisation of health care services. In this study we use a single-equation approach to model the per capita cantonal expenditures on health care services and postulate that per capita health expenditures depend on some economic, demographic and structural factors. The empirical analysis demonstrates that a larger share of old people tends to increase health costs and that physicians paid on a fee-for-service basis swell expenditures, thus highlighting a possible phenomenon of supply-induced demand.

  9. What is a Leading Case in EU law? An empirical analysis

    DEFF Research Database (Denmark)

    Sadl, Urska; Panagis, Yannis

    2015-01-01

    Lawyers generally explain legal development by looking at explicit amendments to statutory law and modifications in judicial practice. As far as the latter are concerned, leading cases occupy a special place. This article empirically studies the process in which certain cases become leading cases....... Our analysis focuses on Les Verts, a case of considerable fame in EU law, closely scrutinising whether it contains inherent leading case material. We show how the legal relevance of a case can become “embedded” in a long process of reinterpretation by legal actors, and we demonstrate that the actual...

  10. The demand for gasoline in South Africa. An empirical analysis using co-integration techniques

    Energy Technology Data Exchange (ETDEWEB)

    Akinboade, Oludele A.; Ziramba, Emmanuel; Kumo, Wolassa L. [Department of Economics, University of South Africa, P.O.Box 392, Pretoria 0003 (South Africa)

    2008-11-15

    Using the recently developed Autoregressive Distributed Lag (ARDL) bound testing approach to co-integration, suggested by Pesaran et al. (Pesaran, M.H., Shin, Y., Smith, R.J. Bounds Testing Approaches to the Analysis of Level Relationships. Journal of Applied Econometrics 2001; 16(3) 289-326), we empirically analyzed the long-run relationship among the variables in the aggregate gasoline demand function over the period 1978-2005. Our study confirms the existence of a co-integrating relationship. The estimated price and income elasticities of - 0.47 and 0.36 imply that gasoline demand in South Africa is price and income inelastic. (author)

  11. Population density and efficiency in energy consumption: An empirical analysis of service establishments

    International Nuclear Information System (INIS)

    Morikawa, Masayuki

    2012-01-01

    This study, using novel establishment-level microdata from the Energy Consumption Statistics, empirically analyzes the effect of urban density on energy intensity in the service sector. According to the analysis, the efficiency of energy consumption in service establishments is higher for densely populated cities. Quantitatively, after controlling for differences among industries, energy efficiency increases by approximately 12% when the density in a municipality population doubles. This result suggests that, given a structural transformation toward the service economy, deregulation of excessive restrictions hindering urban agglomeration, and investment in infrastructure in city centers would contribute to environmentally friendly economic growth.

  12. Empirical Requirements Analysis for Mars Surface Operations Using the Flashline Mars Arctic Research Station

    Science.gov (United States)

    Clancey, William J.; Lee, Pascal; Sierhuis, Maarten; Norvig, Peter (Technical Monitor)

    2001-01-01

    Living and working on Mars will require model-based computer systems for maintaining and controlling complex life support, communication, transportation, and power systems. This technology must work properly on the first three-year mission, augmenting human autonomy, without adding-yet more complexity to be diagnosed and repaired. One design method is to work with scientists in analog (mars-like) setting to understand how they prefer to work, what constrains will be imposed by the Mars environment, and how to ameliorate difficulties. We describe how we are using empirical requirements analysis to prototype model-based tools at a research station in the High Canadian Arctic.

  13. An Empirical Analysis of the Changing Role of the German Bundesbank after 1983

    DEFF Research Database (Denmark)

    Juselius, Katarina

    and expansion or contraction of money supply had the expected effect on prices, income, and interest rates. After 1983, the conventional mechanisms no longer seemed to work. The empirical analysis pointed to the crucial role of the bond rate in the system, particularly for the more recent period......A cointegrated VAR model describing a small macroeconomic system consisting of money, income, prices, and interest rates is estimated on split sample data before and after 1983. The monetary mechanisms were found to be significantly different. Before 1983, the money supply seemed controlable...

  14. Opinion mining feature-level using Naive Bayes and feature extraction based analysis dependencies

    Science.gov (United States)

    Sanda, Regi; Baizal, Z. K. Abdurahman; Nhita, Fhira

    2015-12-01

    Development of internet and technology, has major impact and providing new business called e-commerce. Many e-commerce sites that provide convenience in transaction, and consumers can also provide reviews or opinions on products that purchased. These opinions can be used by consumers and producers. Consumers to know the advantages and disadvantages of particular feature of the product. Procuders can analyse own strengths and weaknesses as well as it's competitors products. Many opinions need a method that the reader can know the point of whole opinion. The idea emerged from review summarization that summarizes the overall opinion based on sentiment and features contain. In this study, the domain that become the main focus is about the digital camera. This research consisted of four steps 1) giving the knowledge to the system to recognize the semantic orientation of an opinion 2) indentify the features of product 3) indentify whether the opinion gives a positive or negative 4) summarizing the result. In this research discussed the methods such as Naï;ve Bayes for sentiment classification, and feature extraction algorithm based on Dependencies Analysis, which is one of the tools in Natural Language Processing (NLP) and knowledge based dictionary which is useful for handling implicit features. The end result of research is a summary that contains a bunch of reviews from consumers on the features and sentiment. With proposed method, accuration for sentiment classification giving 81.2 % for positive test data, 80.2 % for negative test data, and accuration for feature extraction reach 90.3 %.

  15. The incident of repetitive demands resolution in consumer affairs: empirical analysis of legal feasibility

    Directory of Open Access Journals (Sweden)

    Lucas do Monte Silva

    2017-05-01

    Full Text Available Faced with the scenario of massification of lawsuits, this article intends to analyze the main arguments and questionings of the demands related to moral damage and health plans, on Santa Catarina’s Court of Justice, in order to analyze the possible application of the incident of repetitive demands resolution of the new Civil Procedure Code. To do so, it will be done, first, an analysis of the current context of the Brazilian judiciary, presenting the context of repetitive demands and massification of contracts and introductory aspects of the incident of repetitive demands resolution. Then it will made be a judicial empirical analysis, quantitative and qualitative, through a case study of Santa Catarina Courts of Justice, conducting an empirical study of cross descriptive analysis of the demands related to the issue highlighted above, in order to demonstrate an 'argumentative radiography’ of the judgments of that Court. The results confirmed the possibility of applying IRDR in repetitive demands relating to subjects of this study, with due legal caution, taking into account the high number of “issues of fact” that involve lawsuits that have, among their claims, compensation for moral damages.

  16. A Bayes linear Bayes method for estimation of correlated event rates.

    Science.gov (United States)

    Quigley, John; Wilson, Kevin J; Walls, Lesley; Bedford, Tim

    2013-12-01

    Typically, full Bayesian estimation of correlated event rates can be computationally challenging since estimators are intractable. When estimation of event rates represents one activity within a larger modeling process, there is an incentive to develop more efficient inference than provided by a full Bayesian model. We develop a new subjective inference method for correlated event rates based on a Bayes linear Bayes model under the assumption that events are generated from a homogeneous Poisson process. To reduce the elicitation burden we introduce homogenization factors to the model and, as an alternative to a subjective prior, an empirical method using the method of moments is developed. Inference under the new method is compared against estimates obtained under a full Bayesian model, which takes a multivariate gamma prior, where the predictive and posterior distributions are derived in terms of well-known functions. The mathematical properties of both models are presented. A simulation study shows that the Bayes linear Bayes inference method and the full Bayesian model provide equally reliable estimates. An illustrative example, motivated by a problem of estimating correlated event rates across different users in a simple supply chain, shows how ignoring the correlation leads to biased estimation of event rates. © 2013 Society for Risk Analysis.

  17. Cause trending analysis for licensing operational events in Daya Bay Nuclear Power Station

    International Nuclear Information System (INIS)

    Wang Dewei

    2005-01-01

    The human causal factors for all human error licensing operational events on Daya Bay nuclear power station since 1993 to 2003 are categorized, the trend of these causal factors is analyzed. The emphasis is placed on analyzing the deficiencies on complying with and executing regulations and procedures. The results provide directional reference for nuclear power station to improve human performance. (author)

  18. Analysis of timber and coating material on an iron anchor recovered off Aguada Bay, Goa

    Digital Repository Service at National Institute of Oceanography (India)

    Tripati, S.; Rao, B.R.; Shashikala, S.; Rao, R.V.; Khedekar, V.D.

    Shanked iron anchor measuring 3.30 m long with a 4.37 m wooden stock was recovered off Aguada Bay, Goa at a water depth of 11 m. The anchor has been tentatively dated contemporary with the maritime history of Goa and Portugal between the 16th and 17th...

  19. Analysis of Civilian Employee Attrition at the Naval Postgraduate School and Naval Support Activity - Monterey Bay

    National Research Council Canada - National Science Library

    Valverde, Xavier

    1997-01-01

    ...) and Naval Support Activity-Monterey Bay (NSA-MB) to determine what civilian non-faculty employee jobs are likely to be left vacant in the next three years due to attrition and to identify what training and skills will be needed by personnel whose...

  20. Minimizing the trend effect on detrended cross-correlation analysis with empirical mode decomposition

    International Nuclear Information System (INIS)

    Zhao Xiaojun; Shang Pengjian; Zhao Chuang; Wang Jing; Tao Rui

    2012-01-01

    Highlights: ► Investigate the effects of linear, exponential and periodic trends on DCCA. ► Apply empirical mode decomposition to extract trend term. ► Strong and monotonic trends are successfully eliminated. ► Get the cross-correlation exponent in a persistent behavior without crossover. - Abstract: Detrended cross-correlation analysis (DCCA) is a scaling method commonly used to estimate long-range power law cross-correlation in non-stationary signals. However, the susceptibility of DCCA to trends makes the scaling results difficult to analyze due to spurious crossovers. We artificially generate long-range cross-correlated signals and systematically investigate the effect of linear, exponential and periodic trends. Specifically to the crossovers raised by trends, we apply empirical mode decomposition method which decomposes underlying signals into several intrinsic mode functions (IMF) and a residual trend. After the removal of residual term, strong and monotonic trends such as linear and exponential trends are successfully eliminated. But periodic trend cannot be separated out according to the criterion of IMF, which can be eliminated by Fourier transform. As a special case of DCCA, detrended fluctuation analysis presents similar results.

  1. Empirical Analysis of Green Supply Chain Management Practices in Indian Automobile Industry

    Science.gov (United States)

    Luthra, S.; Garg, D.; Haleem, A.

    2014-04-01

    Environmental sustainability and green environmental issues have an increasing popularity among researchers and supply chain practitioners. An attempt has been made to identify and empirically analyze green supply chain management (GSCM) practices in Indian automobile industry. Six main GSCM practices (having 37 sub practices) and four expected performance outcomes (having 16 performances) have been identified by implementing GSCM practices from literature review. Questionnaire based survey has been made to validate these practices and performance outcomes. 123 complete questionnaires were collected from Indian automobile organizations and used for empirical analysis of GSCM practices in Indian automobile industry. Descriptive statistics have been used to know current implementation status of GSCM practices in Indian automobile industry and multiple regression analysis has been carried out to know the impact on expected organizational performance outcomes by current GSCM practices adopted by Indian automobile industry. The results of study suggested that environmental, economic, social and operational performances improve with the implementation of GSCM practices. This paper may play an important role to understand various GSCM implementation issues and help practicing managers to improve their performances in the supply chain.

  2. Partnership effectiveness in primary community care networks: A national empirical analysis of partners' coordination infrastructure designs.

    Science.gov (United States)

    Lin, Blossom Yen-Ju; Lin, Yung-Kai; Lin, Cheng-Chieh

    2010-01-01

    Previous empirical and managerial studies have ignored the effectiveness of integrated health networks. It has been argued that the varying definitions and strategic imperatives of integrated organizations may have complicated the assessment of the outcomes/performance of varying models, particularly when their market structures and contexts differed. This study aimed to empirically verify a theoretical perspective on the coordination infrastructure designs and the effectiveness of the primary community care networks (PCCNs) formed and funded by the Bureau of National Health Insurance since March 2003. The PCCNs present a model to replace the traditional fragmented providers in Taiwan's health care. The study used a cross-sectional mailed survey designed to ascertain partnership coordination infrastructure and integration of governance, clinical care, bonding, finances, and information. The outcome indicators were PCCNs' perceived performance and willingness to remain within the network. Structural equation modeling examined the causal relationships, controlling for organizational and environmental factors. Primary data collection occurred from February through December 2005, via structured questionnaires sent to 172 PCCNs. Using the individual PCCN as the unit of analysis, the results found that a network's efforts regarding coordination infrastructures were positively related to the PCCN's perceived performance and willingness to remain within the network. In addition, PCCNs practicing in rural areas and in areas with higher density of medical resources had better perceived effectiveness and willingness to cooperate in the network.Practical Implication: The lack of both an operational definition and an information about system-wide integration may have obstructed understanding of integrated health networks' organizational dynamics. This study empirically examined individual PCCNs and offers new insights on how to improve networks' organizational design and

  3. Review of the human reliability analysis performed for Empire State Electric Energy Research Corporation

    International Nuclear Information System (INIS)

    Swart, D.; Banz, I.

    1985-01-01

    The Empire State Electric Energy Research Corporation (ESEERCO) commissioned Westinghouse to conduct a human reliability analysis to identify and quantify human error probabilities associated with operator actions for four specific events which may occur in light water reactors: loss of coolant accident, steam generator tube rupture, steam/feed line break, and stuck open pressurizer spray valve. Human Error Probabilities (HEPs) derived from Swain's Technique for Human Error Rate Prediction (THERP) were compared to data obtained from simulator exercises. A correlation was found between the HEPs derived from Swain and the results of the simulator data. The results of this study provide a unique insight into human factors analysis. The HEPs obtained from such probabilistic studies can be used to prioritize scenarios for operator training situations, and thus improve the correlation between simulator exercises and real control room experiences

  4. Decoupling Economic Growth and Energy Use. An Empirical Cross-Country Analysis for 10 Manufacturing Sectors

    Energy Technology Data Exchange (ETDEWEB)

    Mulder, P. [International Institute for Applied Systems Analysis, Laxenburg (Austria); De Groot, H.L.F. [Faculty of Economics and Business Administration, Vrije Universiteit, Amsterdam (Netherlands)

    2004-07-01

    This paper provides an empirical analysis of decoupling economic growth and energy use and its various determinants by exploring trends in energy- and labour productivity across 10 manufacturing sectors and 14 OECD countries for the period 1970-1997. We explicitly aim to trace back aggregate developments in the manufacturing sector to developments at the level of individual subsectors. A cross-country decomposition analysis reveals that in some countries structural changes contributed considerably to aggregate manufacturing energy-productivity growth and, hence, to decoupling, while in other countries they partly offset energy-efficiency improvements. In contrast, structural changes only play a minor role in explaining aggregate manufacturing labour-productivity developments. Furthermore, we find labour-productivity growth to be higher on average than energy-productivity growth. Over time, this bias towards labour-productivity growth is increasing in the aggregate manufacturing sector, while it is decreasing in most manufacturing subsectors.

  5. Denoising of chaotic signal using independent component analysis and empirical mode decomposition with circulate translating

    International Nuclear Information System (INIS)

    Wang Wen-Bo; Zhang Xiao-Dong; Chang Yuchan; Wang Xiang-Li; Wang Zhao; Chen Xi; Zheng Lei

    2016-01-01

    In this paper, a new method to reduce noises within chaotic signals based on ICA (independent component analysis) and EMD (empirical mode decomposition) is proposed. The basic idea is decomposing chaotic signals and constructing multidimensional input vectors, firstly, on the base of EMD and its translation invariance. Secondly, it makes the independent component analysis on the input vectors, which means that a self adapting denoising is carried out for the intrinsic mode functions (IMFs) of chaotic signals. Finally, all IMFs compose the new denoised chaotic signal. Experiments on the Lorenz chaotic signal composed of different Gaussian noises and the monthly observed chaotic sequence on sunspots were put into practice. The results proved that the method proposed in this paper is effective in denoising of chaotic signals. Moreover, it can correct the center point in the phase space effectively, which makes it approach the real track of the chaotic attractor. (paper)

  6. Linear and nonlinear determinants of the performance of informal venture capitalists’ investments. An empirical analysis

    Directory of Open Access Journals (Sweden)

    Vincenzo Capizzi

    2013-05-01

    Full Text Available This paper is aimed at identifying and analyzing the contribution of the major drivers of the performance of informal venture capitalists’ investments. This study analyzes data on Italian transactions and personal features of Italian Business Angels gathered during 2007 – 2011 with the support of IBAN (Italian Business Angels Network. The econometric analysis investigates the returns of business angels’ investments and their major determinants (industry, exit strategy, experience, holding period, rejection rate, and year of divestiture. The major results are the followings: 1 differently from previous literature, the relationship between Experience and IRR is quadratic and significant; 2 for the first time, is confirmed by quantitative data that short Holding period (below 3 years earn a lower IRR; 3 the Rejection rate is logarithmic and the impact on IRR is positive and significant. Finally, the outcomes of the empirical analysis performed in this study allow identifying new and concrete insights on possible policy interventions.

  7. Updating an empirical analysis on the proton’s central opacity and asymptotia

    International Nuclear Information System (INIS)

    Fagundes, D A; Menon, M J; Silva, P V R G

    2016-01-01

    We present an updated empirical analysis on the ratio of the elastic (integrated) to the total cross section in the c.m. energy interval from 5 GeV to 8 TeV. As in a previous work, we use a suitable analytical parametrization for that ratio (depending on only four free fit parameters) and investigate three asymptotic scenarios: either the black disk limit or scenarios above or below that limit. The dataset includes now the datum at 7 TeV, recently reported by the ATLAS Collaboration. Our analysis favors, once more, a scenario below the black disk, providing an asymptotic ratio consistent with the rational value 1/3, namely a gray disk limit. Upper bounds for the ratio of the diffractive (dissociative) to the inelastic cross section are also presented. (paper)

  8. The Impact of Tourism on Economic Growth in the Western Balkan Countries: An Empirical Analysis

    Directory of Open Access Journals (Sweden)

    Prof. Dr Nasir Selimi

    2017-06-01

    Full Text Available Purpose: The purpose of this research paper is to empirically analyse the effects of tourism on economic growth in Western Balkan countries (Albania, Bosnia and Herzegovina, Croatia, FYROM, Montenegro and Serbia. Design/Methodology/Approach: The empirical analysis consists of 17-year panel data of 6 countries over the period 1998 to 2014. Several models are analysed using the panel regression econometric techniques. The study investigates the random and fixed effects, as well as individual heterogeneity across those countries. Also, the Hausman Taylor IV estimator is used as the most appropriate model for this analysis. The real income per capita of the sample countries is modelled as dependent on the lagged income per capita, tourist arrivals, tourism receipts, FDI stock, exports and government expenditures. Findings: The estimation results in all types of models, and indicate that tourism has a positive and significant impact on economic growth in the Western Balkan countries. The Hausman Taylor IV model suggests that for every 1% increase of tourist arrivals, the output will increase approximately by 0.08%. Research limitations/implications: Although the Hausman Taylor IV model performs well, the results should be interpreted with caution. The analysis has its limitations; firstly, the total number of observations is relatively small for a panel regression analysis; secondly, the problem of endogenity is not completely avoided. However, the study implies that these countries should enhance efforts for joint tourism sector policies to engender economic sustainability. Originality/Value: To our best knowledge, this is the first attempt of estimating the effects of tourism on economic growth in the Western Balkan countries using the Hausman Taylor IV model.

  9. Highly comparative time-series analysis: the empirical structure of time series and their methods.

    Science.gov (United States)

    Fulcher, Ben D; Little, Max A; Jones, Nick S

    2013-06-06

    The process of collecting and organizing sets of observations represents a common theme throughout the history of science. However, despite the ubiquity of scientists measuring, recording and analysing the dynamics of different processes, an extensive organization of scientific time-series data and analysis methods has never been performed. Addressing this, annotated collections of over 35 000 real-world and model-generated time series, and over 9000 time-series analysis algorithms are analysed in this work. We introduce reduced representations of both time series, in terms of their properties measured by diverse scientific methods, and of time-series analysis methods, in terms of their behaviour on empirical time series, and use them to organize these interdisciplinary resources. This new approach to comparing across diverse scientific data and methods allows us to organize time-series datasets automatically according to their properties, retrieve alternatives to particular analysis methods developed in other scientific disciplines and automate the selection of useful methods for time-series classification and regression tasks. The broad scientific utility of these tools is demonstrated on datasets of electroencephalograms, self-affine time series, heartbeat intervals, speech signals and others, in each case contributing novel analysis techniques to the existing literature. Highly comparative techniques that compare across an interdisciplinary literature can thus be used to guide more focused research in time-series analysis for applications across the scientific disciplines.

  10. The root cause analysis of 9DVN002ZV fan failure in Daya Bay Nuclear Power Station

    International Nuclear Information System (INIS)

    Guan Jianjun; Zhang Mingjia

    2005-01-01

    Extensive investigations and detailed analysis of the failure reason of 9DVN002ZV fan in Daya Bay Nuclear Power Station showed that the fan destroy was caused by the failure of non-drive end bear. The direct cause of this bearing' failure was its improper assembly caused by improper maintenance procedure, and the root cause was too small internal radial clearance after mounting. The factor affecting bearing internal radial clearance, the relationship between clearance and operating life time and fan failure process were discussed. (authors)

  11. Failure mode and effects analysis: an empirical comparison of failure mode scoring procedures.

    Science.gov (United States)

    Ashley, Laura; Armitage, Gerry

    2010-12-01

    To empirically compare 2 different commonly used failure mode and effects analysis (FMEA) scoring procedures with respect to their resultant failure mode scores and prioritization: a mathematical procedure, where scores are assigned independently by FMEA team members and averaged, and a consensus procedure, where scores are agreed on by the FMEA team via discussion. A multidisciplinary team undertook a Healthcare FMEA of chemotherapy administration. This included mapping the chemotherapy process, identifying and scoring failure modes (potential errors) for each process step, and generating remedial strategies to counteract them. Failure modes were scored using both an independent mathematical procedure and a team consensus procedure. Almost three-fifths of the 30 failure modes generated were scored differently by the 2 procedures, and for just more than one-third of cases, the score discrepancy was substantial. Using the Healthcare FMEA prioritization cutoff score, almost twice as many failure modes were prioritized by the consensus procedure than by the mathematical procedure. This is the first study to empirically demonstrate that different FMEA scoring procedures can score and prioritize failure modes differently. It found considerable variability in individual team members' opinions on scores, which highlights the subjective and qualitative nature of failure mode scoring. A consensus scoring procedure may be most appropriate for FMEA as it allows variability in individuals' scores and rationales to become apparent and to be discussed and resolved by the team. It may also yield team learning and communication benefits unlikely to result from a mathematical procedure.

  12. Identifying ideal brow vector position: empirical analysis of three brow archetypes.

    Science.gov (United States)

    Hamamoto, Ashley A; Liu, Tiffany W; Wong, Brian J

    2013-02-01

    Surgical browlifts counteract the effects of aging, correct ptosis, and optimize forehead aesthetics. While surgeons have control over brow shape, the metrics defining ideal brow shape are subjective. This study aims to empirically determine whether three expert brow design strategies are aesthetically equivalent by using expert focus group analysis and relating these findings to brow surgery. Comprehensive literature search identified three dominant brow design methods (Westmore, Lamas and Anastasia) that are heavily cited, referenced or internationally recognized in either medical literature or by the lay media. Using their respective guidelines, brow shape was modified for 10 synthetic female faces, yielding 30 images. A focus group of 50 professional makeup artists ranked the three images for each of the 10 faces to generate ordinal attractiveness scores. The contemporary methods employed by Anastasia and Lamas produce a brow arch more lateral than Westmore's classic method. Although the more laterally located brow arch is considered the current trend in facial aesthetics, this style was not empirically supported. No single method was consistently rated most or least attractive by the focus group, and no significant difference in attractiveness score for the different methods was observed (p = 0.2454). Although each method of brow placement has been promoted as the "best" approach, no single brow design method achieved statistical significance in optimizing attractiveness. Each can be used effectively as a guide in designing eyebrow shape during browlift procedures, making it possible to use the three methods interchangeably. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.

  13. Ancient DNA analysis suggests negligible impact of the Wari Empire expansion in Peru's central coast during the Middle Horizon

    OpenAIRE

    Valverde, G.; Romero, M.; Espinoza, I.; Cooper, A.; Fehren-Schmitz, L.; Llamas, B.; Haak, W.

    2016-01-01

    The analysis of ancient human DNA from South America allows the exploration of pre-Columbian population history through time and to directly test hypotheses about cultural and demographic evolution. The Middle Horizon (650?1100 AD) represents a major transitional period in the Central Andes, which is associated with the development and expansion of ancient Andean empires such as Wari and Tiwanaku. These empires facilitated a series of interregional interactions and socio-political changes, wh...

  14. Evaluation and analysis of underground brine resources in the southern coastal area of Laizhou Bay

    Science.gov (United States)

    Tian, M.; Zhu, H. T.; Feng, J.; Zhao, Q. S.

    2016-08-01

    The southern coastal districts of Laizhou Bay are some of the most important areas for underground brine exploitation in Shandong Province. Recently, these areas have been gradually developed by the underground brine mining industry. Such economic interest has led to brine exploitation so that underground brine resources are running out. Based on this phenomenon, this study describes the supply, runoff and draining conditions of the area by collecting and organizing the background information of the studied area. Hydrogeological parameters are then calculated according to pumping tests, and the amount of sustainable resources in the coastal areas of the Southern Bank of Laizhou Bay are then calculated based on the uniform distribution of wells. Under the circumstances of underground brine mining, the exploitation potential of the underground brine is evaluated in accordance with the calculation results of exploitation quantum. Finally, suggestions are provided for the sustainable exploitation of underground brine in the area.

  15. Hybrid Wing Body Multi-Bay Test Article Analysis and Assembly Final Report

    Science.gov (United States)

    Velicki, Alexander; Hoffman, Krishna; Linton, Kim A.; Baraja, Jaime; Wu, Hsi-Yung T.; Thrash, Patrick

    2017-01-01

    This report summarizes work performed by The Boeing Company, through its Boeing Research & Technology organization located in Huntington Beach, California, under the Environmentally Responsible Aviation (ERA) project. The report documents work performed to structurally analyze and assemble a large-scale Multi-bay Box (MBB) Test Article capable of withstanding bending and internal pressure loadings representative of a Hybrid Wing Body (HWB) aircraft. The work included fabrication of tooling elements for use in the fabrication and assembly of the test article.

  16. Cádiz bay walers lurbidily varialions from Landsat TM images analysis

    OpenAIRE

    Gutiérrez Mas, José Manuel; Luna del Barco, A.; Parrado Román, J. M.; Sánchez, E.; Fernández Palacios, A.; Ojeda, J.

    1999-01-01

    Landsat TM images has been analysed to obtain extent and direction data about turbidity flumes in several hydrodinamic sinoptic situations in Cadiz bay waters. Results are beside data from water samples. Five turbidity levels has been differentiated: very high turbidity, high, middle, low and very low, and three geographic sectors: a) Inner zone, closed to coast and with shoal waters of high and very high turbidity. This sector is very affected by littoral processes (tidals, surge and contine...

  17. Humboldt Bay Wetlands Review and Baylands Analysis. Volume I. Summary and Findings.

    Science.gov (United States)

    1980-08-01

    the best and most compatible economic, environmental, and social uses of the Humboldt Bay area. Such data includes inven- tories of uses and conditions... personal communication). Since 1973, 22 general permits have been granted by the Harbor District; of these, there were 3 for submarine pipeline/cable...other water charateristics determine the type and abundance. Distribution Water habitats are distributed throughout the study area. Deep and shallow

  18. Analysis of Naïve Bayes Algorithm for Email Spam Filtering across Multiple Datasets

    Science.gov (United States)

    Fitriah Rusland, Nurul; Wahid, Norfaradilla; Kasim, Shahreen; Hafit, Hanayanti

    2017-08-01

    E-mail spam continues to become a problem on the Internet. Spammed e-mail may contain many copies of the same message, commercial advertisement or other irrelevant posts like pornographic content. In previous research, different filtering techniques are used to detect these e-mails such as using Random Forest, Naïve Bayesian, Support Vector Machine (SVM) and Neutral Network. In this research, we test Naïve Bayes algorithm for e-mail spam filtering on two datasets and test its performance, i.e., Spam Data and SPAMBASE datasets [8]. The performance of the datasets is evaluated based on their accuracy, recall, precision and F-measure. Our research use WEKA tool for the evaluation of Naïve Bayes algorithm for e-mail spam filtering on both datasets. The result shows that the type of email and the number of instances of the dataset has an influence towards the performance of Naïve Bayes.

  19. Empirical Analysis and Characterization of Indoor GPS Signal Fading and Multipath Conditions

    DEFF Research Database (Denmark)

    Blunck, Henrik; Kjærgaard, Mikkel Baun; Godsk, Torben

    2009-01-01

    of earlier measurement campaigns to characterize GNSS signal conditions in indoor environments have been published prominently in the GNSS research literature, see, e.g. [1,2,3]. To allow for in-depth signal analysis, these campaigns use a variety of measuring machinery such as channel sounders, mobile...... signal generators and spectrum analyzers. Furthermore, the use-case-specific usability of GPS as an indoor positioning technology as been evaluated empirically on a higher level, see, e.g. [4]. In this paper we present results of a measurement campaign, designed to characterize indoor GNSS signal...... conditions. The work presented can therefore be seen as an effort to the campaigns mentioned above. As the focus of our work lies on the real-world usability of current GNSS technology for indoor use, we employ in our measurement campaign foremost commercial receivers with features, typical for the use cases...

  20. Competencies in Higher Education System: an Empirical Analysis of Employers` Perceptions

    Directory of Open Access Journals (Sweden)

    Adela Deaconu

    2014-08-01

    Full Text Available This study offers insight into the European Qualifications Framework (EQF, as agreed and detailed by the Romanian qualifications framework, applied to the economic sector. By means of a survey conducted on 92 employing companies, it validates the importance of competencies for the Romanian labour market and employers` degree of satisfaction with the competencies of business graduates. In terms of typology, employers attach more importance to transversal competencies than to professional competencies, both at conceptual level and as degree of acquirement following higher education. The empirical analysis provides data on employers` ranking of transversal and professional competencies and identifies the classes of competencies deemed to be in short supply on the labour market. Through its results, the study enhances the relationship between the higher education system and the labour market, providing key information for an efficient implementation of the competence-based education system.

  1. Empirical Analysis of Retirement Pension and IFRS Adoption Effects on Accounting Information: Glance at IT Industry

    Directory of Open Access Journals (Sweden)

    JeongYeon Kim

    2014-01-01

    Full Text Available This study reviews new pension accounting with K-IFRS and provides empirical changes in liability for retirement allowances with adoption of K-IFRS. It will help to understand the effect of pension accounting on individual firm’s financial report and the importance of public announcement of actuarial assumptions. Firms that adopted K-IFRS had various changes in retirement liability compared to the previous financial report not based on K-IFRS. Their actuarial assumptions for pension accounting should be announced, but only few of them were published. Data analysis shows that the small differences of the actuarial assumption may result in a big change of retirement related liability. Firms within IT industry also have similar behaviors, which means that additional financial regulations for pension accounting are recommended.

  2. Empirical Analysis of Intonation Activities in EFL Student’s Books

    Directory of Open Access Journals (Sweden)

    Dušan Nikolić

    2018-05-01

    Full Text Available Intonation instruction has repeatedly proved a challenge for EFL teachers, who avoid getting involved in intonation teaching more than their EFL textbooks demand from them. Since a great number of teachers rely on EFL textbooks when implementing intonation practice, the intonation activities in EFL materials are often central to their classroom. Even though the research on intonation instruction has been well-documented, few papers have explored intonation activities in EFL materials. The present study thus provides an empirical analysis of intonation activities in five EFL student’s books series by exploring the overall coverage of intonation activities across the series and the quality of these activities. The results reveal that intonation activities are underrepresented in the EFL student’s books, and that discourse intonation deserves more attention in the activities. Considerations for EFL teachers and publishers are also discussed.

  3. An empirical analysis оn logistics performance and the global competitiveness

    Directory of Open Access Journals (Sweden)

    Turkay Yildiz

    2017-05-01

    Full Text Available Logistics has been identified as an area to build cost and service advantages. Therefore, companies are more focused on customer needs and trying to find ways to reduce costs, improve quality and meet the growing expectations of their clients. Indeed, the global competition has led managers to begin to address the issue of providing more efficient logistics services. In this regard, this paper presents an empirical study on logistics performance and global competitiveness. This paper also identifies the associations between logistics performances and global competitiveness. The results indicate that some variables in global competitiveness scores contribute much higher to the logistics performances than the other variables through analysis. The variables that contribute much higher than the other variables to the logistics performances are shown.

  4. Niche Overlap and Discrediting Acts: An Empirical Analysis of Informing in Hollywood

    Directory of Open Access Journals (Sweden)

    Giacomo Negro

    2015-06-01

    Full Text Available This article examines informing on others as a discrediting act between individual agents in a labor market. We conduct an empirical analysis of artists called to testify during the 1950s Congressional hearings into Communism in Hollywood, and multi-level regression models reveal that the odds of an artist informing on another increase when their career histories are more similar. The similarity reflects levels of niche overlap in the labor market. The finding that similarity contributes to discredit in the context of resource competition is compatible with a social comparison process, whereby uncertainty about performance leads more similar people to attend to and exclude one another to a greater extent.

  5. Empirical mode decomposition and Hilbert transforms for analysis of oil-film interferograms

    International Nuclear Information System (INIS)

    Chauhan, Kapil; Ng, Henry C H; Marusic, Ivan

    2010-01-01

    Oil-film interferometry is rapidly becoming the preferred method for direct measurement of wall shear stress in studies of wall-bounded turbulent flows. Although being widely accepted as the most accurate technique, it does have inherent measurement uncertainties, one of which is associated with determining the fringe spacing. This is the focus of this paper. Conventional analysis methods involve a certain level of user input and thus some subjectivity. In this paper, we consider empirical mode decomposition (EMD) and the Hilbert transform as an alternative tool for analyzing oil-film interferograms. In contrast to the commonly used Fourier-based techniques, this new method is less subjective and, as it is based on the Hilbert transform, is superior for treating amplitude and frequency modulated data. This makes it particularly robust to wide differences in the quality of interferograms

  6. Empirical analysis of retirement pension and IFRS adoption effects on accounting information: glance at IT industry.

    Science.gov (United States)

    Kim, JeongYeon

    2014-01-01

    This study reviews new pension accounting with K-IFRS and provides empirical changes in liability for retirement allowances with adoption of K-IFRS. It will help to understand the effect of pension accounting on individual firm's financial report and the importance of public announcement of actuarial assumptions. Firms that adopted K-IFRS had various changes in retirement liability compared to the previous financial report not based on K-IFRS. Their actuarial assumptions for pension accounting should be announced, but only few of them were published. Data analysis shows that the small differences of the actuarial assumption may result in a big change of retirement related liability. Firms within IT industry also have similar behaviors, which means that additional financial regulations for pension accounting are recommended.

  7. Empirical Analysis Concerning the Correlation Fiscality Rate – Tax Incomes in Romania

    Directory of Open Access Journals (Sweden)

    Raluca Drãcea

    2009-08-01

    Full Text Available In the specialized literature it is reviewed the taxation from all points of view and the question raised by the last decade analysts is: what is the optimum level of taxation? The difficulty in answering to this question stands in the opposite interests: State wants a high level of taxation due to the increasing trend of public expenses while the tax payers wants a low level in order to benefit of greater financial funds.Starting from Leffer theory, the objective of this paper is the empirical analysis of the correlation between fiscality rate and the tax incomes in Romania, using Matlab programand SPSS software. The paper is structured in three parts: first part it is review the specialized literature, in the second part is described the research methodology while the third part compound results and discussions. The paper is finished by conclusions.

  8. An Empirical Analysis of Economic and Socio-Demographic Determinants of Entrepreneurship Across German Regions

    Directory of Open Access Journals (Sweden)

    Mrożewski Matthias

    2014-11-01

    Full Text Available Entrepreneurship is fundamental for a country's economic development through its positive effect on innovation, productivity growth, and job creation. In entrepreneurial research, one of the most important problems is to define the factors that actually determine entrepreneurial action. This study analyzes that question in the case of Germany by taking an aggregated approach that focuses on socio-demographic and economic determinants of regional entrepreneurship. Based on a literature review of German and international regional-level research, six hypotheses are developed and empirically tested using the most recent available data on 385 German regions as units of analysis. The results are surprising. In the case of household income, unemployment, education and marital status the relationship is significant but contrary to earlier research. Only regional age structure seems to be a stable predictor of regional entrepreneurship. The results indicate that in recent years there was a major shift in the determinants and characteristics of entrepreneurship in Germany.

  9. Energy Taxes as a Signaling Device: An Empirical Analysis of Consumer Preferences

    International Nuclear Information System (INIS)

    Ghalwash, Tarek

    2004-01-01

    This paper presents an econometric study dealing with household demand in Sweden. The main objective is to empirically examine the differences in consumer reaction to the introduction of, or the change, in environmental taxes. Main focus is on environmental taxes as a signaling device. The hypothesis is that the introduction of an environmental tax provides new information about the properties of the directly taxed goods. This in turn may affect consumer preferences for these goods, hence altering the consumption choice. The result from the econometric analysis shows that all goods have negative own-price elasticities, and positive income elasticities. Concerning the signalling effect of environmental taxes the results are somewhat ambiguous. The tax elasticity for energy goods used for heating seems to be significantly higher than the traditional price elasticity, whereas the opposite seems to be the case for energy goods used for transportation

  10. Energy taxes as a signaling device: An empirical analysis of consumer preferences

    International Nuclear Information System (INIS)

    Ghalwash, Tarek

    2007-01-01

    This paper presents an econometric study dealing with household demand in Sweden. The main objective is to empirically examine the differences in consumer reaction to the introduction of, or the change, in environmental taxes. Main focus is on environmental taxes as a signaling device. The hypothesis is that the introduction of an environmental tax provides new information about the properties of the directly taxed goods. This in turn may affect consumer preferences for these goods, hence altering the consumption choice. The result from the econometric analysis shows that all goods have negative own-price elasticities, and positive income elasticities. Concerning the signalling effect of environmental taxes the results are somewhat ambiguous. The tax elasticity for energy goods used for heating seems to be significantly higher than the traditional price elasticity, whereas the opposite seems to be the case for energy goods used for transportation

  11. A Meta-Analysis of Empirically Tested School-Based Dating Violence Prevention Programs

    Directory of Open Access Journals (Sweden)

    Sarah R. Edwards

    2014-05-01

    Full Text Available Teen dating violence prevention programs implemented in schools and empirically tested were subjected to meta-analysis. Eight studies met criteria for inclusion, consisting of both within and between designs. Overall, the weighted mean effect size (ES across studies was significant, ESr = .11; 95% confidence interval (CI = [.08, .15], p < .0001, showing an overall positive effect of the studied prevention programs. However, 25% of the studies showed an effect in the negative direction, meaning students appeared to be more supportive of dating violence after participating in a dating violence prevention program. This heightens the need for thorough program evaluation as well as the need for decision makers to have access to data about the effectiveness of programs they are considering implementing. Further implications of the results and recommendations for future research are discussed.

  12. Windfall profit in portfolio diversification? An empirical analysis of the potential benefits of renewable energy investments

    Energy Technology Data Exchange (ETDEWEB)

    Bruns, Frederik

    2013-05-01

    Modern Portfolio Theory is a theory which was introduced by Markowitz, and which suggests the building of a portfolio with assets that have low or, in the best case, negative correlation. In times of financial crises, however, the positive diversification effect of a portfolio can fail when Traditional Assets are highly correlated. Therefore, many investors search for Alternative Asset classes, such as Renewable Energies, that tend to perform independently from capital market performance. 'Windfall Profit in Portfolio Diversification?' discusses the potential role of Renewable Energy investments in an institutional investor's portfolio by applying the main concepts from Modern Portfolio Theory. Thereby, the empirical analysis uses a unique data set from one of the largest institutional investors in the field of Renewable Energies, including several wind and solar parks. The study received the Science Award 2012 of the German Alternative Investments Association ('Bundesverband Alternative Investments e.V.').

  13. Empirical evidence about inconsistency among studies in a pair‐wise meta‐analysis

    Science.gov (United States)

    Turner, Rebecca M.; Higgins, Julian P. T.

    2015-01-01

    This paper investigates how inconsistency (as measured by the I2 statistic) among studies in a meta‐analysis may differ, according to the type of outcome data and effect measure. We used hierarchical models to analyse data from 3873 binary, 5132 continuous and 880 mixed outcome meta‐analyses within the Cochrane Database of Systematic Reviews. Predictive distributions for inconsistency expected in future meta‐analyses were obtained, which can inform priors for between‐study variance. Inconsistency estimates were highest on average for binary outcome meta‐analyses of risk differences and continuous outcome meta‐analyses. For a planned binary outcome meta‐analysis in a general research setting, the predictive distribution for inconsistency among log odds ratios had median 22% and 95% CI: 12% to 39%. For a continuous outcome meta‐analysis, the predictive distribution for inconsistency among standardized mean differences had median 40% and 95% CI: 15% to 73%. Levels of inconsistency were similar for binary data measured by log odds ratios and log relative risks. Fitted distributions for inconsistency expected in continuous outcome meta‐analyses using mean differences were almost identical to those using standardized mean differences. The empirical evidence on inconsistency gives guidance on which outcome measures are most likely to be consistent in particular circumstances and facilitates Bayesian meta‐analysis with an informative prior for heterogeneity. © 2015 The Authors. Research Synthesis Methods published by John Wiley & Sons, Ltd. © 2015 The Authors. Research Synthesis Methods published by John Wiley & Sons, Ltd. PMID:26679486

  14. Network Analysis Approach to Stroke Care and Assistance Provision: An Empirical Study

    Directory of Open Access Journals (Sweden)

    Szczygiel Nina

    2017-06-01

    Full Text Available To model and analyse stroke care and assistance provision in the Portuguese context from the network perspective. We used the network theory as a theoretical foundation for the study. The model proposed by Frey et al. (2006 was used to elicit and comprehend possible interactions and relations between organisations expected to be involved in the provision of care and assistance to stroke patients in their pathway to rehabilitation. Providers were identified and contacted to evaluate the nature and intensity of relationships. Network analysis was performed with the NodeXL software package. Analysis of 509 entities based on about 260 000 entries indicates that stroke care provision in the evaluated context is best captured in the coalition-collaboration setting, which appears to best demonstrate the character of the network. Information from analysis of the collaboration stage was not sufficient to determine the network dynamics. Application of the network theory to understand interorganisational dynamics of the complex health care context. Empirical validation of the model proposed by Frey et al. (2006 in terms of its operationalisation and the way it actually reflects the practical context. Examination and analysis of interorganisational relationships and its contribution to management of compound health care context involving actors from various sectors.

  15. Empirical Bayes methods in road safety research.

    NARCIS (Netherlands)

    Vogelesang, R.A.W.

    1997-01-01

    Road safety research is a wonderful combination of counting fatal accidents and using a toolkit containing prior, posterior, overdispersed Poisson, negative binomial and Gamma distributions, together with positive and negative regression effects, shrinkage estimators and fiercy debates concerning

  16. Wood identification of Dalbergia nigra (CITES Appendix I) using quantitative wood anatomy, principal components analysis and naive Bayes classification.

    Science.gov (United States)

    Gasson, Peter; Miller, Regis; Stekel, Dov J; Whinder, Frances; Zieminska, Kasia

    2010-01-01

    Dalbergia nigra is one of the most valuable timber species of its genus, having been traded for over 300 years. Due to over-exploitation it is facing extinction and trade has been banned under CITES Appendix I since 1992. Current methods, primarily comparative wood anatomy, are inadequate for conclusive species identification. This study aims to find a set of anatomical characters that distinguish the wood of D. nigra from other commercially important species of Dalbergia from Latin America. Qualitative and quantitative wood anatomy, principal components analysis and naïve Bayes classification were conducted on 43 specimens of Dalbergia, eight D. nigra and 35 from six other Latin American species. Dalbergia cearensis and D. miscolobium can be distinguished from D. nigra on the basis of vessel frequency for the former, and ray frequency for the latter. Principal components analysis was unable to provide any further basis for separating the species. Naïve Bayes classification using the four characters: minimum vessel diameter; frequency of solitary vessels; mean ray width; and frequency of axially fused rays, classified all eight D. nigra correctly with no false negatives, but there was a false positive rate of 36.36 %. Wood anatomy alone cannot distinguish D. nigra from all other commercially important Dalbergia species likely to be encountered by customs officials, but can be used to reduce the number of specimens that would need further study.

  17. Speciation of heavy metals in different grain sizes of Jiaozhou Bay sediments: Bioavailability, ecological risk assessment and source analysis on a centennial timescale.

    Science.gov (United States)

    Kang, Xuming; Song, Jinming; Yuan, Huamao; Duan, Liqin; Li, Xuegang; Li, Ning; Liang, Xianmeng; Qu, Baoxiao

    2017-09-01

    Heavy metal contamination is an essential indicator of environmental health. In this work, one sediment core was used for the analysis of the speciation of heavy metals (Cr, Mn, Ni, Cu, Zn, As, Cd, and Pb) in Jiaozhou Bay sediments with different grain sizes. The bioavailability, sources and ecological risk of heavy metals were also assessed on a centennial timescale. Heavy metals were enriched in grain sizes of Pb > Cd > Zn > Cu >Ni > Cr > As. Enrichment factors (EF) indicated that heavy metals in Jiaozhou Bay presented from no enrichment to minor enrichment. The potential ecological risk index (RI) indicated that Jiaozhou Bay had been suffering from a low ecological risk and presented an increasing trend since 1940s owing to the increase of anthropogenic activities. The source analysis indicated that natural sources were primary sources of heavy metals in Jiaozhou Bay and anthropogenic sources of heavy metals presented an increasing trend since 1940s. The principal component analysis (PCA) indicated that Cr, Mn, Ni, Cu and Pb were primarily derived from natural sources and that Zn and Cd were influenced by shipbuilding industry. Mn, Cu, Zn and Pb may originate from both natural and anthropogenic sources. As may be influenced by agricultural activities. Moreover, heavy metals in sediments of Jiaozhou Bay were clearly influenced by atmospheric deposition and river input. Copyright © 2017. Published by Elsevier Inc.

  18. Characterization of organic matter in sediment cores of the Todos os Santos Bay, Bahia, Brazil, by elemental analysis and 13C NMR

    International Nuclear Information System (INIS)

    Costa, A.B.; Novotny, E.H.; Bloise, A.C.; Azevedo, E.R. de; Bonagamba, T.J.; Zucchi, M.R.; Santos, V.L.C.S.; Azevedo, A.E.G.

    2011-01-01

    Highlights: → The impact of human activity on the sediments of Todos os Santos Bay in Brazil was evaluated by EA and 13 C NMR. → This article reports a study of six sediment cores collected at different depths and regions. → The elemental profiles of cores suggest an abrupt change in the sedimentation regime, corresponds to about 50 years ago, coinciding with the implantation of major onshore industrial projects. → The results presented illustrate several important aspects of environmental impact of human activity on this bay. - Abstract: The impact of human activity on the sediments of Todos os Santos Bay in Brazil was evaluated by elemental analysis and 13 C Nuclear Magnetic Resonance ( 13 C NMR). This article reports a study of six sediment cores collected at different depths and regions of Todos os Santos Bay. The elemental profiles of cores collected on the eastern side of Frades Island suggest an abrupt change in the sedimentation regime. Autoregressive Integrated Moving Average (ARIMA) analysis corroborates this result. The range of depths of the cores corresponds to about 50 years ago, coinciding with the implantation of major onshore industrial projects in the region. Principal Component Analysis of the 13 C NMR spectra clearly differentiates sediment samples closer to the Subae estuary, which have high contents of terrestrial organic matter, from those closer to a local oil refinery. The results presented in this article illustrate several important aspects of environmental impact of human activity on this bay.

  19. THE RESPONSE OF MONTEREY BAY TO THE 2010 CHILEAN EARTHQUAKE

    Directory of Open Access Journals (Sweden)

    Laurence C. Breaker

    2011-01-01

    Full Text Available The primary frequencies contained in the arrival sequence produced by the tsunami from the Chilean earthquake of 2010 in Monterey Bay were extracted to determine the seiche modes that were produced. Singular Spectrum Analysis (SSA and Ensemble Empirical Mode Decomposition (EEMD were employed to extract the primary frequencies of interest. The wave train from the Chilean tsunami lasted for at least four days due to multipath arrivals that may not have included reflections from outside the bay but most likely did include secondary undulations, and energy trapping in the form of edge waves, inside the bay. The SSA decomposition resolved oscillations with periods of 52-57, 34-35, 26-27, and 21-22 minutes, all frequencies that have been predicted and/or observed in previous studies. The EEMD decomposition detected oscillations with periods of 50-55 and 21-22 minutes. Periods in the range of 50-57 minutes varied due to measurement uncertainties but almost certainly correspond to the first longitudinal mode of oscillation for Monterey Bay, periods of 34-35 minutes correspond to the first transverse mode of oscillation that assumes a nodal line across the entrance of the bay, a period of 26- 27 minutes, although previously observed, may not represent a fundamental oscillation, and a period of 21-22 minutes has been predicted and observed previously. A period of ~37 minutes, close to the period of 34-35 minutes, was generated by the Great Alaskan Earthquake of 1964 in Monterey Bay and most likely represents the same mode of oscillation. The tsunamis associated with the Great Alaskan Earthquake and the Chilean Earthquake both entered Monterey Bay but initially arrived outside the bay from opposite directions. Unlike the Great Alaskan Earthquake, however, which excited only one resonant mode inside the bay, the Chilean Earthquake excited several modes suggesting that the asymmetric shape of the entrance to Monterey Bay was an important factor and that the

  20. Territorial analysis of the micro-basin and bay of the Cacaluta River, Santa María Huatulco, Oaxaca

    Directory of Open Access Journals (Sweden)

    Verónica Rosalía Gómez Rojo

    2012-02-01

    Full Text Available This work is an integrated analysis if the following: the physical and bio-geographic elements, the chronological history of the population’s place, types of land ownership, its uses that are parts of the micro-basin and bay of the Cacaluta River, Santa Maria Huatulco, Oaxaca. The 55% of the study zone, falls within the boundaries of the Huatulco National Park and the remaining along the river basin adjacent to the park. This entire region harbors high biodiversity and is made up of scenic landscapes, where different interests come into play which dispute the use of the natural resources and the appropriation of lands. Among the analysis techniques employed analysis in this investigation are map-like diagrams known as choremes, which demonstrates the relationship of the above mentioned aspects of the study.

  1. Geochemical analysis of sediments from a semi-enclosed bay (Dongshan Bay, southeast China) to determine the anthropogenic impact and source.

    Science.gov (United States)

    Xu, Yonghang; Sun, Qinqin; Ye, Xiang; Yin, Xijie; Li, Dongyi; Wang, Liang; Wang, Aijun; Li, Yunhai

    2017-05-01

    The geochemical compositions of sediments in the Dongshan Bay, a semi-enclosed bay on the southeast coast of China, were obtained to identify pollutant sources and evaluate the anthropogenic impacts over the last 100 years. The results indicated that the metal flux had been increasing since the 1980s. Enrichment factor values (Pb, Zn and Cu) suggested only slight enrichment. The proportion of anthropogenic Pb changed from 9% to 15% during 2000-2014. Coal combustion might be an important contamination source in the Dongshan Bay. The historical variation in the metal flux reflected the economic development and urbanization in the Zhangjiang drainage area in the past 30 years. According to the Landsat satellite remote sensing data, the urbanization area expanded approximately three times from 1995 to 2010. The δ 13 C values (-21‰ to -23‰) of the organic matter (OM) in the sediments indicated that the OM was primarily sourced from aquatic, terrigenous and marsh C 3 plants. Nitrogen was mainly derived from aquatic plants and terrigenous erosion before the 1980s. However, the total organic carbon (TOC) contents, total nitrogen (TN) contents and δ 15 N had been increasing since the 1980s, which suggested that the sources of nitrogen were soil erosion, fertilizer and sewage. In addition, the TOC and TN fluxes in the Dongshan Bay had significantly increased since the 1980s, which reflected the use of N fertilizer. However, the TOC and TN fluxes significantly decreased in the past decade because environmental awareness increased and environmental protection policies were implemented. Copyright © 2017 Elsevier Ltd. All rights reserved.

  2. Sentimental analysis of Amazon reviews using naïve bayes on laptop products with MongoDB and R

    Science.gov (United States)

    Kamal Hassan, Mohan; Prasanth Shakthi, Sana; Sasikala, R.

    2017-11-01

    Start In Today’s era the e-commerce is developing rapidly these years, buying products on-line has become more and more fashionable owing to its variety of options, low cost value (high discounts) and quick supply systems, so abundant folks intend to do online shopping. In the meantime the standard and delivery of merchandise is uneven, fake branded products are delivered. We use product users review comments about product and review about retailers from Amazon as data set and classify review text by subjectivity/objectivity and negative/positive attitude of buyer. Such reviews are helpful to some extent, promising both the shoppers and products makers. This paper presents an empirical study of efficacy of classifying product review by tagging the keyword. In the present study, we tend to analyse the fundamentals of determining, positive and negative approach towards the product. Thus we hereby propose completely different approaches by removing the unstructured data and then classifying comments employing Naive Bayes algorithm.

  3. Comparative empirical analysis of temporal relationships between construction investment and economic growth in the United States

    Directory of Open Access Journals (Sweden)

    Navid Ahmadi

    2017-09-01

    Full Text Available The majority of policymakers believe that investments in construction infrastructure would boost the economy of the United States (U.S.. They also assume that construction investment in infrastructure has similar impact on the economies of different U.S. states. In contrast, there have been studies showing the negative impact of construction activities on the economy. However, there has not been any research attempt to empirically test the temporal relationships between construction investment and economic growth in the U.S. states, to determine the longitudinal impact of construction investment on the economy of each state. The objective of this study is to investigate whether Construction Value Added (CVA is the leading (or lagging indicator of real Gross Domestic Product (real GDP for every individual state of the U.S. using empirical time series tests. The results of Granger causality tests showed that CVA is a leading indicator of state real GDP in 18 states and the District of Columbia; real GDP is a leading indicator of CVA in 10 states and the District of Columbia. There is a bidirectional relationship between CVA and real GDP in 5 states and the District of Columbia. In 8 states and the District of Columbia, not only do CVA and real GDP have leading/lagging relationships, but they are also cointegrated. These results highlight the important role of the construction industry in these states. The results also show that leading (or lagging lengths vary for different states. The results of the comparative empirical analysis reject the hypothesis that CVA is a leading indicator of real GDP in the states with the highest shares of construction in the real GDP. The findings of this research contribute to the state of knowledge by quantifying the temporal relationships between construction investment and economic growth in the U.S. states. It is expected that the results help policymakers better understand the impact of construction investment

  4. Double-dividend analysis with SCREEN: an empirical study for Switzerland

    International Nuclear Information System (INIS)

    Frei, Christoph W.; Haldi, Pierre-Andre; Sarlos, Gerard

    2005-01-01

    This paper presents an empirical study that quantifies the effects of an ecological fiscal reform as recently rejected by the Swiss population. The measure aims to encourage employment and, at the same time, to dissuade from an excessive energy use and thereby decrease energy-induced external costs (CO 2 , etc.). The analysis is based on the model SCREEN, a general equilibrium model using the complementarity format for the hybrid description of economy-wide production possibilities where the electricity sector is represented by a bottom-up activity analysis and the other production sectors are characterised by top-down production functions. A dynamic formulation of the activity analysis of technologies allows for the reproduction of endogenous structural change (see Frei, C.W., Haldi, P.-A., Sarlos, G., 2003. Dynamic formulation of a top-down and bottom-up merging energy policy model. Energy Policy 31 (10), 1017-1031.). The labour market is formulated according to the microeconomically founded efficiency wages and calibrated for Switzerland. The study includes the development of a consistent set of top-down, bottom-up and labour data for Switzerland. The collection of bottom-up data on the electricity sector, just before liberalisation, was not easy. The electricity sector characterising data was prepared, based on original statistics about 140 Swiss electricity companies

  5. A new approach for crude oil price analysis based on empirical mode decomposition

    International Nuclear Information System (INIS)

    Zhang, Xun; Wang, Shou-Yang; Lai, K.K.

    2008-01-01

    The importance of understanding the underlying characteristics of international crude oil price movements attracts much attention from academic researchers and business practitioners. Due to the intrinsic complexity of the oil market, however, most of them fail to produce consistently good results. Empirical Mode Decomposition (EMD), recently proposed by Huang et al., appears to be a novel data analysis method for nonlinear and non-stationary time series. By decomposing a time series into a small number of independent and concretely implicational intrinsic modes based on scale separation, EMD explains the generation of time series data from a novel perspective. Ensemble EMD (EEMD) is a substantial improvement of EMD which can better separate the scales naturally by adding white noise series to the original time series and then treating the ensemble averages as the true intrinsic modes. In this paper, we extend EEMD to crude oil price analysis. First, three crude oil price series with different time ranges and frequencies are decomposed into several independent intrinsic modes, from high to low frequency. Second, the intrinsic modes are composed into a fluctuating process, a slowly varying part and a trend based on fine-to-coarse reconstruction. The economic meanings of the three components are identified as short term fluctuations caused by normal supply-demand disequilibrium or some other market activities, the effect of a shock of a significant event, and a long term trend. Finally, the EEMD is shown to be a vital technique for crude oil price analysis. (author)

  6. Multivariate Empirical Mode Decomposition Based Signal Analysis and Efficient-Storage in Smart Grid

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Lu [University of Tennessee, Knoxville (UTK); Albright, Austin P [ORNL; Rahimpour, Alireza [University of Tennessee, Knoxville (UTK); Guo, Jiandong [University of Tennessee, Knoxville (UTK); Qi, Hairong [University of Tennessee, Knoxville (UTK); Liu, Yilu [University of Tennessee (UTK) and Oak Ridge National Laboratory (ORNL)

    2017-01-01

    Wide-area-measurement systems (WAMSs) are used in smart grid systems to enable the efficient monitoring of grid dynamics. However, the overwhelming amount of data and the severe contamination from noise often impede the effective and efficient data analysis and storage of WAMS generated measurements. To solve this problem, we propose a novel framework that takes advantage of Multivariate Empirical Mode Decomposition (MEMD), a fully data-driven approach to analyzing non-stationary signals, dubbed MEMD based Signal Analysis (MSA). The frequency measurements are considered as a linear superposition of different oscillatory components and noise. The low-frequency components, corresponding to the long-term trend and inter-area oscillations, are grouped and compressed by MSA using the mean shift clustering algorithm. Whereas, higher-frequency components, mostly noise and potentially part of high-frequency inter-area oscillations, are analyzed using Hilbert spectral analysis and they are delineated by statistical behavior. By conducting experiments on both synthetic and real-world data, we show that the proposed framework can capture the characteristics, such as trends and inter-area oscillation, while reducing the data storage requirements

  7. Business ethics and economic growth: An empirical analysis for Turkish economy

    Directory of Open Access Journals (Sweden)

    Ekrem Erdem

    2015-12-01

    Full Text Available Purpose – The roots of the science of modern economics are originated from the ideas of Adam Smith who is not a pure economist but a moralist-philosopher. Basic concepts in the Wealth of Nations which is perceived as the hand book of economics depend on the arguments that Adam Smith suggests in his Theory of Moral Sentiments. In this theory, business ethics as a part of the Law of Sympathy appears as one of the factors that provide the invisible hand to operate properly. In light of this property, it is possible to assume business ethics as one of the components of the market mechanism. In this context, this study aims to analyse the link between business ethics and economic growth in the Turkish economy. Design/methodology/approach – The study employs bounced cheques and protested bonds for representing the degradation of business ethics and tries to show how this degradation affects economic growth in the Turkish economy for the period 1988-2013. Findings – Either illustrative or empirical results show that business ethics is an important determinant of economic growth in the Turkish economy and damaging it negatively effects the growth rate of the economy. Research limitations/implications – One of the most restrictive things conducting the present empirical analysis is the lack of various and longer data sets. Using different indicators in terms of business ethics with longer time span will definitely increase the reliability of the study. However, in the current form, results imply a policy that is capable of limiting the failures of business ethics may boost the Turkish economy up. Originality/value – The results tend to support the close link between business ethics and economic growth.

  8. Empirical Analysis for the Heat Exchange Effectiveness of a Thermoelectric Liquid Cooling and Heating Unit

    Directory of Open Access Journals (Sweden)

    Hansol Lim

    2018-03-01

    Full Text Available This study aims to estimate the performance of thermoelectric module (TEM heat pump for simultaneous liquid cooling and heating and propose empirical models for predicting the heat exchange effectiveness. The experiments were conducted to investigate and collect the performance data of TEM heat pump where the working fluid was water. A total of 57 sets of experimental data were statistically analyzed to estimate the effects of each independent variable on the heat exchange effectiveness using analysis of variance (ANOVA. To develop the empirical model, the six design parameters were measured: the number of transfer units (NTU of the heat exchangers (i.e., water blocks, the inlet water temperatures and temperatures of water blocks at the cold and hot sides of the TEM. As a result, two polynomial equations predicting heat exchange effectiveness at the cold and hot sides of the TEM heat pump were derived as a function of the six selected design parameters. Also, the proposed models and theoretical model of conventional condenser and evaporator for heat exchange effectiveness were compared with the additional measurement data to validate the reliability of the proposed models. Consequently, two conclusions have been made: (1 the possibility of using the TEM heat pump for simultaneous cooling and heating was examined with the maximum temperature difference of 30 °C between cold and hot side of TEM, and (2 it is revealed that TEM heat pump has difference with the conventional evaporator and condenser from the comparison results between the proposed models and theoretical model due to the heat conduction and Joule effect in TEM.

  9. The impact of organizational factors on-business adoption: An empirical analysis

    Directory of Open Access Journals (Sweden)

    Marta García-Moreno

    2018-06-01

    Full Text Available Purpose: Provide empirical validation of the model developed by García Moreno et al. (2016 on the factors influencing the adoption of e-business in firms. Design/methodology/approach: Consideration is given to the method for measuring each one of the variables included in the model. Use has been made of the e-Business Watch database, which contains measures for the theoretical model’s three categories: firm, technology, and environment. Multinomial logistic regression models have been provided. Findings: The variables included have revealed significant statistical relationships for the model in question, although the intensity of the relationships differs. the variables related to the environment also reveal statistically significant relationships, whereby the attitude of trading partners appears to have a relevant and growing impact on e-business adoption. Research limitations/implications: Data come from just one database: the e-Business Watch database/enriched data from alternative databases could be included. Practical implications: The infrastructure of information and communications technologies (ICTs is confirmed to be a determining factor in e-business development. Nevertheless, the effect of competitor rivalry has a more erratic influence that is encapsulated in a significant relationship in intermediate models, with a sharper increase in the likelihood of being in the category of customer-focused firms, and less internally focused. Social implications: The human capital linked to ICTs is a driving force behind the adoption of these practices. Albeit with a more moderate effect, note should also be taken of the capacity for entering into relationships with third parties within the scope of ICTs, with significant effects that become more robust as they are tested in models that seek to explain the probability of recording higher levels of e-business adoption. Originality/value: The article presents a first empirical analysis to

  10. Application of a latent class analysis to empirically define eating disorder phenotypes.

    Science.gov (United States)

    Keel, Pamela K; Fichter, Manfred; Quadflieg, Norbert; Bulik, Cynthia M; Baxter, Mark G; Thornton, Laura; Halmi, Katherine A; Kaplan, Allan S; Strober, Michael; Woodside, D Blake; Crow, Scott J; Mitchell, James E; Rotondo, Alessandro; Mauri, Mauro; Cassano, Giovanni; Treasure, Janet; Goldman, David; Berrettini, Wade H; Kaye, Walter H

    2004-02-01

    Diagnostic criteria for eating disorders influence how we recognize, research, and treat eating disorders, and empirically valid phenotypes are required for revealing their genetic bases. To empirically define eating disorder phenotypes. Data regarding eating disorder symptoms and features from 1179 individuals with clinically significant eating disorders were submitted to a latent class analysis. The resulting latent classes were compared on non-eating disorder variables in a series of validation analyses. Multinational, collaborative study with cases ascertained through diverse clinical settings (inpatient, outpatient, and community). Members of affected relative pairs recruited for participation in genetic studies of eating disorders in which probands met DSM-IV-TR criteria for anorexia nervosa (AN) or bulimia nervosa and had at least 1 biological relative with a clinically significant eating disorder. Main Outcome Measure Number and clinical characterization of latent classes. A 4-class solution provided the best fit. Latent class 1 (LC1) resembled restricting AN; LC2, AN and bulimia nervosa with the use of multiple methods of purging; LC3, restricting AN without obsessive-compulsive features; and LC4, bulimia nervosa with self-induced vomiting as the sole form of purging. Biological relatives were significantly likely to belong to the same latent class. Across validation analyses, LC2 demonstrated the highest levels of psychological disturbance, and LC3 demonstrated the lowest. The presence of obsessive-compulsive features differentiates among individuals with restricting AN. Similarly, the combination of low weight and multiple methods of purging distinguishes among individuals with binge eating and purging behaviors. These results support some of the distinctions drawn within the DSM-IV-TR among eating disorder subtypes, while introducing new features to define phenotypes.

  11. Humic Substances from Manila Bay and Bolinao Bay Sediments

    Directory of Open Access Journals (Sweden)

    Elma Llaguno

    1997-12-01

    Full Text Available The C,H,N composition of sedimentary humic acids (HA extracted from three sites in Manila Bay and six sites in Bolinao Bay yielded H/C atomic ratios of 1.1-1.4 and N/C atomic ratios of 0.09 - 0.16. The Manila Bay HA's had lower H/C and N/C ratios compared to those from Bolinao Bay. The IR spectra showed prominent aliphatic C-H and amide I and II bands. Manila Bay HA's also had less diverse molecular composition based on the GC-MS analysis of the CuO and alkaline permanganate oxidation products of the humic acids.

  12. Why do electricity utilities cooperate with coal suppliers? A theoretical and empirical analysis from China

    International Nuclear Information System (INIS)

    Zhao Xiaoli; Lyon, Thomas P.; Wang Feng; Song Cui

    2012-01-01

    The asymmetry of Chinese coal and electricity pricing reforms leads to serious conflict between coal suppliers and electricity utilities. Electricity utilities experience significant losses as a result of conflict: severe coal price fluctuations, and uncertainty in the quantity and quality of coal supplies. This paper explores whether establishing cooperative relationships between coal suppliers and electricity utilities can resolve conflicts. We begin with a discussion of the history of coal and electricity pricing reforms, and then conduct a theoretical analysis of relational contracting to provide a new perspective on the drivers behind the establishment of cooperative relationships between the two parties. Finally, we empirically investigate the role of cooperative relationships and the establishment of mine-mouth power plants on the performance of electricity utilities. The results show that relational contracting between electricity utilities and coal suppliers improves the market performance of electricity utilities; meanwhile, the transportation cost savings derived from mine-mouth power plants are of importance in improving the performance of electricity utilities. - Highlights: ► We discuss the history of coal and electricity pricing reforms. ► The roots of conflicts between electricity and coal firms are presented. ► We conduct a theoretical analysis of relational contracting. ► The role of mine-mouth power plants on the performance of power firms is examined.

  13. Promoting Sustainability Transparency in European Local Governments: An Empirical Analysis Based on Administrative Cultures

    Directory of Open Access Journals (Sweden)

    Andrés Navarro-Galera

    2017-03-01

    Full Text Available Nowadays, the transparency of governments with respect to the sustainability of public services is a very interesting issue for stakeholders and academics. It has led to previous research and international organisations (EU, IMF, OECD, United Nations, IFAC, G-20, World Bank to recommend promotion of the online dissemination of economic, social and environmental information. Based on previous studies about e-government and the influence of administrative cultures on governmental accountability, this paper seeks to identify political actions useful to improve the practices of transparency on economic, social and environmental sustainability in European local governments. We perform a comparative analysis of sustainability information published on the websites of 72 local governments in 10 European countries grouped into main three cultural contexts (Anglo-Saxon, Southern European and Nordic. Using international sustainability reporting guidelines, our results reveal significant differences in local government transparency in each context. The most transparent local governments are the Anglo-Saxon ones, followed by Southern European and Nordic governments. Based on individualized empirical results for each administrative style, our conclusions propose useful policy interventions to enhance sustainability transparency within each cultural tradition, such as development of legal rules on transparency and sustainability, tools to motivate local managers for online diffusion of sustainability information and analysis of information needs of stakeholders.

  14. Combination of canonical correlation analysis and empirical mode decomposition applied to denoising the labor electrohysterogram.

    Science.gov (United States)

    Hassan, Mahmoud; Boudaoud, Sofiane; Terrien, Jérémy; Karlsson, Brynjar; Marque, Catherine

    2011-09-01

    The electrohysterogram (EHG) is often corrupted by electronic and electromagnetic noise as well as movement artifacts, skeletal electromyogram, and ECGs from both mother and fetus. The interfering signals are sporadic and/or have spectra overlapping the spectra of the signals of interest rendering classical filtering ineffective. In the absence of efficient methods for denoising the monopolar EHG signal, bipolar methods are usually used. In this paper, we propose a novel combination of blind source separation using canonical correlation analysis (BSS_CCA) and empirical mode decomposition (EMD) methods to denoise monopolar EHG. We first extract the uterine bursts by using BSS_CCA then the biggest part of any residual noise is removed from the bursts by EMD. Our algorithm, called CCA_EMD, was compared with wavelet filtering and independent component analysis. We also compared CCA_EMD with the corresponding bipolar signals to demonstrate that the new method gives signals that have not been degraded by the new method. The proposed method successfully removed artifacts from the signal without altering the underlying uterine activity as observed by bipolar methods. The CCA_EMD algorithm performed considerably better than the comparison methods.

  15. Denoising of chaotic signal using independent component analysis and empirical mode decomposition with circulate translating

    Science.gov (United States)

    Wen-Bo, Wang; Xiao-Dong, Zhang; Yuchan, Chang; Xiang-Li, Wang; Zhao, Wang; Xi, Chen; Lei, Zheng

    2016-01-01

    In this paper, a new method to reduce noises within chaotic signals based on ICA (independent component analysis) and EMD (empirical mode decomposition) is proposed. The basic idea is decomposing chaotic signals and constructing multidimensional input vectors, firstly, on the base of EMD and its translation invariance. Secondly, it makes the independent component analysis on the input vectors, which means that a self adapting denoising is carried out for the intrinsic mode functions (IMFs) of chaotic signals. Finally, all IMFs compose the new denoised chaotic signal. Experiments on the Lorenz chaotic signal composed of different Gaussian noises and the monthly observed chaotic sequence on sunspots were put into practice. The results proved that the method proposed in this paper is effective in denoising of chaotic signals. Moreover, it can correct the center point in the phase space effectively, which makes it approach the real track of the chaotic attractor. Project supported by the National Science and Technology, China (Grant No. 2012BAJ15B04), the National Natural Science Foundation of China (Grant Nos. 41071270 and 61473213), the Natural Science Foundation of Hubei Province, China (Grant No. 2015CFB424), the State Key Laboratory Foundation of Satellite Ocean Environment Dynamics, China (Grant No. SOED1405), the Hubei Provincial Key Laboratory Foundation of Metallurgical Industry Process System Science, China (Grant No. Z201303), and the Hubei Key Laboratory Foundation of Transportation Internet of Things, Wuhan University of Technology, China (Grant No.2015III015-B02).

  16. Analysis of empirical determinants of credit risk in the banking sector of the Republic of Serbia

    Directory of Open Access Journals (Sweden)

    Račić Željko

    2016-01-01

    Full Text Available The aim of this paper is the detection and analysis of empirical determinants of credit risk in the banking sector of the Republic of Serbia. The paper is based on an analysis of results of the application of the linear regression model, during the period from the third quarter of 2008 to the third quarter of 2014. There are three main findings. Firstly, the higher lending activity of banks contributes to the increasing share of high-risk loans in the total withdrawn loans (delayed effect of 3 years. Secondly, the growth of loans as opposed to deposits contributes to the increased exposure of banks to credit risk. Thirdly, the factors that reduce the exposure of banks to credit risk increase profitability, growth of interest rate spread and real GDP growth. Bearing in mind the overall market conditions and dynamics of the economic recovery of the country, there is a general conclusion based on the results that in the coming period the question of non-performing loans (NPLs in the Republic of Serbia will present a challenge for both lenders and borrowers.

  17. Teaching Integrity in Empirical Research: A Protocol for Documenting Data Management and Analysis

    Science.gov (United States)

    Ball, Richard; Medeiros, Norm

    2012-01-01

    This article describes a protocol the authors developed for teaching undergraduates to document their statistical analyses for empirical research projects so that their results are completely reproducible and verifiable. The protocol is guided by the principle that the documentation prepared to accompany an empirical research project should be…

  18. Empirical Hamiltonians

    International Nuclear Information System (INIS)

    Peggs, S.; Talman, R.

    1987-01-01

    As proton accelerators get larger, and include more magnets, the conventional tracking programs which simulate them run slower. The purpose of this paper is to describe a method, still under development, in which element-by-element tracking around one turn is replaced by a single man, which can be processed far faster. It is assumed for this method that a conventional program exists which can perform faithful tracking in the lattice under study for some hundreds of turns, with all lattice parameters held constant. An empirical map is then generated by comparison with the tracking program. A procedure has been outlined for determining an empirical Hamiltonian, which can represent motion through many nonlinear kicks, by taking data from a conventional tracking program. Though derived by an approximate method this Hamiltonian is analytic in form and can be subjected to further analysis of varying degrees of mathematical rigor. Even though the empirical procedure has only been described in one transverse dimension, there is good reason to hope that it can be extended to include two transverse dimensions, so that it can become a more practical tool in realistic cases

  19. Modeling ionospheric foF2 by using empirical orthogonal function analysis

    Directory of Open Access Journals (Sweden)

    E. A

    2011-08-01

    Full Text Available A similar-parameters interpolation method and an empirical orthogonal function analysis are used to construct empirical models for the ionospheric foF2 by using the observational data from three ground-based ionosonde stations in Japan which are Wakkanai (Geographic 45.4° N, 141.7° E, Kokubunji (Geographic 35.7° N, 140.1° E and Yamagawa (Geographic 31.2° N, 130.6° E during the years of 1971–1987. The impact of different drivers towards ionospheric foF2 can be well indicated by choosing appropriate proxies. It is shown that the missing data of original foF2 can be optimal refilled using similar-parameters method. The characteristics of base functions and associated coefficients of EOF model are analyzed. The diurnal variation of base functions can reflect the essential nature of ionospheric foF2 while the coefficients represent the long-term alteration tendency. The 1st order EOF coefficient A1 can reflect the feature of the components with solar cycle variation. A1 also contains an evident semi-annual variation component as well as a relatively weak annual fluctuation component. Both of which are not so obvious as the solar cycle variation. The 2nd order coefficient A2 contains mainly annual variation components. The 3rd order coefficient A3 and 4th order coefficient A4 contain both annual and semi-annual variation components. The seasonal variation, solar rotation oscillation and the small-scale irregularities are also included in the 4th order coefficient A4. The amplitude range and developing tendency of all these coefficients depend on the level of solar activity and geomagnetic activity. The reliability and validity of EOF model are verified by comparison with observational data and with International Reference Ionosphere (IRI. The agreement between observations and EOF model is quite well, indicating that the EOF model can reflect the major changes and the temporal distribution characteristics of the mid-latitude ionosphere of the

  20. Research on Browsing Behavior in the Libraries: An Empirical Analysis of Consequences, Success and Influences

    Directory of Open Access Journals (Sweden)

    Shan-Ju L. Chang

    2000-12-01

    Full Text Available Browsing as an important part of human information behavior has been observed and investigated in the context of information seeking in the library in general and has assumed greater importance in human-machine interaction in particular. However, the nature and consequences of browsing are not well understood, and little is known of the success rate of such behavior.In this research, exploratory empirical case studies from three types of libraries were conducted, using questionnaires, observation logs, interviews, and computer search logs, to derive the empirical evidence to understand, from the user point of view, what are the consequences of browsing, what constitutes successful browsing, and what factors influence the extent of browsing. Content analysis and statistical analysis were conducted to analyze and synthesize the data. The research results show: (1 There are nine categories of the consequence of browsing, including accidental findings, modification of information need, found the desirable information, learning, feeling relaxation/recreational, information gathering, keeping updated, satisfying curiosity, and not finding what is needed. (2 Four factors that produce successful browsing: intention, the amount or quality of information, the utility of what is found, and help for solving problem or making judgment. (3 Three types of reasons for unsuccessful experience in browsing: not finding what one wanted, inadequate volume or quality of information, and not finding some things useful or interesting. (4 Three types of reasons for partial success: found the intended object but not happy with the quality or amount of information in it, not finding what one wanted but discovering new or potential useful information, not accomplish one purpose but achieve another one given multiple purposes. (5 The influential factors that affect the extent one engages in browsing include browser’s time, scheme of information organization, proximity to

  1. Mex Bay

    African Journals Online (AJOL)

    user

    2015-02-23

    Feb 23, 2015 ... surveys to assess the vulnerability of the most important physical and eutrophication parameters along. El- Mex Bay coast. As a result of increasing population and industrial development, poorly untreated industrial waste, domestic sewage, shipping industry and agricultural runoff are being released to the.

  2. 6 essays about auctions: a theoretical and empirical analysis. Application to power markets

    International Nuclear Information System (INIS)

    Lamy, L.

    2007-06-01

    This thesis is devoted to a theoretical and empirical analysis of auction mechanisms. Motivated by allocation issues in network industries, in particular by the liberalization of the electricity sector, it focus on auctions with externalities (either allocative or informational) and on multi-objects auctions. After an introduction which provides a survey of the use and the analysis of auctions in power markets, six chapters make this thesis. The first one considers standard auctions in Milgrom-Weber's model with interdependent valuations when the seller can not commit not to participate in the auction. The second and third chapters study the combinatorial auction mechanism proposed by Ausubel and Milgrom. The first of these two studies proposes a modification of this format with a final discount stage and clarifies the theoretical status of those formats, in particular the conditions such that truthful reporting is a dominant strategy. Motivated by the robustness issues of the generalizations of the Ausubel-Milgrom and the Vickrey combinatorial auctions to environments with allocative externalities between joint-purchasers, the second one characterizes the buyer-sub-modularity condition in a general model with allocative identity-dependent externalities between purchasers. In a complete information setup, the fourth chapter analyses the optimal design problem when the commitment abilities of the principal are reduced, namely she can not commit to a simultaneous participation game. The fifth chapter is devoted to the structural analysis of the private value auction model for a single-unit when the econometrician can not observe bidders' identities. The asymmetric independent private value (IPV) model is identified. A multi-step kernel-based estimator is proposed and shown to be asymptotically optimal. Using auctions data for the anglo-french electric Interconnector, the last chapter analyses a multi-unit ascending auctions through reduced forms. (author)

  3. Motivational factors influencing the homeowners’ decisions between residential heating systems: An empirical analysis for Germany

    International Nuclear Information System (INIS)

    Michelsen, Carl Christian; Madlener, Reinhard

    2013-01-01

    Heating demand accounts for a large fraction of the overall energy demand of private households in Germany. A better understanding of the adoption and diffusion of energy-efficient and renewables-based residential heating systems (RHS) is of high policy relevance, particularly against the background of climate change, security of energy supply and increasing energy prices. In this paper, we explore the multi-dimensionality of the homeowners’ motivation to decide between competing RHS. A questionnaire survey (N=2440) conducted in 2010 among homeowners who had recently installed a RHS provides the empirical foundation. Principal component analysis shows that 25 items capturing different adoption motivations can be grouped around six dimensions: (1) cost aspects, (2) general attitude towards the RHS, (3) government grant, (4) reactions to external threats (i.e., environmental or energy supply security considerations), (5) comfort considerations, and (6) influence of peers. Moreover, a cluster analysis with the identified motivational factors as segmentation variables reveals three adopter types: (1) the convenience-oriented, (2) the consequences-aware, and (3) the multilaterally-motivated RHS adopter. Finally, we show that the influence of the motivational factors on the adoption decision also differs by certain characteristics of the homeowner and features of the home. - Highlights: ► Study of the multi-dimensionality of the motivation to adopt residential heating systems (RHS). ► Principal component and cluster analysis are applied to representative survey data for Germany. ► Motivation has six dimensions, including rational decision-making and emotional factors. ► Adoption motivation differs by certain characteristics of the homeowner and of the home. ► Many adopters are driven by existing habits and perceptions about the convenience of the RHS

  4. Liquidity Risk Management: An Empirical Analysis on Panel Data Analysis and ISE Banking Sector

    OpenAIRE

    Sibel ÇELİK; Yasemin Deniz AKARIM

    2012-01-01

    In this paper, we test the factors affecting liquidity risk management in banking sector in Turkey by using panel regression analysis. We use the data for 9 commercial banks traded in Istanbul Stock Exchange for the period 1998-2008. In conclusion, we find that risky liquid assets and return on equity variables are negatively related with liquidity risk. However, external financing and return on asset variables are positively related with liquidity risk. This finding is importance for banks s...

  5. On the relationship between fiscal plans in the European Union: An empirical analysis based on real-time data

    NARCIS (Netherlands)

    Giuliodori, M.; Beetsma, R.M.W.J.

    2007-01-01

    We investigate the interdependence of fiscal policies, and in particular deficits, in the European Union using an empirical analysis based on real-time fiscal data. There are many potential reasons why fiscal policies could be interdependent, such as direct externalities due to cross-border public

  6. On the relationship between fiscal plans in the European Union: an empirical analysis based on real-time data

    NARCIS (Netherlands)

    Giuliodori, M.; Beetsma, R.

    2008-01-01

    We investigate the interdependence of fiscal policies, and in particular deficits, in the European Union using an empirical analysis based on real-time fiscal data. There are many potential reasons why fiscal policies could be interdependent, such as direct externalities due to cross-border public

  7. Antecedents and Consequences of Individual Performance Analysis of Turnover Intention Model (Empirical Study of Public Accountants in Indonesia)

    OpenAIRE

    Raza, Hendra; Maksum, Azhar; Erlina; Lumban Raja, Prihatin

    2014-01-01

    Azhar Maksum This study aims to examine empirically the antecedents of individual performance on its consequences of turnover intention in public accounting firms. There are eight variables measured which consists of auditors' empowerment, innovation professionalism, role ambiguity, role conflict, organizational commitment, individual performance and turnover intention. Data analysis is based on 163 public accountant using the Structural Equation Modeling assisted with an appli...

  8. Analysis of whistles produced by the Tucuxi Dolphin Sotalia fluviatilis from Sepetiba Bay, Brazil

    Directory of Open Access Journals (Sweden)

    Erber Claudia

    2004-01-01

    Full Text Available From July 2001 to June 2002, we recorded a total of 2h55min of Tucuxi Dolphin Sotalia fluviatilis vocalizations from Sepetiba Bay, Brazil (22º35'S-44º03'W. A total of 3350 whistles were analyzed quantitative and qualitatively and were divided into 124 types, by visual inspection of sonograms. The following parameters were measured: Initial Frequency, Final Frequency, Minimum Frequency, Maximum Frequency, Duration, Number of Inflections, Frequency at the Inflection Points, Frequency at 1/2, 1/4, and 3/4 of whistle duration, Presence of Frequency Modulation and Harmonics. Ascending type whistles (N=2719 were most common, representing 82% of the total. Dolphin behavior and average group size observed during recording influenced the whistle's quantitative and qualitative parameters. The results demonstrate the great diversity of whistles emitted and indicate a functional role of these vocalizations during the observed behaviors.

  9. Sentiment analysis system for movie review in Bahasa Indonesia using naive bayes classifier method

    Science.gov (United States)

    Nurdiansyah, Yanuar; Bukhori, Saiful; Hidayat, Rahmad

    2018-04-01

    There are many ways of implementing the use of sentiments often found in documents; one of which is the sentiments found on the product or service reviews. It is so important to be able to process and extract textual data from the documents. Therefore, we propose a system that is able to classify sentiments from review documents into two classes: positive sentiment and negative sentiment. We use Naive Bayes Classifier method in this document classification system that we build. We choose Movienthusiast, a movie reviews in Bahasa Indonesia website as the source of our review documents. From there, we were able to collect 1201 movie reviews: 783 positive reviews and 418 negative reviews that we use as the dataset for this machine learning classifier. The classifying accuracy yields an average of 88.37% from five times of accuracy measuring attempts using aforementioned dataset.

  10. Cause analysis and preventives for human error events in Daya Bay NPP

    International Nuclear Information System (INIS)

    Huang Weigang; Zhang Li

    1998-01-01

    Daya Bay Nuclear Power Plant is put into commercial operation in 1994 Until 1996, there are 368 human error events in operating and maintenance area, occupying 39% of total events. These events occurred mainly in the processes of maintenance, test equipment isolation and system on-line, in particular in refuelling and maintenance. The author analyses root causes for human errorievents, which are mainly operator omission or error procedure deficiency; procedure not followed; lack of training; communication failures; work management inadequacy. The protective measures and treatment principle for human error events are also discussed, and several examples applying them are given. Finally, it is put forward that key to prevent human error event lies in the coordination and management, person in charge of work, and good work habits of staffs

  11. Characterizing Social Interaction in Tobacco-Oriented Social Networks: An Empirical Analysis.

    Science.gov (United States)

    Liang, Yunji; Zheng, Xiaolong; Zeng, Daniel Dajun; Zhou, Xingshe; Leischow, Scott James; Chung, Wingyan

    2015-06-19

    Social media is becoming a new battlefield for tobacco "wars". Evaluating the current situation is very crucial for the advocacy of tobacco control in the age of social media. To reveal the impact of tobacco-related user-generated content, this paper characterizes user interaction and social influence utilizing social network analysis and information theoretic approaches. Our empirical studies demonstrate that the exploding pro-tobacco content has long-lasting effects with more active users and broader influence, and reveal the shortage of social media resources in global tobacco control. It is found that the user interaction in the pro-tobacco group is more active, and user-generated content for tobacco promotion is more successful in obtaining user attention. Furthermore, we construct three tobacco-related social networks and investigate the topological patterns of these tobacco-related social networks. We find that the size of the pro-tobacco network overwhelms the others, which suggests a huge number of users are exposed to the pro-tobacco content. These results indicate that the gap between tobacco promotion and tobacco control is widening and tobacco control may be losing ground to tobacco promotion in social media.

  12. An empirical analysis of the impact of renewable energy deployment on local sustainability

    Energy Technology Data Exchange (ETDEWEB)

    Del Rio, Pablo [Institute for Public Goods and Policies (IPP), Centro de Ciencias Humanas y Sociales, Consejo Superior de Investigaciones Cientificas (CSIC), C/Albasanz 26-28, 28037 Madrid (Spain); Burguillo, Mercedes [Facultad de Ciencias Economicas y Empresariales, Universidad de Alcala, Pza. de la Victoria 3, 28802 Alcala de Henares, Madrid (Spain)

    2009-08-15

    It is usually mentioned that renewable energy sources (RES) have a large potential to contribute to the sustainable development of specific territories by providing them with a wide variety of socioeconomic benefits, including diversification of energy supply, enhanced regional and rural development opportunities, creation of a domestic industry and employment opportunities. The analysis of these benefits has usually been too general (i.e., mostly at the national level) and a focus on the regional and especially the local level has been lacking. This paper empirically analyses those benefits, by applying a conceptual and methodological framework previously developed by the authors to three renewable energy technologies in three different places in Spain. With the help of case studies, the paper shows that the contribution of RES to the economic and social dimensions of sustainable development might be significant. Particularly important is employment creation in these areas. Although, in absolute terms, the number of jobs created may not be high, it may be so with respect to the existing jobs in the areas considered. Socioeconomic benefits depend on several factors, and not only on the type of renewable energy, as has usually been mentioned. The specific socioeconomic features of the territories, including the productive structure of the area, the relationships between the stakeholders and the involvement of the local actors in the renewable energy project may play a relevant role in this regard. Furthermore, other local (socioeconomic) sustainability aspects beyond employment creation should be considered. (author)

  13. Dynamic analysis of interhospital collaboration and competition: empirical evidence from an Italian regional health system.

    Science.gov (United States)

    Mascia, Daniele; Di Vincenzo, Fausto; Cicchetti, Americo

    2012-05-01

    Policymakers stimulate competition in universalistic health-care systems while encouraging the formation of service provision networks among hospital organizations. This article addresses a gap in the extant literature by empirically analyzing simultaneous collaboration and competition between hospitals within the Italian National Health Service, where important procompetition reforms have been implemented. To explore how rising competition between hospitals relates to their propensity to collaborate with other local providers. Longitudinal data on interhospital collaboration and competition collected in an Italian region from 2003 to 2007 are analyzed. Social network analysis techniques are applied to study the structure and dynamics of interhospital collaboration. Negative binomial regressions are employed to explore how interhospital competition relates to the collaborative network over time. Competition among providers does not hinder interhospital collaboration. Collaboration is primarily local, with resource complementarity and differentials in the volume of activity and hospital performance explaining the propensity to collaborate. Formation of collaborative networks among hospitals is not hampered by reforms aimed at fostering market forces. Because procompetition reforms elicit peculiar forms of managed competition in universalistic health systems, studies are needed to clarify whether the positive association between interhospital competition and collaboration can be generalized to other health-care settings. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  14. Review of US ESCO industry market trends: an empirical analysis of project data

    Energy Technology Data Exchange (ETDEWEB)

    Goldman, C.A.; Hopper, N.C.; Osborn, J.G. [Lawrence Berkeley National Laboratory, Berkeley, CA (United States). Energy Analysis

    2005-02-01

    This comprehensive empirical analysis of US energy service company (ESCO) industry trends and performance employs two parallel analytical approaches: a survey of firms to estimate total industry size, and a database of {approx}1500 ESCO projects, from which we report target markets and typical project characteristics, energy savings and customer economics. We estimate that industry investment for energy-efficiency related services reached US$2 billion in 2000 following a decade of strong growth. ESCO activity is concentrated in states with high economic activity and strong policy support. Typical projects save 150-200 MJ/m{sup 2}/year and are cost-effective with median benefit/cost ratios of 1.6 and 2.1 for institutional and private sector projects. The median simple payback time (SPT) is 7 years among institutional customers; 3 years is typical in the private sector. Reliance on DSM incentives has decreased since 1995. Preliminary evidence suggests that state enabling policies have boosted the industry in medium-sized states. ECSOs have proven resilient in the face of restructuring and will probably shift toward selling 'energy solutions', with energy efficiency part of a package. We conclude that appropriate policy support - both financial and non-financial - can 'jump-start' a viable private-sector energy-efficiency services industry that targets large institutional and commercial/industrial customers. (author)

  15. The Determinants of the Global Mobile Telephone Deployment: An Empirical Analysis

    Directory of Open Access Journals (Sweden)

    Sheikh Taher ABU

    2010-01-01

    Full Text Available This study aims to analyze the global mobile phones by examining the instruments stimulating the diffusion pattern. A rigorous demand model is estimated using global mobile telecommu-nications panel dataset comprised with 51 countries classified in order to World Bank income categories from 1990-2007. In particular, the paper examines what factors contribute the most to the deployment of global mobile telephones. To construct an econometric model, the number of subscribers to mobile phone per 100 inhabitants is taken as dependent variable, while the following groups of variables (1 GDP per capita income and charges, (2 competition policies (3 telecom infrastructure (4 technological innovations (5 others are selected as independent variables. Estimation results report the presence of substantial disparity among groups. Additionally GDP per capita income and own-price elasticity comprised with call rate, subscription charges, are reported. The analysis of impulse responses for price, competition policies, and technological innovations such as digitalization of mobile network, mobile network coverage indicates that substantial mobile telephone growth is yet to be realized especially in developing countries. A new and important empirical finding is that there are still many opportunities available for mobile phone development in the world pro-poor nations by providing better telecom infrastructure.

  16. Operational Practices and Financial Performance: an Empirical Analysis of Brazilian Manufacturing Companies

    Directory of Open Access Journals (Sweden)

    André Luís de Castro Moura Duarte

    2011-10-01

    Full Text Available In the operations management field, operational practices like total quality management or just in time have been seen as a way to improve operational performance and ultimately financial performance. Empirical support for this effect of operational practices in financial performance has been, however, limited due to research design and the inherent difficulties of using performance as a dependent variable. In this paper, we tested the relationship between selected operational practices (quality management, just in time, ISO certification and services outsourcing in financial performance outcomes of profitability and growth. A sample of 1200 firms, operating in São Paulo, Brazil, was used. Analysis using multiple regression explored the direct effect of practices and their interaction with industry dummies. Results did not support the existence of a positive relationship with financial performance. A negative relationship of outsourcing with both profitability and growth was found, supporting some critical views of the outsourcing practice. A weaker negative relationship between ISO certification and growth was also found. Some interactions between practices and industries were also significant, with mixed results, indicating that the effect of practices on performance might be context dependent.

  17. Investigating properties of the cardiovascular system using innovative analysis algorithms based on ensemble empirical mode decomposition.

    Science.gov (United States)

    Yeh, Jia-Rong; Lin, Tzu-Yu; Chen, Yun; Sun, Wei-Zen; Abbod, Maysam F; Shieh, Jiann-Shing

    2012-01-01

    Cardiovascular system is known to be nonlinear and nonstationary. Traditional linear assessments algorithms of arterial stiffness and systemic resistance of cardiac system accompany the problem of nonstationary or inconvenience in practical applications. In this pilot study, two new assessment methods were developed: the first is ensemble empirical mode decomposition based reflection index (EEMD-RI) while the second is based on the phase shift between ECG and BP on cardiac oscillation. Both methods utilise the EEMD algorithm which is suitable for nonlinear and nonstationary systems. These methods were used to investigate the properties of arterial stiffness and systemic resistance for a pig's cardiovascular system via ECG and blood pressure (BP). This experiment simulated a sequence of continuous changes of blood pressure arising from steady condition to high blood pressure by clamping the artery and an inverse by relaxing the artery. As a hypothesis, the arterial stiffness and systemic resistance should vary with the blood pressure due to clamping and relaxing the artery. The results show statistically significant correlations between BP, EEMD-based RI, and the phase shift between ECG and BP on cardiac oscillation. The two assessments results demonstrate the merits of the EEMD for signal analysis.

  18. Review of US ESCO industry market trends: an empirical analysis of project data

    International Nuclear Information System (INIS)

    Goldman, Charles A.; Hopper, Nicole C.; Osborn, Julie G.

    2005-01-01

    This comprehensive empirical analysis of US energy service company (ESCO) industry trends and performance employs two parallel analytical approaches: a survey of firms to estimate total industry size, and a database of ∼1500 ESCO projects, from which we report target markets and typical project characteristics, energy savings and customer economics. We estimate that industry investment for energy-efficiency related services reached US$2 billion in 2000 following a decade of strong growth. ESCO activity is concentrated in states with high economic activity and strong policy support. Typical projects save 150-200 MJ/m 2 /year and are cost-effective with median benefit/cost ratios of 1.6 and 2.1 for institutional and private sector projects. The median simple payback time (SPT) is 7 years among institutional customers; 3 years is typical in the private sector. Reliance on DSM incentives has decreased since 1995. Preliminary evidence suggests that state enabling policies have boosted the industry in medium-sized states. ESCOs have proven resilient in the face of restructuring and will probably shift toward selling 'energy solutions', with energy efficiency part of a package. We conclude that appropriate policy support - both financial and non-financial - can 'jump-start' a viable private-sector energy-efficiency services industry that targets large institutional and commercial/industrial customers

  19. Review of US ESCO industry market trends: an empirical analysis of project data

    International Nuclear Information System (INIS)

    Goldman, C.A.; Hopper, N.C.; Osborn, J.G.

    2005-01-01

    This comprehensive empirical analysis of US energy service company (ESCO) industry trends and performance employs two parallel analytical approaches: a survey of firms to estimate total industry size, and a database of ∼1500 ESCO projects, from which we report target markets and typical project characteristics, energy savings and customer economics. We estimate that industry investment for energy-efficiency related services reached US$2 billion in 2000 following a decade of strong growth. ESCO activity is concentrated in states with high economic activity and strong policy support. Typical projects save 150-200 MJ/m 2 /year and are cost-effective with median benefit/cost ratios of 1.6 and 2.1 for institutional and private sector projects. The median simple payback time (SPT) is 7 years among institutional customers; 3 years is typical in the private sector. Reliance on DSM incentives has decreased since 1995. Preliminary evidence suggests that state enabling policies have boosted the industry in medium-sized states. ECSOs have proven resilient in the face of restructuring and will probably shift toward selling 'energy solutions', with energy efficiency part of a package. We conclude that appropriate policy support - both financial and non-financial - can 'jump-start' a viable private-sector energy-efficiency services industry that targets large institutional and commercial/industrial customers. (author)

  20. An empirical analysis of strategy implementation process and performance of construction companies

    Science.gov (United States)

    Zaidi, F. I.; Zawawi, E. M. A.; Nordin, R. M.; Ahnuar, E. M.

    2018-02-01

    Strategy implementation is known as action stage where it is to be considered as the most difficult stage in strategic planning. Strategy implementation can influence the whole texture of a company including its performance. The aim of this research is to provide the empirical relationship between strategy implementation process and performance of construction companies. This research establishes the strategy implementation process and how it influences the performance of construction companies. This research used quantitative method approached via questionnaire survey. Respondents were G7 construction companies in Klang Valley, Selangor. Pearson correlation analysis indicate a strong positive relationship between strategy implementation process and construction companies’ performance. The most importance part of strategy implementation process is to provide sufficient training for employees which directly influence the construction companies’ profit growth and employees’ growth. This research results will benefit top management in the construction companies to conduct strategy implementation in their companies. This research may not reflect the whole construction industry in Malaysia. Future research may be resumed to small and medium grades contractors and perhaps in other areas in Malaysia.

  1. Empirical analysis of online social networks in the age of Web 2.0

    Science.gov (United States)

    Fu, Feng; Liu, Lianghuan; Wang, Long

    2008-01-01

    Today the World Wide Web is undergoing a subtle but profound shift to Web 2.0, to become more of a social web. The use of collaborative technologies such as blogs and social networking site (SNS) leads to instant online community in which people communicate rapidly and conveniently with each other. Moreover, there are growing interest and concern regarding the topological structure of these new online social networks. In this paper, we present empirical analysis of statistical properties of two important Chinese online social networks-a blogging network and an SNS open to college students. They are both emerging in the age of Web 2.0. We demonstrate that both networks possess small-world and scale-free features already observed in real-world and artificial networks. In addition, we investigate the distribution of topological distance. Furthermore, we study the correlations between degree (in/out) and degree (in/out), clustering coefficient and degree, popularity (in terms of number of page views) and in-degree (for the blogging network), respectively. We find that the blogging network shows disassortative mixing pattern, whereas the SNS network is an assortative one. Our research may help us to elucidate the self-organizing structural characteristics of these online social networks embedded in technical forms.

  2. Partial differential equation-based approach for empirical mode decomposition: application on image analysis.

    Science.gov (United States)

    Niang, Oumar; Thioune, Abdoulaye; El Gueirea, Mouhamed Cheikh; Deléchelle, Eric; Lemoine, Jacques

    2012-09-01

    The major problem with the empirical mode decomposition (EMD) algorithm is its lack of a theoretical framework. So, it is difficult to characterize and evaluate this approach. In this paper, we propose, in the 2-D case, the use of an alternative implementation to the algorithmic definition of the so-called "sifting process" used in the original Huang's EMD method. This approach, especially based on partial differential equations (PDEs), was presented by Niang in previous works, in 2005 and 2007, and relies on a nonlinear diffusion-based filtering process to solve the mean envelope estimation problem. In the 1-D case, the efficiency of the PDE-based method, compared to the original EMD algorithmic version, was also illustrated in a recent paper. Recently, several 2-D extensions of the EMD method have been proposed. Despite some effort, 2-D versions for EMD appear poorly performing and are very time consuming. So in this paper, an extension to the 2-D space of the PDE-based approach is extensively described. This approach has been applied in cases of both signal and image decomposition. The obtained results confirm the usefulness of the new PDE-based sifting process for the decomposition of various kinds of data. Some results have been provided in the case of image decomposition. The effectiveness of the approach encourages its use in a number of signal and image applications such as denoising, detrending, or texture analysis.

  3. Security Vulnerability Profiles of Mission Critical Software: Empirical Analysis of Security Related Bug Reports

    Science.gov (United States)

    Goseva-Popstojanova, Katerina; Tyo, Jacob

    2017-01-01

    While some prior research work exists on characteristics of software faults (i.e., bugs) and failures, very little work has been published on analysis of software applications vulnerabilities. This paper aims to contribute towards filling that gap by presenting an empirical investigation of application vulnerabilities. The results are based on data extracted from issue tracking systems of two NASA missions. These data were organized in three datasets: Ground mission IVV issues, Flight mission IVV issues, and Flight mission Developers issues. In each dataset, we identified security related software bugs and classified them in specific vulnerability classes. Then, we created the security vulnerability profiles, i.e., determined where and when the security vulnerabilities were introduced and what were the dominating vulnerabilities classes. Our main findings include: (1) In IVV issues datasets the majority of vulnerabilities were code related and were introduced in the Implementation phase. (2) For all datasets, around 90 of the vulnerabilities were located in two to four subsystems. (3) Out of 21 primary classes, five dominated: Exception Management, Memory Access, Other, Risky Values, and Unused Entities. Together, they contributed from 80 to 90 of vulnerabilities in each dataset.

  4. Review of U.S. ESCO industry market trends: An empirical analysis of project data

    Energy Technology Data Exchange (ETDEWEB)

    Goldman, Charles A.; Hopper, Nicole C.; Osborn, Julie G.; Singer, Terry E.

    2003-03-01

    This article summarizes a comprehensive empirical analysis of U.S. Energy Service Company (ESCO) industry trends and performance. We employ two parallel analytical approaches: a comprehensive survey of firms to estimate total industry size and a database of {approx}1500 ESCO projects, from which we report target markets and typical project characteristics, energy savings and customer economics. We estimate that industry investment for energy-efficiency related services reached US $2 billion in 2000 following a decade of strong growth. ESCO activity is concentrated in states with high economic activity and strong policy support. Typical projects save 150-200 MJ/m2/year and are cost-effective with median benefit/cost ratios of 1.6 and 2.1 for institutional and private sector projects. The median simple payback time is 7 years among institutional customers; 3 years is typical in the private sector. Reliance on DSM incentives has decreased since 1995. Preliminary evidence suggests that state enabling policies have boosted the industry in medium-sized states. ESCOs have proven resilient in the face of restructuring and will probably shift toward selling ''energy solutions,'' with energy efficiency part of a package. We conclude that a private sector energy-efficiency services industry that targets large commercial and industrial customers is viable and self-sustaining with appropriate policy support both financial and non-financial.

  5. Stakeholders of Voluntary Forest Carbon Offset Projects in China: An Empirical Analysis

    Directory of Open Access Journals (Sweden)

    Derong Lin

    2015-01-01

    Full Text Available Climate change is one of the defining challenges facing the planet. Voluntary forest carbon offset project which has the potential to boost forest carbon storage and mitigate global warming has aroused the global concern. The objective of this paper is to model the game situation and analyze the game behaviors of stakeholders of voluntary forest carbon offset projects in China. A stakeholder model and a Power-Benefit Matrix are constructed to analyze the roles, behaviors, and conflicts of stakeholders including farmers, planting entities, communities, government, and China Green Carbon Foundation. The empirical analysis results show that although the stakeholders have diverse interests and different goals, a win-win solution is still possible through their joint participation and compromise in the voluntary forest carbon offset project. A wide governance structure laying emphasis on benefit balance, equality, and information exchanges and being regulated by all stakeholders has been constructed. It facilitates the agreement among the stakeholders with conflicting or different interests. The joint participation of stakeholders in voluntary forest carbon offset projects might change the government-dominated afforestation/reforestation into a market, where all participators including government are encouraged to cooperate with each other to improve the condition of fund shortage and low efficiency.

  6. Identifying the oil price-macroeconomy relationship. An empirical mode decomposition analysis of US data

    International Nuclear Information System (INIS)

    Oladosu, Gbadebo

    2009-01-01

    This paper employs the empirical mode decomposition (EMD) method to filter cyclical components of US quarterly gross domestic product (GDP) and quarterly average oil price (West Texas Intermediate - WTI). The method is adaptive and applicable to non-linear and non-stationary data. A correlation analysis of the resulting components is performed and examined for insights into the relationship between oil and the economy. Several components of this relationship are identified. However, the principal one is that the medium-run component of the oil price has a negative relationship with the main cyclical component of the GDP. In addition, weak correlations suggesting a lagging, demand-driven component and a long-run component of the relationship were also identified. Comparisons of these findings with significant oil supply disruption and recession dates were supportive. The study identifies a number of lessons applicable to recent oil market events, including the eventuality of persistent oil price and economic decline following a long oil price run-up. In addition, it was found that oil market related exogenous events are associated with short- to medium-run price implications regardless of whether they lead to actual supply losses. (author)

  7. Anterior temporal face patches: A meta-analysis and empirical study

    Directory of Open Access Journals (Sweden)

    Rebecca J. Von Der Heide

    2013-02-01

    Full Text Available Studies of nonhuman primates have reported face sensitive patches in the ventral anterior temporal lobes (ATL. In humans, ATL resection or damage causes an associative prosopagnosia in which face perception is intact but face memory is compromised. Some fMRI studies have extended these findings using famous and familiar faces. However, it is unclear whether these regions in the human ATL are in locations comparable to those reported in non-human primates, typically using unfamiliar faces. We present the results of two studies of person memory: a meta-analysis of existing fMRI studies and an empirical fMRI study using optimized imaging parameters. Both studies showed left-lateralized ATL activations to familiar individuals while novel faces activated the right ATL. Activations to famous faces were quite ventral, similar to what has been reported in monkeys. These findings suggest that face memory-sensitive patches in the human ATL are in the ventral/polar ATL.

  8. Lessons Learned on Benchmarking from the International Human Reliability Analysis Empirical Study

    International Nuclear Information System (INIS)

    Boring, Ronald L.; Forester, John A.; Bye, Andreas; Dang, Vinh N.; Lois, Erasmia

    2010-01-01

    The International Human Reliability Analysis (HRA) Empirical Study is a comparative benchmark of the prediction of HRA methods to the performance of nuclear power plant crews in a control room simulator. There are a number of unique aspects to the present study that distinguish it from previous HRA benchmarks, most notably the emphasis on a method-to-data comparison instead of a method-to-method comparison. This paper reviews seven lessons learned about HRA benchmarking from conducting the study: (1) the dual purposes of the study afforded by joining another HRA study; (2) the importance of comparing not only quantitative but also qualitative aspects of HRA; (3) consideration of both negative and positive drivers on crew performance; (4) a relatively large sample size of crews; (5) the use of multiple methods and scenarios to provide a well-rounded view of HRA performance; (6) the importance of clearly defined human failure events; and (7) the use of a common comparison language to 'translate' the results of different HRA methods. These seven lessons learned highlight how the present study can serve as a useful template for future benchmarking studies.

  9. Lessons Learned on Benchmarking from the International Human Reliability Analysis Empirical Study

    Energy Technology Data Exchange (ETDEWEB)

    Ronald L. Boring; John A. Forester; Andreas Bye; Vinh N. Dang; Erasmia Lois

    2010-06-01

    The International Human Reliability Analysis (HRA) Empirical Study is a comparative benchmark of the prediction of HRA methods to the performance of nuclear power plant crews in a control room simulator. There are a number of unique aspects to the present study that distinguish it from previous HRA benchmarks, most notably the emphasis on a method-to-data comparison instead of a method-to-method comparison. This paper reviews seven lessons learned about HRA benchmarking from conducting the study: (1) the dual purposes of the study afforded by joining another HRA study; (2) the importance of comparing not only quantitative but also qualitative aspects of HRA; (3) consideration of both negative and positive drivers on crew performance; (4) a relatively large sample size of crews; (5) the use of multiple methods and scenarios to provide a well-rounded view of HRA performance; (6) the importance of clearly defined human failure events; and (7) the use of a common comparison language to “translate” the results of different HRA methods. These seven lessons learned highlight how the present study can serve as a useful template for future benchmarking studies.

  10. Analysis of MABEL Bathymetry in Keweenaw Bay and Implications for ICESat-2 ATLAS

    Directory of Open Access Journals (Sweden)

    Nicholas A. Forfinski-Sarkozi

    2016-09-01

    Full Text Available In 2018, the National Aeronautics and Space Administration (NASA is scheduled to launch the Ice, Cloud, and land Elevation Satellite-2 (ICESat-2, with a new six-beam, green-wavelength, photon-counting lidar system, Advanced Topographic Laser Altimeter System (ATLAS. The primary objectives of the ICESat-2 mission are to measure ice-sheet elevations, sea-ice thickness, and global biomass. However, if bathymetry can be reliably retrieved from ATLAS data, this could assist in addressing a key data need in many coastal and inland water body areas, including areas that are poorly-mapped and/or difficult to access. Additionally, ATLAS-derived bathymetry could be used to constrain bathymetry derived from complementary data, such as passive, multispectral imagery and synthetic aperture radar (SAR. As an important first step in evaluating the ability to map bathymetry from ATLAS, this study involves a detailed assessment of bathymetry from the Multiple Altimeter Beam Experimental Lidar (MABEL, NASA’s airborne ICESat-2 simulator, flown on the Earth Resources 2 (ER-2 high-altitude aircraft. An interactive, web interface, MABEL Viewer, was developed and used to identify bottom returns in Keweenaw Bay, Lake Superior. After applying corrections for refraction and channel-specific elevation biases, MABEL bathymetry was compared against National Oceanic and Atmospheric Administration (NOAA data acquired two years earlier. The results indicate that MABEL reliably detected bathymetry in depths of up to 8 m, with a root mean square (RMS difference of 0.7 m, with respect to the reference data. Additionally, a version of the lidar equation was developed for predicting bottom-return signal levels in MABEL and tested using the Keweenaw Bay data. Future work will entail extending these results to ATLAS, as the technical specifications of the sensor become available.

  11. Impact of period and timescale of FDDA analysis nudging on the numerical simulation of tropical cyclones in the Bay of Bengal

    KAUST Repository

    Viswanadhapalli, Yesubabu; Srinivas, C. V.; Ramakrishna, S. S V S; Hari Prasad, K. B R R

    2014-01-01

    In this study, the impact of four-dimensional data assimilation (FDDA) analysis nudging is examined on the prediction of tropical cyclones (TC) in the Bay of Bengal to determine the optimum period and timescale of nudging. Six TCs (SIDR: November 13

  12. An Empirical Analysis of Stakeholders' Influence on Policy Development: the Role of Uncertainty Handling

    Directory of Open Access Journals (Sweden)

    Rianne M. Bijlsma

    2011-03-01

    Full Text Available Stakeholder participation is advocated widely, but there is little structured, empirical research into its influence on policy development. We aim to further the insight into the characteristics of participatory policy development by comparing it to expert-based policy development for the same case. We describe the process of problem framing and analysis, as well as the knowledge base used. We apply an uncertainty perspective to reveal differences between the approaches and speculate about possible explanations. We view policy development as a continuous handling of substantive uncertainty and process uncertainty, and investigate how the methods of handling uncertainty of actors influence the policy development. Our findings suggest that the wider frame that was adopted in the participatory approach was the result of a more active handling of process uncertainty. The stakeholders handled institutional uncertainty by broadening the problem frame, and they handled strategic uncertainty by negotiating commitment and by including all important stakeholder criteria in the frame. In the expert-based approach, we observed a more passive handling of uncertainty, apparently to avoid complexity. The experts handled institutional uncertainty by reducing the scope and by anticipating windows of opportunity in other policy arenas. Strategic uncertainty was handled by assuming stakeholders' acceptance of noncontroversial measures that balanced benefits and sacrifices. Three other observations are of interest to the scientific debate on participatory policy processes. Firstly, the participatory policy was less adaptive than the expert-based policy. The observed low tolerance for process uncertainty of participants made them opt for a rigorous "once and for all" settling of the conflict. Secondly, in the participatory approach, actors preferred procedures of traceable knowledge acquisition over controversial topics to handle substantive uncertainty. This

  13. EMPIRICAL ANALYSIS OF CRISIS MANAGEMENT PRACTICES IN TOURISM ENTERPRISES IN TERMS OF ORGANIZATIONAL LEARNING

    Directory of Open Access Journals (Sweden)

    Gülsel ÇİFTÇİ

    2017-04-01

    Full Text Available In this study, it is aimed to empirically analyze the crisis management practices of tourism enterprises in terms of their effects on organizational learning. It is aimed to determine the ways that the hotels in Turkey respond to previous crisis, what kind of precautions were taken and whether these crises had taught anything regarding the operation and performance of the enterprise. It is also aimed to contribute to related literature and to offer suggestions that will guide businesses and future studies. Within this context, taking account of 2016 (October data of the Ministry of Culture and Tourism of Turkey, Antalya, which embodies most certified 5-star hotels, and highest bed capacity and number of rooms in resort category, and Istanbul, which embodies most certified 5-star hotels, and highest bed capacity and number of rooms in urban hotels category, are included within the scope of this study. It’s decided to conduct this study on hotels, considering the effects of tourism industry on world economy. In this study, it is aimed to empirically analyze the crisis management practices of tourism enterprises in terms of their effects on organizational learning. It is aimed to determine the ways that the hotels in Turkey respond to previous crisis, what kind of precautions were taken and whether these crises had taught anything regarding the operation and performance of the enterprise. A comprehensive literature review was conducted in the first and second part of this three-part study; The concept of crisis management in the enterprises was examined and the applications on tourism enterprises were discussed. The last part of the study contains information on testing and analyzing hypotheses. The data obtained as a result of the questionnaires were analyzed in SPSS (Statistical Package for Social Sciences and LISREL (LInear Structural RELationships program. A Pearson Correlation analysis was conducted to examine the relationship between

  14. Site suitability analysis for Bay scallop aquaculture and implications for sustainable fisheries management in the Ha Long Bay archipelago, northern Vietnam

    Directory of Open Access Journals (Sweden)

    Pham Thi Khanh

    2013-01-01

    Full Text Available Mollusc culture if properly managed, may help decrease capture fisheries over-exploitation in Vietnam, and possibly become an alternative income for local fishermen. The definition and characterization of zones suitable for aquaculture is pivotal for its success and sustainable development, and this study aims at determining the suitability of Argopecten irradians (Bay scallop culture in the Ha Long Bay Archipelago. Temperature, salinity, chlorophyll-a, total suspended solid and bathymetry, were compiled in an environmental suitability model. Distance of culture sites from landing points and fish markets were instead grouped in an infrastructural suitability model. In both models, developed with Geographic Information Systems, the suitability scores were ranked on a scale from 1 (unsuitable to 6 (very-highly suitable. Results showed that 98 % of the studied area is environmentally suitable for such culture. However, overlaying the infrastructural factors the suitable zone decrease to 38 %. Advantages and disadvantages of two management options were then discussed: (a strengthening fisheries infrastructures or (b developing post harvesting processing plants.

  15. Antibiotics in the coastal environment of the Hailing Bay region, South China Sea: Spatial distribution, source analysis and ecological risks.

    Science.gov (United States)

    Chen, Hui; Liu, Shan; Xu, Xiang-Rong; Zhou, Guang-Jie; Liu, Shuang-Shuang; Yue, Wei-Zhong; Sun, Kai-Feng; Ying, Guang-Guo

    2015-06-15

    In this study, the occurrence and spatial distribution of 38 antibiotics in surface water and sediment samples of the Hailing Bay region, South China Sea, were investigated. Twenty-one, 16 and 15 of 38 antibiotics were detected with the concentrations ranging from antibiotics in the water phase were correlated positively with chemical oxygen demand and nitrate. The source analysis indicated that untreated domestic sewage was the primary source of antibiotics in the study region. Fluoroquinolones showed strong sorption capacity onto sediments due to their high pseudo-partitioning coefficients. Risk assessment indicated that oxytetracycline, norfloxacin and erythromycin-H2O posed high risks to aquatic organisms. Copyright © 2015 Elsevier Ltd. All rights reserved.

  16. Marine ecology conditions at Weda Bay, North Maluku based on statistical analysis on distribution of recent foraminifera

    Directory of Open Access Journals (Sweden)

    Kurniasih Anis

    2017-01-01

    Full Text Available Analysis of foraminifera in geology,usually being used to find the age of rocks/ sediments and depositional environment. In this study, recent foraminifera was used not only to determinethe sedimentary environment,but also to estimate the ecological condition of the water through a statistical approach.Analysis was performed quantitatively in 10 surface seabed sediment samples in Weda Bay North Maluku. The analysis includes dominance (Sympson Index, diversity and evenness (Shannon Index, and the ratio of planktonic -benthic. The results were shown in the plotting diagram of M-R-T (Miliolid-Rotalid-Textularid to determine the depositional environment. Quantitative analysis was performed using Past software (paleontological version Statistic 1:29.The analysis result showed there was no domination of certain taxon with a moderate degree of evenness and stable communities and considerably a moderate diversity. The results of this analysis indicated that research area had a stable water conditions with the optimum level of carbonate content, oxygen supply, salinity, and temperature. The ratio of planktonic and benthic indicate the relative depth, which was deeper the water increased the percentage of planktonic foraminifera. Based on M-R-T diagram showed the distribution of sediment deposited on exposed carbonate (carbonate platform environment with normal saline.

  17. Modeling fates and impacts for bio-economic analysis of hypothetical oil spill scenarios in San Francisco Bay

    International Nuclear Information System (INIS)

    French McCay, D.; Whittier, N.; Sankaranarayanan, S.; Jennings, J.; Etkin, D.S.

    2002-01-01

    The oil spill risks associated with four submerged rock pinnacles near Alcatraz Island in San Francisco Bay are being evaluated by the United States Army Corps of Engineers. Oil spill modeling has been conducted for a hypothetical oil spill to determine biological impacts, damages to natural resources and response costs. The scenarios are hypothetical vessel grounding on the pinnacles. The SIMAP modeling software by the Applied Science Associates was used to model 3 spill sizes (20, 50 and 95 percentile by volume) and 4 types of oil (gasoline, diesel, heavy fuel oil, and crude oil). The frequency distribution of oil fates and impacts was determined by first running each scenario in stochastic mode. The oil fates and biological effects of the spills were the focus of this paper. It was shown that diesel and crude oil spills would have greater impacts in the water column than heavy fuel or gasoline because gasoline is more volatile and less toxic and because heavy oil spills would be small in volume. It was determined that the major impacts and damage to birds would be low due to the high dilution potential of the bay. It was also noted that dispersants would be very effective in reducing impacts on wildlife and the shoreline. These results are being used to evaluate the cost-benefit analysis of removing the rocks versus the risk of an oil spill. The work demonstrates a statistically quantifiable method to estimate potential impacts that could be used in ecological risk assessment and cost-benefit analysis. 15 refs., 13 tabs., 11 figs

  18. Cycling empirical antibiotic therapy in hospitals: meta-analysis and models.

    Directory of Open Access Journals (Sweden)

    Pia Abel zur Wiesch

    2014-06-01

    Full Text Available The rise of resistance together with the shortage of new broad-spectrum antibiotics underlines the urgency of optimizing the use of available drugs to minimize disease burden. Theoretical studies suggest that coordinating empirical usage of antibiotics in a hospital ward can contain the spread of resistance. However, theoretical and clinical studies came to different conclusions regarding the usefulness of rotating first-line therapy (cycling. Here, we performed a quantitative pathogen-specific meta-analysis of clinical studies comparing cycling to standard practice. We searched PubMed and Google Scholar and identified 46 clinical studies addressing the effect of cycling on nosocomial infections, of which 11 met our selection criteria. We employed a method for multivariate meta-analysis using incidence rates as endpoints and find that cycling reduced the incidence rate/1000 patient days of both total infections by 4.95 [9.43-0.48] and resistant infections by 7.2 [14.00-0.44]. This positive effect was observed in most pathogens despite a large variance between individual species. Our findings remain robust in uni- and multivariate metaregressions. We used theoretical models that reflect various infections and hospital settings to compare cycling to random assignment to different drugs (mixing. We make the realistic assumption that therapy is changed when first line treatment is ineffective, which we call "adjustable cycling/mixing". In concordance with earlier theoretical studies, we find that in strict regimens, cycling is detrimental. However, in adjustable regimens single resistance is suppressed and cycling is successful in most settings. Both a meta-regression and our theoretical model indicate that "adjustable cycling" is especially useful to suppress emergence of multiple resistance. While our model predicts that cycling periods of one month perform well, we expect that too long cycling periods are detrimental. Our results suggest that

  19. Risk and protective factors of internet addiction: a meta-analysis of empirical studies in Korea.

    Science.gov (United States)

    Koo, Hoon Jung; Kwon, Jung-Hye

    2014-11-01

    A meta-analysis of empirical studies performed in Korea was conducted to systematically investigate the associations between the indices of Internet addiction (IA) and psychosocial variables. Systematic literature searches were carried out using the Korean Studies Information Service System, Research Information Sharing Service, Science Direct, Google Scholar, and references in review articles. The key words were Internet addiction, (Internet) game addiction, and pathological, problematic, and excessive Internet use. Only original research papers using Korean samples published from 1999 to 2012 and officially reviewed by peers were included for analysis. Ninety-five studies meeting the inclusion criteria were identified. The magnitude of the overall effect size of the intrapersonal variables associated with internet addiction was significantly higher than that of interpersonal variables. Specifically, IA demonstrated a medium to strong association with "escape from self" and "self-identity" as self-related variables. "Attention problem", "self-control", and "emotional regulation" as control and regulation-relation variables; "addiction and absorption traits" as temperament variables; "anger" and "aggression" as emotion and mood and variables; "negative stress coping" as coping variables were also associated with comparably larger effect sizes. Contrary to our expectation, the magnitude of the correlations between relational ability and quality, parental relationships and family functionality, and IA were found to be small. The strength of the association between IA and the risk and protective factors was found to be higher in younger age groups. The findings highlight a need for closer examination of psychosocial factors, especially intrapersonal variables when assessing high-risk individuals and designing intervention strategies for both general IA and Internet game addiction.

  20. Risk and Protective Factors of Internet Addiction: A Meta-Analysis of Empirical Studies in Korea

    Science.gov (United States)

    Koo, Hoon Jung

    2014-01-01

    Purpose A meta-analysis of empirical studies performed in Korea was conducted to systematically investigate the associations between the indices of Internet addiction (IA) and psychosocial variables. Materials and Methods Systematic literature searches were carried out using the Korean Studies Information Service System, Research Information Sharing Service, Science Direct, Google Scholar, and references in review articles. The key words were Internet addiction, (Internet) game addiction, and pathological, problematic, and excessive Internet use. Only original research papers using Korean samples published from 1999 to 2012 and officially reviewed by peers were included for analysis. Ninety-five studies meeting the inclusion criteria were identified. Results The magnitude of the overall effect size of the intrapersonal variables associated with internet addiction was significantly higher than that of interpersonal variables. Specifically, IA demonstrated a medium to strong association with "escape from self" and "self-identity" as self-related variables. "Attention problem", "self-control", and "emotional regulation" as control and regulation-relation variables; "addiction and absorption traits" as temperament variables; "anger" and "aggression" as emotion and mood and variables; "negative stress coping" as coping variables were also associated with comparably larger effect sizes. Contrary to our expectation, the magnitude of the correlations between relational ability and quality, parental relationships and family functionality, and IA were found to be small. The strength of the association between IA and the risk and protective factors was found to be higher in younger age groups. Conclusion The findings highlight a need for closer examination of psychosocial factors, especially intrapersonal variables when assessing high-risk individuals and designing intervention strategies for both general IA and Internet game addiction. PMID:25323910

  1. Self-image and Missions of Universities: An Empirical Analysis of Japanese University Executives

    Directory of Open Access Journals (Sweden)

    Masataka Murasawa

    2014-05-01

    Full Text Available As universities in Japan gain institutional autonomy in managing internal organizations, independent of governmental control as a result of deregulation and decentralizing reforms, it is becoming increasingly important that the executives and administrators of each institution demonstrate clear and strategic vision and ideas to external stakeholders, in order to maintain financially robust operations and attractiveness of their institutions. This paper considers whether and how the self-image, mission, and vision of universities are perceived and internalized by the management of Japanese universities and empirically examines the determinants of shaping such individual perceptions. The result of our descriptive analysis indicates that the recent government policy to internationalize domestic universities has not shown much progress in the view of university executives in Japan. An increasing emphasis on the roles of serving local needs in research and teaching is rather pursued by these universities. Individual perceptions among Japanese university executives with regard to the missions and functional roles to be played by their institutions are influenced by managerial rank as well as the field of their academic training. A multiple regression analysis reveals that the economy of scale brought out by an expanded undergraduate student enrollment gradually slows down and decelerate executive perceptions, with regard to establishing a globally recognized status in research and teaching. Moreover, Japanese universities with a small proportion of graduate student enrollment, likely opted out from competitions for gaining a greater respect in the global community of higher education between 2005 and 2012. Finally, the management in universities granted with the same amount of external research funds in both studied years responded more passively in 2012 than did in 2005 on the self-assessment of whether having established a status as a global

  2. Distribution and behavior of major and trace elements in Tokyo Bay, Mutsu Bay and Funka Bay marine sediments

    International Nuclear Information System (INIS)

    Honda, Teruyuki; Kimura, Ken-ichiro

    2003-01-01

    Fourteen major and trace elements in marine sediment core samples collected from the coasts along eastern Japan, i.e. Tokyo Bay (II) (the recess), Tokyo Bay (IV) (the mouth), Mutsu Bay and Funka Bay and the Northwest Pacific basin as a comparative subject were determined by the instrumental neutron activation analysis (INAA). The sedimentation rates and sedimentary ages were calculated for the coastal sediment cores by the 210 Pb method. The results obtained in this study are summarized as follows: (1) Lanthanoid abundance patterns suggested that the major origin of the sediments was terrigenous material. La*/Lu* and Ce*/La* ratios revealed that the sediments from Tokyo Bay (II) and Mutsu Bay more directly reflected the contribution from river than those of other regions. In addition, the Th/Sc ratio indicated that the coastal sediments mainly originated in the materials from the volcanic island-arcs, Japanese islands, whereas those from the Northwest Pacific mainly from the continent. (2) The correlation between the Ce/U and Th/U ratios with high correlation coefficients of 0.920 to 0.991 indicated that all the sediments from Tokyo Bay (II) and Funka Bay were in reducing conditions while at least the upper sediments from Tokyo Bay (IV) and Mutsu Bay were in oxidizing conditions. (3) It became quite obvious that the sedimentation mechanism and the sedimentation environment at Tokyo Bay (II) was different from those at Tokyo Bay (IV), since the sedimentation rate at Tokyo Bay (II) was approximately twice as large as that at Tokyo Bay (IV). The sedimentary age of the 5th layer (8∼10 cm in depth) from Funka Bay was calculated at approximately 1940∼50, which agreed with the time, 1943∼45 when Showa-shinzan was formed by the eruption of the Usu volcano. (author)

  3. What is the carrying capacity for fish in the ocean? A meta analysis of population dynamics of North Atlantic cod

    DEFF Research Database (Denmark)

    Myers, R.A.; MacKenzie, Brian; Bowen, K.G.

    2001-01-01

    used empirical Bayes techniques to estimate the maximum reproductive rate and carrying capacity of each stock. In all cases, the empirical Bayes estimates were biologically reasonable, whereas a stock by stock analysis occasionally yielded nonsensical parameter estimates (e.g., infinite values). Our...... analysis showed that the carrying capacity per unit area varied by more than 20-fold among populations and that much of this variation was related to temperature. That is, the carrying capacity per square kilometre declines as temperature increases....

  4. Voluntary Participation in Community Economic Development in Canada: An Empirical Analysis

    Directory of Open Access Journals (Sweden)

    Laura Lamb

    2011-01-01

    Full Text Available This article is an empirical analysis of an individual's decision to participate in community economic development (CED initiatives in Canada. The objective of the analysis is to better understand how individuals make decisions to volunteer time toward CED initiatives and to determine whether the determinants of participation in CED are unique when compared to those of participation in volunteer activities in general. The dataset employed is Statistics Canada's 2004 Canada Survey of Giving, Volunteering and Participating (CSGVP. To date, there has been no prior econometric analysis of the decision to participate in community economic development initiatives in Canada. Results suggest a role for both public policymakers and practitioners in influencing participation in CED. / Cet article constitue une analyse empirique du processus de prise de décision chez les individus en ce qui a trait à la participation aux initiatives canadiennes de développement économique communautaire (DÉC. Le but de l'analyse est de mieux comprendre la façon dont les individus prennent la décision de consacrer du temps au bénévolat dans les initiatives de DÉC. Elle sert aussi à trancher la question de savoir si les facteurs de participation aux initiatives de développement économique communautaire sont uniques ou communs à la participation à des activités bénévoles en général. Les données employées dans le cadre de cette analyse sont puisées de l'Enquête canadienne sur le don, le bénévolat et la participation effectuée par Statistique Canada en 2004. À ce jour, aucune analyse économétrique n'a été menée sur la décision de participer aux initiatives canadiennes de DÉC. Les résultats suggèrent que les responsables de l'élaboration des politiques ainsi que les praticiens influencent tous deux la participation aux initiatives de DÉC.

  5. Analysis of surface soil moisture patterns in agricultural landscapes using Empirical Orthogonal Functions

    Directory of Open Access Journals (Sweden)

    W. Korres

    2010-05-01

    Full Text Available Soil moisture is one of the fundamental variables in hydrology, meteorology and agriculture. Nevertheless, its spatio-temporal patterns in agriculturally used landscapes that are affected by multiple natural (rainfall, soil, topography etc. and agronomic (fertilisation, soil management etc. factors are often not well known. The aim of this study is to determine the dominant factors governing the spatio-temporal patterns of surface soil moisture in a grassland and an arable test site that are located within the Rur catchment in Western Germany. Surface soil moisture (0–6 cm was measured in an approx. 50×50 m grid during 14 and 17 measurement campaigns (May 2007 to November 2008 in both test sites. To analyse the spatio-temporal patterns of surface soil moisture, an Empirical Orthogonal Function (EOF analysis was applied and the results were correlated with parameters derived from topography, soil, vegetation and land management to link the patterns to related factors and processes. For the grassland test site, the analysis resulted in one significant spatial structure (first EOF, which explained 57.5% of the spatial variability connected to soil properties and topography. The statistical weight of the first spatial EOF is stronger on wet days. The highest temporal variability can be found in locations with a high percentage of soil organic carbon (SOC. For the arable test site, the analysis resulted in two significant spatial structures, the first EOF, which explained 38.4% of the spatial variability, and showed a highly significant correlation to soil properties, namely soil texture and soil stone content. The second EOF, which explained 28.3% of the spatial variability, is linked to differences in land management. The soil moisture in the arable test site varied more strongly during dry and wet periods at locations with low porosity. The method applied is capable of identifying the dominant parameters controlling spatio-temporal patterns of

  6. An empirical analysis of ERP adoption by oil and gas firms

    Science.gov (United States)

    Romero, Jorge

    2005-07-01

    Despite the growing popularity of enterprise-resource-planning (ERP) systems for the information technology infrastructure of large and medium-sized businesses, there is limited empirical evidence on the competitive benefits of ERP implementations. Case studies of individual firms provide insights but do not provide sufficient evidence to draw reliable inferences and cross-sectional studies of firms in multiple industries provide a broad-brush perspective of the performance effects associated with ERP installations. To narrow the focus to a specific competitive arena, I analyze the impact of ERP adoption on various dimensions of performance for firms in the Oil and Gas Industry. I selected the Oil and Gas Industry because several companies installed a specific type of ERP system, SAP R/3, during the period from 1990 to 2002. In fact, SAP was the dominant provider of enterprise software to oil and gas companies during this period. I evaluate performance of firms that implemented SAP R/3 relative to firms that did not adopt ERP systems in the pre-implementation, implementation and post-implementation periods. My analysis takes two different approaches, the first from a financial perspective and the second from a strategic perspective. Using the Sloan (General Motors) model commonly applied in financial statement analysis, I examine changes in performance for ERP-adopting firms versus non-adopting firms along the dimensions of asset utilization and return on sales. Asset utilization is more closely aligned with changes in leanness of operations, and return on sales is more closely aligned with customer-value-added. I test hypotheses related to the timing and magnitude of the impact of ERP implementation with respect to leanness of operations and customer value added. I find that SAP-adopting companies performed relatively better in terms of asset turnover than non-SAP-adopting companies during both the implementation and post-implementation periods and that SAP

  7. Empirical analysis of vegetation dynamics and the possibility of a catastrophic desertification transition.

    Science.gov (United States)

    Weissmann, Haim; Kent, Rafi; Michael, Yaron; Shnerb, Nadav M

    2017-01-01

    The process of desertification in the semi-arid climatic zone is considered by many as a catastrophic regime shift, since the positive feedback of vegetation density on growth rates yields a system that admits alternative steady states. Some support to this idea comes from the analysis of static patterns, where peaks of the vegetation density histogram were associated with these alternative states. Here we present a large-scale empirical study of vegetation dynamics, aimed at identifying and quantifying directly the effects of positive feedback. To do that, we have analyzed vegetation density across 2.5 × 106 km2 of the African Sahel region, with spatial resolution of 30 × 30 meters, using three consecutive snapshots. The results are mixed. The local vegetation density (measured at a single pixel) moves towards the average of the corresponding rainfall line, indicating a purely negative feedback. On the other hand, the chance of spatial clusters (of many "green" pixels) to expand in the next census is growing with their size, suggesting some positive feedback. We show that these apparently contradicting results emerge naturally in a model with positive feedback and strong demographic stochasticity, a model that allows for a catastrophic shift only in a certain range of parameters. Static patterns, like the double peak in the histogram of vegetation density, are shown to vary between censuses, with no apparent correlation with the actual dynamical features. Our work emphasizes the importance of dynamic response patterns as indicators of the state of the system, while the usefulness of static modality features appears to be quite limited.

  8. A comparative empirical analysis of statistical models for evaluating highway segment crash frequency

    Directory of Open Access Journals (Sweden)

    Bismark R.D.K. Agbelie

    2016-08-01

    Full Text Available The present study conducted an empirical highway segment crash frequency analysis on the basis of fixed-parameters negative binomial and random-parameters negative binomial models. Using a 4-year data from a total of 158 highway segments, with a total of 11,168 crashes, the results from both models were presented, discussed, and compared. About 58% of the selected variables produced normally distributed parameters across highway segments, while the remaining produced fixed parameters. The presence of a noise barrier along a highway segment would increase mean annual crash frequency by 0.492 for 88.21% of the highway segments, and would decrease crash frequency for 11.79% of the remaining highway segments. Besides, the number of vertical curves per mile along a segment would increase mean annual crash frequency by 0.006 for 84.13% of the highway segments, and would decrease crash frequency for 15.87% of the remaining highway segments. Thus, constraining the parameters to be fixed across all highway segments would lead to an inaccurate conclusion. Although, the estimated parameters from both models showed consistency in direction, the magnitudes were significantly different. Out of the two models, the random-parameters negative binomial model was found to be statistically superior in evaluating highway segment crashes compared with the fixed-parameters negative binomial model. On average, the marginal effects from the fixed-parameters negative binomial model were observed to be significantly overestimated compared with those from the random-parameters negative binomial model.

  9. Financial incentives and psychiatric services in Australia: an empirical analysis of three policy changes.

    Science.gov (United States)

    Doessel, D P; Scheurer, Roman W; Chant, David C; Whiteford, Harvey

    2007-01-01

    Australia has a national, compulsory and universal health insurance scheme, called Medicare. In 1996 the Government changed the Medicare Benefit Schedule Book in such a way as to create different financial incentives for consumers or producers of out-of-hospital private psychiatric services, once an individual consumer had received 50 such services in a 12-month period. The Australian Government introduced a new Item (319) to cover some special cases that were affected by the policy change. At the same time, the Commonwealth introduced a 'fee-freeze' for all medical services. The purpose of this study is two-fold. First, it is necessary to describe the three policy interventions (the constraints on utilization, the operation of the new Item and the general 'fee-freeze'.) The new Item policy was essentially a mechanism to 'dampen' the effect of the 'constraint' policy, and these two policy changes will be consequently analysed as a single intervention. The second objective is to evaluate the policy intervention in terms of the (stated) Australian purpose of reducing utilization of psychiatric services, and thus reducing financial outlays. Thus, it is important to separate out the different effects of the three policies that were introduced at much the same time in November 1996 and January 1997. The econometric results indicate that the composite policy change (constraining services and the new 319 Item) had a statistically significant effect. The analysis of the Medicare Benefit (in constant prices) indicates that the 'fee-freeze' policy also had a statistically significant effect. This enables separate determination of the several policy changes. In fact, the empirical results indicate that the Commonwealth Government underestimated the 'savings' that would arise from the 'constraint' policy.

  10. Cloak of compassion, or evidence of elitism? An empirical analysis of white coat ceremonies.

    Science.gov (United States)

    Karnieli-Miller, Orit; Frankel, Richard M; Inui, Thomas S

    2013-01-01

    White coat ceremonies (WCCs) are widely prevalent as a celebration of matriculation in medical schools. Critics have questioned whether these ceremonies can successfully combine the themes of professionalism and humanism, as well as whether the white coat is an appropriate symbol. This study aimed to add a process of empirical assessment to the discussion of these criticisms by analysing the content and messages communicated during these ceremonies. Multiple qualitative methods were used to discern the core meanings expressed in a sample of 18 ceremonies through the analysis of artefacts, words, phrases, statements and narratives. Out of a stratified random sample of 25 US schools of medicine conducting WCCs in 2009, 18 schools submitted video, audio and written materials. All ceremonies followed the same general format, but varied in their content, messages and context. Ceremonies included five principal descriptions of what is symbolised by the white coat, including: commitment to humanistic professional care; a reminder of obligations and privileges; power; the student's need to 'grow', and the white coat as a mantle. Statements about obligations were made three times more frequently than statements about privileges. Key words or phrases in WCCs mapped to four domains: professionalism; morality; humanism, and spirituality. Spoken narratives focused on humility and generosity. The WCCs studied did not celebrate the status of an elite class, but marked the beginning of educational, personal and professional formation processes and urged matriculants to develop into doctors 'worthy of trust'. The ceremonies centred on the persons entering the vocation, who were invited to affirm its calling and obligations by donning a symbolic garb, and to join an ancient and modern tradition of healing and immersion in their community. The schools' articulated construct of the white coat situated it as a symbol of humanism. This study's findings may clarify and guide schools

  11. Agglomeration effects in the labour market: an empirical analysis for Italy

    Directory of Open Access Journals (Sweden)

    Marusca De Castris

    2013-05-01

    Full Text Available Extensive and persistent geographic variability of the unemployment rate within the same region has been attributed to various causes. Some theories identify the “thickness” of markets as the source of positive externalities affecting labour market by improving the ability to match the skills requested by firms with those offered by workers. A recent paper by Gan and Zhang (2006 empirically confirms this hypothesis for the US labour markets. Agglomeration can be defined as aggregation of people, basically measured by city size, or as aggregation of firms, measured by cluster size (employment or number of plants. However, the population location and the industrial location are by far more similar in United States than in Europe and in Italy. Our paper aims to evaluate the effects of agglomeration on the local unemployment rate. The new methodological contribution of the study is the identification of both urban and industrial cluster agglomeration effects, using a wide set of control variables. Adjusting the system for the effects of sectorial and size shocks, as well as those relating to geographic structure and policy interventions, the results of our analysis differ from that for the United States. The study stresses the presence of negative and significant urbanisation externalities. We obtain, instead, positive effects concerning the geographic agglomeration of firms, and their thickness, in a specific area. Furthermore, positive and significant effects can be found in local systems with features of a district. Finally, the model distinguishes the negative effects of urban agglomerations (in terms of population density from positive firm’s agglomerations (in terms of density of local units.

  12. Temporal associations between weather and headache: analysis by empirical mode decomposition.

    Directory of Open Access Journals (Sweden)

    Albert C Yang

    Full Text Available BACKGROUND: Patients frequently report that weather changes trigger headache or worsen existing headache symptoms. Recently, the method of empirical mode decomposition (EMD has been used to delineate temporal relationships in certain diseases, and we applied this technique to identify intrinsic weather components associated with headache incidence data derived from a large-scale epidemiological survey of headache in the Greater Taipei area. METHODOLOGY/PRINCIPAL FINDINGS: The study sample consisted of 52 randomly selected headache patients. The weather time-series parameters were detrended by the EMD method into a set of embedded oscillatory components, i.e. intrinsic mode functions (IMFs. Multiple linear regression models with forward stepwise methods were used to analyze the temporal associations between weather and headaches. We found no associations between the raw time series of weather variables and headache incidence. For decomposed intrinsic weather IMFs, temperature, sunshine duration, humidity, pressure, and maximal wind speed were associated with headache incidence during the cold period, whereas only maximal wind speed was associated during the warm period. In analyses examining all significant weather variables, IMFs derived from temperature and sunshine duration data accounted for up to 33.3% of the variance in headache incidence during the cold period. The association of headache incidence and weather IMFs in the cold period coincided with the cold fronts. CONCLUSIONS/SIGNIFICANCE: Using EMD analysis, we found a significant association between headache and intrinsic weather components, which was not detected by direct comparisons of raw weather data. Contributing weather parameters may vary in different geographic regions and different seasons.

  13. Capital structure and value firm: an empirical analysis of abnormal returns

    Directory of Open Access Journals (Sweden)

    Faris Nasif AL-SHUBIRI

    2010-12-01

    Full Text Available This study investigates whether capital structure is value relevant for the equity investor. In this sense, the paper links empirical corporate finance issues with investment analysis. This study also integrates the Miller-Modigliani (MM framework (1958 into an investment approach by estimating abnormal returns on leverage portfolios in the time-series for different risk classes. For most risk classes, abnormal returns decline in firm leverage. Descriptive statistics, simple and multiple regressions are used to test the hold indicator significance. The results reflect that the designed measures are the negative relationship between returns and leverage could also be due to the market’s pricing of the firm’s ability to raise funds if need be. Further avenues for research in this area include examining the stock return performance of companies based on the changes in leverage of the firms relative to their risk classes. It would be particularly noteworthy to examine the rate at which the information content of said changes is incorporated in the share prices of companies as well as in their long run returns This study encompasses all non-financial firms across the five sectors that cover all the various classes of risk. This study investigates neither the determinants of multiple capital structure choices nor changes in capital structures over time. Our main goal is to explore the effect of capital structure on cumulative abnormal returns. This study also examine a firm’s cumulative average abnormal returns by measuring leverage at the firm level and at the average level for the firm’s industry. And also examine other factors, such as size, price earnings, market-to-book and betas.

  14. Microbiome analysis and detection of pathogenic bacteria of Penaeus monodon from Jakarta Bay and Bali.

    Science.gov (United States)

    Oetama, Vincensius S P; Hennersdorf, Philipp; Abdul-Aziz, Muslihudeen A; Mrotzek, Grit; Haryanti, Haryanti; Saluz, Hans Peter

    2016-09-30

    Penaeus monodon, the Asian black tiger shrimp is one of the most widely consumed marine crustaceans worldwide. In this study, we examine and compare the fecal microbiota of P. monodon from highly polluted waters around Jakarta Bay, with those of less polluted waters of Bali. Using next generation sequencing techniques, we identified potential bacterial pathogens and common viral diseases of shrimp. Proteobacteria (96.08%) was found to be the most predominant phylum, followed by Bacteriodetes (2.32%), Fusobacteria (0.96%), and Firmicutes (0.53%). On the order level, Vibrionales (66.20%) and Pseudoaltermonadales (24.81%) were detected as predominant taxa. qPCR profiling was used as a confirmatory step and further revealed Vibrio alginolyticus and Photobacterium damselae as two potential pathogenic species present in most of the samples. In addition, viral diseases for shrimp were discovered among the samples, WSSV in Jakarta free-living samples, YHV in Bali free-living samples and IHHNV in both. Copyright © 2016 Elsevier Ltd. All rights reserved.

  15. Myocardial imaging with 201Tl: an analysis of clinical usefulness based on Bayes' theorem

    International Nuclear Information System (INIS)

    Hamilton, G.W.; Trobaugh, G.B.; Ritchie, J.L.; Gould, K.L.; DeRouen, T.A.; Williams, D.L.

    1978-01-01

    Rest-exercise thallium-201 ( 201 Tl) myocardial imaging and rest-exercise electrocardiography were performed in 137 patients with suspected coronary artery disease (CAD). The final diagnosis of coronary disease was made by arteriography. Sensitivity and specificity for the ECG and thallium studies alone or combined were then determined. Based on these data, the posttest probability of CAD with a normal or abnormal test was calculated using Bayes' theorem for disease prevalences ranging from 1% to 99%. The difference between the probability of disease with a normal test and the probability of disease with an abnormal test was also calculated for each prevalence range. The results demonstrate that 201 Tl imaging discriminates between disease absence or presence better than does the ECG. However, both the ECG and thallium studies provide rather poor discrimination between disease and no disease when the disease prevalence is low (less than 0.20) or high (greater than 0.70). Because of this characteristic, it is unlikely that screening tests for CAD will prove useful unless the disease prevalence in the group under study is in the moderate (0.20 to 0.70) range

  16. The money creation process: A theoretical and empirical analysis for the US

    OpenAIRE

    Levrero, Enrico Sergio; Deleidi, Matteo

    2017-01-01

    The aim of this paper is to assess – on both theoretical and empirical grounds – the two main views regarding the money creation process,namely the endogenous and exogenous money approaches. After analysing the main issues and the related empirical literature, we will apply a VAR and VECM methodology to the United States in the period 1959-2016 to assess the causal relationship between a number of critical variables that are supposed to determine the money supply, i.e., the monetary base, ban...

  17. Comparative Phytochemical Analysis of Chinese and Bay Starvine (Schisandra spp.): Potential for Development as a New Dietary Supplement Ingredient.

    Science.gov (United States)

    Lyles, James T; Tyler, Paula; Bradbury, E Jane; Nelson, Kate; Brown, Carl F; Pierce, Stefanie T; Quave, Cassandra L

    2017-11-02

    Schisandra chinensis (Chinese starvine) is a popular dietary supplement with a rich history of use in traditional Chinese medicine. Schisandra glabra (bay starvine) is the only North American representative of the genus, and little is known about its history of traditional use, chemistry, and potential biological activity. In this study, we conducted comparative high-performance liquid chromatography-diode array detector (HPLC-DAD) analysis on S. glabra and S. chinensis fruits. Additional characterization of S. glabra was performed by liquid chromatography-Fourier transform mass spectrometry (LC-FTMS). Quantitative analysis of four bioactive marker compounds revealed that S. glabra does not have statistically higher levels of schisandrin A or schisandrol B than S. chinensis. S. glabra has lower levels of schisandrol A and γ-schisandrin. Total phenolic contents of the two species' fruits were not statistically different. S. glabra had higher total tannin content than S. chinensis. We discuss the relevance of this analytical analysis to the study of S. glabra as a potential dietary supplement ingredient and give specific consideration to the conservation challenges involved in commercially developing a regionally threatened species, even in semicultivated conditions.

  18. Living network meta-analysis compared with pairwise meta-analysis in comparative effectiveness research: empirical study.

    Science.gov (United States)

    Nikolakopoulou, Adriani; Mavridis, Dimitris; Furukawa, Toshi A; Cipriani, Andrea; Tricco, Andrea C; Straus, Sharon E; Siontis, George C M; Egger, Matthias; Salanti, Georgia

    2018-02-28

    To examine whether the continuous updating of networks of prospectively planned randomised controlled trials (RCTs) ("living" network meta-analysis) provides strong evidence against the null hypothesis in comparative effectiveness of medical interventions earlier than the updating of conventional, pairwise meta-analysis. Empirical study of the accumulating evidence about the comparative effectiveness of clinical interventions. Database of network meta-analyses of RCTs identified through searches of Medline, Embase, and the Cochrane Database of Systematic Reviews until 14 April 2015. Network meta-analyses published after January 2012 that compared at least five treatments and included at least 20 RCTs. Clinical experts were asked to identify in each network the treatment comparison of greatest clinical interest. Comparisons were excluded for which direct and indirect evidence disagreed, based on side, or node, splitting test (Pmeta-analyses were performed for each selected comparison. Monitoring boundaries of statistical significance were constructed and the evidence against the null hypothesis was considered to be strong when the monitoring boundaries were crossed. A significance level was defined as α=5%, power of 90% (β=10%), and an anticipated treatment effect to detect equal to the final estimate from the network meta-analysis. The frequency and time to strong evidence was compared against the null hypothesis between pairwise and network meta-analyses. 49 comparisons of interest from 44 networks were included; most (n=39, 80%) were between active drugs, mainly from the specialties of cardiology, endocrinology, psychiatry, and rheumatology. 29 comparisons were informed by both direct and indirect evidence (59%), 13 by indirect evidence (27%), and 7 by direct evidence (14%). Both network and pairwise meta-analysis provided strong evidence against the null hypothesis for seven comparisons, but for an additional 10 comparisons only network meta-analysis provided

  19. Living network meta-analysis compared with pairwise meta-analysis in comparative effectiveness research: empirical study

    Science.gov (United States)

    Nikolakopoulou, Adriani; Mavridis, Dimitris; Furukawa, Toshi A; Cipriani, Andrea; Tricco, Andrea C; Straus, Sharon E; Siontis, George C M; Egger, Matthias

    2018-01-01

    Abstract Objective To examine whether the continuous updating of networks of prospectively planned randomised controlled trials (RCTs) (“living” network meta-analysis) provides strong evidence against the null hypothesis in comparative effectiveness of medical interventions earlier than the updating of conventional, pairwise meta-analysis. Design Empirical study of the accumulating evidence about the comparative effectiveness of clinical interventions. Data sources Database of network meta-analyses of RCTs identified through searches of Medline, Embase, and the Cochrane Database of Systematic Reviews until 14 April 2015. Eligibility criteria for study selection Network meta-analyses published after January 2012 that compared at least five treatments and included at least 20 RCTs. Clinical experts were asked to identify in each network the treatment comparison of greatest clinical interest. Comparisons were excluded for which direct and indirect evidence disagreed, based on side, or node, splitting test (Pmeta-analysis. The frequency and time to strong evidence was compared against the null hypothesis between pairwise and network meta-analyses. Results 49 comparisons of interest from 44 networks were included; most (n=39, 80%) were between active drugs, mainly from the specialties of cardiology, endocrinology, psychiatry, and rheumatology. 29 comparisons were informed by both direct and indirect evidence (59%), 13 by indirect evidence (27%), and 7 by direct evidence (14%). Both network and pairwise meta-analysis provided strong evidence against the null hypothesis for seven comparisons, but for an additional 10 comparisons only network meta-analysis provided strong evidence against the null hypothesis (P=0.002). The median time to strong evidence against the null hypothesis was 19 years with living network meta-analysis and 23 years with living pairwise meta-analysis (hazard ratio 2.78, 95% confidence interval 1.00 to 7.72, P=0.05). Studies directly comparing

  20. Essays in economics of energy efficiency in residential buildings - An empirical analysis[Dissertation 17157

    Energy Technology Data Exchange (ETDEWEB)

    Jakob, M.

    2007-07-01

    Energy efficiency in the building sector is a key element of cost-effective climate change and energy policies in most countries throughout the world. (...) However, a gap between the cost-effectiveness of energy efficiency measures, their benefits, the necessities from a societal point of view on the one hand and the actual investments in the building stock - particularly in the moment of re-investment and refurbishing - on the other hand became more and more evident. The research questions that arose against this background were whether this gap and the low energy efficiency levels and rates could be confirmed empirically and if yes, how the gap could be explained and how it could be overcome by adequate policy measures. To address these questions, the multi-functional character of buildings (i.e. well conditioned and quiet living rooms and working space) had to be considered. Associated benefits arise on the societal level (ancillary benefits) and on the private level (co-benefits), the latter being increasingly addressed by different building labels such as 'Minergie', 'Passive House', and others. It was assumed that these co-benefits are of economic relevance, but empirical evidence regarding their economic value was missing. Thus, putting these benefits into an appropriate economic appraisal framework was at stake to make use of them in market information and policy instruments, preventing uninformed and biased cost benefit analyses and decisions on the private and on the societal level. The research presented in this PhD thesis had the goal to provide a sound empirical basis about costs and benefits of energy efficiency investments in residential buildings, with a special emphasis on the economic valuation of their co-benefits from a building user perspective (owner-occupiers, purchasers and tenants). In view of long time-horizons in the building sector, the techno-economic dynamics should also be addressed. The results should be useful

  1. Riverine discharges to Chesapeake Bay: Analysis of long-term (1927–2014) records and implications for future flows in the Chesapeake Bay basin

    Science.gov (United States)

    Rice, Karen; Moyer, Douglas; Mills, Aaron L.

    2017-01-01

    The Chesapeake Bay (CB) basin is under a total maximum daily load (TMDL) mandate to reduce nitrogen, phosphorus, and sediment loads to the bay. Identifying shifts in the hydro-climatic regime may help explain observed trends in water quality. To identify potential shifts, hydrologic data (1927–2014) for 27 watersheds in the CB basin were analyzed to determine the relationships among long-term precipitation and stream discharge trends. The amount, frequency, and intensity of precipitation increased from 1910 to 1996 in the eastern U.S., with the observed increases greater in the northeastern U.S. than the southeastern U.S. The CB watershed spans the north-to-south gradient in precipitation increases, and hydrologic differences have been observed in watersheds north relative to watersheds south of the Pennsylvania—Maryland (PA-MD) border. Time series of monthly mean precipitation data specific to each of 27 watersheds were derived from the Precipitation-elevation Regression on Independent Slopes Model (PRISM) dataset, and monthly mean stream-discharge data were obtained from U.S. Geological Survey streamgage records. All annual precipitation trend slopes in the 18 watersheds north of the PA-MD border were greater than or equal to those of the nine south of that border. The magnitude of the trend slopes for 1927–2014 in both precipitation and discharge decreased in a north-to-south pattern. Distributions of the monthly precipitation and discharge datasets were assembled into percentiles for each year for each watershed. Multivariate correlation of precipitation and discharge within percentiles among the groups of northern and southern watersheds indicated only weak associations. Regional-scale average behaviors of trends in the distribution of precipitation and discharge annual percentiles differed between the northern and southern watersheds. In general, the linkage between precipitation and discharge was weak, with the linkage weaker in the northern watersheds

  2. A Systematic Analysis and Synthesis of the Empirical MOOC Literature Published in 2013-2015

    Science.gov (United States)

    Veletsianos, George; Shepherdson, Peter

    2016-01-01

    A deluge of empirical research became available on MOOCs in 2013-2015 and this research is available in disparate sources. This paper addresses a number of gaps in the scholarly understanding of MOOCs and presents a comprehensive picture of the literature by examining the geographic distribution, publication outlets, citations, data collection and…

  3. Libor and Swap Market Models for the Pricing of Interest Rate Derivatives : An Empirical Analysis

    NARCIS (Netherlands)

    de Jong, F.C.J.M.; Driessen, J.J.A.G.; Pelsser, A.

    2000-01-01

    In this paper we empirically analyze and compare the Libor and Swap Market Models, developed by Brace, Gatarek, and Musiela (1997) and Jamshidian (1997), using paneldata on prices of US caplets and swaptions.A Libor Market Model can directly be calibrated to observed prices of caplets, whereas a

  4. An Empirical Analysis of Differences in GDP per Capita and the Role of Human Capital

    Science.gov (United States)

    Sfakianakis, George; Magoutas, Anastasios I.; Georgopoulos, Demosthenes

    2010-01-01

    Using a generalized production function approach and insights from empirical research on the determinants of growth, this paper assesses the relative importance of specific factors in explaining differences in the levels of per capita GDP. Emphasis is placed on education, physical capital accumulation, the share of the public sector in economic…

  5. The impact of category prices on store price image formation : An empirical analysis

    NARCIS (Netherlands)

    Da Silva Lourenço, C.J.; Gijsbrechts, E.; Paap, R.

    2015-01-01

    The authors empirically explore how consumers update beliefs about a store's overall expensiveness. They estimate a learning model of store price image (SPI) formation with the impact of actual prices linked to category characteristics, on a unique dataset combining store visit and purchase

  6. An Empirical Analysis of the Role of the Trading Intensity in Information Dissemination on the NYSE

    NARCIS (Netherlands)

    Spierdijk, L.

    2002-01-01

    Asymmetric information models predict comovements among trade characteristics such as returns, bid-ask spread, and trade volume on one hand and the trading intensity on the other hand.In this paper we investigate empirically the two-sided causality between trade characteristics and trading

  7. Academic Staff Quality in Higher Education: An Empirical Analysis of Portuguese Public Administration Education

    Science.gov (United States)

    Sarrico, Cláudia S.; Alves, André A.

    2016-01-01

    Higher education accreditation frameworks typically consider academic staff quality a key element. This article embarks on an empirical study of what academic staff quality means, how it is measured, and how different aspects of staff quality relate to each other. It draws on the relatively nascent Portuguese experience with study programme…

  8. How Certain are Dutch Households about Future Income? An Empirical Analysis

    NARCIS (Netherlands)

    Das, J.W.M.; Donkers, A.C.D.

    1997-01-01

    The growing literature on precautionary saving clearly indicates the need for measurement of income uncertainty. In this paper we empirically analyze subjective income uncertainty in the Netherlands. Data come from the Dutch VSB panel. We measure income uncertainty directly by asking questions on

  9. Managing Human Resource Capabilities for Sustainable Competitive Advantage: An Empirical Analysis from Indian Global Organisations

    Science.gov (United States)

    Khandekar, Aradhana; Sharma, Anuradha

    2005-01-01

    Purpose: The purpose of this article is to examine the role of human resource capability (HRC) in organisational performance and sustainable competitive advantage (SCA) in Indian global organisations. Design/Methodology/Approach: To carry out the present study, an empirical research on a random sample of 300 line or human resource managers from…

  10. Repatriation Readjustment of International Managers: An Empirical Analysis of HRD Interventions

    Science.gov (United States)

    Osman-Gani, A Ahad M.; Hyder, Akmal S.

    2008-01-01

    Purpose: With increasing interest in overseas business expansion, particularly in the Asia-Pacific region, expatriate management, including repatriation readjustments, has become a critical international human resource development (HRD) issue for multinational enterprises (MNEs). This empirical study therefore aims to investigate the use of HRD…

  11. Empirical analysis of an in-car speed, headway and lane use Advisory system

    NARCIS (Netherlands)

    Schakel, W.J.; Van Arem, B.; Van Lint, J.W.C.

    2014-01-01

    For a recently developed in-car speed, headway and lane use advisory system, this paper investigates empirically advice validity (advice given in correct traffic circumstances), credibility (advice logical to drivers) and frequency. The system has been developed to optimize traffic flow by giving

  12. The value of replicationg the data analysis of an empirical evaluation

    African Journals Online (AJOL)

    The aim of this research was to determine whether the results of an empirical evaluation could be confirmed using a different evaluation method. In this investigation the Qualitative Weight and Sum method used by the researchers Graf and List to evaluate several free and open source e-learning software platforms, were ...

  13. SENSITIVITY ANALYSIS IN FLEXIBLE PAVEMENT PERFORMANCE USING MECHANISTIC EMPIRICAL METHOD (CASE STUDY: CIREBON–LOSARI ROAD SEGMENT, WEST JAVA

    Directory of Open Access Journals (Sweden)

    E. Samad

    2012-02-01

    Full Text Available Cirebon – Losari flexible pavement which is located on the North Coast of Java, Indonesia, is in the severe damage condition caused by overloading vehicles passing the road. The need for developing improved pavement design and analysis methods is very necessary. The increment of loads and quality of material properties can be evaluated through Mechanistic-Empirical (M-E method. M-E software like KENLAYER has been developed to facilitate the transition from empirical to mechanistic design methods. From the KENLAYER analysis, it can be concluded that the effect of overloading to the pavement structure performance is difficult to minimize even though the first two layers have relatively high modulus of elasticity. The occurrence of 150%, 200%, and 250% overloading have a very significant effect in reducing 84%, 95%, and 98% of the pavement design life, respectively. For the purpose of increasing the pavement service life, it is more effective to manage the allowable load.

  14. Empirical mode decomposition and long-range correlation analysis of sunspot time series

    International Nuclear Information System (INIS)

    Zhou, Yu; Leung, Yee

    2010-01-01

    Sunspots, which are the best known and most variable features of the solar surface, affect our planet in many ways. The number of sunspots during a period of time is highly variable and arouses strong research interest. When multifractal detrended fluctuation analysis (MF-DFA) is employed to study the fractal properties and long-range correlation of the sunspot series, some spurious crossover points might appear because of the periodic and quasi-periodic trends in the series. However many cycles of solar activities can be reflected by the sunspot time series. The 11-year cycle is perhaps the most famous cycle of the sunspot activity. These cycles pose problems for the investigation of the scaling behavior of sunspot time series. Using different methods to handle the 11-year cycle generally creates totally different results. Using MF-DFA, Movahed and co-workers employed Fourier truncation to deal with the 11-year cycle and found that the series is long-range anti-correlated with a Hurst exponent, H, of about 0.12. However, Hu and co-workers proposed an adaptive detrending method for the MF-DFA and discovered long-range correlation characterized by H≈0.74. In an attempt to get to the bottom of the problem in the present paper, empirical mode decomposition (EMD), a data-driven adaptive method, is applied to first extract the components with different dominant frequencies. MF-DFA is then employed to study the long-range correlation of the sunspot time series under the influence of these components. On removing the effects of these periods, the natural long-range correlation of the sunspot time series can be revealed. With the removal of the 11-year cycle, a crossover point located at around 60 months is discovered to be a reasonable point separating two different time scale ranges, H≈0.72 and H≈1.49. And on removing all cycles longer than 11 years, we have H≈0.69 and H≈0.28. The three cycle-removing methods—Fourier truncation, adaptive detrending and the

  15. Bioactive conformational generation of small molecules: A comparative analysis between force-field and multiple empirical criteria based methods

    Directory of Open Access Journals (Sweden)

    Jiang Hualiang

    2010-11-01

    Full Text Available Abstract Background Conformational sampling for small molecules plays an essential role in drug discovery research pipeline. Based on multi-objective evolution algorithm (MOEA, we have developed a conformational generation method called Cyndi in the previous study. In this work, in addition to Tripos force field in the previous version, Cyndi was updated by incorporation of MMFF94 force field to assess the conformational energy more rationally. With two force fields against a larger dataset of 742 bioactive conformations of small ligands extracted from PDB, a comparative analysis was performed between pure force field based method (FFBM and multiple empirical criteria based method (MECBM hybrided with different force fields. Results Our analysis reveals that incorporating multiple empirical rules can significantly improve the accuracy of conformational generation. MECBM, which takes both empirical and force field criteria as the objective functions, can reproduce about 54% (within 1Å RMSD of the bioactive conformations in the 742-molecule testset, much higher than that of pure force field method (FFBM, about 37%. On the other hand, MECBM achieved a more complete and efficient sampling of the conformational space because the average size of unique conformations ensemble per molecule is about 6 times larger than that of FFBM, while the time scale for conformational generation is nearly the same as FFBM. Furthermore, as a complementary comparison study between the methods with and without empirical biases, we also tested the performance of the three conformational generation methods in MacroModel in combination with different force fields. Compared with the methods in MacroModel, MECBM is more competitive in retrieving the bioactive conformations in light of accuracy but has much lower computational cost. Conclusions By incorporating different energy terms with several empirical criteria, the MECBM method can produce more reasonable conformational

  16. USGS Tampa Bay Pilot Study

    Science.gov (United States)

    Yates, K.K.; Cronin, T. M.; Crane, M.; Hansen, M.; Nayeghandi, A.; Swarzenski, P.; Edgar, T.; Brooks, G.R.; Suthard, B.; Hine, A.; Locker, S.; Willard, D.A.; Hastings, D.; Flower, B.; Hollander, D.; Larson, R.A.; Smith, K.

    2007-01-01

    predictive modeling tools for effective ecosystem adaptive management. As a multidisciplinary organization, the USGS possesses the capability of developing and coordinating an integrated science strategy for estuarine research founded on partnerships and collaborative efforts, multidisciplinary teams of scientists, and integrated field work, data analysis and interpretation, and product development. The primary role of the USGS in Tamps Bay research was defined with our partners based upon this capability to address estuarine issues using an integrated science approach with a regional perspective and within a national context to complement the numerous ongoing scien efforts by state and local agencies that address local issues within Tamp Bay. Six primary components of the USGS Tamp Bay Study address critical gaps within each of the the four estuarine system components and focus on: 1.) Examining how natural and man-made physical changes affect ecosystem health through mapping and modeling.

  17. Collection and analysis of remotely sensed data from the Rhode River Estuary Watershed. [ecological parameters of Chesapeake Bay

    Science.gov (United States)

    Jenkins, D. W.

    1972-01-01

    NASA chose the watershed of Rhode River, a small sub-estuary of the Bay, as a representative test area for intensive studies of remote sensing, the results of which could be extrapolated to other estuarine watersheds around the Bay. A broad program of ecological research was already underway within the watershed, conducted by the Smithsonian Institution's Chesapeake Bay Center for Environmental Studies (CBCES) and cooperating universities. This research program offered a unique opportunity to explore potential applications for remote sensing techniques. This led to a joint NASA-CBCES project with two basic objectives: to evaluate remote sensing data for the interpretation of ecological parameters, and to provide essential data for ongoing research at the CBCES. A third objective, dependent upon realization of the first two, was to extrapolate photointerpretive expertise gained at the Rhode River watershed to other portions of the Chesapeake Bay.

  18. Atmospheric Nitrogen Deposition Loadings to the Chesapeake Bay: An Initial Analysis of the Cost Effectiveness of Control Options (1996)

    Science.gov (United States)

    This report examines the cost effectiveness of control options which reduce nitrate deposition to the Chesapeake watershed and to the tidal Bay. The report analyzes current estimates of the reductions expected in the ozone transport region.

  19. [Rationalization and rationing at the bedside. A normative and empirical status quo analysis].

    Science.gov (United States)

    Strech, D

    2014-02-01

    The topic of bedside rationing is increasingly discussed in Germany. Further need for clarification exists for the question how bedside rationing (e.g., in the area of overcare) can be justified despite coexistent inefficiencies. This paper outlines and analyses the relationship of waste avoidance and rationing from an ethical perspective. Empirical findings regarding the status quo of bedside rationing and rationalization are presented. These normative and empirical explorations will then be further specified regarding opportunities for future physician-driven activities to tackle overuse. The self-government partners in Germany should communicate more explicitly within their communities and to the public how and with which benchmarks they aim to reduce inefficient health care (overuse) in an appropriate manner. Physician-driven activities such as the "Choosing Wisely®" initiative in the USA could provide a first step to raise the awareness for overuse among physicians as well as in the public.

  20. Space evolution model and empirical analysis of an urban public transport network

    Science.gov (United States)

    Sui, Yi; Shao, Feng-jing; Sun, Ren-cheng; Li, Shu-jing

    2012-07-01

    This study explores the space evolution of an urban public transport network, using empirical evidence and a simulation model validated on that data. Public transport patterns primarily depend on traffic spatial-distribution, demands of passengers and expected utility of investors. Evolution is an iterative process of satisfying the needs of passengers and investors based on a given traffic spatial-distribution. The temporal change of urban public transport network is evaluated both using topological measures and spatial ones. The simulation model is validated using empirical data from nine big cities in China. Statistical analyses on topological and spatial attributes suggest that an evolution network with traffic demands characterized by power-law numerical values which distribute in a mode of concentric circles tallies well with these nine cities.

  1. Ambiguity and Investment Decisions: An Empirical Analysis on Mutual Fund Investor Behaviour

    Directory of Open Access Journals (Sweden)

    Chao Tang

    2017-09-01

    Full Text Available The paper empirically studies the relationship between ambiguity and mutual fund investor behaviour. Theoretical models for investment decisions incorporating ambiguity motivate our analyses. While the models indicate that investors would less likely to invest in financial markets when ambiguity increases, there is rare empirical evidence in natural occurring financial data to examine this hypothesis. In this paper, we test the hypothesis with equity fund flow data as for investment decisions and ambiguity with the degree of disagreement in equity analysts’ prediction about asset returns. Our results support the hypothesis that increases in ambiguity could lead to less fund flows and this result remains consistently when adding various control variables affecting fund flows. Besides, we find that heterogeneous impacts of ambiguity: equity funds with high yield targets and active management style are affected more than funds investing in stable stocks; funds with larger proportion of institutional investors are more sensitive and affected by the ambiguity.

  2. Analysis of consumption behaviour concerning current income and lags consumption: Empirical evidence from Pakistan

    Directory of Open Access Journals (Sweden)

    Abdul Qayyum Khan

    2014-10-01

    Full Text Available As in other economies, consumption expenditure is the largest component of the Gross Domestic Product (GDP of Pakistan economy. The figure has been estimated around 80 percent of the GDP and demonstrates that historically, Pakistan’s economic growth is characterized as consumption-led growth. The present paper aims to explore the relationship between income and consumption using annual time series data for the period: 1975 to 2012 in Pakistan. For empirical investigation the linear regression model and the method of Least Squares is used as analytical techniques. Empirical results support the existence of a significant positive relationship between income and consumption. The finding suggests that long term committed planning is indispensable to enhance the productive capacity of the economy, employment opportunities and reduce poverty levels more effectively.

  3. Endangering of Businesses by the German Inheritance Tax? – An Empirical Analysis

    OpenAIRE

    Houben, Henriette; Maiterth, Ralf

    2011-01-01

    This contribution addresses the substantial tax privilege for businesses introduced by the German Inheritance Tax Act 2009. Advocates of the vast or even entire tax exemption for businesses stress the potential damage of the inheritance tax on businesses, as those often lack liquidity to meet tax liability. This submission tackles this issue empirically based on data of the German Inheritance Tax Statistics and the SOEP. The results indicate that former German inheritance tax law has not enda...

  4. An empirical analysis of gasoline demand in Denmark using cointegration techniques

    International Nuclear Information System (INIS)

    Bentzen, Jan

    1994-01-01

    Danish time-series data covering the period 1948-91 are used in order to estimate short-run and long-run elasticities in gasoline demand. A cointegration test for a stable long-run relationship between the variables in the model proves to be positive, showing a smaller value of the long-run price elasticity than often quoted in empirical studies of gasoline demand. Finally, an error correction model is estimated. (author)

  5. An Empirical Analysis of the Interrelationship between Motivation and Stress in the Computing Industry

    OpenAIRE

    Ó Cuirrín, Maitiú

    2007-01-01

    Although a great body of literature exists on the concepts of motivation and stress, no such study has examined the interrelationship between them. The objectives of this thesis are thus, to investigate the factors that motivate/demotivate and cause stress among recently employed computing graduates, as well as examining the implications of these factors both individually and interdependently for both the computing graduate and their employing organisation. An empirical quantitative appro...

  6. Inheritance tax-exempt transfer of German businesses: Imperative or unjustified subsidy? An empirical analysis

    OpenAIRE

    Houben, Henriette; Maiterth, Ralf

    2009-01-01

    This contribution addresses the substantial tax subsidies for businesses introduced by the German Inheritance Tax Act 2009. Advocates in favour of the vast or even entire tax exemption for businesses stress the potential damage of the inheritance tax on businesses, as those often lack liquid assets to meet tax liability. This submission tackles this issue empirically based on data of the German Inheritance Tax Statistics and the SOEP. The results indicate that former German inheritance tax la...

  7. Banking Fragility in Colombia: An Empirical Analysis Based on Balance Sheets

    OpenAIRE

    Ignacio Lozano; Alexander Guarín

    2014-01-01

    In this paper, we study the empirical relationship between credit funding sources and the financial vulnerability of the Colombian banking system. We propose a statistical model to measure and predict banking-fragility episodes associated with credit funding sources classified into retail deposits and wholesale funds. We compute the probability of financial fragility for both the aggregated banking system and the individual banks. Our approach performs a Bayesian averaging of estimated logit ...

  8. Empirical analysis of the constituent factors of internal marketing orientation at Spanish hotels

    OpenAIRE

    Robledo, José Luis Ruizalba; Arán, María Vallespín

    2014-01-01

    This study, which draws upon specialized literature and empirical evidence, aims to characterize the underlying structure of the construct ‘Internal Marketing Orientation’ (IMO). As a consequence six underlying factors are identified: exchange of values, segmentation of the internal market, internal communication, management concern, implementation of management concern and training. It also evaluates the degree of Spanish hotels’ IMO, classifying them into three different groups according to...

  9. THE PROVINCIALISM OF GLOBAL BRANDS AN EMPIRICAL ANALYSIS OF BRAND EQUITY DIFFERENCES IN MEXICO AND GERMANY

    OpenAIRE

    Thomas Cleff; Lena Fischer; Nadine Walter

    2010-01-01

    The term “global brand” has become widely used by the media and by consumers. Although media and consumers call these brands “global” and centralized marketing departments manage these brands globally – are these “global brands” really global? Can we talk about truly global brand equity? And if there were brand image differences between countries, which factors cause them? The authors conducted an empirical research during May and June 2009 with similarly aged University students in Germany (...

  10. Exchange rate policy and external debt in emerging economies: an empirical analysis

    OpenAIRE

    Cebir, Bilgen

    2012-01-01

    In this thesis, we empirically analyze the e ects of exchange rate policy on external debt accumulation in emerging market economies with a sample of 15 countries over the period 1998-2010. The exchange rate policy is captured by the de facto exchange rate classi cation of Ilzetzki, Reinhart, and Rogo (2008). This classification is based on the actual exchange rate behavior rather than the officially declared regimes. Therefore, it is expected to better reflect the exchange rate policies act...

  11. Micro-Credit and Rural Poverty: An Analysis of Empirical Evidence

    OpenAIRE

    Chavan, P.; Ramakumar, R.

    2003-01-01

    This paper reviews empirical evidence on NGO-led micro-credit programmes in several developing countries, and compares them with state-led poverty alleviation schemes in India. It shows that micro-credit programmes have brought about a marginal improvement in the beneficiaries' income, though technological improvements are lacking due to its emphasis on ‘survival skills'. Also, in Bangladesh the practice of repayment of Grameen Bank loans by making fresh loans from moneylenders has resulted ...

  12. An Empirical Analysis of Socio-Demographic Stratification in Sweetened Carbonated Soft-Drink Purchasing

    OpenAIRE

    Rhodes, Charles

    2012-01-01

    Caloric soft drinks are the number one source of added sugars in U.S. diets, and are associated with many health problems. Three recent years of household purchase, household demographic, and industry advertising data allow Heckit estimation to identify how specific demographic groups vary in their purchase response to marketing of sweetened carbonated soft drinks (sCSDs) at the product category level. Empirical results reveal unique non-linear patterns of household purchase response to sCSD-...

  13. Strategic Management Tools and Techniques: A Comparative Analysis of Empirical Studies

    Directory of Open Access Journals (Sweden)

    Albana Berisha Qehaja

    2017-01-01

    Full Text Available There is no doubt that strategic management tools and techniques are important parts of the strategic management process. Their use in organizations should be observed in a practice-based context. This paper analyzes the empirical studies on the usage of strategic management tools and techniques. Hence, the main aim of this study is to investigate and analyze which enterprises, according to their country development level, use more strategic management tools and techniques and which of these are used the most. Also, this paper investigates which strategic management tools and techniques are used globally according to the results of empirical studies. The study presents a summary of empirical studies for the period 1990–2015. The research results indicate that more strategic tools and techniques are used in developed countries, followed by developing countries and fewest in countries in transition. This study is likely to contribute to the field of strategic management because it summarizes the most used strategic tools and techniques at the global level according to varying stages of countries’ economic development. Also, the findings from this study may be utilized to maximize the full potential of enterprises and reduce the cases of entrepreneurship failures, through creating awareness of the importance of using strategic management tools and techniques.

  14. Corporate Social Responsibility Applied for Rural Development: An Empirical Analysis of Firms from the American Continent

    Directory of Open Access Journals (Sweden)

    Miguel Arato

    2016-01-01

    Full Text Available Corporate Social Responsibility has been recognized by policymakers and development specialists as a feasible driver for rural development. The present paper explores both theoretically and empirically how firms involved in CSR provide development opportunities to rural communities. The research first evaluates the applied literature on the implementation of CSR by private firms and policymakers as means to foster sustainable rural development. The empirical research analyses the CSR activities of 100 firms from a variety of industries, sizes, and countries to determine the type of companies who are involved in rural development and the kind of activities they deployed. Results from the empirical research show that although rural development initiatives are not relevant for all types of companies, a significant number of firms from a variety of industries have engaged in CSR programs supporting rural communities. Firms appear to be interested in stimulating rural development and seem to benefit from it. This paper also includes an exploration of the main challenges and constraints that firms encounter when encouraging rural development initiatives.

  15. Investigating complex patterns of blocked intestinal artery blood pressure signals by empirical mode decomposition and linguistic analysis

    International Nuclear Information System (INIS)

    Yeh, J-R; Lin, T-Y; Shieh, J-S; Chen, Y; Huang, N E; Wu, Z; Peng, C-K

    2008-01-01

    In this investigation, surgical operations of blocked intestinal artery have been conducted on pigs to simulate the condition of acute mesenteric arterial occlusion. The empirical mode decomposition method and the algorithm of linguistic analysis were applied to verify the blood pressure signals in simulated situation. We assumed that there was some information hidden in the high-frequency part of the blood pressure signal when an intestinal artery is blocked. The empirical mode decomposition method (EMD) has been applied to decompose the intrinsic mode functions (IMF) from a complex time series. But, the end effects and phenomenon of intermittence damage the consistence of each IMF. Thus, we proposed the complementary ensemble empirical mode decomposition method (CEEMD) to solve the problems of end effects and the phenomenon of intermittence. The main wave of blood pressure signals can be reconstructed by the main components, identified by Monte Carlo verification, and removed from the original signal to derive a riding wave. Furthermore, the concept of linguistic analysis was applied to design the blocking index to verify the pattern of riding wave of blood pressure using the measurements of dissimilarity. Blocking index works well to identify the situation in which the sampled time series of blood pressure signal was recorded. Here, these two totally different algorithms are successfully integrated and the existence of the existence of information hidden in high-frequency part of blood pressure signal has been proven

  16. An empirical likelihood ratio test robust to individual heterogeneity for differential expression analysis of RNA-seq.

    Science.gov (United States)

    Xu, Maoqi; Chen, Liang

    2018-01-01

    The individual sample heterogeneity is one of the biggest obstacles in biomarker identification for complex diseases such as cancers. Current statistical models to identify differentially expressed genes between disease and control groups often overlook the substantial human sample heterogeneity. Meanwhile, traditional nonparametric tests lose detailed data information and sacrifice the analysis power, although they are distribution free and robust to heterogeneity. Here, we propose an empirical likelihood ratio test with a mean-variance relationship constraint (ELTSeq) for the differential expression analysis of RNA sequencing (RNA-seq). As a distribution-free nonparametric model, ELTSeq handles individual heterogeneity by estimating an empirical probability for each observation without making any assumption about read-count distribution. It also incorporates a constraint for the read-count overdispersion, which is widely observed in RNA-seq data. ELTSeq demonstrates a significant improvement over existing methods such as edgeR, DESeq, t-tests, Wilcoxon tests and the classic empirical likelihood-ratio test when handling heterogeneous groups. It will significantly advance the transcriptomics studies of cancers and other complex disease. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  17. Investigating complex patterns of blocked intestinal artery blood pressure signals by empirical mode decomposition and linguistic analysis

    Energy Technology Data Exchange (ETDEWEB)

    Yeh, J-R; Lin, T-Y; Shieh, J-S [Department of Mechanical Engineering, Yuan Ze University, 135 Far-East Road, Chung-Li, Taoyuan, Taiwan (China); Chen, Y [Far Eastern Memorial Hospital, Taiwan (China); Huang, N E [Research Center for Adaptive Data Analysis, National Central University, Taiwan (China); Wu, Z [Center for Ocean-Land-Atmosphere Studies (United States); Peng, C-K [Beth Israel Deaconess Medical Center, Harvard Medical School (United States)], E-mail: s939205@ mail.yzu.edu.tw

    2008-02-15

    In this investigation, surgical operations of blocked intestinal artery have been conducted on pigs to simulate the condition of acute mesenteric arterial occlusion. The empirical mode decomposition method and the algorithm of linguistic analysis were applied to verify the blood pressure signals in simulated situation. We assumed that there was some information hidden in the high-frequency part of the blood pressure signal when an intestinal artery is blocked. The empirical mode decomposition method (EMD) has been applied to decompose the intrinsic mode functions (IMF) from a complex time series. But, the end effects and phenomenon of intermittence damage the consistence of each IMF. Thus, we proposed the complementary ensemble empirical mode decomposition method (CEEMD) to solve the problems of end effects and the phenomenon of intermittence. The main wave of blood pressure signals can be reconstructed by the main components, identified by Monte Carlo verification, and removed from the original signal to derive a riding wave. Furthermore, the concept of linguistic analysis was applied to design the blocking index to verify the pattern of riding wave of blood pressure using the measurements of dissimilarity. Blocking index works well to identify the situation in which the sampled time series of blood pressure signal was recorded. Here, these two totally different algorithms are successfully integrated and the existence of the existence of information hidden in high-frequency part of blood pressure signal has been proven.

  18. Radiocarbon Analysis to Calculate New End-Member Values for Biomass Burning Source Samples Specific to the Bay Area

    Science.gov (United States)

    Yoon, S.; Kirchstetter, T.; Fairley, D.; Sheesley, R. J.; Tang, X.

    2017-12-01

    Elemental carbon (EC), also known as black carbon or soot, is an important particulate air pollutant that contributes to climate forcing through absorption of solar radiation and to adverse human health impacts through inhalation. Both fossil fuel combustion and biomass burning, via residential firewood burning, agricultural burning, wild fires, and controlled burns, are significant sources of EC. Our ability to successfully control ambient EC concentrations requires understanding the contribution of these different emission sources. Radiocarbon (14C) analysis has been increasingly used as an apportionment tool to distinguish between EC from fossil fuel and biomass combustion sources. However, there are uncertainties associated with this method including: 1) uncertainty associated with the isolation of EC to be used for radiocarbon analysis (e.g., inclusion of organic carbon, blank contamination, recovery of EC, etc.) 2) uncertainty associated with the radiocarbon signature of the end member. The objective of this research project is to utilize laboratory experiments to evaluate some of these uncertainties, particularly for EC sources that significantly impact the San Francisco Bay Area. Source samples of EC only and a mix of EC and organic carbon (OC) were produced for this study to represent known emission sources and to approximate the mixing of EC and OC that would be present in the atmosphere. These samples include a combination of methane flame soot, various wood smoke samples (i.e. cedar, oak, sugar pine, pine at various ages, etc.), meat cooking, and smoldering cellulose smoke. EC fractions were isolated using a Sunset Laboratory's thermal optical transmittance carbon analyzer. For 14C analysis, samples were sent to Woods Hole Oceanographic Institution for isotope analysis using an accelerated mass spectrometry. End member values and uncertainties for the EC isolation utilizing this method will be reported.

  19. Consistency and Variability in Talk about "Diversity": An Empirical Analysis of Discursive Scope in Swiss Large Scale Enterprises

    Directory of Open Access Journals (Sweden)

    Anja Ostendorp

    2009-02-01

    Full Text Available Traditionally discussions of "diversity" in organizations either refer to an ideal "management" of a diverse workforce or to specific concerns of minorities. The term diversity, however, entails a growing number of translations. Highlighting this diversity of diversity, the concept cannot be merely conceived of as either social-normative or economic-functional. Therefore, the present study empirically scrutinizes the current scope of diversity-talk in Swiss large scale enterprises from a discursive psychological perspective. First, it provides five so-called interpretative repertoires which focus on: image, market, minorities, themes, and difference. Second, it discusses why and how persons oscillate between consistency and variability whenever they draw upon these different repertoires. Finally, it points out possibilities to combine them. This empirical approach to diversity in organizations offers new aspects to the current debate on diversity and introduces crucial concepts of a discursive psychological analysis. URN: urn:nbn:de:0114-fqs090218

  20. Marine electrical resistivity imaging of submarine groundwater discharge: Sensitivity analysis and application in Waquoit Bay, Massachusetts, USA

    Science.gov (United States)

    Henderson, Rory; Day-Lewis, Frederick D.; Abarca, Elena; Harvey, Charles F.; Karam, Hanan N.; Liu, Lanbo; Lane, John W.

    2010-01-01

    Electrical resistivity imaging has been used in coastal settings to characterize fresh submarine groundwater discharge and the position of the freshwater/salt-water interface because of the relation of bulk electrical conductivity to pore-fluid conductivity, which in turn is a function of salinity. Interpretation of tomograms for hydrologic processes is complicated by inversion artifacts, uncertainty associated with survey geometry limitations, measurement errors, and choice of regularization method. Variation of seawater over tidal cycles poses unique challenges for inversion. The capabilities and limitations of resistivity imaging are presented for characterizing the distribution of freshwater and saltwater beneath a beach. The experimental results provide new insight into fresh submarine groundwater discharge at Waquoit Bay National Estuarine Research Reserve, East Falmouth, Massachusetts (USA). Tomograms from the experimental data indicate that fresh submarine groundwater discharge may shut down at high tide, whereas temperature data indicate that the discharge continues throughout the tidal cycle. Sensitivity analysis and synthetic modeling provide insight into resolving power in the presence of a time-varying saline water layer. In general, vertical electrodes and cross-hole measurements improve the inversion results regardless of the tidal level, whereas the resolution of surface arrays is more sensitive to time-varying saline water layer.

  1. Antibiotics in the coastal environment of the Hailing Bay region, South China Sea: Spatial distribution, source analysis and ecological risks

    International Nuclear Information System (INIS)

    Chen, Hui; Liu, Shan; Xu, Xiang-Rong; Zhou, Guang-Jie; Liu, Shuang-Shuang; Yue, Wei-Zhong; Sun, Kai-Feng; Ying, Guang-Guo

    2015-01-01

    Highlights: • Thirty-eight antibiotics were systematically investigated in marine environment. • The distribution of antibiotics was significantly correlated with COD and NO 3 –N. • Untreated domestic sewage was the primary source of antibiotics. • Fluoroquinolones showed a strong sorption capacity onto sediments. • Oxytetracycline, norfloxacin and erythromycin–H 2 O indicated high risks. - Abstract: In this study, the occurrence and spatial distribution of 38 antibiotics in surface water and sediment samples of the Hailing Bay region, South China Sea, were investigated. Twenty-one, 16 and 15 of 38 antibiotics were detected with the concentrations ranging from <0.08 (clarithromycin) to 15,163 ng/L (oxytetracycline), 2.12 (methacycline) to 1318 ng/L (erythromycin–H 2 O), <1.95 (ciprofloxacin) to 184 ng/g (chlortetracycline) in the seawater, discharged effluent and sediment samples, respectively. The concentrations of antibiotics in the water phase were correlated positively with chemical oxygen demand and nitrate. The source analysis indicated that untreated domestic sewage was the primary source of antibiotics in the study region. Fluoroquinolones showed strong sorption capacity onto sediments due to their high pseudo-partitioning coefficients. Risk assessment indicated that oxytetracycline, norfloxacin and erythromycin–H 2 O posed high risks to aquatic organisms

  2. Inglorious Empire

    DEFF Research Database (Denmark)

    Khair, Tabish

    2017-01-01

    Review of 'Inglorious Empire: What the British did to India' by Shashi Tharoor, London, Hurst Publishers, 2017, 296 pp., £20.00......Review of 'Inglorious Empire: What the British did to India' by Shashi Tharoor, London, Hurst Publishers, 2017, 296 pp., £20.00...

  3. An empirical analysis of risk-taking in car driving and other aspects of life

    DEFF Research Database (Denmark)

    Abay, Kibrom Araya; Mannering, Fred

    2016-01-01

    The link between risk-taking behavior in various aspects of life has long been an area of debate among economists and psychologists. Using an extensive data set from Denmark, this study provides an empirical investigation of the link between risky driving and risk taking in other aspects of life...... results in this study suggest that risk-taking behavior in various aspects of life can be associated, and our results corroborate previous evidence on the link between individuals’ risk preferences across various aspects of life. This implies that individuals’ driving behavior, which is commonly...

  4. An Empirical Analysis of Romania’s Comovement with the Euro Zone

    Directory of Open Access Journals (Sweden)

    Nicolae Dardac

    2009-11-01

    Full Text Available In light of adopting the euro in the near future, it is important to asses to which extent the Romanian business cycle evolves in a similar fashion with that of the euro zone. The present study is an empirical investigation of the degree of business cycle synchronization between Romania and the euro area, based on macroeconomic series that capture the cyclical features of the two economies. The results indicate that the most recent period, characterized by major economic and financial turmoil, has lead to an increase of the degree of comovement between of the Romanian economy with that of the euro area.

  5. Ancient DNA Analysis Suggests Negligible Impact of the Wari Empire Expansion in Peru's Central Coast during the Middle Horizon.

    Science.gov (United States)

    Valverde, Guido; Barreto Romero, María Inés; Flores Espinoza, Isabel; Cooper, Alan; Fehren-Schmitz, Lars; Llamas, Bastien; Haak, Wolfgang

    2016-01-01

    The analysis of ancient human DNA from South America allows the exploration of pre-Columbian population history through time and to directly test hypotheses about cultural and demographic evolution. The Middle Horizon (650-1100 AD) represents a major transitional period in the Central Andes, which is associated with the development and expansion of ancient Andean empires such as Wari and Tiwanaku. These empires facilitated a series of interregional interactions and socio-political changes, which likely played an important role in shaping the region's demographic and cultural profiles. We analyzed individuals from three successive pre-Columbian cultures present at the Huaca Pucllana archaeological site in Lima, Peru: Lima (Early Intermediate Period, 500-700 AD), Wari (Middle Horizon, 800-1000 AD) and Ychsma (Late Intermediate Period, 1000-1450 AD). We sequenced 34 complete mitochondrial genomes to investigate the potential genetic impact of the Wari Empire in the Central Coast of Peru. The results indicate that genetic diversity shifted only slightly through time, ruling out a complete population discontinuity or replacement driven by the Wari imperialist hegemony, at least in the region around present-day Lima. However, we caution that the very subtle genetic contribution of Wari imperialism at the particular Huaca Pucllana archaeological site might not be representative for the entire Wari territory in the Peruvian Central Coast.

  6. Ancient DNA Analysis Suggests Negligible Impact of the Wari Empire Expansion in Peru's Central Coast during the Middle Horizon.

    Directory of Open Access Journals (Sweden)

    Guido Valverde

    Full Text Available The analysis of ancient human DNA from South America allows the exploration of pre-Columbian population history through time and to directly test hypotheses about cultural and demographic evolution. The Middle Horizon (650-1100 AD represents a major transitional period in the Central Andes, which is associated with the development and expansion of ancient Andean empires such as Wari and Tiwanaku. These empires facilitated a series of interregional interactions and socio-political changes, which likely played an important role in shaping the region's demographic and cultural profiles. We analyzed individuals from three successive pre-Columbian cultures present at the Huaca Pucllana archaeological site in Lima, Peru: Lima (Early Intermediate Period, 500-700 AD, Wari (Middle Horizon, 800-1000 AD and Ychsma (Late Intermediate Period, 1000-1450 AD. We sequenced 34 complete mitochondrial genomes to investigate the potential genetic impact of the Wari Empire in the Central Coast of Peru. The results indicate that genetic diversity shifted only slightly through time, ruling out a complete population discontinuity or replacement driven by the Wari imperialist hegemony, at least in the region around present-day Lima. However, we caution that the very subtle genetic contribution of Wari imperialism at the particular Huaca Pucllana archaeological site might not be representative for the entire Wari territory in the Peruvian Central Coast.

  7. The Validity Chlorophyll-a Estimation by Sun Induced Fluorescence in Estuarine Waters: An Analysis of Long-term (2003-2011) Water Quality Data from Tampa Bay, Florida (USA)

    Science.gov (United States)

    Moreno-Madrinan, Max Jacobo; Fischer, Andrew

    2012-01-01

    Satellite observation of phytoplankton concentration or chlorophyll-a is an important characteristic, critically integral to monitoring coastal water quality. However, the optical properties of estuarine and coastal waters are highly variable and complex and pose a great challenge for accurate analysis. Constituents such as suspended solids and dissolved organic matter and the overlapping and uncorrelated absorptions in the blue region of the spectrum renders the blue-green ratio algorithms for estimating chlorophyll-a inaccurate. Measurement of sun-induced chlorophyll fluorescence, on the other hand, which utilizes the near infrared portion of the electromagnetic spectrum, may provide a better estimate of phytoplankton concentrations. While modelling and laboratory studies have illustrated both the utility and limitations of satellite baseline algorithms based on the sun induced chlorophyll fluorescence signal, few have examined the empirical validity of these algorithms using a comprehensive long term in situ data set. In an unprecedented analysis of a long term (2003-2011) in situ monitoring data from Tampa Bay, Florida (USA), we assess the validity of the FLH product from the Moderate Resolution Imaging Spectrometer (MODIS) against chlorophyll ]a and a suite of water quality parameters taken in a variety of conditions throughout a large optically complex estuarine system. A systematic analysis of sampling sites throughout the bay is undertaken to understand how the relationship between FLH and in situ chlorophyll-a responds to varying conditions within the estuary including water depth, distance from shore and structures and eight water quality parameters. From the 39 station for which data was derived, 22 stations showed significant correlations when the FLH product was matched with in situ chlorophyll-alpha data. The correlations (r2) for individual stations within Tampa Bay ranged between 0.67 (n=28, pless than 0.01) and-0.457 (n=12, p=.016), indicating that

  8. Factors of economic growth in Palestine: an empirical Analysis during the period of (1994-2013

    Directory of Open Access Journals (Sweden)

    Omar Mahmoud Abu-Eideh

    2014-07-01

    Full Text Available This study aimed to analyze the impact of the size of domestic working labour force, real gross domestic capital formation, real domestic exports and imports of goods and services, and political instability on real gross domestic product( RGDP in Palestine during the period of 1994 -2013. To examine the empirical relationship between these explanatory variables and real (GDP growth the study adopted a standardized Cobb- Douglas production function by using the annual official data of the Palestinian Central Bureau of Statistics (PCBS, and applying the Ordinary least Square method (OLS and Second Order Auto Correlation Techniques. The empirical results of the model applied indicated that there is a positive relationship between the size of domestic working labour force, real gross domestic capital formation, real domestic exports and real gross domestic product( RGDP, and a negative relationship between real domestic imports of goods and services, and political instability and the real growth of (GDP. The study suggested several recommendations that can boost the level of growth, among them the most important one, is the urgent need for more investment in the economy as it leads to more formation of domestic capital which can count more in terms of economic growth in many ways.

  9. An empirical analysis of price expectations formation: Evidence from the crude oil reserves acquisitions market

    International Nuclear Information System (INIS)

    Vielhaber, L.M.

    1991-01-01

    Reasons for the recent scant empirical attention to price expectations theory are twofold. First, except for futures markets and the occasional expectations survey, price expectations are rarely documented. Second, results of empirical tests of rational expectations are fundamentally flawed by the subjective input of the researcher. Subjectivity taints the results of the test, first, in the form of model specification and, second, in the form of the identification of the relevant information set. This study addresses each of these shortcomings. First, crude oil price expectations are recovered in the market for reserves by using a standard engineering model commonly used in reserves evaluation. Second, the crude oil futures market is used to estimate an index of information. This index circumvents the need to subjectively identify the elements of the information set, removing a key source of subjective input. The results show that agents involved in the crude oil reserves acquisitions market form expectations of futures prices in a way that does not conform with the adaptive expectations model

  10. Measuring health lifestyles in a comparative analysis: theoretical issues and empirical findings.

    Science.gov (United States)

    Abel, T

    1991-01-01

    The concept of lifestyle bears great potential for research in medical sociology. Yet, weaknesses in current methods have restrained lifestyle research from realizing its full potentials. The present focus is on the links between theoretical conceptions and their empirical application. The paper divides into two parts. The first part provides a discussion of basic theoretical and methodological issues. In particular selected lines of thought from Max Weber are presented and their usefulness in providing a theoretical frame of reference for health lifestyle research is outlined. Next, a theory guided definition of the subject matter is introduced and basic problems in empirical applications of theoretical lifestyle concepts are discussed. In its second part the paper presents findings from comparative lifestyle analyses. Data from the U.S. and West Germany are utilized to explore issues of measurement equivalence and theoretical validity. Factor analyses indicate high conceptual equivalence for new measures of health lifestyle dimensions in both the U.S. and West Germany. Divisive cluster analyses detect three distinct lifestyle groups in both nations. Implications for future lifestyle research are discussed.

  11. Electronic contributions to the transport properties and specific heat of solid UO2: an empirical, self-consistent analysis

    International Nuclear Information System (INIS)

    Hyland, G.J.; Ralph, J.

    1982-07-01

    From an empirical, self-consistent analysis of new high temperature data on the thermo-electric Seebeck coefficient and d.c. electrical conductivity, the value of the free energy controlling the equilibrium of the thermally induced reaction, 2U 4+ reversible U 3+ + U 5+ is determined (treating the U 3+ and U 5+ as small polarons) and used to calculate the contribution of the process to the high temperature thermal conductivity and specific heat of UO 2 . It is found that the transport properties can be completely accounted for in this way, but not the anomalous rise in specific heat - the origin of which remains obscure. (U.K.)

  12. Manager’s decision-making in organizations –empirical analysis of bureaucratic vs. learning approach

    OpenAIRE

    Jana Frenová; Daniela Hrehová; Eva Bolfíková

    2010-01-01

    The paper is focused on the study of manager’s decision-making with respect to the basic model of learning organization, presented by P. Senge as a system model of management. On one hand, the empirical research was conducted in connection with key dimensions of organizational learning such as: 1. system thinking, 2. personal mastery, 3. mental models, 4. team learning, 5. building shared vision and 6. dynamics causes. On the other hand, the research was connected with the analysis of the bur...

  13. X-ray spectrum analysis of multi-component samples by a method of fundamental parameters using empirical ratios

    International Nuclear Information System (INIS)

    Karmanov, V.I.

    1986-01-01

    A type of the fundamental parameter method based on empirical relation of corrections for absorption and additional-excitation with absorbing characteristics of samples is suggested. The method is used for X-ray fluorescence analysis of multi-component samples of charges of welded electrodes. It is shown that application of the method is justified only for determination of titanium, calcium and silicon content in charges taking into account only corrections for absorption. Irn and manganese content can be calculated by the simple method of the external standard

  14. Trace element and stable isotope analysis of fourteen species of marine invertebrates from the Bay of Fundy, Canada.

    Science.gov (United States)

    English, Matthew D; Robertson, Gregory J; Mallory, Mark L

    2015-12-15

    The Bay of Fundy, Canada, is a macrotidal bay with a highly productive intertidal zone, hosting a large abundance and diversity of marine invertebrates. We analysed trace element concentrations and stable isotopic values of δ(15)N and δ(13)C in 14 species of benthic marine invertebrates from the Bay of Fundy's intertidal zone to investigate bioaccumulation or biodilution of trace elements in the lower level of this marine food web. Barnacles (Balanus balanus) consistently had significantly greater concentrations of trace elements compared to the other species studied, but otherwise we found low concentrations of non-essential trace elements. In the range of trophic levels that we studied, we found limited evidence of bioaccumulation or biodilution of trace elements across species, likely due to the species examined occupying similar trophic levels in different food chains. Copyright © 2015 Elsevier Ltd. All rights reserved.

  15. Flood control construction of Shidao Bay nuclear power plant and safety analysis for hypothetical accident of HTR-PM

    International Nuclear Information System (INIS)

    Chen Yongrong; Zhang Keke; Zhu Li

    2014-01-01

    A series of events triggered by tsunami eventually led to the Fukushima nuclear accident. For drawing lessons from the nuclear accident and applying to Shidao Bay nuclear power plant flood control construction, we compare with the state laws and regulations, and prove the design of Shidao Bay nuclear power plant flood construction. Through introducing the history of domestic tsunamis and the national researches before and after the Fukushima nuclear accident, we expound the tsunami hazards of Shidao Bay nuclear power plant. In addition, in order to verify the safety of HTR-PM, we anticipate the contingent accidents after ''superposition event of earthquake and extreme flood'', and analyse the abilities and measures of HTR-PM to deal with these beyond design basis accidents (BDBA). (author)

  16. Comparative empirical analysis of flow-weighted transit route networks in R-space and evolution modeling

    Science.gov (United States)

    Huang, Ailing; Zang, Guangzhi; He, Zhengbing; Guan, Wei

    2017-05-01

    Urban public transit system is a typical mixed complex network with dynamic flow, and its evolution should be a process coupling topological structure with flow dynamics, which has received little attention. This paper presents the R-space to make a comparative empirical analysis on Beijing’s flow-weighted transit route network (TRN) and we found that both the Beijing’s TRNs in the year of 2011 and 2015 exhibit the scale-free properties. As such, we propose an evolution model driven by flow to simulate the development of TRNs with consideration of the passengers’ dynamical behaviors triggered by topological change. The model simulates that the evolution of TRN is an iterative process. At each time step, a certain number of new routes are generated driven by travel demands, which leads to dynamical evolution of new routes’ flow and triggers perturbation in nearby routes that will further impact the next round of opening new routes. We present the theoretical analysis based on the mean-field theory, as well as the numerical simulation for this model. The results obtained agree well with our empirical analysis results, which indicate that our model can simulate the TRN evolution with scale-free properties for distributions of node’s strength and degree. The purpose of this paper is to illustrate the global evolutional mechanism of transit network that will be used to exploit planning and design strategies for real TRNs.

  17. Endangering of Businesses by the German Inheritance Tax? – An Empirical Analysis

    Directory of Open Access Journals (Sweden)

    Henriette Houben

    2011-04-01

    Full Text Available This contribution addresses the substantial tax privilege for businesses introduced by the German Inheritance Tax Act 2009. Advocates of the vast or even entire tax exemption for businesses stress the potential damage of the inheritance tax on businesses, as those often lack liquidity to meet tax liability. This submission tackles this issue empirically based on data of the German Inheritance Tax Statistics and the SOEP. The results indicate that former German inheritance tax law has not endangered transferred businesses. Hence, there is no need for the tremendous tax privilege for businesses in current German inheritance tax law. An alternative flat inheritance tax without tax privileges, which meets revenue neutrality per tax class according to current tax law, provokes in some cases relative high tax loads which might trouble businesses.

  18. Competition policies and environmental quality: Empirical analysis of the electricity sector in OECD countries

    International Nuclear Information System (INIS)

    Asane-Otoo, Emmanuel

    2016-01-01

    Over the last decades, electricity markets across OECD countries have been subjected to profound structural changes with far-reaching implications on the economy and the environment. This paper investigates the effect of restructuring – changes in entry regulations, the degree of vertical integration and ownership structure – on GHG emissions. The findings show that competition policies – particularly reducing the degree of vertical integration and increasing privatization – correlate negatively with emission intensity. However, the environmental effect of reducing market entry barriers is generally insignificant. Integration of competition and stringent environmental policies are required to reduce GHG emissions and improve environmental quality. - Highlights: •Empirical study on competition policies and GHG emissions from the electricity sector. •Product market regulation scores for OECD countries are used to measure the extent of competition. •Evidence of a positive relationship between competition policies and environmental quality. •Integration of competition and stringent environmental policies is recommended.

  19. The impact of e-ticketing technique on customer satisfaction: an empirical analysis

    Directory of Open Access Journals (Sweden)

    Mazen Kamal Qteishat

    2015-09-01

    Full Text Available Recently, internet technology is considered to be the most used information and communication technology by organizations: it can ease the process of transactions and reinforce the relation between companies and customers. This investigation empirically examines the impact of e-ticketing technique on customer satisfaction; a convenience sample of Jordanian airline passengers that had booked flights in the last 12 months through companies offering e-ticketing services was acquired. The findings indicate that customer satisfaction with e-ticketing services was influenced by all of the independent variables measured (Data security, Customer and Technical Support, and User-Friendliness were noted to have significant impact on customer satisfaction with e-ticketing services.

  20. Empirical correction of crosstalk in a low-background germanium γ-γ analysis system

    International Nuclear Information System (INIS)

    Keillor, M.E.; Erikson, L.E.; Aalseth, C.E.; Day, A.R.; Fuller, E.S.; Glasgow, B.D.; Hoppe, E.W.; Hossbach, T.W.; Mizouni, L.K.; Myers, A.W.

    2013-01-01

    The Pacific Northwest National Laboratory (PNNL) is currently developing a custom software suite capable of automating many of the tasks required to accurately analyze coincident signals within gamma spectrometer arrays. During the course of this work, significant crosstalk was identified in the energy determination for spectra collected with a new low-background intrinsic germanium (HPGe) array at PNNL. The HPGe array is designed for high detection efficiency, ultra-low-background performance, and sensitive γ-γ coincidence detection. The first half of the array, a single cryostat containing seven HPGe crystals, was recently installed into a new shallow underground laboratory facility. This update will present a brief review of the germanium array, describe the observed crosstalk, and present a straight-forward empirical correction that significantly reduces the impact of this crosstalk on the spectroscopic performance of the system. (author)

  1. DOES ENERGY CONSUMPTION VOLATILITY AFFECT REAL GDP VOLATILITY? AN EMPIRICAL ANALYSIS FOR THE UK

    Directory of Open Access Journals (Sweden)

    Abdul Rashid

    2013-10-01

    Full Text Available This paper empirically examines the relation between energy consumption volatility and unpredictable variations in real gross domestic product (GDP in the UK. Estimating the Markov switching ARCH model we find a significant regime switching in the behavior of both energy consumption and GDP volatility. The results from the Markov regime-switching model show that the variability of energy consumption has a significant role to play in determining the behavior of GDP volatilities. Moreover, the results suggest that the impacts of unpredictable variations in energy consumption on GDP volatility are asymmetric, depending on the intensity of volatility. In particular, we find that while there is no significant contemporaneous relationship between energy consumption volatility and GDP volatility in the first (low-volatility regime, GDP volatility is significantly positively related to the volatility of energy utilization in the second (high-volatility regime.

  2. Beyond the use of food supplements: An empirical analysis in Italy

    Directory of Open Access Journals (Sweden)

    A. LOMBARDI

    2016-03-01

    Full Text Available This paper aims to profile Italian food supplements used by consumers based upon their psychometric patterns and demographic characteristics. The FTNS scale is used to assess empirically and evaluate the role of technophobic/technophilic consumer traits in determining the decision whether or not to consume supplements and vitamins and the frequency of their consumption.An ad-hoc survey was carried out in 2012 involving 400 residents of a metropolitan area in southern Italy. Our results show that women have a higher consumption frequency of dietary supplements, while age, BMI and education influence the propensity to consume. As regards food habits, the propensity to use dietary supplements is positively associated to the consumption of bread and pasta, red meat and pulses, and negatively with the consumption of fruit and cheese.Finally, the research supports the role of technophobic traits as consistent and significant determinants of the consumption frequency of dietary supplements.

  3. An empirical analysis of gasoline price convergence for 20 OECD countries

    Energy Technology Data Exchange (ETDEWEB)

    Bentzen, J.

    2003-07-01

    Two decades have passed now since the oil price shocks of the 1970s and since then energy prices have - apart from short periods of price instability - evolved relatively smoothly in the industrialized countries. Energy taxes in many countries differ markedly thereby causing differences in final energy prices, but as similar tax levels are becoming more common, e.g. in the European Union, convergence concerning energy prices might be expected to appear. In the present paper national gasoline price data covering the time period since the 1970s for a sample of OECD countries are used in order to test for this often addressed topic of convergence. The empirical part of the paper applies different time series based tests of convergence, where gasoline prices exhibit convergence for most OECD-Europe countries in the case where US$ is used for measurement of the energy prices indicating a convergence or tax harmonization process is taking place for these countries. (au)

  4. Factors Affecting the Adoption of Mobile Payment Systems: An Empirical Analysis

    Directory of Open Access Journals (Sweden)

    İkram Daştan

    2016-02-01

    Full Text Available The world witnessed a rapid growth in the e-commerce in the recent years. Widespread use of mobile devices in the e-commerce has a role in this augmentation. Associated with growth of trading volume and the introduction of new devices, new products and solutions emerge and they diversify concerning online payments. Consumer attitudes and behaviors may change according to these developments. The purpose of this study is to investigate the factors effecting adoption of mobile payment systems by the consumer. 225 individuals were surveyed online through convenience sampling method. A research model was developed and proposed relationships were tested using structural equation modeling. The empirical findings point out that perceived trust, perceived mobility and attitudes positively affect the adoption of MPS; perceived usefulness and perceived ease of use have no effect on adoption of MPS. Furthermore perceived reputation positively related to perceived trust and finally environmental risk negatively related to perceived trust.

  5. Empirical Analysis of Stochastic Volatility Model by Hybrid Monte Carlo Algorithm

    International Nuclear Information System (INIS)

    Takaishi, Tetsuya

    2013-01-01

    The stochastic volatility model is one of volatility models which infer latent volatility of asset returns. The Bayesian inference of the stochastic volatility (SV) model is performed by the hybrid Monte Carlo (HMC) algorithm which is superior to other Markov Chain Monte Carlo methods in sampling volatility variables. We perform the HMC simulations of the SV model for two liquid stock returns traded on the Tokyo Stock Exchange and measure the volatilities of those stock returns. Then we calculate the accuracy of the volatility measurement using the realized volatility as a proxy of the true volatility and compare the SV model with the GARCH model which is one of other volatility models. Using the accuracy calculated with the realized volatility we find that empirically the SV model performs better than the GARCH model.

  6. An empirical analysis on the adoption of alternative fuel vehicles: The case of natural gas vehicles

    International Nuclear Information System (INIS)

    Yeh, Sonia

    2007-01-01

    The adoption of alternative fuel vehicles (AFVs) has been regarded as one of the most important strategies to address the issues of energy dependence, air quality, and, more recently, climate change. Despite decades of effort, we still face daunting challenges to promote wider acceptance of AFVs by the general public. More empirical analyses are needed to understand the technology adoption process associated with different market structures, the effectiveness of regulations and incentives, and the density of infrastructure adequate to reach sustainable commercial application. This paper compares the adoption of natural gas vehicles (NGVs) in eight countries: Argentina, Brazil, China, India, Italy, New Zealand, Pakistan, and the US. It examines the major policies aimed at promoting the use of NGVs, instruments for implementing those policies and targeting likely stakeholders, and a range of factors that influence the adoption of NGVs. The findings in this paper should be applicable to other AFVs

  7. International Direct Investment and Transboundary Pollution: An Empirical Analysis of Complex Networks

    Directory of Open Access Journals (Sweden)

    Yuping Deng

    2015-04-01

    Full Text Available Using complex networks and spatial econometric methods, we empirically test the extent to which a country’s influence and its position in an international investment network affect environmental quality as well as the country’s role in transboundary pollution transfer. The estimated results show that the ties connecting nodes together in an international investment network have significant impacts on global environmental pollution. Additionally, node linkages between developing countries have stronger negative effects on environmental quality than node linkages between developed countries. Moreover, greater node importance and node centrality accelerate the speed and scale of the growth of polluting industries, which allows developed countries to more easily transfer their pollution-intensive industries to developing countries that possess higher node dependency. We also find that the factor endowment effect coexists with the pollution haven effect, the effects of environmental regulation advantage in the international investment network are greater than the impact of factor endowment advantage.

  8. An empirical analysis of gasoline price convergence for 20 OECD countries

    International Nuclear Information System (INIS)

    Bentzen, J.

    2003-01-01

    Two decades have passed now since the oil price shocks of the 1970s and since then energy prices have - apart from short periods of price instability - evolved relatively smoothly in the industrialized countries. Energy taxes in many countries differ markedly thereby causing differences in final energy prices, but as similar tax levels are becoming more common, e.g. in the European Union, convergence concerning energy prices might be expected to appear. In the present paper national gasoline price data covering the time period since the 1970s for a sample of OECD countries are used in order to test for this often addressed topic of convergence. The empirical part of the paper applies different time series based tests of convergence, where gasoline prices exhibit convergence for most OECD-Europe countries in the case where US$ is used for measurement of the energy prices indicating a convergence or tax harmonization process is taking place for these countries. (au)

  9. A high-resolution, empirical approach to climate impact assessment for regulatory analysis

    Science.gov (United States)

    Delgado, M.; Simcock, J. G.; Greenstone, M.; Hsiang, S. M.; Kopp, R. E.; Carleton, T.; Hultgren, A.; Jina, A.; Rising, J. A.; Nath, I.; Yuan, J.; Rode, A.; Chong, T.; Dobbels, G.; Hussain, A.; Wang, J.; Song, Y.; Mohan, S.; Larsen, K.; Houser, T.

    2017-12-01

    Recent breakthroughs in computing, data availability, and methodology have precipitated significant advances in the understanding of the relationship between climate and socioeconomic outcomes [1]. And while the use of estimates of the global marginal costs of greenhouse gas emissions (e.g. the SCC) are a mandatory component of regulatory policy in many jurisdictions, existing SCC-IAMs have lagged advances in impact assessment and valuation [2]. Recent work shows that incorporating high spatial and temporal resolution can significantly affect the observed relationships of economic outcomes to climate and socioeconomic factors [3] and that maintaining this granularity is critical to understanding the sensitivity of aggregate measures of valuation to inequality and risk adjustment methodologies [4]. We propose a novel framework that decomposes uncertainty in the SCC along multiple sources, including aggregate climate response parameters, the translation of global climate into local weather, the effect of weather on physical and economic systems, human and macro-economic responses, and impact valuation methodologies. This work extends Hsiang et al. (2017) [4] to directly estimate local response functions for multiple sectors in each of 24,378 global regions and to estimate impacts at this resolution daily, incorporating endogenous, empirically-estimated adaptation and costs. The goal of this work is to provide insight into the heterogeneity of climate impacts and to work with other modeling teams to enhance the empirical grounding of integrated climate impact assessment in more complex energy-environment-economics models. [1] T. Carleton and S. Hsiang (2016), DOI: 10.1126/science.aad9837. [2] National Academies of Sciences, Engineering, and Medicine (2017), DOI: 10.17226/24651. [3] Burke, M., S. Hsiang, and E. Miguel (2015), DOI: 10.1038/nature15725. [4] S. Hsiang et al. (2017), DOI: 10.1126/science.aal4369.

  10. An empirical analysis of overlap publication in Chinese language and English research manuscripts.

    Directory of Open Access Journals (Sweden)

    Joseph D Tucker

    Full Text Available BACKGROUND: There are a number of sound justifications for publishing nearly identical information in Chinese and English medical journals, assuming several conditions are met. Although overlap publication is perceived as undesirable and ethically questionable in Europe and North America, it may serve an important function in some regions where English is not the native tongue. There is no empirical data on the nature and degree of overlap publication in English and Chinese language journals. METHODS/PRINCIPAL FINDINGS: A random sample of 100 English manuscripts from Chinese institutions was selected from PubMed. Key words and institutions were searched in the China National Knowledge Infrastructure, a comprehensive Chinese language research database. Unacknowledged overlap was a priori defined according to International Committee of Medical Journal Editor (ICMJE guidelines following examination by two individuals. 19% (95% CI 11-27 of English manuscripts from Chinese institutions were found to have substantial overlap with Chinese published work based on full text examination. None of the manuscripts met all of the criteria established by the ICMJE for an acknowledged overlap publication. Individual-level, journal-level, and institutional factors seem to influence overlap publication. Manuscripts associated with an institution outside of China and with more than one institution were significantly less likely to have substantial overlap (p<0.05. CONCLUSIONS/SIGNIFICANCE: Overlap publication was common in this context, but instances of standard ICMJE notations to acknowledge this practice were rare. This research did not cite the identified overlap manuscripts with the hope that these empirical data will inform journal policy changes and structural initiatives to promote clearer policies and manuscripts.

  11. Pixe analysis of Cu,Zn,Hg and Cd in mussels samples in bay of Algiers

    International Nuclear Information System (INIS)

    Benamar, M.A.; Tchantchane, A.; Benouali, N.; Azbouche, A.; Tobbeche, S.

    1995-01-01

    The purpose of our work is the elaboration of and absolute technique for determination of trace elements in biological matrices by means of Pixe analysis. We are interested in the determination of heavy metals in mussels samples taken from differents sits of algies coast (Cu,Zn,Cd and Hg). The reason of our choise is the element toxicity and the possible contamination of the marine environment

  12. Distribution characteristics of volatile methylsiloxanes in Tokyo Bay watershed in Japan: Analysis of surface waters by purge and trap method.

    Science.gov (United States)

    Horii, Yuichi; Minomo, Kotaro; Ohtsuka, Nobutoshi; Motegi, Mamoru; Nojiri, Kiyoshi; Kannan, Kurunthachalam

    2017-05-15

    Surface waters including river water and effluent from sewage treatment plants (STPs) were collected from Tokyo Bay watershed, Japan, and analyzed for seven cyclic and linear volatile methylsiloxanes (VMSs), i.e., D3, D4, D5, D6, L3, L4, and L5 by an optimized purge and trap extraction method. The total concentrations of seven VMSs (ΣVMS) in river water ranged from watershed was estimated at 2300kg. Our results indicate widespread distribution of VMSs in Tokyo Bay watershed and the influence of domestic wastewater discharges as a source of VMSs in the aquatic environment. Copyright © 2017 Elsevier B.V. All rights reserved.

  13. Quasi-experimental Methods in Empirical Regional Science and Policy Analysis – Is there a Scope for Application?

    DEFF Research Database (Denmark)

    Mitze, Timo; Paloyo, Alfredo R.; Alecke, Björn

    Applied econometrics has recently emphasized the identification of causal parameters for policy analysis. This revolution has yet to fully propagate to the field of regional science. We examine the scope for application of the matching approach – part of the modern applied econometrics toolkit...... – in regional science and highlight special features of regional data that make such an application difficult. In particular, our analysis of the effect of regional subsidies on labor-productivity growth in Germany indicates that such policies are effective, but only up to a certain maximum treatment intensity...... to be interpreted with some caution. The matching approach nevertheless can be of great value for regional policy analysis and should be the subject of future research efforts in the field of empirical regional science....

  14. Bayes linear statistics, theory & methods

    CERN Document Server

    Goldstein, Michael

    2007-01-01

    Bayesian methods combine information available from data with any prior information available from expert knowledge. The Bayes linear approach follows this path, offering a quantitative structure for expressing beliefs, and systematic methods for adjusting these beliefs, given observational data. The methodology differs from the full Bayesian methodology in that it establishes simpler approaches to belief specification and analysis based around expectation judgements. Bayes Linear Statistics presents an authoritative account of this approach, explaining the foundations, theory, methodology, and practicalities of this important field. The text provides a thorough coverage of Bayes linear analysis, from the development of the basic language to the collection of algebraic results needed for efficient implementation, with detailed practical examples. The book covers:The importance of partial prior specifications for complex problems where it is difficult to supply a meaningful full prior probability specification...

  15. Comparative analysis of long-term chlorophyll data with generalized additive model - San Francisco Bay and St. Lucie Estuary

    Science.gov (United States)

    The health of estuarine ecosystems is often influenced by hydraulic and nutrient loading from upstream watersheds. We examined four decades of monitoring data of nutrient export into the Indian River Lagoon and San Francisco Bay, both of which have received considerable attentio...

  16. Anthropogenic effects on shoreface and shoreline changes: Input from a multi-method analysis, Agadir Bay, Morocco

    Science.gov (United States)

    Aouiche, Ismail; Daoudi, Lahcen; Anthony, Edward J.; Sedrati, Mouncef; Ziane, Elhassane; Harti, Abderrazak; Dussouillez, Philippe

    2016-02-01

    In many situations, the links between shoreline fluctuations and larger-scale coastal change embracing the shoreface are not always well understood. In particular, meso-scale (years to decades) sand exchanges between the shoreface and the shoreline, considered as important on many wave-dominated coasts, are rather poorly understood and difficult to identify. Coastal systems where sediment transport is perturbed by engineering interventions on the shoreline and shoreface commonly provide fine examples liable to throw light on these links. This is especially so where shoreface bathymetric datasets, which are generally lacking, are collected over time, enabling more or less fine resolution of the meso-scale coastal sediment budget. Agadir Bay and the city of Agadir together form one of the two most important economic development poles on the Atlantic coast of Morocco. Using a combined methodological approach based on wave-current modelling, bathymetric chart-differencing, determination of shoreline fluctuations, and beach topographic surveying, we highlight the close links between variations in the bed of the inner shoreface and the bay shoreline involving both cross-shore and longshore sand transport pathways, sediment budget variations and new sediment cell patterns. We show that the significant changes that have affected the bay shoreline and shoreface since 1978 clearly reflect anthropogenic impacts, notably blocking of alongshore sand transport by Agadir harbour, completed in 1988, and the foundations of which lie well beyond the depth of wave closure. Construction of the harbour has led to the creation of a rapidly accreting beach against an original portion of rocky shoreline updrift and to a net sand loss exceeding 145,000 m3/year between 1978 and 2012 over 8.5 km2of the bay shoreface downdrift. Shoreline retreat has been further exacerbated by sand extraction from aeolian dunes and by flattening of these dunes to make space for tourist infrastructure. Digital

  17. Potential of qualitative network analysis in migration studies- Reflections based on an empirical analysis of young researchers' mobility aspirations

    OpenAIRE

    Elisabeth Scheibelhofer

    2011-01-01

    Based on the example of an empirical research study, the paper examines the strengths and limitations of a qualitative network approach to migration and mobility. The method of graphic drawings produced by the respondents within an interview setting was applied. With this method, we argue to be able to analyse migrants’ specific social embeddedness and its influence on future mobility aspirations. Likewise, connections between the migratory biography and the individuals’ various social relati...

  18. The Wage Determination Process in Turkey: An Empirical Analysis in Kaleckian Perspective

    Directory of Open Access Journals (Sweden)

    Başak Gül AKTAKAS

    2014-05-01

    Full Text Available Orthodox economists generally think about labor market that the price stability and low unemployment cannot be achieved at the same time. In this sense, the Orthodox argument discusses that a decline in the aggregate demand will decrease the money wages and real wages proportionally, and increase the volume of employment. Michal Kalecki denies such a wage policy which is consistently determined by this idea. Wages reflect the price-money wage relation in real terms. Prices set in regard to the degree of monopoly. In this context, it is assumed real wages is determined depending upon the degree of monopoly, labor productivity and price of import goods. According to Kalecki, Orthodoxian view which relates a decrease in real wages with an increase in production based on increasing marginal cost assumption and Kalecki does not accept this perspective. Kaleckian PostKeynesian labor market will be theoretically discussed and the determination process of the real wage will be empirically analyzed for Turkish economy over the period 1989:1 to 2012:4.

  19. Is CO2 emission a side effect of financial development? An empirical analysis for China.

    Science.gov (United States)

    Hao, Yu; Zhang, Zong-Yong; Liao, Hua; Wei, Yi-Ming; Wang, Shuo

    2016-10-01

    Based on panel data for 29 Chinese provinces from 1995 to 2012, this paper explores the relationship between financial development and environmental quality in China. A comprehensive framework is utilized to estimate both the direct and indirect effects of financial development on CO 2 emissions in China using a carefully designed two-stage regression model. The first-difference and orthogonal-deviation Generalized Method of Moments (GMM) methods are used to control for potential endogeneity and introduce dynamics. To ensure the robustness of the estimations, two indicators measuring financial development-financial depth and financial efficiency-are used. The empirical results indicate that the direct effects of financial depth and financial efficiency on environmental quality are positive and negative, respectively. The indirect effects of both indicators are U shaped and dominate the shape of the total effects. These findings suggest that the influences of the financial development on environment depend on the level of economic development. At the early stage of economic growth, financial development is environmentally friendly. When the economy is highly developed, a higher level of financial development is harmful to the environmental quality.

  20. An Empirical Analysis of the Performance of Preconditioners for SPD Systems

    KAUST Repository

    George, Thomas

    2012-08-01

    Preconditioned iterative solvers have the potential to solve very large sparse linear systems with a fraction of the memory used by direct methods. However, the effectiveness and performance of most preconditioners is not only problem dependent, but also fairly sensitive to the choice of their tunable parameters. As a result, a typical practitioner is faced with an overwhelming number of choices of solvers, preconditioners, and their parameters. The diversity of preconditioners makes it difficult to analyze them in a unified theoretical model. A systematic empirical evaluation of existing preconditioned iterative solvers can help in identifying the relative advantages of various implementations. We present the results of a comprehensive experimental study of the most popular preconditioner and iterative solver combinations for symmetric positive-definite systems. We introduce a methodology for a rigorous comparative evaluation of various preconditioners, including the use of some simple but powerful metrics. The detailed comparison of various preconditioner implementations and a state-of-the-art direct solver gives interesting insights into their relative strengths and weaknesses. We believe that these results would be useful to researchers developing preconditioners and iterative solvers as well as practitioners looking for appropriate sparse solvers for their applications. © 2012 ACM.

  1. An Empirical Study on User-oriented Association Analysis of Library Classification Schemes

    Directory of Open Access Journals (Sweden)

    Hsiao-Tieh Pu

    2002-12-01

    Full Text Available Library classification schemes are mostly organized based on disciplines with a hierarchical structure. From the user point of view, some highly related yet non-hierarchical classes may not be easy to perceive in these schemes. This paper is to discover hidden associations between classes by analyzing users’ usage of library collections. The proposed approach employs collaborative filtering techniques to discover associated classes based on the circulation patterns of similar users. Many associated classes scattered across different subject hierarchies could be discovered from the circulation patterns of similar users. The obtained association norms between classes were found to be useful in understanding users' subject preferences for a given class. Classification schemes can, therefore, be made more adaptable to changes of users and the uses of different library collections. There are implications for applications in information organization and retrieval as well. For example, catalogers could refer to the ranked associated classes when they perform multi-classification, and users could also browse the associated classes for related subjects in an enhanced OPAC system. In future research, more empirical studies will be needed to validate the findings, and methods for obtaining user-oriented associations can still be improved.[Article content in Chinese

  2. The purchasing power parity in emerging Europe: Empirical results based on two-break analysis

    Directory of Open Access Journals (Sweden)

    Mladenović Zorica

    2013-01-01

    Full Text Available The purpose of the paper is to evaluate the validity of purchasing power parity (PPP for eight countries from the Emerging Europe: Hungary, Czech Republic, Poland, Romania, Lithuania, Latvia, Serbia and Turkey. Monthly data for euro and U.S. dollar based real exchange rate time series are considered covering the period: January, 2000 - August, 2011. Given significant changes in these economies in this sample it seems plausible to assume that real exchange time series are characterized by more than one time structural break. In order to endogenously determine the number and type of breaks while testing for the presence of unit roots we applied the Lee-Strazicich approach. For two euro based real exchange rate time series (in Hungary and Turkey the unit root hypothesis has been rejected. For the U.S. dollar based real exchange rate time series in Poland, Romania and Turkey the presence of unit root has been rejected. To assess the adjustment dynamics of those real exchange rates that were detected to be stationary with two breaks, the impulse response function is calculated and half-life is estimated. Our overall conclusion is that the persistence of real exchange rate in Emerging Europe is still substantially high. The lack of strong empirical support for PPP suggests that careful policy actions are needed in this region to prevent serious exchange rate misalignment.

  3. The effects of traited and situational impression management on a personality test: an empirical analysis

    Directory of Open Access Journals (Sweden)

    MICHAEL S. HENRY

    2006-09-01

    Full Text Available Studies examining impression management (IM in self-report measures typically assume that impression management is either a 1 trait or 2 situational variable, which has led to often conflicting results (Stark, Chernyshenko, Chan, Lee and Drasgow. 2001. This study examined the item-level and scale-level responses on six empirically-derived facets of conscientiousness from the California Psychological Inventory (CPI between high and low IM groups. Subjects (N = 6,220 were participants in a management assessment conducted by an external consulting firm. Subjects participated in the assessment as part of either 1 a selection or promotional process, or 2 a feedback and development process, and two specific occupational groups (sales/marketing and accounting/finance were examined. Using the IRT-based DFIT framework (Raju, Van der Linden & Fleer, 1995, the item-level and scale-level differences were examined for the situational IM and traited IM approaches. The results indicated that relatively little DIF/DTF was present and that the differences between the two approaches to examining IM may not be as great as previously suggested.

  4. Banking Competition and Efficiency: Empirical Analysis on the Bosnia and Herzegovina Using Panzar-Rosse Model

    Directory of Open Access Journals (Sweden)

    Memić Deni

    2015-03-01

    Full Text Available Background: Competition in the banking industry has been an important topic in the scientific literature as researchers tried to assess the level of competition in the banking sector. Objectives: This paper has an aim to investigate the market structure and a long term equilibrium of the banking market in Bosnia and Herzegovina nationwide as well as on its constitutional entities as well as to evaluate the monopoly power of banks during the years 2008-2012. Methods/Approach: The paper is examining the market structure using the most frequently applied measures of concentration k-bank concentration ratio (CRk and Herfindahl-Hirschman Index (HHI as well as evaluating the monopoly power of banks by employing Panzar-Rosse “H-statistic”. Results: The empirical results using CRk and HHI show that Bosnia and Herzegovina banking market has a moderately concentrated market with a concentration decreasing trend. The Panzar-Rosse “H-statistic” suggests that banks in Bosnia and Herzegovina operate under monopoly or monopolistic competition depending on the market segment. Conclusions: Banks operating on the banking market in Bosnia and Herzegovina seem to be earning their total and interest revenues under monopoly or perfectly collusive oligopoly.

  5. Situation analysis of R & d activities: an empirical study in Iranian pharmaceutical companies.

    Science.gov (United States)

    Rasekh, Hamid Reza; Mehralian, Gholamhossein; Vatankhah-Mohammadabadi, Abbas Ali

    2012-01-01

    As global competition intensifies, research and development (R & D) organizations need to enhance their strategic management in order to become goal-directed communities for innovation and allocate their resources consistent with their overall R & D strategy. The world pharmaceutical market has undergone fast, unprecedented, tremendous and complex changes in the last several years. The pharmaceutical industry is today still one of the most inventive, innovative and lucrative of the so-called "high-tech" industries. This industry serves a dual role in modern society. On one hand, it is a growing industry, and its output makes a direct contribution to gross domestic product (GDP). On the other side, drugs, this industry's major output, are an input in the production of good health. The purpose of this study is to evaluate R & D activities of pharmaceutical companies, and also to highlight critical factors which have influential effect on results of these activities. To run this study a valid questionnaire based on literature review and experts' opinion was designed and delivered to 11 pharmaceutical companies. Empirical data show there is not acceptable situations considering of the factors that should be taken in to account by managers including; management commitment, human resource management, information technology and financial management. Furthermore, we concluded some interesting results related to different aspects of R & D management. In conclusion, managers must be aware about their performance in R & D activities, accordingly they will able to take a comprehensive policy in both national and within the company.

  6. Analysis of respiratory mechanomyographic signals by means of the empirical mode decomposition

    International Nuclear Information System (INIS)

    Torres, A; Jane, R; Fiz, J A; Laciar, E; Galdiz, J B; Gea, J; Morera, J

    2007-01-01

    The study of the mechanomyographic (MMG) signals of respiratory muscles is a promising technique in order to evaluate the respiratory muscles effort. A critical point in MMG studies is the selection of the cut-off frequency in order to separate the low frequency (LF) component (basically due to gross movement of the muscle or of the body) and the high frequency (HF) component (related with the vibration of the muscle fibres during contraction). In this study, we propose to use the Empirical Mode Decomposition method in order to analyze the Intrinsic Mode Functions of MMG signals of the diaphragm muscle, acquired by means of a capacitive accelerometer applied on the costal wall. The method was tested on an animal model, with two incremental respiratory protocols performed by two non anesthetized mongrel dogs. The proposed EMD based method seems to be a useful tool to eliminate the low frequency component of MMG signals. The obtained correlation coefficients between respiratory and MMG parameters were higher than the ones obtained with a Wavelet multiresolution decomposition method utilized in a previous work

  7. Improved Prediction of Preterm Delivery Using Empirical Mode Decomposition Analysis of Uterine Electromyography Signals.

    Directory of Open Access Journals (Sweden)

    Peng Ren

    Full Text Available Preterm delivery increases the risk of infant mortality and morbidity, and therefore developing reliable methods for predicting its likelihood are of great importance. Previous work using uterine electromyography (EMG recordings has shown that they may provide a promising and objective way for predicting risk of preterm delivery. However, to date attempts at utilizing computational approaches to achieve sufficient predictive confidence, in terms of area under the curve (AUC values, have not achieved the high discrimination accuracy that a clinical application requires. In our study, we propose a new analytical approach for assessing the risk of preterm delivery using EMG recordings which firstly employs Empirical Mode Decomposition (EMD to obtain their Intrinsic Mode Functions (IMF. Next, the entropy values of both instantaneous amplitude and instantaneous frequency of the first ten IMF components are computed in order to derive ratios of these two distinct components as features. Discrimination accuracy of this approach compared to those proposed previously was then calculated using six differently representative classifiers. Finally, three different electrode positions were analyzed for their prediction accuracy of preterm delivery in order to establish which uterine EMG recording location was optimal signal data. Overall, our results show a clear improvement in prediction accuracy of preterm delivery risk compared with previous approaches, achieving an impressive maximum AUC value of 0.986 when using signals from an electrode positioned below the navel. In sum, this provides a promising new method for analyzing uterine EMG signals to permit accurate clinical assessment of preterm delivery risk.

  8. An empirical analysis of the green paradox in China: From the perspective of fiscal decentralization

    International Nuclear Information System (INIS)

    Zhang, Kun; Zhang, Zong-Yong; Liang, Qiao-Mei

    2017-01-01

    While it is generally recognized that the introduction of environmental policy can effectively control carbon emissions, the green paradox hypothesis puts forth a new warning about the validity of this policy's implementation. This study uses panel data on 29 Chinese provinces from 1995 to 2012 to investigate the impact of fiscal decentralization on the functional mechanisms of environmental policy while controlling for the spatial correlations of carbon emission. The empirical results indicate that environmental policy alone can achieve the objective of reducing carbon emissions. However, the Chinese style fiscal decentralization makes the environmental policy significantly promote carbon emissions, leading to a green paradox. Moreover, we find that the impact of fiscal decentralization on environmental policy varies greatly among different geographical regions and the direct-controlled municipalities. In addition, our study confirms the spatial correlations in China's carbon emissions by using a spatial integration term. Finally, we recommend that emission reduction efforts should be incorporated into the local government's performance evaluation system to improve the institutional environment. Further, differentiated environmental policies and measures should be considered for different provinces to maximize the emission reduction potential. - Highlights: • We consider the spatial correlations of carbon emissions in neighboring provinces. • The impacts of environmental regulation on carbon emissions are examined. • Fiscal decentralization is not beneficial to environmental policy implementation. • The effects of fiscal decentralization vary greatly among different regions.

  9. Source analysis using regional empirical Green's functions: The 2008 Wells, Nevada, earthquake

    Science.gov (United States)

    Mendoza, C.; Hartzell, S.

    2009-01-01

    We invert three-component, regional broadband waveforms recorded for the 21 February 2008 Wells, Nevada, earthquake using a finite-fault methodology that prescribes subfault responses using eight MW∼4 aftershocks as empirical Green's functions (EGFs) distributed within a 20-km by 21.6-km fault area. The inversion identifies a seismic moment of 6.2 x 1024 dyne-cm (5.8 MW) with slip concentrated in a compact 6.5-km by 4-km region updip from the hypocenter. The peak slip within this localized area is 88 cm and the stress drop is 72 bars, which is higher than expected for Basin and Range normal faults in the western United States. The EGF approach yields excellent fits to the complex regional waveforms, accounting for strong variations in wave propagation and site effects. This suggests that the procedure is useful for studying moderate-size earthquakes with limited teleseismic or strong-motion data and for examining uncertainties in slip models obtained using theoretical Green's functions.

  10. PERFORMANCE IN CROSS-BORDER MERGERS AND ACQUISITIONS: AN EMPIRICAL ANALYSIS OF THE BRAZILIAN CASE

    Directory of Open Access Journals (Sweden)

    Adriana Bruscato Bortoluzzo

    2014-10-01

    Full Text Available The purpose of this article is to investigate whether the cross-border acquisitions made by Brazilian companies over the past 15 years have improved their financial performance. Drawing on institutional, sociocultural, and organizational learning theories, this study develops and empirically tests several hypotheses on the determinants of M&A performance. The results demonstrate that the cross-border acquisition moves by Brazilian companies actually improve their financial performance. Financial performance tends to be positive when the cultural distance between the countries of the acquiring and acquired companies is low to medium and when the institutional context of the acquired company is a developed one. We also found an inverted-U shape relationship between acquiring companies’ previous international M&A experience and the performance of a new cross-border operation. These findings suggest that research on international M&As should include acquirers’ M&A experience as well as the institutional characteristics of their target countries.

  11. Organizational learning in commercial nuclear power plant safety: An empirical analysis

    International Nuclear Information System (INIS)

    Marcus, A.A.; Bromiley, P.; Nichols, M.L.

    1989-01-01

    The need for knowledge in organizations that manage and run high risk technologies is very high. The acquisition of useful knowledge is referred to as organizational learning. The theoretical roots of this concept are well established in the academic literature and in practice, especially in manufacturing industries. This paper focuses on organizational problem solving and learning as it relates to the safe and efficient management of commercial nuclear power plants. The authors are co-investigators on a larger team working under contract with the Nuclear Regulatory Commission to develop a logical framework that enables systematic examination of potential linkages between management and organizational factors and safety in nuclear power plant performance. Management and organizational factors that facilitate or impede organizational learning are only a part of the larger study, but are the major focus of this paper. In this paper, the theoretical roots of the concept of organizational learning are discussed, relationships to measures of safety and efficiency of commercial nuclear power plants are hypothesized, and empirical findings which provide partial tests of the hypotheses are discussed. This line of research appears promising; implications for further research, regulatory application, and nuclear power plant management are described

  12. Revisiting the causal nexus between savings and economic growth in India: An empirical analysis

    Directory of Open Access Journals (Sweden)

    Suresh Kumar Patra

    2017-09-01

    Full Text Available This paper attempts to analyze the long run association between savings and growth; and investigates the causality issue in Indian context for the period 1950–51 to 2011–12. Firstly, the study identifies the structural break in the year 1980 by employing Bi-Perron test with unknown time. Further, it examines the association and the direction of causality between savings and real economic activity. The empirical evidence of the study suggests that savings boost the real activity both in the pre and post break period in the long run, while economic growth causes saving in the short run in the pre break period. Thus, the present study brings evidence in favour of the neoclassical exogenous and the post-neoclassical endogenous growth models and suggest that both the incentive-based measures and the productivity-based measures would be useful to generate higher savings and reinforce the acceleration of income and growth. JEL classification: E21, O4, C22, Keywords: Savings, Economic growth, Structural break, Palavras-chave: Poupança, Ggrowth econômico, Ruptura estrutural

  13. A 16-year examination of domestic violence among Asians and Asian Americans in the empirical knowledge base: a content analysis.

    Science.gov (United States)

    Yick, Alice G; Oomen-Early, Jody

    2008-08-01

    Until recently, research studies have implied that domestic violence does not affect Asian American and immigrant communities, or even Asians abroad, because ethnicity or culture has not been addressed. In this content analysis, the authors examined trends in publications in leading scholarly journals on violence relating to Asian women and domestic violence. A coding schema was developed, with two raters coding the data with high interrater reliability. Sixty articles were published over the 16 years studied, most atheoretical and focusing on individual levels of analysis. The terms used in discussing domestic violence reflected a feminist perspective. Three quarters of the studies were empirical, with most guided by logical positivism using quantitative designs. Most targeted specific Asian subgroups (almost a third focused on Asian Indians) rather than categorizing Asians as a general ethnic category. The concept of "Asian culture" was most often assessed by discussing Asian family structure. Future research is discussed in light of the findings.

  14. A differentiating empirical linguistic analysis of dreamer activity in reports of EEG-controlled REM-dreams and hypnagogic hallucinations.

    Science.gov (United States)

    Speth, Jana; Frenzel, Clemens; Voss, Ursula

    2013-09-01

    We present Activity Analysis as a new method for the quantification of subjective reports of altered states of consciousness with regard to the indicated level of simulated motor activity. Empirical linguistic activity analysis was conducted with dream reports conceived immediately after EEG-controlled periods of hypnagogic hallucinations and REM-sleep in the sleep laboratory. Reports of REM-dreams exhibited a significantly higher level of simulated physical dreamer activity, while hypnagogic hallucinations appear to be experienced mostly from the point of passive observer. This study lays the groundwork for clinical research on the level of simulated activity in pathologically altered states of subjective experience, for example in the REM-dreams of clinically depressed patients, or in intrusions and dreams of patients diagnosed with PTSD. Copyright © 2013 Elsevier Inc. All rights reserved.

  15. Empirical likelihood

    CERN Document Server

    Owen, Art B

    2001-01-01

    Empirical likelihood provides inferences whose validity does not depend on specifying a parametric model for the data. Because it uses a likelihood, the method has certain inherent advantages over resampling methods: it uses the data to determine the shape of the confidence regions, and it makes it easy to combined data from multiple sources. It also facilitates incorporating side information, and it simplifies accounting for censored, truncated, or biased sampling.One of the first books published on the subject, Empirical Likelihood offers an in-depth treatment of this method for constructing confidence regions and testing hypotheses. The author applies empirical likelihood to a range of problems, from those as simple as setting a confidence region for a univariate mean under IID sampling, to problems defined through smooth functions of means, regression models, generalized linear models, estimating equations, or kernel smooths, and to sampling with non-identically distributed data. Abundant figures offer vi...

  16. Urban Greening Bay Area

    Science.gov (United States)

    Information about the San Francisco Bay Water Quality Project (SFBWQP) Urban Greening Bay Area, a large-scale effort to re-envision urban landscapes to include green infrastructure (GI) making communities more livable and reducing stormwater runoff.

  17. Empirical comparison of four baseline covariate adjustment methods in analysis of continuous outcomes in randomized controlled trials

    Directory of Open Access Journals (Sweden)

    Zhang S

    2014-07-01

    Full Text Available Shiyuan Zhang,1 James Paul,2 Manyat Nantha-Aree,2 Norman Buckley,2 Uswa Shahzad,2 Ji Cheng,2 Justin DeBeer,5 Mitchell Winemaker,5 David Wismer,5 Dinshaw Punthakee,5 Victoria Avram,5 Lehana Thabane1–41Department of Clinical Epidemiology and Biostatistics, 2Department of Anesthesia, McMaster University, Hamilton, ON, Canada; 3Biostatistics Unit/Centre for Evaluation of Medicines, St Joseph's Healthcare - Hamilton, Hamilton, ON, Canada; 4Population Health Research Institute, Hamilton Health Science/McMaster University, 5Department of Surgery, Division of Orthopaedics, McMaster University, Hamilton, ON, CanadaBackground: Although seemingly straightforward, the statistical comparison of a continuous variable in a randomized controlled trial that has both a pre- and posttreatment score presents an interesting challenge for trialists. We present here empirical application of four statistical methods (posttreatment scores with analysis of variance, analysis of covariance, change in scores, and percent change in scores, using data from a randomized controlled trial of postoperative pain in patients following total joint arthroplasty (the Morphine COnsumption in Joint Replacement Patients, With and Without GaBapentin Treatment, a RandomIzed ControlLEd Study [MOBILE] trials.Methods: Analysis of covariance (ANCOVA was used to adjust for baseline measures and to provide an unbiased estimate of the mean group difference of the 1-year postoperative knee flexion scores in knee arthroplasty patients. Robustness tests were done by comparing ANCOVA with three comparative methods: the posttreatment scores, change in scores, and percentage change from baseline.Results: All four methods showed similar direction of effect; however, ANCOVA (-3.9; 95% confidence interval [CI]: -9.5, 1.6; P=0.15 and the posttreatment score (-4.3; 95% CI: -9.8, 1.2; P=0.12 method provided the highest precision of estimate compared with the change score (-3.0; 95% CI: -9.9, 3.8; P=0

  18. Is bigger better? An empirical analysis of waste management in New South Wales

    International Nuclear Information System (INIS)

    Carvalho, Pedro; Marques, Rui Cunha; Dollery, Brian

    2015-01-01

    Highlights: • We search for the most efficient cost structure for NSW household waste services. • We found that larger services are no longer efficient. • We found an optimal size for the range 12,000–20,000 inhabitants. • We found significant economies of output density for household waste collection. • We found economies of scope in joint provision of unsorted and recycling services. - Abstract: Across the world, rising demand for municipal solid waste services has seen an ongoing increase in the costs of providing these services. Moreover, municipal waste services have typically been provided through natural or legal monopolies, where few incentives exist to reduce costs. It is thus vital to examine empirically the cost structure of these services in order to develop effective public policies which can make these services more cost efficient. Accordingly, this paper considers economies of size and economies of output density in the municipal waste collection sector in the New South Wales (NSW) local government system in an effort to identify the optimal size of utilities from the perspective of cost efficiency. Our results show that – as presently constituted – NSW municipal waste services are not efficient in terms of costs, thereby demonstrating that ‘bigger is not better.’ The optimal size of waste utilities is estimated to fall in the range 12,000–20,000 inhabitants. However, significant economies of output density for unsorted (residual) municipal waste collection and recycling waste collection were found, which means it is advantageous to increase the amount of waste collected, but maintaining constant the number of customers and the intervention area

  19. Global Analysis of Empirical Relationships Between Annual Climate and Seasonality of NDVI

    Science.gov (United States)

    Potter, C. S.

    1997-01-01

    This study describes the use of satellite data to calibrate a new climate-vegetation greenness function for global change studies. We examined statistical relationships between annual climate indexes (temperature, precipitation, and surface radiation) and seasonal attributes of the AVHRR Normalized Difference Vegetation Index (NDVI) time series for the mid-1980s in order to refine our empirical understanding of intraannual patterns and global abiotic controls on natural vegetation dynamics. Multiple linear regression results using global l(sup o) gridded data sets suggest that three climate indexes: growing degree days, annual precipitation total, and an annual moisture index together can account to 70-80 percent of the variation in the NDVI seasonal extremes (maximum and minimum values) for the calibration year 1984. Inclusion of the same climate index values from the previous year explained no significant additional portion of the global scale variation in NDVI seasonal extremes. The monthly timing of NDVI extremes was closely associated with seasonal patterns in maximum and minimum temperature and rainfall, with lag times of 1 to 2 months. We separated well-drained areas from l(sup o) grid cells mapped as greater than 25 percent inundated coverage for estimation of both the magnitude and timing of seasonal NDVI maximum values. Predicted monthly NDVI, derived from our climate-based regression equations and Fourier smoothing algorithms, shows good agreement with observed NDVI at a series of ecosystem test locations from around the globe. Regions in which NDVI seasonal extremes were not accurately predicted are mainly high latitude ecosystems and other remote locations where climate station data are sparse.

  20. Empirical Analysis and Modeling of Stop-Line Crossing Time and Speed at Signalized Intersections

    Directory of Open Access Journals (Sweden)

    Keshuang Tang

    2016-12-01

    Full Text Available In China, a flashing green (FG indication of 3 s followed by a yellow (Y indication of 3 s is commonly applied to end the green phase at signalized intersections. Stop-line crossing behavior of drivers during such a phase transition period significantly influences safety performance of signalized intersections. The objective of this study is thus to empirically analyze and model drivers’ stop-line crossing time and speed in response to the specific phase transition period of FG and Y. High-resolution trajectories for 1465 vehicles were collected at three rural high-speed intersections with a speed limit of 80 km/h and two urban intersections with a speed limit of 50 km/h in Shanghai. With the vehicle trajectory data, statistical analyses were performed to look into the general characteristics of stop-line crossing time and speed at the two types of intersections. A multinomial logit model and a multiple linear regression model were then developed to predict the stop-line crossing patterns and speeds respectively. It was found that the percentage of stop-line crossings during the Y interval is remarkably higher and the stop-line crossing time is approximately 0.7 s longer at the urban intersections, as compared with the rural intersections. In addition, approaching speed and distance to the stop-line at the onset of FG as well as area type significantly affect the percentages of stop-line crossings during the FG and Y intervals. Vehicle type and stop-line crossing pattern were found to significantly influence the stop-line crossing speed, in addition to the above factors. The red-light-running seems to occur more frequently at the large intersections with a long cycle length.

  1. Is bigger better? An empirical analysis of waste management in New South Wales

    Energy Technology Data Exchange (ETDEWEB)

    Carvalho, Pedro, E-mail: pedrotcc@gmail.com [LAMEMO Laboratory, Department of Civil Engineering, Federal University of Rio de Janeiro, COPPE/UFRJ, Av. Pedro Calmon – Ilha do Fundão, 21941-596 Rio de Janeiro (Brazil); CESUR – Center for Urban and Regional Systems, Instituto Superior Técnico, University of Lisboa, Av. Rovisco Pais, 1049-001 Lisbon (Portugal); Marques, Rui Cunha, E-mail: rui.marques@tecnico.ulisboa.pt [CESUR – Center for Urban and Regional Systems, Instituto Superior Técnico, University of Lisboa, Av. Rovisco Pais, 1049-001 Lisbon (Portugal); Dollery, Brian, E-mail: bdollery@une.edu.au [CLG – Centre for Local Government, University of New England, Armidale, NSW (Australia); Faculty of Economics, Yokohama National University (Japan)

    2015-05-15

    Highlights: • We search for the most efficient cost structure for NSW household waste services. • We found that larger services are no longer efficient. • We found an optimal size for the range 12,000–20,000 inhabitants. • We found significant economies of output density for household waste collection. • We found economies of scope in joint provision of unsorted and recycling services. - Abstract: Across the world, rising demand for municipal solid waste services has seen an ongoing increase in the costs of providing these services. Moreover, municipal waste services have typically been provided through natural or legal monopolies, where few incentives exist to reduce costs. It is thus vital to examine empirically the cost structure of these services in order to develop effective public policies which can make these services more cost efficient. Accordingly, this paper considers economies of size and economies of output density in the municipal waste collection sector in the New South Wales (NSW) local government system in an effort to identify the optimal size of utilities from the perspective of cost efficiency. Our results show that – as presently constituted – NSW municipal waste services are not efficient in terms of costs, thereby demonstrating that ‘bigger is not better.’ The optimal size of waste utilities is estimated to fall in the range 12,000–20,000 inhabitants. However, significant economies of output density for unsorted (residual) municipal waste collection and recycling waste collection were found, which means it is advantageous to increase the amount of waste collected, but maintaining constant the number of customers and the intervention area.

  2. An Empirical Analysis of the Effect of Stock Market Crisis on Economic Growth: The Nigerian Case

    Directory of Open Access Journals (Sweden)

    Omowunmi Felicia Olokoyo

    2011-08-01

    Full Text Available Stock market crashes are social phenomena where external economic events combine with crowd behavior and psychology in a positive feedback loop where selling by some market participants drives more market participants to sell. This study empirically established the relationship between stock market crisis and Nigeria’s economic growth and also showed the relationship between stock market price crash and the crisis itself. In this light, this paper examined the interactive influence of movements in the major indicators of the performance of the Nigerian Stock Exchange Market such as the Market Capitalization (MK, All Share Index (ASI, Number of Deals (NOD, Volume and Value of Stock (VV, Total Number of New Issues (TNI and Inflation (INFR on the Nigerian Gross Domestic Product (GDP using data from 1985-2009. To achieve the two objectives stated above, the Ordinary Least Square (OLS method was employed. To correct for the OLS result biasness the log was applied to GDP and MK and also AR(1 was introduced to the first model. The result shows that stock market crisis has a highly significant effect on Nigeria’s economic growth. The result also shows a significant relationship between stock market price crash and the market crisis itself. It is therefore recommended that in the face of the ongoing crisis in the global stock market, the Nigerian stock market authorities should aim at making the market meet a world class standard. Also, all the sectors of the economy should act in a collaborative manner such that optimum benefits can be realized from their economic activities in the Nigeria market even in the hub of global crisis.

  3. A consensus microsatellite-based linkage map for the hermaphroditic bay scallop (Argopecten irradians and its application in size-related QTL analysis.

    Directory of Open Access Journals (Sweden)

    Hongjun Li

    Full Text Available Bay scallop (Argopecten irradians is one of the most economically important aquaculture species in China. In this study, we constructed a consensus microsatellite-based genetic linkage map with a mapping panel containing two hybrid backcross-like families involving two subspecies of bay scallop, A. i. irradians and A. i. concentricus. One hundred sixty-one microsatellite and one phenotypic (shell color markers were mapped to 16 linkage groups (LGs, which corresponds to the haploid chromosome number of bay scallop. The sex-specific map was 779.2 cM and 781.6 cM long in female and male, respectively, whereas the sex-averaged map spanned 849.3 cM. The average resolution of integrated map was 5.9 cM/locus and the estimated coverage was 81.3%. The proportion of distorted markers occurred more in the hybrid parents, suggesting that the segregation distortion was possibly resulted from heterospecific interaction between genomes of two subspecies of bay scallop. The overall female-to-male recombination rate was 1.13:1 across all linked markers in common to both parents, and considerable differences in recombination also existed among different parents in both families. Four size-related traits, including shell length (SL, shell height (SH, shell width (SW and total weight (TW were measured for quantitative trait loci (QTL analysis. Three significant and six suggestive QTL were detected on five LGs. Among the three significant QTL, two (qSW-10 and qTW-10, controlling SW and TW, respectively were mapped on the same region near marker AiAD121 on LG10 and explained 20.5% and 27.7% of the phenotypic variance, while the third (qSH-7, controlling SH was located on LG7 and accounted for 15.8% of the phenotypic variance. Six suggestive QTL were detected on four different LGs. The linkage map and size-related QTL obtained in this study may facilitate marker-assisted selection (MAS in bay scallop.

  4. Seasonal Variation of Colored Dissolved Organic Matter in Barataria Bay, Louisiana, Using Combined Landsat and Field Data

    Directory of Open Access Journals (Sweden)

    Ishan Joshi

    2015-09-01

    Full Text Available Coastal bays, such as Barataria Bay, are important transition zones between the terrigenous and marine environments that are also optically complex due to elevated amounts of particulate and dissolved constituents. Monthly field data collected over a period of 15 months in 2010 and 2011 in Barataria Bay were used to develop an empirical band ratio algorithm for the Landsat-5 TM that showed a good correlation with the Colored Dissolved Organic Matter (CDOM absorption coefficient at 355 nm (ag355 (R2 = 0.74. Landsat-derived CDOM maps generally captured the major details of CDOM distribution and seasonal influences, suggesting the potential use of Landsat imagery to monitor biogeochemistry in coastal water environments. An investigation of the seasonal variation in ag355 conducted using Landsat-derived ag355 as well as field data suggested the strong influence of seasonality in the different regions of the bay with the marine end members (lower bay experiencing generally low but highly variable ag355 and the freshwater end members (upper bay experiencing high ag355 with low variability. Barataria Bay experienced a significant increase in ag355 during the freshwater release at the Davis Pond Freshwater Diversion (DPFD following the Deep Water Horizon oil spill in 2010 and following the Mississippi River (MR flood conditions in 2011, resulting in a weak linkage to salinity in comparison to the other seasons. Tree based statistical analysis showed the influence of high river flow conditions, high- and low-pressure systems that appeared to control ag355 by ~28%, 29% and 43% of the time duration over the study period at the marine end member just outside the bay. An analysis of CDOM variability in 2010 revealed the strong influence of the MR in controlling CDOM abundance in the lower bay during the high flow conditions, while strong winds associated with cold fronts significantly increase CDOM abundance in the upper bay, thus revealing the important

  5. A Survey of Undergraduate Marketing Programs: An Empirical Analysis of Knowledge Areas and Metaskills

    Science.gov (United States)

    Finch, David; Nadeau, John; O'Reilly, Norm

    2018-01-01

    Scholars suggest that the dynamic nature of marketing has put both the marketing profession and marketing education at a crossroads. This study is an analysis of marketing programs by conceptual knowledge and metaskills. In a content analysis of course descriptions for 523 undergraduate marketing courses in Canada from 40 universities, the…

  6. The Effects of Variability and Risk in Selection Utility Analysis: An Empirical Comparison.

    Science.gov (United States)

    Rich, Joseph R.; Boudreau, John W.

    1987-01-01

    Investigated utility estimate variability for the selection utility of using the Programmer Aptitude Test to select computer programmers. Comparison of Monte Carlo results to other risk assessment approaches (sensitivity analysis, break-even analysis, algebraic derivation of the distribtion) suggests that distribution information provided by Monte…

  7. Empirical Analysis on the Impact of Personality Traits on Different Categories of Mobile App Adoption

    OpenAIRE

    Gan Chunmei; Zhang Chunfu; Liang Xubin

    2017-01-01

    [Purpose/significance] This research attempts to investigate the impact of personality traits on the adoption and time spending of mobile photography apps, mobile game apps, mobile shopping apps and mobile video apps. [Method/process] 520 valid samples were collected by questionnaires and further analyzed by using the variance analysis and correlation analysis. [Result/conclusion] The result shows that agreeableness is posi...

  8. An Empirical Study on Needs Analysis of College Business English Course

    Science.gov (United States)

    Wu, Yan

    2012-01-01

    Under the theoretical framework of needs analysis, this paper is aimed to give insights into the college business English learners' needs (including target situation needs, learning situation needs and present situation needs). The analysis of the research data has provided teachers insights into business English teaching related issues.

  9. An Empirical Analysis of the Performance of Vietnamese Higher Education Institutions

    Science.gov (United States)

    Tran, Carolyn-Dung T. T.; Villano, Renato A.

    2017-01-01

    This article provides an analysis of the academic performance of higher education institutions (HEIs) in Vietnam with 50 universities and 50 colleges in 2011/12. The two-stage semiparametric data envelopment analysis is used to estimate the efficiency of HEIs and investigate the effects of various factors on their performance. The findings reveal…

  10. An empirical analysis of the importance of controlling for unobserved heterogeneity when estimating the income-mortality gradient

    Directory of Open Access Journals (Sweden)

    Adriaan Kalwij

    2014-10-01

    Full Text Available Background: Statistical theory predicts that failing to control for unobserved heterogeneity in a Gompertz mortality risk model attenuates the estimated income-mortality gradient toward zero. Objective: I assess the empirical importance of controlling for unobserved heterogeneity in a Gompertz mortality risk model when estimating the income-mortality gradient. The analysis is carried out using individual-level administrative data from the Netherlands over the period 1996-2012. Methods: I estimate a Gompertz mortality risk model in which unobserved heterogeneity has a gamma distribution and left-truncation of life durations is explicitly taken into account. Results: I find that, despite a strong and significant presence of unobserved heterogeneity in both the male and female samples, failure to control for unobserved heterogeneity yields only a small and insignificant attenuation bias in the negative income-mortality gradient. Conclusions: The main finding, a small and insignificant attenuation bias in the negative income-mortality gradient when failing to control for unobserved heterogeneity, is positive news for the many empirical studies, whose estimations of the income-mortality gradient ignore unobserved heterogeneity.

  11. Determination of heavy metals and other elements in sediments from Sepetiba Bay (RJ, Brazil) by neutron activation analysis; Determinacao de metais pesados e outros elementos em sedimentos da Baia de Sepetiba (RJ) por ativacao neutronica

    Energy Technology Data Exchange (ETDEWEB)

    Pellegatti, Fabio

    2000-07-01

    Sepetiba Bay, located about 60 km south of the city of Rio de Janeiro, Brazil, is one of the most important fishery areas in the State of Rio de Janeiro. A large harbor brought up a lot of industrial investment in that area. Since the 1970's, the Sepetiba region has undergone fast industrial expansion, leading to high levels of pollution by metals. For the last two decades, an industrial park composed of about 400 industrial plants, basically metallurgical, was established in the Sepetiba Bay basin, releasing its industrial waste either straight into the bay or through local rivers. Metal contamination in the bay for some metals, such as Zn, has already exceeded acceptable levels. Many authors have studied the distribution and behavior of heavy metals and another elements in the bay, but only few elements have been focused (Cd, Cr, Cu, Fe, Mn, Ni, Pb and Zn). This is probably due to the fact that the analytical technique most employed has been atomic absorption spectrometry, which is not a multi-elemental technique. In this work, Instrumental Neutron Activation Analysis (INAA) was applied to the determination of the elements As, Ba, Br, Ce, Co, Cr, Cs, Eu, Fe, Hf, La, Lu, Nd, Rb, Sc, Sm, Ta, Tb, Th, U, Yb and Zn in 28 bottom sediment samples and four sediment cores from Sepetiba Bay. The elements Co, Cr, Cs, Fe, Sc, Ta and Zn presented similar behavior in the bottom sediments, showing higher concentration along the Northern coast of the bay, where most of the fluvial water flows out to the bay. The contamination of Sepetiba Bay was also assessed by the analysis of four sediment cores. Two of them were sampled in the Eastern part of the bay, where the industrial park is located, whereas the other two were sampled in the Western part of the bay, a more preserved region. For each region, two cores were sampled within the mangrove trees and the others at the edge of the tidal flat. The results showed that, the sediments displayed higher metal concentration within

  12. State intervention causing inefficiency: an empirical analysis of the Norwegian Continental Shelf

    International Nuclear Information System (INIS)

    Kashani, Hossein A.

    2005-01-01

    State intervention in the Norwegian Continental Shelf started with the establishment of Statoil as the medium of state ownership over the found petroleum and as a tool to monitor oil companies' procurement behaviour. This paper tests the extent to which the state intervention created inefficiencies in the Norwegian Continental Shelf (NCS) activities, as measured by data envelopment analysis, stochastic frontier analysis, Malmquist Indices, and standard regression analysis. Our results confirm such inefficiencies. Accordingly, the results provide an important insight into NCS production techniques and, more generally, into governments' abilities to influence private sector behaviour through contracts and tendering

  13. Paleoenvironment interpretation of a 1760 years B.P. old sediment in a mangrove area of the Bay of Guanabara, using pollen analysis

    Directory of Open Access Journals (Sweden)

    Barth Ortrud M.

    2006-01-01

    Full Text Available A sediment sample was obtained at 122 cm from the top of a drilling core in the Guapimirim mangrove, Bay of Guanabara, and analyzed using pollen analysis. This muddy core reached a sandy ground at 133 cm. 14C datation got the age of 1760 ? 50 years B.P. The most frequent pollen grains were mangrove species of Rhizophora mangle, Laguncularia racemosa and Avicennia schaueriana. "Restinga" and tropical rain forest vegetation was recognized behind the mangrove. After the last sea transgression at 2500 years B.P., the water level lowered to its actual size, allowing the installation of this mangrove.

  14. Paleoenvironment interpretation of a 1760 years B.P. old sediment in a mangrove area of the Bay of Guanabara, using pollen analysis

    OpenAIRE

    Barth, Ortrud M.; São-Thiago, Luiz E.U.; Barros, Marcia A.

    2006-01-01

    A sediment sample was obtained at 122 cm from the top of a drilling core in the Guapimirim mangrove, Bay of Guanabara, and analyzed using pollen analysis. This muddy core reached a sandy ground at 133 cm. 14C datation got the age of 1760 ± 50 years B.P. The most frequent pollen grains were mangrove species of Rhizophora mangle, Laguncularia racemosa and Avicennia schaueriana. "Restinga" and tropical rain forest vegetation was recognized behind the mangrove. After the last sea transgression at...

  15. A global weighted mean temperature model based on empirical orthogonal function analysis

    Science.gov (United States)

    Li, Qinzheng; Chen, Peng; Sun, Langlang; Ma, Xiaping

    2018-03-01

    A global empirical orthogonal function (EOF) model of the tropospheric weighted mean temperature called GEOFM_Tm was developed using high-precision Global Geodetic Observing System (GGOS) Atmosphere Tm data during the years 2008-2014. Due to the quick convergence of EOF decomposition, it is possible to use the first four EOF series, which consists base functions Uk and associated coefficients Pk, to represent 99.99% of the overall variance of the original data sets and its spatial-temporal variations. Results show that U1 displays a prominent latitude distribution profile with positive peaks located at low latitude region. U2 manifests an asymmetric pattern that positive values occurred over 30° in the Northern Hemisphere, and negative values were observed at other regions. U3 and U4 displayed significant anomalies in Tibet and North America, respectively. Annual variation is the major component of the first and second associated coefficients P1 and P2, whereas P3 and P4 mainly reflects both annual and semi-annual variation components. Furthermore, the performance of constructed GEOFM_Tm was validated by comparison with GTm_III and GTm_N with different kinds of data including GGOS Atmosphere Tm data in 2015 and radiosonde data from Integrated Global Radiosonde Archive (IGRA) in 2014. Generally speaking, GEOFM_Tm can achieve the same accuracy and reliability as GTm_III and GTm_N models in a global scale, even has improved in the Antarctic and Greenland regions. The MAE and RMS of GEOFM_Tm tend to be 2.49 K and 3.14 K with respect to GGOS Tm data, respectively; and 3.38 K and 4.23 K with respect to IGRA sounding data, respectively. In addition, those three models have higher precision at low latitude than middle and high latitude regions. The magnitude of Tm remains at the range of 220-300 K, presented a high correlation with geographic latitude. In the Northern Hemisphere, there was a significant enhancement at high latitude region reaching 270 K during summer

  16. When complexity science meets implementation science: a theoretical and empirical analysis of systems change.

    Science.gov (United States)

    Braithwaite, Jeffrey; Churruca, Kate; Long, Janet C; Ellis, Louise A; Herkes, Jessica

    2018-04-30

    Implementation science has a core aim - to get evidence into practice. Early in the evidence-based medicine movement, this task was construed in linear terms, wherein the knowledge pipeline moved from evidence created in the laboratory through to clinical trials and, finally, via new tests, drugs, equipment, or procedures, into clinical practice. We now know that this straight-line thinking was naïve at best, and little more than an idealization, with multiple fractures appearing in the pipeline. The knowledge pipeline derives from a mechanistic and linear approach to science, which, while delivering huge advances in medicine over the last two centuries, is limited in its application to complex social systems such as healthcare. Instead, complexity science, a theoretical approach to understanding interconnections among agents and how they give rise to emergent, dynamic, systems-level behaviors, represents an increasingly useful conceptual framework for change. Herein, we discuss what implementation science can learn from complexity science, and tease out some of the properties of healthcare systems that enable or constrain the goals we have for better, more effective, more evidence-based care. Two Australian examples, one largely top-down, predicated on applying new standards across the country, and the other largely bottom-up, adopting medical emergency teams in over 200 hospitals, provide empirical support for a complexity-informed approach to implementation. The key lessons are that change can be stimulated in many ways, but a triggering mechanism is needed, such as legislation or widespread stakeholder agreement; that feedback loops are crucial to continue change momentum; that extended sweeps of time are involved, typically much longer than believed at the outset; and that taking a systems-informed, complexity approach, having regard for existing networks and socio-technical characteristics, is beneficial. Construing healthcare as a complex adaptive system

  17. Polymorphs and prodrugs and salts (oh my!: an empirical analysis of "secondary" pharmaceutical patents.

    Directory of Open Access Journals (Sweden)

    Amy Kapczynski

    Full Text Available BACKGROUND: While there has been much discussion by policymakers and stakeholders about the effects of "secondary patents" on the pharmaceutical industry, there is no empirical evidence on their prevalence or determinants. Characterizing the landscape of secondary patents is important in light of recent court decisions in the U.S. that may make them more difficult to obtain, and for developing countries considering restrictions on secondary patents. METHODOLOGY/PRINCIPAL FINDINGS: We read the claims of the 1304 Orange Book listed patents on all new molecular entities approved in the U.S. between 1988 and 2005, and coded the patents as including chemical compound claims (claims covering the active molecule itself and/or one of several types of secondary claims. We distinguish between patents with any secondary claims, and those with only secondary claims and no chemical compound claims ("independent" secondary patents. We find that secondary claims are common in the pharmaceutical industry. We also show that independent secondary patents tend to be filed and issued later than chemical compound patents, and are also more likely to be filed after the drug is approved. When present, independent formulation patents add an average of 6.5 years of patent life (95% C.I.: 5.9 to 7.3 years, independent method of use patents add 7.4 years (95% C.I.: 6.4 to 8.4 years, and independent patents on polymorphs, isomers, prodrug, ester, and/or salt claims add 6.3 years (95% C.I.: 5.3 to 7.3 years. We also provide evidence that late-filed independent secondary patents are more common for higher sales drugs. CONCLUSIONS/SIGNIFICANCE: Policies and court decisions affecting secondary patenting are likely to have a significant impact on the pharmaceutical industry. Secondary patents provide substantial additional patent life in the pharmaceutical industry, at least nominally. Evidence that they are also more common for best-selling drugs is consistent with accounts of

  18. An empirical orthogonal function analysis of ocean shoreline location on the Virginia barrier islands

    Science.gov (United States)

    Haluska, J. D.

    2017-12-01

    Shoreline change along the Eastern Atlantic shore of Virginia has been studied for the individual barrier islands but not as an integrated system. This study combines the Atlantic shoreline locations for eleven barrier islands obtained from LANDSAT 5, 7, and 8 images. Approximately 250 shoreline locations over a 24-year period from Jan 1990 to Dec 2014 were extracted from the digitized shoreline data at 338 transects. The resulting 338 by 250 matrix was analyzed by the empirical orthogonal function (EOF) technique. The first four principal components (PC) explained 86 percent of the sample variance. Since the data was not detrended, the first PC was the overall trend of the data with a discontinuity in 2004-2005. The 2004-2005 interval included storm events and large shoreline changes. PCs 2 to 4 reflect the effects of El Nino events and tropical and non-tropical storms. Eigenvectors 1 to 4 all show the effects of the nine inlets in the island group. Eigenvector (EV) 1 explains 59 percent of the shoreline spatial variance and shows the largest changes at the northern and southern island ends. EVs 2 to 4 reflect the pattern of EV1 but at sequentially smaller percentages of the spatial variance. As a group, the eleven islands are losing ocean side shoreline. The lone exception is Hog Island. Sea level had the strongest correlation with the shoreline loss trend of PC1. The coefficient of determination was 0.41. The NAO and MEI also correlated with PC1 with correlations of determination of 0.05 and 0.12 respectively. These confidence level for the three factors was better than 99 percent. Sea level also correlated with PC3 and PC4. The PCs as a group show that the year intervals 2004-2005 and 2009-2010 had large effects on the shoreline change pattern for the island group. EVs 1 to 4 had the highest range of shoreline change at the island ends indicating the effect the changes of the inlets have on the adjacent islands. The smaller islands as a group had a higher level

  19. Patents and scientific publications: an empirical analysis of the Italian system of academic professor recruitment

    Directory of Open Access Journals (Sweden)

    Bruno Marsigalia

    2014-01-01

    Full Text Available The recent increase in patenting by European and American university researchers has raised concerns among observers that increased patenting may be associated with less open publication of research results. This leads us to examine if the propensity to academic patenting would negatively affect publication of scientific research results and, therefore, result in less diffusion of knowledge resources; or, conversely, if it could increase the quantity and quality of scientific publications and therefore improve academic performances. We propose a quantitative approach through which we aim to test whether academic researchers who both publish and patent are less productive than their peers who concentrate exclusively on scholarly publication, in order to communicate their research results. More specifically, by using the statistical model of comparison between sample means, we analyse if the average number of publications by academic inventors is lower than the average of non-academic ones. We use a panel dataset comprising Italian academic researchers who have obtained the National Scientific Qualification as full professor in the se tor “02/B3 - Applied Physi s” in the session 2012. With regard to the relationship between patenting and publishing by university researchers there is not an unanimous doctrinal orientation. Additionally, there is only limited empirical evidence regarding the correlation between these two variables. Our study contributes to the existing literature by supporting the thesis according to which the open publication of university research results is not inhibited by patenting by university faculty members. The outcomes of the application suggest that it would appear appropriate to encourage a greater use of patents by university rese r hers. It would seem in f t th t th nks to the fin n i l support to demi rese r h nd in gener l to the in entives rising from ont t with industry the development of industrial

  20. Software project profitability analysis using temporal probabilistic reasoning; an empirical study with the CASSE framework

    CSIR Research Space (South Africa)

    Balikuddembe, JK

    2009-04-01

    Full Text Available Undertaking adequate risk management by understanding project requirements and ensuring that viable estimates are made on software projects require extensive application and sophisticated techniques of analysis and interpretation. Informative...

  1. GRID PRICING VERSUS AVERAGE PRICING FOR SLAUGHTER CATTLE: AN EMPIRICAL ANALYSIS

    OpenAIRE

    Fausti, Scott W.; Qasmi, Bashir A.

    1999-01-01

    The paper compares weekly producer revenue under grid pricing and average dressed weight pricing methods for 2560 cattle over a period of 102 weeks. Regression analysis is applied to identify factors affecting the revenue differential.

  2. Research on the factors of return on equity: empirical analysis in Chinese port industries from 2000-2008

    Science.gov (United States)

    Li, Wei

    2012-01-01

    Port industries are the basic industries in the national economy. The industries have become the most modernized departments in every country. The development of the port industry is not only advantageous to promote the optimizing arrangement of social resources, but also to promote the growth of foreign trade volume through enhancing the transportation functions. Return on equity (ROE) is a direct indicator related to the maximization of company's wealth. It makes up the shortcomings of earnings per share (EPS). The aim of this paper is to prove the correlation between ROE and other financial indicators by choosing the listed port companies as the research objectives and selecting the data of these companies from 2000 to 2008 as empirical sample data with statistical analysis of the chartered figure and coefficient. The detailed analysis method used in the paper is the combination of trend analysis, comparative analysis and the ratio of the factor analysis method. This paper analyzes and compares all these factors and draws the conclusions as follows: Firstly, ROE has a positive correlation with total assets turnover, main profit margin and fixed asset ratio, while has a negative correlation with assets liabilities ratio, total assets growth rate and DOL. Secondly, main profit margin has the greatest positive effect on ROE among all these factors. The second greatest factor is total assets turnover, which shows the operation capacity is also an important indicator after the profitability. Thirdly, assets liabilities ratio has the greatest negative effect on ROE among all these factors.

  3. Development, calibration, and analysis of a hydrologic and water-quality model of the Delaware Inland Bays watershed

    Science.gov (United States)

    Gutierrez-Magness, Angelica L.; Raffensperger, Jeff P.

    2003-01-01

    Excessive nutrients and sediment are among the most significant environmental stressors in the Delaware Inland Bays (Rehoboth, Indian River, and Little Assawoman Bays). Sources of nutrients, sediment, and other contaminants within the Inland Bays watershed include point-source discharges from industries and wastewater-treatment plants, runoff and infiltration to ground water from agricultural fields and poultry operations, effluent from on-site wastewater disposal systems, and atmospheric deposition. To determine the most effective restoration methods for the Inland Bays, it is necessary to understand the relative distribution and contribution of each of the possible sources of nutrients, sediment, and other contaminants. A cooperative study involving the Delaware Department of Natural Resources and Environmental Control, the Delaware Geological Survey, and the U.S. Geological Survey was initiated in 2000 to develop a hydrologic and water-quality model of the Delaware Inland Bays watershed that can be used as a water-resources planning and management tool. The model code Hydrological Simulation Program - FORTRAN (HSPF) was used. The 719-square-kilometer watershed was divided into 45 model segments, and the model was calibrated using streamflow and water-quality data for January 1999 through April 2000 from six U.S. Geological Survey stream-gaging stations within the watershed. Calibration for some parameters was accomplished using PEST, a model-independent parameter estimator. Model parameters were adjusted systematically so that the discrepancies between the simulated values and the corresponding observations were minimized. Modeling results indicate that soil and aquifer permeability, ditching, dominant land-use class, and land-use practices affect the amount of runoff, the mechanism or flow path (surface flow, interflow, or base flow), and the loads of sediment and nutrients. In general, the edge-of-stream total suspended solids yields in the Inland Bays

  4. Different methods for ethical analysis in health technology assessment: an empirical study.

    Science.gov (United States)

    Saarni, Samuli I; Braunack-Mayer, Annette; Hofmann, Bjørn; van der Wilt, Gert Jan

    2011-10-01

    Ethical analysis can highlight important ethical issues related to implementing a technology, values inherent in the technology itself, and value-decisions underlying the health technology assessment (HTA) process. Ethical analysis is a well-acknowledged part of HTA, yet seldom included in practice. One reason for this is lack of knowledge about the properties and differences between the methods available. This study compares different methods for ethical analysis within HTA. Ethical issues related to bariatric (obesity) surgery were independently evaluated using axiological, casuist, principlist, and EUnetHTA models for ethical analysis within HTA. The methods and results are presented and compared. Despite varying theoretical underpinnings and practical approaches, the four methods identified similar themes: personal responsibility, self-infliction, discrimination, justice, public funding, and stakeholder involvement. The axiological and EUnetHTA models identified a wider range of arguments, whereas casuistry and principlism concentrated more on analyzing a narrower set of arguments deemed more important. Different methods can be successfully used for conducting ethical analysis within HTA. Although our study does not show that different methods in ethics always produce similar results, it supports the view that different methods of ethics can yield relevantly similar results. This suggests that the key conclusions of ethical analyses within HTA can be transferable between methods and countries. The systematic and transparent use of some method of ethics appears more important than the choice of the exact method.

  5. Application of empirical orthogonal functions or principal component analysis to environmental variability data

    International Nuclear Information System (INIS)

    Carvajal Escobar, Yesid; Marco Segura, Juan B

    2005-01-01

    An EOF analysis or principal component analysis (PC) was made for monthly precipitation (1972-1998) using 50 stations, and for monthly rate of flow (1951-2000) at 8 stations in the Valle del Cauca state, Colombia. Previously, we had applied 5 measures in order to verify the convenience of the analysis. These measures were: i) evaluation of significance level of correlation between variables; II) the kaiser-Meyer-Oikin (KMO) test; III) the Bartlett sphericity test; (IV) the measurement of sample adequacy (MSA), and v) the percentage of non-redundant residues with absolute values>0.05. For the selection of the significant PCS in every set of variables we applied seven criteria: the graphical method, the explained variance percentage, the mean root, the tests of Velicer, Bartlett, Broken Stich and the cross validation test. We chose the latter as the best one. It is robust and quantitative. Precipitation stations were divided in three homogeneous groups, applying a hierarchical cluster analysis, which was verified through the geographic method and the discriminate analysis for the first four EOFs of precipitation. There are many advantages to the EOF method: reduction of the dimensionality of multivariate data, calculation of missing data, evaluation and reduction of multi-co linearity, building of homogeneous groups, and detection of outliers. With the first four principal components we can explain 60.34% of the total variance of monthly precipitation for the Valle del Cauca state, and 94% of the total variance for the selected records of rates of flow

  6. Multivariate Analysis of Water Quality and Benthic Macrophyte Communities in Florida Bay, USA Reveals Hurricane Effects and Susceptibility to Seagrass Die-Off

    Directory of Open Access Journals (Sweden)

    Amanda M. Cole

    2018-05-01

    Full Text Available Seagrass communities, dominated by Thalassia testudinum, form the principal benthic ecosystem within Florida Bay, Florida USA. The bay has had several large-scale seagrass die-offs in recent decades associated with drought and hypersaline conditions. In addition, three category-5 hurricanes passed in close proximity to the bay during the fall of 2005. This study investigated temporal and spatial trends in macrophyte abundance and water quality from 2006 to 2013 at 15 permanent transect sites, which were co-located with long-term water quality stations. Relationships, by year and by transect location (basin, between antecedent water quality (mean, minimum and maximum for a 6-month period and benthic macrophyte communities were examined using multivariate analyses. Total phosphorus, salinity, pH, turbidity, dissolved inorganic nitrogen (DIN, DIN to phosphate ratio (DIN:PO4-3, chlorophyll a, and dissolved oxygen correlated with temporal and spatial variations in the macrophyte communities. Temporal analysis (MDS and LINKTREE indicated that the fall 2005 hurricanes affected both water quality and macrophyte communities for approximately a 2-year period. Spatial analysis revealed that five basins, which subsequently exhibited a major seagrass die-off during summer 2015, significantly differed from the other ten basins in macrophyte community structure and water quality more than 2 years before this die-off event. High total phosphorus, high pH, low DIN, and low DIN:PO4-3, in combination with deep sediments and high seagrass cover were characteristic of sites that subsequently exhibited severe die-off. Our results indicate basins with more mixed seagrass communities and higher macroalgae abundance are less susceptible to die-off, which is consistent with the management goals of promoting more heterogeneous benthic macrophyte communities.

  7. Short- and long-run elasticities of gasoline demand in India. An empirical analysis using cointegration techniques

    International Nuclear Information System (INIS)

    Ramanathan, R.

    1999-01-01

    In developing countries like India, consumption of petroleum products has implications on its balance of payments, economic growth and fiscal deficit. Gasoline is one of the prime petroleum products. In this paper, the relationship between gasoline demand, national income and price of gasoline is empirically examined using cointegration and error correction techniques. The time frame of the analysis is from 1972-1973 to 1993-1994. It has been found that gasoline demand is likely to increase significantly for a given increase in the gross domestic product. The increase will be larger in the long-run (2.682) than in the short-run (1.178). Gasoline demand is relatively inelastic to price changes, both in the long and short terms. The error correction model has shown that gasoline demand adjusts to their respective long-run equilibrium at a relatively slow rate, with about 28% of adjustment taking place in the first year. 23 refs

  8. Multifractal features of EUA and CER futures markets by using multifractal detrended fluctuation analysis based on empirical model decomposition

    International Nuclear Information System (INIS)

    Cao, Guangxi; Xu, Wei

    2016-01-01

    Basing on daily price data of carbon emission rights in futures markets of Certified Emission Reduction (CER) and European Union Allowances (EUA), we analyze the multiscale characteristics of the markets by using empirical mode decomposition (EMD) and multifractal detrended fluctuation analysis (MFDFA) based on EMD. The complexity of the daily returns of CER and EUA futures markets changes with multiple time scales and multilayered features. The two markets also exhibit clear multifractal characteristics and long-range correlation. We employ shuffle and surrogate approaches to analyze the origins of multifractality. The long-range correlations and fat-tail distributions significantly contribute to multifractality. Furthermore, we analyze the influence of high returns on multifractality by using threshold method. The multifractality of the two futures markets is related to the presence of high values of returns in the price series.

  9. Tunnel support design by comparison of empirical and finite element analysis of the Nahakki tunnel in mohmand agency, pakistan

    Directory of Open Access Journals (Sweden)

    Riaz Asif

    2016-03-01

    Full Text Available The paper analyses the geological conditions of study area, rock mass strength parameters with suitable support structure propositions for the under construction Nahakki tunnel in Mohmand Agency. Geology of study area varies from mica schist to graphitic marble/phyllite to schist. The tunnel ground is classified and divided by the empisical classification systems like Rock mass rating (RMR, Q system (Q, and Geological strength index (GSI. Tunnel support measures are selected based on RMR and Q classification systems. Computer based finite element analysis (FEM has given yet another dimension to design approach. FEM software Phase2 version 7.017 is used to calculate and compare deformations and stress concentrations around the tunnel, analyze interaction of support systems with excavated rock masses and verify and check the validity of empirically determined excavation and support systems.

  10. Illumina-based analysis the microbial diversity associated with Thalassia hemprichii in Xincun Bay, South China Sea.

    Science.gov (United States)

    Jiang, Yu-Feng; Ling, Juan; Dong, Jun-De; Chen, Biao; Zhang, Yan-Ying; Zhang, Yuan-Zhou; Wang, You-Shao

    2015-10-01

    In order to increase our understanding of the microbial diversity associated with seagrass Thalassia hemprichii in Xincun Bay, South China Sea, 16S rRNA gene was identified by highthrough sequencing method. Bacteria associated with seagrass T. hemprichii belonged to 37 phyla, 99 classes. The diversity of bacteria associated with seagrass was similar among the geographically linked coastal locations of Xincun Bay. Proteobacteria was the dominant bacteria and the α-proteobacteria had adapted to the seagrass ecological niche. As well, α-proteobacteria and Pseudomonadales were associated microflora in seagrass meadows, but the interaction between the bacteria and plant is needed to further research. Burkholderiales and Verrucomicrobiae indicated the influence of the bay from anthropogenic activities. Further, Cyanobacteria could imply the difference of the nutrient conditions in the sites. γ-proteobacteria, Desulfobacterales and Pirellulales played a role in the cycle of sulfur, organic mineralization and meadow ecosystem, respectively. In addition, the less abundance bacteria species have key functions in the seagrass meadows, but there is lack knowledge of the interaction of the seagrass and less abundance bacteria species. Microbial communities can response to surroundings and play key functions in the biochemical cycle.

  11. Regulation and efficiency: an empirical analysis of the United Kingdom continental shelf petroleum industry

    International Nuclear Information System (INIS)

    Kashani, H.A.

    2005-01-01

    The petroleum industry of the United Kingdom Continental Shelf (UKCS) has been subject to various degrees of regulation. Self-sufficiency, security of supply and developing offshore supply industry triggered government regulations that were seen as interventionary and protectionist. This paper tests the extent to which regulations targeting involvement of British offshore supply industry in the UKCS activity created inefficiencies. Data envelopment analysis (DEA), stochastic frontier analysis (SFA), Malmquist Indices, and standard regression analysis are used to measure the amount and address the source of inefficiencies. We will show that such inefficiencies could not be ruled out. The results provide an important insight into the UKCS production techniques and, more generally, into governments' abilities to influence private sector behaviour through contracts and tendering

  12. Information Technology, Human Resources Management Systems and Firm Performance: An Empirical Analysis from Spain

    Directory of Open Access Journals (Sweden)

    Pilar Ficapal-Cusí

    2011-04-01

    Full Text Available This research paper uses survey data on 1.518 Catalan firms (in Spain, with capital in Barcelona to examine the relationship between IT use, innovative human resources management systems (IHRMS and firm’s performance. Using factor and cluster analysis, we find that only one-third of Catalan firms use IHRMS. Using association analysis we find that firms that adopt IHMRS are more internationalised; show grater ability to adapt to the change environment, to innovate and to collaborate; focuses product/service differentiation strategy enhancing quality; apply a greater degree of new forms of work organization; use IT more intensively; and invest more in training their employees Using regression analysis, we find that features which are structural, technological, strategic, organisational and result-related explain the adoption of IHRMS.

  13. AirAsia In The Malaysian Domestic Airline Market: Empirical Analysis Of Strategy

    OpenAIRE

    Mok Kim Man; Jainurin Bin Justine

    2011-01-01

    This paper will examine the results of the strategic actions of AirAsia in the Malaysian domestic airline market. Firstly, the paper will provide a general background of the airline industry, in particular the Malaysian domestic airline market and a summary of an analysis of the industry using Michael Porters Five Forces Analysis. Secondly, the paper will provide a brief background of AirAsia and Malaysia Airlines. Thirdly, the paper will analyse the results of AirAsias strategy vis--vis oper...

  14. An empirical analysis of lumpy investment. The case of US petroleum refining industry

    International Nuclear Information System (INIS)

    Asano, Hirokatsu

    2002-01-01

    This paper employs five econometric models to examine lumpy investment and investigates the investment behavior of the US petroleum refining industry. Firms in the industry are classified into three groups by their size. All three groups show zero investment, disinvestment and investment in accordance with economic conditions. The analysis finds the minimum amount of investment and disinvestment for each group, which suggests that the size of fixed costs of investment is substantial, regardless of firm size. However, small firms adjust capital stock more slowly than medium or large firms. The analysis also suggests the existence of a convex adjustment cost

  15. DANWEC - Empirical Analysis of the Wave Climate at the Danish Wave Energy Centre

    DEFF Research Database (Denmark)

    Tetu, Amelie; Nielsen, Kim; Kofoed, Jens Peter

    information on the DanWEC wave and current climate. In this paper an analysis of the wave climate of the DanWEC test site will be presented. This includes a description of the data quality control and filtration for analysis and the observations and data analysis. Relevant characteristics of the test site...... site for several Danish WECs. In 2013 DanWEC has received Greenlab funding from the EUDP programme to establish the site including more detailed information on its wave climate and bathymetry and seabed conditions. The project “Resource Assessment, Forecasts and WECs O&M strategies at DanWEC and beyond......, as for example scatter diagram (Hm0, Tz) will be analysed and wave power distribution given. Based on the data gathered so far a preliminary analysis of extreme events at the DanWEC test site will be presented. Deployment, control strategies and O&M strategies of wave energy converters are sensitive to the wave...

  16. Advanced Infantry Training: An Empirical Analysis Of (0341) Mortarman Success While Attending Advanced Mortarman Course

    Science.gov (United States)

    2017-12-01

    system MCT Marine Combat Training MEF Marine Expeditionary Force MK Math knowledge MOS Military occupational specialty MSG Marine Security Guard...to advanced level training, specifically, the Advanced Mortarman Course (AMC). Prospective students’ success is predicated on an effective command...survival. It is evident through survival analysis that increased levels of cognitive ability have significant impacts on a Marine’s probability to

  17. The Faculty Promotion Process. An Empirical Analysis of the Administration of Large State Universities.

    Science.gov (United States)

    Luthans, Fred

    One phase of academic management, the faculty promotion process, is systematically described and analyzed. The study encompasses three parts: (l) the justification of the use of management concepts in the analysis of academic administration; (2) a descriptive presentation of promotion policies and practices in 46 large state universities; and (3)…

  18. A Meta-Analysis of Motivational Interviewing: Twenty-Five Years of Empirical Studies

    Science.gov (United States)

    Lundahl, Brad W.; Kunz, Chelsea; Brownell, Cynthia; Tollefson, Derrik; Burke, Brian L.

    2010-01-01

    Objective: The authors investigated the unique contribution motivational interviewing (MI) has on counseling outcomes and how MI compares with other interventions. Method: A total of 119 studies were subjected to a meta-analysis. Targeted outcomes included substance use (tobacco, alcohol, drugs, marijuana), health-related behaviors (diet,…

  19. Factors of Economic growth in KSA An empirical analysis from 2000:2014

    Directory of Open Access Journals (Sweden)

    Hanaa Abdelaty Hasan Esmail

    2015-11-01

    Full Text Available This paper aims to conduct analysis of the impact of Oil Revenue and its relationship with all factors that affect growth in domestic product and also focuses on investment opportunity and economic growth in Saudi Arabia which has made the country an increasingly attractive destination of investment for foreign investors.

  20. Factors of Economic growth in KSA An empirical analysis from 2000:2014

    OpenAIRE

    Hanaa Abdelaty Hasan Esmail

    2015-01-01

    This paper aims to conduct analysis of the impact of Oil Revenue and its relationship with all factors that affect growth in domestic product and also focuses on investment opportunity and economic growth in Saudi Arabia which has made the country an increasingly attractive destination of investment for foreign investors.