WorldWideScience

Sample records for modeling studies empirical

  1. Salt intrusion study in Cochin estuary - Using empirical models

    Digital Repository Service at National Institute of Oceanography (India)

    Jacob, B.; Revichandran, C.; NaveenKumar, K.R.

    been applied to the Cochin estuary in the present study to identify the most suitable model for predicting the salt intrusion length. Comparison of the obtained results indicate that the model of Van der Burgh (1972) is the most suitable empirical model...

  2. An empirical and model study on automobile market in Taiwan

    Science.gov (United States)

    Tang, Ji-Ying; Qiu, Rong; Zhou, Yueping; He, Da-Ren

    2006-03-01

    We have done an empirical investigation on automobile market in Taiwan including the development of the possession rate of the companies in the market from 1979 to 2003, the development of the largest possession rate, and so on. A dynamic model for describing the competition between the companies is suggested based on the empirical study. In the model each company is given a long-term competition factor (such as technology, capital and scale) and a short-term competition factor (such as management, service and advertisement). Then the companies play games in order to obtain more possession rate in the market under certain rules. Numerical simulation based on the model display a competition developing process, which qualitatively and quantitatively agree with our empirical investigation results.

  3. A study on online monitoring system development using empirical models

    Energy Technology Data Exchange (ETDEWEB)

    An, Sang Ha

    2010-02-15

    Maintenance technologies have been progressed from a time-based to a condition-based manner. The fundamental idea of condition-based maintenance (CBM) is built on the real-time diagnosis of impending failures and/or the prognosis of residual lifetime of equipment by monitoring health conditions using various sensors. The success of CBM, therefore, hinges on the capability to develop accurate diagnosis/prognosis models. Even though there may be an unlimited number of methods to implement models, the models can normally be classified into two categories in terms of their origins: using physical principles or historical observations. I have focused on the latter method (sometimes referred as the empirical model based on statistical learning) because of some practical benefits such as context-free applicability, configuration flexibility, and customization adaptability. While several pilot-scale systems using empirical models have been applied to work sites in Korea, it should be noticed that these do not seem to be generally competitive against conventional physical models. As a result of investigating the bottlenecks of previous attempts, I have recognized the need for a novel strategy for grouping correlated variables such that an empirical model can accept not only statistical correlation but also some extent of physical knowledge of a system. Detailed examples of problems are as follows: (1) missing of important signals in a group caused by the lack of observations, (2) problems of signals with the time delay, (3) problems of optimal kernel bandwidth. In this study an improved statistical learning framework including the proposed strategy and case studies illustrating the performance of the method are presented.

  4. Design Models as Emergent Features: An Empirical Study in Communication and Shared Mental Models in Instructional

    Science.gov (United States)

    Botturi, Luca

    2006-01-01

    This paper reports the results of an empirical study that investigated the instructional design process of three teams involved in the development of an e-­learning unit. The teams declared they were using the same fast-­prototyping design and development model, and were composed of the same roles (although with a different number of SMEs).…

  5. An Empirical Study of a Solo Performance Assessment Model

    Science.gov (United States)

    Russell, Brian E.

    2015-01-01

    The purpose of this study was to test a hypothesized model of solo music performance assessment. Specifically, this study investigates the influence of technique and musical expression on perceptions of overall performance quality. The Aural Musical Performance Quality (AMPQ) measure was created to measure overall performance quality, technique,…

  6. Behavioral Modeling Based on Probabilistic Finite Automata: An Empirical Study.

    Science.gov (United States)

    Tîrnăucă, Cristina; Montaña, José L; Ontañón, Santiago; González, Avelino J; Pardo, Luis M

    2016-06-24

    Imagine an agent that performs tasks according to different strategies. The goal of Behavioral Recognition (BR) is to identify which of the available strategies is the one being used by the agent, by simply observing the agent's actions and the environmental conditions during a certain period of time. The goal of Behavioral Cloning (BC) is more ambitious. In this last case, the learner must be able to build a model of the behavior of the agent. In both settings, the only assumption is that the learner has access to a training set that contains instances of observed behavioral traces for each available strategy. This paper studies a machine learning approach based on Probabilistic Finite Automata (PFAs), capable of achieving both the recognition and cloning tasks. We evaluate the performance of PFAs in the context of a simulated learning environment (in this case, a virtual Roomba vacuum cleaner robot), and compare it with a collection of other machine learning approaches.

  7. Empirical study of the GARCH model with rational errors

    International Nuclear Information System (INIS)

    Chen, Ting Ting; Takaishi, Tetsuya

    2013-01-01

    We use the GARCH model with a fat-tailed error distribution described by a rational function and apply it to stock price data on the Tokyo Stock Exchange. To determine the model parameters we perform Bayesian inference to the model. Bayesian inference is implemented by the Metropolis-Hastings algorithm with an adaptive multi-dimensional Student's t-proposal density. In order to compare our model with the GARCH model with the standard normal errors, we calculate the information criteria AIC and DIC, and find that both criteria favor the GARCH model with a rational error distribution. We also calculate the accuracy of the volatility by using the realized volatility and find that a good accuracy is obtained for the GARCH model with a rational error distribution. Thus we conclude that the GARCH model with a rational error distribution is superior to the GARCH model with the normal errors and it can be used as an alternative GARCH model to those with other fat-tailed distributions

  8. Neural networks in economic modelling : An empirical study

    NARCIS (Netherlands)

    Verkooijen, W.J.H.

    1996-01-01

    This dissertation addresses the statistical aspects of neural networks and their usability for solving problems in economics and finance. Neural networks are discussed in a framework of modelling which is generally accepted in econometrics. Within this framework a neural network is regarded as a

  9. Empirical study on entropy models of cellular manufacturing systems

    Institute of Scientific and Technical Information of China (English)

    Zhifeng Zhang; Renbin Xiao

    2009-01-01

    From the theoretical point of view,the states of manufacturing resources can be monitored and assessed through the amount of information needed to describe their technological structure and operational state.The amount of information needed to describe cellular manufacturing systems is investigated by two measures:the structural entropy and the operational entropy.Based on the Shannon entropy,the models of the structural entropy and the operational entropy of cellular manufacturing systems are developed,and the cognizance of the states of manufacturing resources is also illustrated.Scheduling is introduced to measure the entropy models of cellular manufacturing systems,and the feasible concepts of maximum schedule horizon and schedule adherence are advanced to quantitatively evaluate the effectiveness of schedules.Finally,an example is used to demonstrate the validity of the proposed methodology.

  10. A Trade Study of Thermosphere Empirical Neutral Density Models

    Science.gov (United States)

    2014-08-01

    solar radio F10.7 proxy and magnetic activity measurements are used to calculate the baseline orbit. This approach is applied to compare the daily... approach is to calculate along-track errors for these models and compare them against the baseline error based on the “ground truth” neutral density data...n,m = Degree and order, respectively ′ = Geocentric latitude Approved for public release; distribution is unlimited. 2 λ = Geocentric

  11. Assessing and improving the quality of modeling : a series of empirical studies about the UML

    NARCIS (Netherlands)

    Lange, C.F.J.

    2007-01-01

    Assessing and Improving the Quality of Modeling A Series of Empirical Studies about the UML This thesis addresses the assessment and improvement of the quality of modeling in software engineering. In particular, we focus on the Unified Modeling Language (UML), which is the de facto standard in

  12. Modeling Active Aging and Explicit Memory: An Empirical Study.

    Science.gov (United States)

    Ponce de León, Laura Ponce; Lévy, Jean Pierre; Fernández, Tomás; Ballesteros, Soledad

    2015-08-01

    The rapid growth of the population of older adults and their concomitant psychological status and health needs have captured the attention of researchers and health professionals. To help fill the void of literature available to social workers interested in mental health promotion and aging, the authors provide a model for active aging that uses psychosocial variables. Structural equation modeling was used to examine the relationships among the latent variables of the state of explicit memory, the perception of social resources, depression, and the perception of quality of life in a sample of 184 older adults. The results suggest that explicit memory is not a direct indicator of the perception of quality of life, but it could be considered an indirect indicator as it is positively correlated with perception of social resources and negatively correlated with depression. These last two variables influenced the perception of quality of life directly, the former positively and the latter negatively. The main outcome suggests that the perception of social support improves explicit memory and quality of life and reduces depression in active older adults. The findings also suggest that gerontological professionals should design memory training programs, improve available social resources, and offer environments with opportunities to exercise memory.

  13. Performance-Based Service Quality Model: An Empirical Study on Japanese Universities

    Science.gov (United States)

    Sultan, Parves; Wong, Ho

    2010-01-01

    Purpose: This paper aims to develop and empirically test the performance-based higher education service quality model. Design/methodology/approach: The study develops 67-item instrument for measuring performance-based service quality with a particular focus on the higher education sector. Scale reliability is confirmed using the Cronbach's alpha.…

  14. Integrating technology readiness into the expectation-confirmation model: an empirical study of mobile services.

    Science.gov (United States)

    Chen, Shih-Chih; Liu, Ming-Ling; Lin, Chieh-Peng

    2013-08-01

    The aim of this study was to integrate technology readiness into the expectation-confirmation model (ECM) for explaining individuals' continuance of mobile data service usage. After reviewing the ECM and technology readiness, an integrated model was demonstrated via empirical data. Compared with the original ECM, the findings of this study show that the integrated model may offer an ameliorated way to clarify what factors and how they influence the continuous intention toward mobile services. Finally, the major findings are summarized, and future research directions are suggested.

  15. Antecedents of employee electricity saving behavior in organizations: An empirical study based on norm activation model

    International Nuclear Information System (INIS)

    Zhang, Yixiang; Wang, Zhaohua; Zhou, Guanghui

    2013-01-01

    China is one of the major energy-consuming countries, and is under great pressure to promote energy saving and reduce domestic energy consumption. Employees constitute an important target group for energy saving. However, only a few research efforts have been paid to study what drives employee energy saving behavior in organizations. To fill this gap, drawing on norm activation model (NAM), we built a research model to study antecedents of employee electricity saving behavior in organizations. The model was empirically tested using survey data collected from office workers in Beijing, China. Results show that personal norm positively influences employee electricity saving behavior. Organizational electricity saving climate negatively moderates the effect of personal norm on electricity saving behavior. Awareness of consequences, ascription of responsibility, and organizational electricity saving climate positively influence personal norm. Furthermore, awareness of consequences positively influences ascription of responsibility. This paper contributes to the energy saving behavior literature by building a theoretical model of employee electricity saving behavior which is understudied in the current literature. Based on the empirical results, implications on how to promote employee electricity saving are discussed. - Highlights: • We studied employee electricity saving behavior based on norm activation model. • The model was tested using survey data collected from office workers in China. • Personal norm positively influences employee′s electricity saving behavior. • Electricity saving climate negatively moderates personal norm′s effect. • This research enhances our understanding of employee electricity saving behavior

  16. IT-enabled dynamic capability on performance: An empirical study of BSC model

    Directory of Open Access Journals (Sweden)

    Adilson Carlos Yoshikuni

    2017-05-01

    Full Text Available ew studies have investigated the influence of “information capital,” through IT-enabled dynamic capability, on corporate performance, particularly in economic turbulence. Our study investigates the causal relationship between performance perspectives of the balanced scorecard using partial least squares path modeling. Using data on 845 Brazilian companies, we conduct a quantitative empirical study of firms during an economic crisis and observe the following interesting results. Operational and analytical IT-enabled dynamic capability had positive effects on business process improvement and corporate performance. Results pertaining to mediation (endogenous variables and moderation (control variables clarify IT’s role in and benefits for corporate performance.

  17. A Comprehensive Comparison Study of Empirical Cutting Transport Models in Inclined and Horizontal Wells

    Directory of Open Access Journals (Sweden)

    Asep Mohamad Ishaq Shiddiq

    2017-07-01

    Full Text Available In deviated and horizontal drilling, hole-cleaning issues are a common and complex problem. This study explored the effect of various parameters in drilling operations and how they affect the flow rate required for effective cutting transport. Three models, developed following an empirical approach, were employed: Rudi-Shindu’s model, Hopkins’, and Tobenna’s model. Rudi-Shindu’s model needs iteration in the calculation. Firstly, the three models were compared using a sensitivity analysis of drilling parameters affecting cutting transport. The result shows that the models have similar trends but different values for minimum flow velocity. Analysis was conducted to examine the feasibility of using Rudi-Shindu’s, Hopkins’, and Tobenna’s models. The result showed that Hopkins’ model is limited by cutting size and revolution per minute (RPM. The minimum flow rate from Tobenna’s model is affected only by well inclination, drilling fluid weight and drilling fluid rheological property. Meanwhile, Rudi-Shindu’s model is limited by inclinations above 45°. The study showed that the investigated models are not suitable for horizontal wells because they do not include the effect of lateral section.

  18. Empirical Bayes ranking and selection methods via semiparametric hierarchical mixture models in microarray studies.

    Science.gov (United States)

    Noma, Hisashi; Matsui, Shigeyuki

    2013-05-20

    The main purpose of microarray studies is screening of differentially expressed genes as candidates for further investigation. Because of limited resources in this stage, prioritizing genes are relevant statistical tasks in microarray studies. For effective gene selections, parametric empirical Bayes methods for ranking and selection of genes with largest effect sizes have been proposed (Noma et al., 2010; Biostatistics 11: 281-289). The hierarchical mixture model incorporates the differential and non-differential components and allows information borrowing across differential genes with separation from nuisance, non-differential genes. In this article, we develop empirical Bayes ranking methods via a semiparametric hierarchical mixture model. A nonparametric prior distribution, rather than parametric prior distributions, for effect sizes is specified and estimated using the "smoothing by roughening" approach of Laird and Louis (1991; Computational statistics and data analysis 12: 27-37). We present applications to childhood and infant leukemia clinical studies with microarrays for exploring genes related to prognosis or disease progression. Copyright © 2012 John Wiley & Sons, Ltd.

  19. A DISTANCE EDUCATION MODEL FOR JORDANIAN STUDENTS BASED ON AN EMPIRICAL STUDY

    Directory of Open Access Journals (Sweden)

    Ahmad SHAHER MASHHOUR

    2007-04-01

    Full Text Available Distance education is expanding worldwide. Numbers of students enrolled in distance education are increasing at very high rates. Distance education is said to be the future of education because it addresses educational needs of the new millennium. This paper represents the findings of an empirical study on a sample of Jordanian distance education students into a requirement model that addresses the need of such education at the national level. The responses of the sample show that distance education is offering a viable and satisfactory alternative to those who cannot enroll in regular residential education. The study also shows that the shortcomings of the regular and the current form of distance education in Jordan can be overcome by the use of modern information technology.

  20. Model and Empirical Study on Several Urban Public Transport Networks in China

    Science.gov (United States)

    Ding, Yimin; Ding, Zhuo

    2012-07-01

    In this paper, we present the empirical investigation results on the urban public transport networks (PTNs) and propose a model to understand the results obtained. We investigate some urban public traffic networks in China, which are the urban public traffic networks of Beijing, Guangzhou, Wuhan and etc. The empirical results on the big cities show that the accumulative act-degree distributions of PTNs take neither power function forms, nor exponential function forms, but they are described by a shifted power function, and the accumulative act-degree distributions of PTNs in medium-sized or small cities follow the same law. In the end, we propose a model to show a possible evolutionary mechanism for the emergence of such network. The analytic results obtained from this model are in good agreement with the empirical results.

  1. [A competency model of rural general practitioners: theory construction and empirical study].

    Science.gov (United States)

    Yang, Xiu-Mu; Qi, Yu-Long; Shne, Zheng-Fu; Han, Bu-Xin; Meng, Bei

    2015-04-01

    To perform theory construction and empirical study of the competency model of rural general practitioners. Through literature study, job analysis, interviews, and expert team discussion, the questionnaire of rural general practitioners competency was constructed. A total of 1458 rural general practitioners were surveyed by the questionnaire in 6 central provinces. The common factors were constructed using the principal component method of exploratory factor analysis and confirmatory factor analysis. The influence of the competency characteristics on the working performance was analyzed using regression equation analysis. The Cronbach 's alpha coefficient of the questionnaire was 0.974. The model consisted of 9 dimensions and 59 items. The 9 competency dimensions included basic public health service ability, basic clinical skills, system analysis capability, information management capability, communication and cooperation ability, occupational moral ability, non-medical professional knowledge, personal traits and psychological adaptability. The rate of explained cumulative total variance was 76.855%. The model fitting index were Χ(2)/df 1.88, GFI=0.94, NFI=0.96, NNFI=0.98, PNFI=0.91, RMSEA=0.068, CFI=0.97, IFI=0.97, RFI=0.96, suggesting good model fitting. Regression analysis showed that the competency characteristics had a significant effect on job performance. The rural general practitioners competency model provides reference for rural doctor training, rural order directional cultivation of medical students, and competency performance management of the rural general practitioners.

  2. An Evaluation Model for Sustainable Development of China’s Textile Industry: An Empirical Study

    Science.gov (United States)

    Zhao, Hong; Lu, Xiaodong; Yu, Ting; Yin, Yanbin

    2018-04-01

    With economy’s continuous rapid growth, textile industry is required to search for new rules and adjust strategies in order to optimize industrial structure and rationalize social spending. The sustainable development of China’s textile industry is a comprehensive research subject. This study analyzed the status of China’s textile industry and constructed the evaluation model based on the economical, ecologic, and social benefits. Analytic Hierarchy Process (AHP) and Data Envelopment Analysis (DEA) were used for an empirical study of textile industry. The result of evaluation model suggested that the status of the textile industry has become the major problems in the sustainable development of China’s textile industry. It’s nearly impossible to integrate into the global economy if no measures are taken. The enterprises concerned with the textile industry status should be reformed in terms of product design, raw material selection, technological reform, technological progress, and management, in accordance with the ideas and requirements of sustainable development. The results of this study are benefit for 1) discover the main elements restricting the industry’s sustainable development; 2) seek for corresponding solutions for policy formulation and implementation of textile industry; 3) provide references for enterprises’ development transformation in strategic deployment, fund allocation, and personnel assignment.

  3. Tests of Parameters Instability: Theoretical Study and Empirical Applications on Two Types of Models (ARMA Model and Market Model

    Directory of Open Access Journals (Sweden)

    Sahbi FARHANI

    2012-01-01

    Full Text Available This paper considers tests of parameters instability and structural change with known, unknown or multiple breakpoints. The results apply to a wide class of parametric models that are suitable for estimation by strong rules for detecting the number of breaks in a time series. For that, we use Chow, CUSUM, CUSUM of squares, Wald, likelihood ratio and Lagrange multiplier tests. Each test implicitly uses an estimate of a change point. We conclude with an empirical analysis on two different models (ARMA model and simple linear regression model.

  4. Using an Empirical Binomial Hierarchical Bayesian Model as an Alternative to Analyzing Data from Multisite Studies

    Science.gov (United States)

    Hardin, J. Michael; Anderson, Billie S.; Woodby, Lesa L.; Crawford, Myra A.; Russell, Toya V.

    2008-01-01

    This article explores the statistical methodologies used in demonstration and effectiveness studies when the treatments are applied across multiple settings. The importance of evaluating and how to evaluate these types of studies are discussed. As an alternative to standard methodology, the authors of this article offer an empirical binomial…

  5. Empirical Vector Autoregressive Modeling

    NARCIS (Netherlands)

    M. Ooms (Marius)

    1993-01-01

    textabstractChapter 2 introduces the baseline version of the VAR model, with its basic statistical assumptions that we examine in the sequel. We first check whether the variables in the VAR can be transformed to meet these assumptions. We analyze the univariate characteristics of the series.

  6. Empirical Testing of a Theoretical Extension of the Technology Acceptance Model: An Exploratory Study of Educational Wikis

    Science.gov (United States)

    Liu, Xun

    2010-01-01

    This study extended the technology acceptance model and empirically tested the new model with wikis, a new type of educational technology. Based on social cognitive theory and the theory of planned behavior, three new variables, wiki self-efficacy, online posting anxiety, and perceived behavioral control, were added to the original technology…

  7. Time-varying volatility in Malaysian stock exchange: An empirical study using multiple-volatility-shift fractionally integrated model

    Science.gov (United States)

    Cheong, Chin Wen

    2008-02-01

    This article investigated the influences of structural breaks on the fractionally integrated time-varying volatility model in the Malaysian stock markets which included the Kuala Lumpur composite index and four major sectoral indices. A fractionally integrated time-varying volatility model combined with sudden changes is developed to study the possibility of structural change in the empirical data sets. Our empirical results showed substantial reduction in fractional differencing parameters after the inclusion of structural change during the Asian financial and currency crises. Moreover, the fractionally integrated model with sudden change in volatility performed better in the estimation and specification evaluations.

  8. Antecedents and Consequences of Individual Performance Analysis of Turnover Intention Model (Empirical Study of Public Accountants in Indonesia)

    OpenAIRE

    Raza, Hendra; Maksum, Azhar; Erlina; Lumban Raja, Prihatin

    2014-01-01

    Azhar Maksum This study aims to examine empirically the antecedents of individual performance on its consequences of turnover intention in public accounting firms. There are eight variables measured which consists of auditors' empowerment, innovation professionalism, role ambiguity, role conflict, organizational commitment, individual performance and turnover intention. Data analysis is based on 163 public accountant using the Structural Equation Modeling assisted with an appli...

  9. Comparative approaches from empirical to mechanistic simulation modelling in Land Evaluation studies

    Science.gov (United States)

    Manna, P.; Basile, A.; Bonfante, A.; Terribile, F.

    2009-04-01

    The Land Evaluation (LE) comprise the evaluation procedures to asses the attitudes of the land to a generic or specific use (e.g. biomass production). From local to regional and national scale the approach to the land use planning should requires a deep knowledge of the processes that drive the functioning of the soil-plant-atmosphere system. According to the classical approaches the assessment of attitudes is the result of a qualitative comparison between the land/soil physical properties and the land use requirements. These approaches have a quick and inexpensive applicability; however, they are based on empirical and qualitative models with a basic knowledge structure specifically built for a specific landscape and for the specific object of the evaluation (e.g. crop). The outcome from this situation is the huge difficulties in the spatial extrapolation of the LE results and the rigidity of the system. Modern techniques instead, rely on the application of mechanistic and quantitative simulation modelling that allow a dynamic characterisation of the interrelated physical and chemical processes taking place in the soil landscape. Moreover, the insertion of physical based rules in the LE procedure may make it less difficult in terms of both extending spatially the results and changing the object (e.g. crop species, nitrate dynamics, etc.) of the evaluation. On the other side these modern approaches require high quality and quantity of input data that cause a significant increase in costs. In this scenario nowadays the LE expert is asked to choose the best LE methodology considering costs, complexity of the procedure and benefits in handling a specific land evaluation. In this work we performed a forage maize land suitability study by comparing 9 different methods having increasing complexity and costs. The study area, of about 2000 ha, is located in North Italy in the Lodi plain (Po valley). The range of the 9 employed methods ranged from standard LE approaches to

  10. An empirical model for the study of employee paticipation and its influence on job satisfaction

    Directory of Open Access Journals (Sweden)

    Lucas Joan Pujol Cols

    2015-12-01

    Full Text Available This article provides an analysis of the factors that influence the employee’s possibilities perceived to trigger actions of meaningful participation in three levels: Intra-group Level, Institutional Level and directly in the Leadership team of of the organization.Twelve (12 interviews were done with teachers from the Social and Economic Sciences School of the Mar del Plata (Argentina University, with different positions, areas and working hours.Based on qualitative evidence, an empirical model was constructed claiming to connect different factors for each manifest of participation, establishing hypothetical relations between subgroups.Additionally, in this article the implication of participation, its relationship with the job satisfaction and the role of individual expectations on the participation opportunities that receives each employee, are discussed. Keywords: Participation, Job satisfaction, University, Expectations, Qualitative Analysis. 

  11. Empirical tests of the Chicago model and the Easterlin hypothesis: a case study of Japan.

    Science.gov (United States)

    Ohbuchi, H

    1982-05-01

    The objective of this discussion is to test the applicability of economic theory of fertility with special reference to postwar Japan and to find a clue for forecasting the future trend of fertility. The theories examined are the "Chicago model" and the "Easterlin hypothesis." The major conclusion common among the leading economic theories of fertility, which have their origin with Gary S. Becker (1960, 1965) and Richard A. Easterlin (1966), is the positive income effect, i.e., that the relationship between income and fertility is positive despite the evidence that higher income families have fewer children and that fertility has declined with economic development. To bridge the gap between theory and fact is the primary purpose of the economic theory of fertility, and each offers a different interpretation for it. The point of the Chicago model, particularly of the household decision making model of the "new home economics," is the mechanism that a positive effect of husband's income growth on fertility is offset by a negative price effect caused by the opportunity cost of wife's time. While the opportunity cost of wife's time is independent of the female wage rate for an unemployed wife, it is directly associated with the wage rate for a gainfully employed wife. Thus, the fertility response to female wages occurs only among families with an employed wife. The primary concern of empirical efforts to test the Chicago model has been with the determination of income and price elasticities. An attempt is made to test the relevance of the Chicago model and the Easterlin hypothesis in explaning the fertility movement in postwar Japan. In case of the Chicago model, the statistical results appeared fairly successful but did not match with the theory. The effect on fertility of a rise in women's real wage (and, therefore in the opportunity cost of mother's time) and of a rise in labor force participation rate of married women of childbearing age in recent years could not

  12. Empirical Model Building Data, Models, and Reality

    CERN Document Server

    Thompson, James R

    2011-01-01

    Praise for the First Edition "This...novel and highly stimulating book, which emphasizes solving real problems...should be widely read. It will have a positive and lasting effect on the teaching of modeling and statistics in general." - Short Book Reviews This new edition features developments and real-world examples that showcase essential empirical modeling techniques Successful empirical model building is founded on the relationship between data and approximate representations of the real systems that generated that data. As a result, it is essential for researchers who construct these m

  13. Empirical questions for collective-behaviour modelling

    Indian Academy of Sciences (India)

    The collective behaviour of groups of social animals has been an active topic of study ... Models have been successful at reproducing qualitative features of ... quantitative and detailed empirical results for a range of animal systems. ... standard method [23], the redundant information recorded by the cameras can be used to.

  14. An Empirical Rate Constant Based Model to Study Capacity Fading in Lithium Ion Batteries

    Directory of Open Access Journals (Sweden)

    Srivatsan Ramesh

    2015-01-01

    Full Text Available A one-dimensional model based on solvent diffusion and kinetics to study the formation of the SEI (solid electrolyte interphase layer and its impact on the capacity of a lithium ion battery is developed. The model uses the earlier work on silicon oxidation but studies the kinetic limitations of the SEI growth process. The rate constant of the SEI formation reaction at the anode is seen to play a major role in film formation. The kinetics of the reactions for capacity fading for various battery systems are studied and the rate constants are evaluated. The model is used to fit the capacity fade in different battery systems.

  15. Reviewing the effort-reward imbalance model: drawing up the balance of 45 empirical studies

    NARCIS (Netherlands)

    Vegchel, van N.; Jonge, de J.; Bosma, H.; Schaufeli, W.B.

    2005-01-01

    The present paper provides a review of 45 studies on the Effort–Reward Imbalance (ERI) Model published from 1986 to 2003 (inclusive). In 1986, the ERI Model was introduced by Siegrist et al. (Biological and Psychological Factors in Cardiovascular Disease, Springer, Berlin, 1986, pp. 104–126; Social

  16. School leadership effects revisited: a review of empirical studies guided by indirect-effect models

    NARCIS (Netherlands)

    Hendriks, Maria A.; Scheerens, Jaap

    2013-01-01

    Fourteen leadership effect studies that used indirect-effect models were quantitatively analysed to explore the most promising mediating variables. The results indicate that total effect sizes based on indirect-effect studies appear to be low, quite comparable to the results of some meta-analyses of

  17. An Empirical Study of Kirkpatrick's Evaluation Model in the Hospitality Industry

    Science.gov (United States)

    Chang, Ya-Hui Elegance

    2010-01-01

    This study examined Kirkpatrick's training evaluation model (Kirkpatrick & Kirkpatrick, 2006) by assessing a sales training program conducted at an organization in the hospitality industry. The study assessed the employees' training outcomes of knowledge and skills, job performance, and the impact of the training upon the organization. By…

  18. An Empirical Bayes Mixture Model for Effect Size Distributions in Genome-Wide Association Studies

    DEFF Research Database (Denmark)

    Thompson, Wesley K.; Wang, Yunpeng; Schork, Andrew J.

    2015-01-01

    -wide association study (GWAS) test statistics. Test statistics corresponding to null associations are modeled as random draws from a normal distribution with zero mean; test statistics corresponding to non-null associations are also modeled as normal with zero mean, but with larger variance. The model is fit via...... analytically and in simulations. We apply this approach to meta-analysis test statistics from two large GWAS, one for Crohn’s disease (CD) and the other for schizophrenia (SZ). A scale mixture of two normals distribution provides an excellent fit to the SZ nonparametric replication effect size estimates. While...... minimizing discrepancies between the parametric mixture model and resampling-based nonparametric estimates of replication effect sizes and variances. We describe in detail the implications of this model for estimation of the non-null proportion, the probability of replication in de novo samples, the local...

  19. Customer orientation on online newspaper business models with paid content strategies: An empirical study

    OpenAIRE

    Goyanes, Manuel; Sylvie, George

    2014-01-01

    This study examines the transformations that trigger business models with paid content strategies on news organizations under the theoretical framework of market orientation. The results show three main factors: those related to competence, to the organization culture and to understanding of needs and wants of the audience. The findings also suggest that online newspapers business models with paid content strategies are more like experiments or forays rather than definitive methods that monet...

  20. An Empirical Study on the Preference of Supermarkets with Analytic Hierarchy Process Model

    Science.gov (United States)

    Weng Siew, Lam; Singh, Ranjeet; Singh, Bishan; Weng Hoe, Lam; Kah Fai, Liew

    2018-04-01

    Large-scale retailers are very important to the consumers in this fast-paced world. Selection of desirable market to purchase products and services becomes major concern among consumers in their daily life due to vast choices available. Therefore, the objective of this paper is to determine the most preferred supermarket among AEON, Jaya Grocer, Tesco, Giant and Econsave by the undergraduate students in Malaysia with Analytic Hierarchy Process (AHP) model. Besides that, this study also aims to determine the priority of decision criteria in the selection of supermarkets among the undergraduatestudents with AHP model. The decision criteria employed in this study are product quality, competitive price, cleanliness, product variety, location, good price labelling, fast checkout and employee courtesy. The results of this study show that AEON is the most preferred supermarket followed by Jaya Grocer, Tesco, Econsave and Giant among the students based on AHP model. Product quality, cleanliness and competitive price are ranked as the top three influential factors in this study. This study is significant because it helps to determine the most preferred supermarket as well as the most influential decision criteria in the preference of supermarkets among the undergraduate students with AHP model.

  1. Biomass viability: An experimental study and the development of an empirical mathematical model for submerged membrane bioreactor.

    Science.gov (United States)

    Zuthi, M F R; Ngo, H H; Guo, W S; Nghiem, L D; Hai, F I; Xia, S Q; Zhang, Z Q; Li, J X

    2015-08-01

    This study investigates the influence of key biomass parameters on specific oxygen uptake rate (SOUR) in a sponge submerged membrane bioreactor (SSMBR) to develop mathematical models of biomass viability. Extra-cellular polymeric substances (EPS) were considered as a lumped parameter of bound EPS (bEPS) and soluble microbial products (SMP). Statistical analyses of experimental results indicate that the bEPS, SMP, mixed liquor suspended solids and volatile suspended solids (MLSS and MLVSS) have functional relationships with SOUR and their relative influence on SOUR was in the order of EPS>bEPS>SMP>MLVSS/MLSS. Based on correlations among biomass parameters and SOUR, two independent empirical models of biomass viability were developed. The models were validated using results of the SSMBR. However, further validation of the models for different operating conditions is suggested. Copyright © 2015 Elsevier Ltd. All rights reserved.

  2. Predictive time-series modeling using artificial neural networks for Linac beam symmetry: an empirical study.

    Science.gov (United States)

    Li, Qiongge; Chan, Maria F

    2017-01-01

    Over half of cancer patients receive radiotherapy (RT) as partial or full cancer treatment. Daily quality assurance (QA) of RT in cancer treatment closely monitors the performance of the medical linear accelerator (Linac) and is critical for continuous improvement of patient safety and quality of care. Cumulative longitudinal QA measurements are valuable for understanding the behavior of the Linac and allow physicists to identify trends in the output and take preventive actions. In this study, artificial neural networks (ANNs) and autoregressive moving average (ARMA) time-series prediction modeling techniques were both applied to 5-year daily Linac QA data. Verification tests and other evaluations were then performed for all models. Preliminary results showed that ANN time-series predictive modeling has more advantages over ARMA techniques for accurate and effective applicability in the dosimetry and QA field. © 2016 New York Academy of Sciences.

  3. A Structural Model of Business Performance: An Empirical Study on Tobacco Farmers

    Directory of Open Access Journals (Sweden)

    Sony Heru Priyanto

    2006-01-01

    The results of the analysis indicate that factors like personal aspects, together with physical, economic and institutional environments, affect farmers’ entrepreneurship. Personal aspects turn out to be the dominant factor that determines entrepreneurship and farm performance. This study also shows that farmers’ entrepreneurship is affected by their management capacity, which, in turn, affects the farmers’ farm performance. While there is no doubt in the adequacy of the model to estimate farm performance, this finding invites further investigation to validate it in other fields and scale of business, such as in small and medium enterprises and other companies. Furthermore, in order to evaluate the goodness of fit of the model in various contexts, further research both in a cross-cultural context and cross-national contexts using this model should be conducted.

  4. An Empirical Bayes Mixture Model for Effect Size Distributions in Genome-Wide Association Studies.

    Directory of Open Access Journals (Sweden)

    Wesley K Thompson

    2015-12-01

    Full Text Available Characterizing the distribution of effects from genome-wide genotyping data is crucial for understanding important aspects of the genetic architecture of complex traits, such as number or proportion of non-null loci, average proportion of phenotypic variance explained per non-null effect, power for discovery, and polygenic risk prediction. To this end, previous work has used effect-size models based on various distributions, including the normal and normal mixture distributions, among others. In this paper we propose a scale mixture of two normals model for effect size distributions of genome-wide association study (GWAS test statistics. Test statistics corresponding to null associations are modeled as random draws from a normal distribution with zero mean; test statistics corresponding to non-null associations are also modeled as normal with zero mean, but with larger variance. The model is fit via minimizing discrepancies between the parametric mixture model and resampling-based nonparametric estimates of replication effect sizes and variances. We describe in detail the implications of this model for estimation of the non-null proportion, the probability of replication in de novo samples, the local false discovery rate, and power for discovery of a specified proportion of phenotypic variance explained from additive effects of loci surpassing a given significance threshold. We also examine the crucial issue of the impact of linkage disequilibrium (LD on effect sizes and parameter estimates, both analytically and in simulations. We apply this approach to meta-analysis test statistics from two large GWAS, one for Crohn's disease (CD and the other for schizophrenia (SZ. A scale mixture of two normals distribution provides an excellent fit to the SZ nonparametric replication effect size estimates. While capturing the general behavior of the data, this mixture model underestimates the tails of the CD effect size distribution. We discuss the

  5. An Empirical Bayes Mixture Model for Effect Size Distributions in Genome-Wide Association Studies.

    Science.gov (United States)

    Thompson, Wesley K; Wang, Yunpeng; Schork, Andrew J; Witoelar, Aree; Zuber, Verena; Xu, Shujing; Werge, Thomas; Holland, Dominic; Andreassen, Ole A; Dale, Anders M

    2015-12-01

    Characterizing the distribution of effects from genome-wide genotyping data is crucial for understanding important aspects of the genetic architecture of complex traits, such as number or proportion of non-null loci, average proportion of phenotypic variance explained per non-null effect, power for discovery, and polygenic risk prediction. To this end, previous work has used effect-size models based on various distributions, including the normal and normal mixture distributions, among others. In this paper we propose a scale mixture of two normals model for effect size distributions of genome-wide association study (GWAS) test statistics. Test statistics corresponding to null associations are modeled as random draws from a normal distribution with zero mean; test statistics corresponding to non-null associations are also modeled as normal with zero mean, but with larger variance. The model is fit via minimizing discrepancies between the parametric mixture model and resampling-based nonparametric estimates of replication effect sizes and variances. We describe in detail the implications of this model for estimation of the non-null proportion, the probability of replication in de novo samples, the local false discovery rate, and power for discovery of a specified proportion of phenotypic variance explained from additive effects of loci surpassing a given significance threshold. We also examine the crucial issue of the impact of linkage disequilibrium (LD) on effect sizes and parameter estimates, both analytically and in simulations. We apply this approach to meta-analysis test statistics from two large GWAS, one for Crohn's disease (CD) and the other for schizophrenia (SZ). A scale mixture of two normals distribution provides an excellent fit to the SZ nonparametric replication effect size estimates. While capturing the general behavior of the data, this mixture model underestimates the tails of the CD effect size distribution. We discuss the implications of

  6. Blog acceptance model: An empirical study on exploring users’ acceptance and continual usage of blogs

    Institute of Scientific and Technical Information of China (English)

    ZHAO; Yuxiang; ZHU; Qinghua

    2009-01-01

    Blogs have permeated into our daily lives at a fast speed,and various kinds of blog spaces have attracted our attention.However,little effort has been made on studying the users’motivation to participate in blog activities.This paper aims to construct a theoretical model about the blog adoption based on technology acceptance model(TAM theory),social capital theory and social exchange theory,and put forward 18 related hypotheses.Then the survey method is adopted to analyze the data from 208 questionnaires using the SPSS and LISREL tools,and to examine the theoretical model and hypotheses.Finally,the paper makes a discussion from five aspects due to the results of data analysis,including individual driving factors,group driving factors,community driving factors,technology acceptance factors and moderating variables.The results show that curiosity/enjoyment,user’s experience,social interaction and social identification will greatly affect users’motivation to accept a blog;meanwhile,perceived ease of use,exchange cost and trust will partially influence users’intention to participate in blog activities.The results also suggest that age and education degrees have significant moderating effects on users’acceptance and updating of blogs.

  7. Empirically Based Composite Fracture Prediction Model From the Global Longitudinal Study of Osteoporosis in Postmenopausal Women (GLOW)

    Science.gov (United States)

    Compston, Juliet E.; Chapurlat, Roland D.; Pfeilschifter, Johannes; Cooper, Cyrus; Hosmer, David W.; Adachi, Jonathan D.; Anderson, Frederick A.; Díez-Pérez, Adolfo; Greenspan, Susan L.; Netelenbos, J. Coen; Nieves, Jeri W.; Rossini, Maurizio; Watts, Nelson B.; Hooven, Frederick H.; LaCroix, Andrea Z.; March, Lyn; Roux, Christian; Saag, Kenneth G.; Siris, Ethel S.; Silverman, Stuart; Gehlbach, Stephen H.

    2014-01-01

    Context: Several fracture prediction models that combine fractures at different sites into a composite outcome are in current use. However, to the extent individual fracture sites have differing risk factor profiles, model discrimination is impaired. Objective: The objective of the study was to improve model discrimination by developing a 5-year composite fracture prediction model for fracture sites that display similar risk profiles. Design: This was a prospective, observational cohort study. Setting: The study was conducted at primary care practices in 10 countries. Patients: Women aged 55 years or older participated in the study. Intervention: Self-administered questionnaires collected data on patient characteristics, fracture risk factors, and previous fractures. Main Outcome Measure: The main outcome is time to first clinical fracture of hip, pelvis, upper leg, clavicle, or spine, each of which exhibits a strong association with advanced age. Results: Of four composite fracture models considered, model discrimination (c index) is highest for an age-related fracture model (c index of 0.75, 47 066 women), and lowest for Fracture Risk Assessment Tool (FRAX) major fracture and a 10-site model (c indices of 0.67 and 0.65). The unadjusted increase in fracture risk for an additional 10 years of age ranges from 80% to 180% for the individual bones in the age-associated model. Five other fracture sites not considered for the age-associated model (upper arm/shoulder, rib, wrist, lower leg, and ankle) have age associations for an additional 10 years of age from a 10% decrease to a 60% increase. Conclusions: After examining results for 10 different bone fracture sites, advanced age appeared the single best possibility for uniting several different sites, resulting in an empirically based composite fracture risk model. PMID:24423345

  8. An Empirical Study of Propagation Models for Wireless Communications in Open-pit Mines

    DEFF Research Database (Denmark)

    Portela Lopes de Almeida, Erika; Caldwell, George; Rodriguez Larrad, Ignacio

    2018-01-01

    —In this paper, we investigate the suitability of the propagation models ITU-R 526, Okumura Hata, COST Hata models and Standard Propagation Model (SPM) to predict the path loss in open-pit mines. The models are evaluated by comparing the predicted data with measurements obtained in two operational...

  9. An Empirical Study of Efficiency and Accuracy of Probabilistic Graphical Models

    DEFF Research Database (Denmark)

    Nielsen, Jens Dalgaard; Jaeger, Manfred

    2006-01-01

    In this paper we compare Na\\ii ve Bayes (NB) models, general Bayes Net (BN) models and Probabilistic Decision Graph (PDG) models w.r.t. accuracy and efficiency. As the basis for our analysis we use graphs of size vs. likelihood that show the theoretical capabilities of the models. We also measure...

  10. Fitting non-gaussian Models to Financial data: An Empirical Study

    Directory of Open Access Journals (Sweden)

    Pablo Olivares

    2011-04-01

    Full Text Available In this paper are presented some experiences about the modeling of financial data by three classes of models as alternative to Gaussian Linear models. Dynamic Volatility, Stable L'evy and Diffusion with Jumps models are considered. The techniques are illustrated with some examples of financial series on currency, futures and indexes.

  11. Modeling Zero-Inflated and Overdispersed Count Data: An Empirical Study of School Suspensions

    Science.gov (United States)

    Desjardins, Christopher David

    2016-01-01

    The purpose of this article is to develop a statistical model that best explains variability in the number of school days suspended. Number of school days suspended is a count variable that may be zero-inflated and overdispersed relative to a Poisson model. Four models were examined: Poisson, negative binomial, Poisson hurdle, and negative…

  12. Empirical study of long-range connections in a road network offers new ingredient for navigation optimization models

    International Nuclear Information System (INIS)

    Wang, Pu; Liu, Like; Li, Xiamiao; Li, Guanliang; González, Marta C

    2014-01-01

    Navigation problem in lattices with long-range connections has been widely studied to understand the design principles for optimal transport networks; however, the travel cost of long-range connections was not considered in previous models. We define long-range connection in a road network as the shortest path between a pair of nodes through highways and empirically analyze the travel cost properties of long-range connections. Based on the maximum speed allowed in each road segment, we observe that the time needed to travel through a long-range connection has a characteristic time T h  ∼ 29 min, while the time required when using the alternative arterial road path has two different characteristic times T a  ∼ 13 and 41 min and follows a power law for times larger than 50 min. Using daily commuting origin–destination matrix data, we additionally find that the use of long-range connections helps people to save about half of the travel time in their daily commute. Based on the empirical results, we assign a more realistic travel cost to long-range connections in two-dimensional square lattices, observing dramatically different minimum average shortest path 〈l〉 but similar optimal navigation conditions. (paper)

  13. Empirical study of long-range connections in a road network offers new ingredient for navigation optimization models

    Science.gov (United States)

    Wang, Pu; Liu, Like; Li, Xiamiao; Li, Guanliang; González, Marta C.

    2014-01-01

    Navigation problem in lattices with long-range connections has been widely studied to understand the design principles for optimal transport networks; however, the travel cost of long-range connections was not considered in previous models. We define long-range connection in a road network as the shortest path between a pair of nodes through highways and empirically analyze the travel cost properties of long-range connections. Based on the maximum speed allowed in each road segment, we observe that the time needed to travel through a long-range connection has a characteristic time Th ˜ 29 min, while the time required when using the alternative arterial road path has two different characteristic times Ta ˜ 13 and 41 min and follows a power law for times larger than 50 min. Using daily commuting origin-destination matrix data, we additionally find that the use of long-range connections helps people to save about half of the travel time in their daily commute. Based on the empirical results, we assign a more realistic travel cost to long-range connections in two-dimensional square lattices, observing dramatically different minimum average shortest path but similar optimal navigation conditions.

  14. The social networking application success model : An empirical study of Facebook and Twitter

    NARCIS (Netherlands)

    Ou, Carol; Davison, R.M.; Huang, Q.

    2016-01-01

    Social networking applications (SNAs) are among the fastest growing web applications of recent years. In this paper, we propose a causal model to assess the success of SNAs, grounded on DeLone and McLean’s updated information systems (IS) success model. In addition to their original three dimensions

  15. The role of production and teamwork practices in construction safety: a cognitive model and an empirical case study.

    Science.gov (United States)

    Mitropoulos, Panagiotis Takis; Cupido, Gerardo

    2009-01-01

    In construction, the challenge for researchers and practitioners is to develop work systems (production processes and teams) that can achieve high productivity and high safety at the same time. However, construction accident causation models ignore the role of work practices and teamwork. This study investigates the mechanisms by which production and teamwork practices affect the likelihood of accidents. The paper synthesizes a new model for construction safety based on the cognitive perspective (Fuller's Task-Demand-Capability Interface model, 2005) and then presents an exploratory case study. The case study investigates and compares the work practices of two residential framing crews: a 'High Reliability Crew' (HRC)--that is, a crew with exceptional productivity and safety over several years, and an average performing crew from the same company. The model explains how the production and teamwork practices generate the work situations that workers face (the task demands) and affect the workers ability to cope (capabilities). The case study indicates that the work practices of the HRC directly influence the task demands and match them with the applied capabilities. These practices were guided by the 'principle' of avoiding errors and rework and included work planning and preparation, work distribution, managing the production pressures, and quality and behavior monitoring. The Task Demand-Capability model links construction research to a cognitive model of accident causation and provides a new way to conceptualize safety as an emergent property of the production practices and teamwork processes. The empirical evidence indicates that the crews' work practices and team processes strongly affect the task demands, the applied capabilities, and the match between demands and capabilities. The proposed model and the exploratory case study will guide further discovery of work practices and teamwork processes that can increase both productivity and safety in construction

  16. On Feature Relevance in Image-Based Prediction Models: An Empirical Study

    DEFF Research Database (Denmark)

    Konukoglu, E.; Ganz, Melanie; Van Leemput, Koen

    2013-01-01

    Determining disease-related variations of the anatomy and function is an important step in better understanding diseases and developing early diagnostic systems. In particular, image-based multivariate prediction models and the “relevant features” they produce are attracting attention from the co...

  17. Analyzing the business model for mobile payment from banks' perspective : An empirical study

    NARCIS (Netherlands)

    Guo, J.; Nikou, S.; Bouwman, W.A.G.A.

    2013-01-01

    The increasing number of smart phones presents a significant opportunity for the development of m-payment services. Despite the predicted success of m-payment, the market remains immature in most countries. This can be explained by the lack of agreement on standards and business models for all

  18. Testing an integrated model of operations capabilities An empirical study of Australian airlines

    NARCIS (Netherlands)

    Nand, Alka Ashwini; Singh, Prakash J.; Power, Damien

    2013-01-01

    Purpose - The purpose of this paper is to test the integrated model of operations strategy as proposed by Schmenner and Swink to explain whether firms trade-off or accumulate capabilities, taking into account their positions relative to their asset and operating frontiers.

  19. Developing a Model for Agile Supply: an Empirical Study from Iranian Pharmaceutical Supply Chain

    Science.gov (United States)

    Rajabzadeh Ghatari, Ali; Mehralian, Gholamhossein; Zarenezhad, Forouzandeh; Rasekh, Hamid Reza

    2013-01-01

    Agility is the fundamental characteristic of a supply chain needed for survival in turbulent markets, where environmental forces create additional uncertainty resulting in higher risk in the supply chain management. In addition, agility helps providing the right product, at the right time to the consumer. The main goal of this research is therefore to promote supplier selection in pharmaceutical industry according to the formative basic factors. Moreover, this paper can configure its supply network to achieve the agile supply chain. The present article analyzes the supply part of supply chain based on SCOR model, used to assess agile supply chains by highlighting their specific characteristics and applicability in providing the active pharmaceutical ingredient (API). This methodology provides an analytical modeling; the model enables potential suppliers to be assessed against the multiple criteria using both quantitative and qualitative measures. In addition, for making priority of critical factors, TOPSIS algorithm has been used as a common technique of MADM model. Finally, several factors such as delivery speed, planning and reorder segmentation, trust development and material quantity adjustment are identified and prioritized as critical factors for being agile in supply of API. PMID:24250689

  20. Modeling code-interactions in bilingual word recognition: Recent empirical studies and simulations with BIA+

    NARCIS (Netherlands)

    Lam, K.J.Y.; Dijkstra, A.F.J.

    2010-01-01

    Daily conversations contain many repetitions of identical and similar word forms. For bilinguals, the words can even come from the same or different languages. How do such repetitions affect the human word recognition system? The Bilingual Interactive Activation Plus (BIA+) model provides a

  1. Developing a model for agile supply: an empirical study from Iranian pharmaceutical supply chain.

    Science.gov (United States)

    Rajabzadeh Ghatari, Ali; Mehralian, Gholamhossein; Zarenezhad, Forouzandeh; Rasekh, Hamid Reza

    2013-01-01

    Agility is the fundamental characteristic of a supply chain needed for survival in turbulent markets, where environmental forces create additional uncertainty resulting in higher risk in the supply chain management. In addition, agility helps providing the right product, at the right time to the consumer. The main goal of this research is therefore to promote supplier selection in pharmaceutical industry according to the formative basic factors. Moreover, this paper can configure its supply network to achieve the agile supply chain. The present article analyzes the supply part of supply chain based on SCOR model, used to assess agile supply chains by highlighting their specific characteristics and applicability in providing the active pharmaceutical ingredient (API). This methodology provides an analytical modeling; the model enables potential suppliers to be assessed against the multiple criteria using both quantitative and qualitative measures. In addition, for making priority of critical factors, TOPSIS algorithm has been used as a common technique of MADM model. Finally, several factors such as delivery speed, planning and reorder segmentation, trust development and material quantity adjustment are identified and prioritized as critical factors for being agile in supply of API.

  2. Molecular models of zinc phthalocyanines: semi-empirical molecular orbital computations and physicochemical properties studied by molecular mechanics simulations

    International Nuclear Information System (INIS)

    Gantchev, Tsvetan G.; van Lier, Johan E.; Hunting, Darel J.

    2005-01-01

    To build 3D-molecular models of Zinc-phthalocyanines (ZnPc) and to study their diverse chemical and photosensitization properties, we performed quantum mechanical molecular orbital (MO) semi-empirical (AM1) computations of the ground, excited singlet and triplet states as well as free radical (ionic) species. RHF and UHF (open shell) geometry optimizations led to near-perfect symmetrical ZnPc. Predicted ionization potentials (IP), electron affinities (EA) and lowest electronic transitions of ZnPc are in good agreement with the published experimental and theoretical data. The computation-derived D 4h /D 2h -symmetry 3D-structures of ground and excited states and free radicals of ZnPc, together with the frontier orbital energies and Mulliken electron population analysis enabled us to build robust molecular models. These models were used to predict important chemical-reactivity entities such as global electronegativity (χ), hardness (η) and local softness based on Fukui-functions analysis. Examples of molecular mechanics (MM) applications of the 3D-molecular models are presented as approaches to evaluate solvation free energy (ΔG 0 ) solv and to estimate ground- and excited- state oxidation/reduction potentials as well as intermolecular interactions and stability of ground and excited state dimers (exciplexes) and radical ion-pairs

  3. Can Environmental Regulations Promote Corporate Environmental Responsibility? Evidence from the Moderated Mediating Effect Model and an Empirical Study in China

    Directory of Open Access Journals (Sweden)

    Benhong Peng

    2018-02-01

    Full Text Available Based on the Stakeholder theory, a moderated mediating effect model is developed to reach the study objective, revealing an important connection that suggests environmental regulations (ERs influence corporate environmental responsibility (CER (Porter Hypothesis. In building the model, the validity of the questionnaire data was analyzed with factor analysis. By employing a two-step approach, a regression analysis is utilized to discuss the mediating effect of altruistic motivation and moderating effect of green innovation, and a structural equation model is used to explore the interactive mechanism of different variables. It is found that altruistic motivation plays a medium role in the relationship between ERs and CER, and green innovation engages a positive coordination in the relationship. The empirical study identifies factors affecting enterprises’ willingness to undertake environmental responsibility, including environment policies, corporate culture, and personal characters among others. It is also revealed that altruistic motivation is conducive to forming a community interests among enterprises and enhancing their resistance to market risks, which explains and corroborates the Stakeholder theory; and the higher the level of green innovation, the more willing enterprises are to implement environmentally friendly operations.

  4. Farm Household Economic Model of The Integrated Crop Livestock System: Conceptual and Empirical Study

    Directory of Open Access Journals (Sweden)

    Atien Priyanti

    2007-06-01

    Full Text Available An integrated approach to enhance rice production in Indonesia is very prospectus throughout the implementation of adapted and liable integrated program. One of the challenges in rice crop sub sector is the stagnation of its production due to the limitation of organic matter availability. This provides an opportunity for livestock development to overcome the problems on land fertility through the use of manure as the source of organic fertilizer. Ministry of Agriculture had implemented a program on Increasing Integrated Rice Productivity with an Integrated Crop Livestock System as one of the potential components since 2002. Integrated crop livestock system program with special reference to rice field and beef cattle is an alternative to enhance the potential development of agriculture sector in Indonesia. The implementation on this integrated program is to enhance rice production and productivity through a system involving beef cattle with its goal on increasing farmers’ income. Household economic model can be used as one of the analysis to evaluate the success of the implemented crop livestock system program. The specificity of the farmers is that rationality behavior of the role as production and consumption decision making. In this case, farmers perform the production to meet home consumption based on the resources that used directly for its production. The economic analysis of farmers household can be described to anticipate policy options through this model. Factors influencing farmers’ decisions and direct interrelations to production and consumption aspects that have complex implications for the farmers’ welfare of the integrated crop livestock system program.

  5. University staff adoption of iPads: An empirical study using an extended TAM model

    Directory of Open Access Journals (Sweden)

    Michael Steven Lane

    2014-11-01

    Full Text Available This research examined key factors influencing adoption of iPads by university staff. An online survey collected quantitative data to test hypothesised relationships in an extended TAM model. The findings show that university staff consider iPads easy to use and useful, with a high level of compatibility with their work. Social status had no influence on their attitude to using an iPad. However older university staff and university staff with no previous experience in using a similar technology such as an iPhone or smartphone found iPads less easy to use. Furthermore, a lack of formal end user ICT support impacted negatively on the use of iPads.

  6. Psychological Models of Art Reception must be Empirically Grounded

    DEFF Research Database (Denmark)

    Nadal, Marcos; Vartanian, Oshin; Skov, Martin

    2017-01-01

    We commend Menninghaus et al. for tackling the role of negative emotions in art reception. However, their model suffers from shortcomings that reduce its applicability to empirical studies of the arts: poor use of evidence, lack of integration with other models, and limited derivation of testable...... hypotheses. We argue that theories about art experiences should be based on empirical evidence....

  7. An Empirical Comparison of Different Models of Active Aging in Canada: The International Mobility in Aging Study.

    Science.gov (United States)

    Bélanger, Emmanuelle; Ahmed, Tamer; Filiatrault, Johanne; Yu, Hsiu-Ting; Zunzunegui, Maria Victoria

    2017-04-01

    Active aging is a concept that lacks consensus. The WHO defines it as a holistic concept that encompasses the overall health, participation, and security of older adults. Fernández-Ballesteros and colleagues propose a similar concept but omit security and include mood and cognitive function. To date, researchers attempting to validate conceptual models of active aging have obtained mixed results. The goal of this study was to examine the validity of existing models of active aging with epidemiological data from Canada. The WHO model of active aging and the psychological model of active aging developed by Fernández-Ballesteros and colleagues were tested with confirmatory factor analysis. The data used included 799 community-dwelling older adults between 65 and 74 years old, recruited from the patient lists of family physicians in Saint-Hyacinthe, Quebec and Kingston, Ontario. Neither model could be validated in the sample of Canadian older adults. Although a concept of healthy aging can be modeled adequately, social participation and security did not fit a latent factor model. A simple binary index indicated that 27% of older adults in the sample did not meet the active aging criteria proposed by the WHO. Our results suggest that active aging might represent a human rights policy orientation rather than an empirical measurement tool to guide research among older adult populations. Binary indexes of active aging may serve to highlight what remains to be improved about the health, participation, and security of growing populations of older adults. © The Author 2015. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  8. A sensitivity analysis of centrifugal compressors' empirical models

    International Nuclear Information System (INIS)

    Yoon, Sung Ho; Baek, Je Hyun

    2001-01-01

    The mean-line method using empirical models is the most practical method of predicting off-design performance. To gain insight into the empirical models, the influence of empirical models on the performance prediction results is investigated. We found that, in the two-zone model, the secondary flow mass fraction has a considerable effect at high mass flow-rates on the performance prediction curves. In the TEIS model, the first element changes the slope of the performance curves as well as the stable operating range. The second element makes the performance curves move up and down as it increases or decreases. It is also discovered that the slip factor affects pressure ratio, but it has little effect on efficiency. Finally, this study reveals that the skin friction coefficient has significant effect on both the pressure ratio curve and the efficiency curve. These results show the limitations of the present empirical models, and more reasonable empirical models are reeded

  9. A Categorization Model for Educational Values of the History of Mathematics: An Empirical Study

    Science.gov (United States)

    Wang, Xiao-qin; Qi, Chun-yan; Wang, Ke

    2017-01-01

    There is not a clear consensus on the categorization framework of the educational values of the history of mathematics. By analyzing 20 Chinese teaching cases on integrating the history of mathematics into mathematics teaching based on the relevant literature, this study examined a new categorization framework of the educational values of the…

  10. A Hybrid Forecasting Model Based on Empirical Mode Decomposition and the Cuckoo Search Algorithm: A Case Study for Power Load

    Directory of Open Access Journals (Sweden)

    Jiani Heng

    2016-01-01

    Full Text Available Power load forecasting always plays a considerable role in the management of a power system, as accurate forecasting provides a guarantee for the daily operation of the power grid. It has been widely demonstrated in forecasting that hybrid forecasts can improve forecast performance compared with individual forecasts. In this paper, a hybrid forecasting approach, comprising Empirical Mode Decomposition, CSA (Cuckoo Search Algorithm, and WNN (Wavelet Neural Network, is proposed. This approach constructs a more valid forecasting structure and more stable results than traditional ANN (Artificial Neural Network models such as BPNN (Back Propagation Neural Network, GABPNN (Back Propagation Neural Network Optimized by Genetic Algorithm, and WNN. To evaluate the forecasting performance of the proposed model, a half-hourly power load in New South Wales of Australia is used as a case study in this paper. The experimental results demonstrate that the proposed hybrid model is not only simple but also able to satisfactorily approximate the actual power load and can be an effective tool in planning and dispatch for smart grids.

  11. A Categorization Model for Educational Values of the History of Mathematics. An Empirical Study

    Science.gov (United States)

    Wang, Xiao-qin; Qi, Chun-yan; Wang, Ke

    2017-11-01

    There is not a clear consensus on the categorization framework of the educational values of the history of mathematics. By analyzing 20 Chinese teaching cases on integrating the history of mathematics into mathematics teaching based on the relevant literature, this study examined a new categorization framework of the educational values of the history of mathematics by combining the objectives of high school mathematics curriculum in China. This framework includes six dimensions: the harmony of knowledge, the beauty of ideas or methods, the pleasure of inquiries, the improvement of capabilities, the charm of cultures, and the availability of moral education. The results show that this framework better explained the all-educational values of the history of mathematics that all teaching cases showed. Therefore, the framework can guide teachers to better integrate the history of mathematics into teaching.

  12. Empirically evaluating decision-analytic models.

    Science.gov (United States)

    Goldhaber-Fiebert, Jeremy D; Stout, Natasha K; Goldie, Sue J

    2010-08-01

    Model-based cost-effectiveness analyses support decision-making. To augment model credibility, evaluation via comparison to independent, empirical studies is recommended. We developed a structured reporting format for model evaluation and conducted a structured literature review to characterize current model evaluation recommendations and practices. As an illustration, we applied the reporting format to evaluate a microsimulation of human papillomavirus and cervical cancer. The model's outputs and uncertainty ranges were compared with multiple outcomes from a study of long-term progression from high-grade precancer (cervical intraepithelial neoplasia [CIN]) to cancer. Outcomes included 5 to 30-year cumulative cancer risk among women with and without appropriate CIN treatment. Consistency was measured by model ranges overlapping study confidence intervals. The structured reporting format included: matching baseline characteristics and follow-up, reporting model and study uncertainty, and stating metrics of consistency for model and study results. Structured searches yielded 2963 articles with 67 meeting inclusion criteria and found variation in how current model evaluations are reported. Evaluation of the cervical cancer microsimulation, reported using the proposed format, showed a modeled cumulative risk of invasive cancer for inadequately treated women of 39.6% (30.9-49.7) at 30 years, compared with the study: 37.5% (28.4-48.3). For appropriately treated women, modeled risks were 1.0% (0.7-1.3) at 30 years, study: 1.5% (0.4-3.3). To support external and projective validity, cost-effectiveness models should be iteratively evaluated as new studies become available, with reporting standardized to facilitate assessment. Such evaluations are particularly relevant for models used to conduct comparative effectiveness analyses.

  13. The gravity model specification for modeling international trade flows and free trade agreement effects: a 10-year review of empirical studies

    OpenAIRE

    Kepaptsoglou, Konstantinos; Karlaftis, Matthew G.; Tsamboulas, Dimitrios

    2010-01-01

    The gravity model has been extensively used in international trade research for the last 40 years because of its considerable empirical robustness and explanatory power. Since their introduction in the 1960's, gravity models have been used for assessing trade policy implications and, particularly recently, for analyzing the effects of Free Trade Agreements on international trade. The objective of this paper is to review the recent empirical literature on gravity models, highlight best practic...

  14. Model uncertainty in growth empirics

    NARCIS (Netherlands)

    Prüfer, P.

    2008-01-01

    This thesis applies so-called Bayesian model averaging (BMA) to three different economic questions substantially exposed to model uncertainty. Chapter 2 addresses a major issue of modern development economics: the analysis of the determinants of pro-poor growth (PPG), which seeks to combine high

  15. Measuring hospital service quality and its influence on patient satisfaction: An empirical study using structural equation modeling

    OpenAIRE

    Nasim Kazemi; Parisa Ehsani; Farshid Abdi; Mohammad Kazem Bighami

    2013-01-01

    This paper presents an empirical investigation to measure different dimensions of hospital service quality (HSQ) by gap analysis and patient satisfaction (PS). It also attempts to measure patients’ satisfaction with three dimensions extracted from exploratory factor analysis (EFA) by Principle component analysis method and conformity factor analysis (CFA). In addition, the study analyzes relationship between HSQ and PS in the context of Iranian hospital services, using structural equation mod...

  16. An empirical study of the pathology of organizational communications based on three branches model: A case study

    Directory of Open Access Journals (Sweden)

    Mehdi Kheirandish

    2017-12-01

    Full Text Available Understanding obstacles in front of communication system has turned into a critical task executed by managers. Present study analyzes major vulnerabilities to organizational communication from structural, behavioral and contextual aspects. The statistical population includes employees and managers in the headquarters of National Iranian Oil Company. After assessing the validity and reliability of a conceptual model, we used Kolmogorov–Smirnov test, T-test and F-test for analyzing our data. The results show that priority of communication barriers are as follows: structural elements like centrality and formality. Contextual elements like cultural and technical barriers and finally behavioral elements like perceptual and human barriers.

  17. Empirical Bayesian inference and model uncertainty

    International Nuclear Information System (INIS)

    Poern, K.

    1994-01-01

    This paper presents a hierarchical or multistage empirical Bayesian approach for the estimation of uncertainty concerning the intensity of a homogeneous Poisson process. A class of contaminated gamma distributions is considered to describe the uncertainty concerning the intensity. These distributions in turn are defined through a set of secondary parameters, the knowledge of which is also described and updated via Bayes formula. This two-stage Bayesian approach is an example where the modeling uncertainty is treated in a comprehensive way. Each contaminated gamma distributions, represented by a point in the 3D space of secondary parameters, can be considered as a specific model of the uncertainty about the Poisson intensity. Then, by the empirical Bayesian method each individual model is assigned a posterior probability

  18. Empirical study of supervised gene screening

    Directory of Open Access Journals (Sweden)

    Ma Shuangge

    2006-12-01

    Full Text Available Abstract Background Microarray studies provide a way of linking variations of phenotypes with their genetic causations. Constructing predictive models using high dimensional microarray measurements usually consists of three steps: (1 unsupervised gene screening; (2 supervised gene screening; and (3 statistical model building. Supervised gene screening based on marginal gene ranking is commonly used to reduce the number of genes in the model building. Various simple statistics, such as t-statistic or signal to noise ratio, have been used to rank genes in the supervised screening. Despite of its extensive usage, statistical study of supervised gene screening remains scarce. Our study is partly motivated by the differences in gene discovery results caused by using different supervised gene screening methods. Results We investigate concordance and reproducibility of supervised gene screening based on eight commonly used marginal statistics. Concordance is assessed by the relative fractions of overlaps between top ranked genes screened using different marginal statistics. We propose a Bootstrap Reproducibility Index, which measures reproducibility of individual genes under the supervised screening. Empirical studies are based on four public microarray data. We consider the cases where the top 20%, 40% and 60% genes are screened. Conclusion From a gene discovery point of view, the effect of supervised gene screening based on different marginal statistics cannot be ignored. Empirical studies show that (1 genes passed different supervised screenings may be considerably different; (2 concordance may vary, depending on the underlying data structure and percentage of selected genes; (3 evaluated with the Bootstrap Reproducibility Index, genes passed supervised screenings are only moderately reproducible; and (4 concordance cannot be improved by supervised screening based on reproducibility.

  19. Empirically Derived Dehydration Scoring and Decision Tree Models for Children With Diarrhea: Assessment and Internal Validation in a Prospective Cohort Study in Dhaka, Bangladesh

    OpenAIRE

    Levine, Adam C; Glavis-Bloom, Justin; Modi, Payal; Nasrin, Sabiha; Rege, Soham; Chu, Chieh; Schmid, Christopher H; Alam, Nur H

    2015-01-01

    Introduction: Diarrhea remains one of the most common and most deadly conditions affecting children worldwide. Accurately assessing dehydration status is critical to determining treatment course, yet no clinical diagnostic models for dehydration have been empirically derived and validated for use in resource-limited settings. Methods: In the Dehydration: Assessing Kids Accurately (DHAKA) prospective cohort study, a random sample of children under 5 with acute diarrhea was enrolled between Feb...

  20. An Empirical Model for Energy Storage Systems

    Energy Technology Data Exchange (ETDEWEB)

    Rosewater, David Martin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Scott, Paul [TransPower, Poway, CA (United States)

    2016-03-17

    Improved models of energy storage systems are needed to enable the electric grid’s adaptation to increasing penetration of renewables. This paper develops a generic empirical model of energy storage system performance agnostic of type, chemistry, design or scale. Parameters for this model are calculated using test procedures adapted from the US DOE Protocol for Uniformly Measuring and Expressing the Performance of Energy Storage. We then assess the accuracy of this model for predicting the performance of the TransPower GridSaver – a 1 MW rated lithium-ion battery system that underwent laboratory experimentation and analysis. The developed model predicts a range of energy storage system performance based on the uncertainty of estimated model parameters. Finally, this model can be used to better understand the integration and coordination of energy storage on the electric grid.

  1. Empirical agent-based modelling challenges and solutions

    CERN Document Server

    Barreteau, Olivier

    2014-01-01

    This instructional book showcases techniques to parameterise human agents in empirical agent-based models (ABM). In doing so, it provides a timely overview of key ABM methodologies and the most innovative approaches through a variety of empirical applications.  It features cutting-edge research from leading academics and practitioners, and will provide a guide for characterising and parameterising human agents in empirical ABM.  In order to facilitate learning, this text shares the valuable experiences of other modellers in particular modelling situations. Very little has been published in the area of empirical ABM, and this contributed volume will appeal to graduate-level students and researchers studying simulation modeling in economics, sociology, ecology, and trans-disciplinary studies, such as topics related to sustainability. In a similar vein to the instruction found in a cookbook, this text provides the empirical modeller with a set of 'recipes'  ready to be implemented. Agent-based modeling (AB...

  2. Combining empirical and theory-based land-use modelling approaches to assess economic potential of biofuel production avoiding iLUC: Argentina as a case study

    NARCIS (Netherlands)

    Diogo, V.; van der Hilst, F.; van Eijck, J.; Verstegen, J.A.; Hilbert, J.; Carballo, S.; Volante, J.; Faaij, A.

    2014-01-01

    In this paper, a land-use modelling framework is presented combining empirical and theory-based modelling approaches to determine economic potential of biofuel production avoiding indirect land-use changes (iLUC) resulting from land competition with other functions. The empirical approach explores

  3. An Empirical Study of Online Discussion Forums by Library and Information Science Postgraduate Students using Technology Acceptance Model 3

    Directory of Open Access Journals (Sweden)

    Airen Adetimirin

    2015-06-01

    Full Text Available E-learning is an important trend globally that is believed to enhance the acquisition of knowledge by students within and outside the classroom to improve their academic pursuit. The Online Discussion Forum (ODF is one of the tools that are used for e-learning in Nigerian universities. It facilitates interaction among postgraduate students as they can communicate and share information sources with one another to promote learning. However, the optimum use of this forum is determined by anchor factors in TAM 3 such as computer self-efficacy, perceptions of external control, computer anxiety and computer playfulness. A conceptual model based on TAM 3 was proposed and empirically tested. Using a survey research design and an online questionnaire for 121 Library and Information Science (LIS postgraduate students, the paper demonstrated that computer self-efficacy, perceptions of external control, computer anxiety and computer playfulness have significant influence on the use of ODF. The paper therefore proposes that Online Discussion Forums should be encouraged for learning in postgraduate education.

  4. Comparison of empirical models and laboratory saturated hydraulic ...

    African Journals Online (AJOL)

    Numerous methods for estimating soil saturated hydraulic conductivity exist, which range from direct measurement in the laboratory to models that use only basic soil properties. A study was conducted to compare laboratory saturated hydraulic conductivity (Ksat) measurement and that estimated from empirical models.

  5. Equifinality in empirical studies of cultural transmission.

    Science.gov (United States)

    Barrett, Brendan J

    2018-01-31

    Cultural systems exhibit equifinal behavior - a single final state may be arrived at via different mechanisms and/or from different initial states. Potential for equifinality exists in all empirical studies of cultural transmission including controlled experiments, observational field research, and computational simulations. Acknowledging and anticipating the existence of equifinality is important in empirical studies of social learning and cultural evolution; it helps us understand the limitations of analytical approaches and can improve our ability to predict the dynamics of cultural transmission. Here, I illustrate and discuss examples of equifinality in studies of social learning, and how certain experimental designs might be prone to it. I then review examples of equifinality discussed in the social learning literature, namely the use of s-shaped diffusion curves to discern individual from social learning and operational definitions and analytical approaches used in studies of conformist transmission. While equifinality exists to some extent in all studies of social learning, I make suggestions for how to address instances of it, with an emphasis on using data simulation and methodological verification alongside modern statistical approaches that emphasize prediction and model comparison. In cases where evaluated learning mechanisms are equifinal due to non-methodological factors, I suggest that this is not always a problem if it helps us predict cultural change. In some cases, equifinal learning mechanisms might offer insight into how both individual learning, social learning strategies and other endogenous social factors might by important in structuring cultural dynamics and within- and between-group heterogeneity. Copyright © 2018 Elsevier B.V. All rights reserved.

  6. Empirical Study on Total Factor Productive Energy Efficiency in Beijing-Tianjin-Hebei Region-Analysis based on Malmquist Index and Window Model

    Science.gov (United States)

    Xu, Qiang; Ding, Shuai; An, Jingwen

    2017-12-01

    This paper studies the energy efficiency of Beijing-Tianjin-Hebei region and to finds out the trend of energy efficiency in order to improve the economic development quality of Beijing-Tianjin-Hebei region. Based on Malmquist index and window analysis model, this paper estimates the total factor energy efficiency in Beijing-Tianjin-Hebei region empirically by using panel data in this region from 1991 to 2014, and provides the corresponding political recommendations. The empirical result shows that, the total factor energy efficiency in Beijing-Tianjin-Hebei region increased from 1991 to 2014, mainly relies on advances in energy technology or innovation, and obvious regional differences in energy efficiency to exist. Throughout the window period of 24 years, the regional differences of energy efficiency in Beijing-Tianjin-Hebei region shrank. There has been significant convergent trend in energy efficiency after 2000, mainly depends on the diffusion and spillover of energy technologies.

  7. Qualitative Case Study Research as Empirical Inquiry

    Science.gov (United States)

    Ellinger, Andrea D.; McWhorter, Rochell

    2016-01-01

    This article introduces the concept of qualitative case study research as empirical inquiry. It defines and distinguishes what a case study is, the purposes, intentions, and types of case studies. It then describes how to determine if a qualitative case study is the preferred approach for conducting research. It overviews the essential steps in…

  8. The investor behavior and futures market volatility A theory and empirical study based on the OLG model and high-frequency data

    Institute of Scientific and Technical Information of China (English)

    Yun Wang; Renhai Hua; Zongcheng Zhang

    2011-01-01

    Purpose-The purpose of this paper is to examine whether the futures volatility could attect the investor behavior and what trading strategy different investors could adopt when they meet different information conditions.Design/methodology/approach-This study introduces a two-period overlapping generation model (OLG) model into the future market and set the investor behavior model based on the future contract price,which can also be extended to complete and incomplete information.It provides the equilibrium solution and uses cuprum tick data in SHFE to conduct the empirical analysis.Findings-The two-period OLG model based on the future market is consistent with the practical situation;second,the sufficient information investors such as institutional adopt reversal trading patterns generally;last,the insufficient information investors such as individual investors adopt momentum trading patterns in general.Research limitations/implications-Investor trading behavior is always an important issue in the behavioral finance and market supervision,but the related research is scarce.Practical implications-The conclusion shows that the investors' behavior in Chinese future market is different from the Chinese stock market.Originality/value-This study empirically analyzes and verifies the different types of trading strategies investors could;investors such as institutional ones adopt reversal trading patterns generally;while investors such as individual investors adopt momentum trading patterns in general.

  9. Empirical atom model of Vegard's law

    International Nuclear Information System (INIS)

    Zhang, Lei; Li, Shichun

    2014-01-01

    Vegard's law seldom holds true for most binary continuous solid solutions. When two components form a solid solution, the atom radii of component elements will change to satisfy the continuity requirement of electron density at the interface between component atom A and atom B so that the atom with larger electron density will expand and the atom with the smaller one will contract. If the expansion and contraction of the atomic radii of A and B respectively are equal in magnitude, Vegard's law will hold true. However, the expansion and contraction of two component atoms are not equal in most situations. The magnitude of the variation will depend on the cohesive energy of corresponding element crystals. An empirical atom model of Vegard's law has been proposed to account for signs of deviations according to the electron density at Wigner–Seitz cell from Thomas–Fermi–Dirac–Cheng model

  10. Characterizing Student Expectations: A Small Empirical Study

    Science.gov (United States)

    Warwick, Jonathan

    2016-01-01

    This paper describes the results of a small empirical study (n = 130), in which undergraduate students in the Business Faculty of a UK university were asked to express views and expectations relating to the study of a mathematics. Factor analysis is used to identify latent variables emerging from clusters of the measured variables and these are…

  11. The problem analysis for empirical studies

    NARCIS (Netherlands)

    Groenland, E.A.G.

    2014-01-01

    This article proposes a systematic methodology for the development of a problem analysis for cross-sectional, empirical research. This methodology is referred to as the 'Annabel approach'. It is suitable both for academic studies and applied (business) studies. In addition it can be used for both

  12. Observation and empirical shell-model study of new yrast excited states in the nucleus sup 1 sup 4 sup 2 Ce

    CERN Document Server

    Liu Zhong; Guo Ying Xiang; Zhou Xiao Hong; Lei Xiang Guo; LiuMinLiang; Luo Wan Ju; He Jian Jun; Zheng Yong; Pan Qiang Yan; Gan Zai Guo; Luo Yi Xiao; Hayakawa, T; Oshima, M; Toh, Y; Shizima, T; Hatsukawa, Y; Osa, A; Ishii, T; Sugawara, M

    2002-01-01

    Excited states of sup 1 sup 4 sup 2 Ce, populated in deep inelastic reactions of sup 8 sup 2 Se projectiles bombarding sup 1 sup 3 sup 9 La target, have been studied to medium spins using in-beam gamma spectroscopy techniques. Three new levels have been identified at 2625, 2995 and 3834 keV, and assigned as 8 sup + , 9 sup ( sup - sup ) and 11 sup ( sup - sup ) , respectively, based on the analysis of the properties of gamma transitions. These new yrast states follow well the level systematics of N 84 isotones. Their structures have been discussed with the help of empirical shell-model calculations

  13. Predicting acid dew point with a semi-empirical model

    International Nuclear Information System (INIS)

    Xiang, Baixiang; Tang, Bin; Wu, Yuxin; Yang, Hairui; Zhang, Man; Lu, Junfu

    2016-01-01

    Highlights: • The previous semi-empirical models are systematically studied. • An improved thermodynamic correlation is derived. • A semi-empirical prediction model is proposed. • The proposed semi-empirical model is validated. - Abstract: Decreasing the temperature of exhaust flue gas in boilers is one of the most effective ways to further improve the thermal efficiency, electrostatic precipitator efficiency and to decrease the water consumption of desulfurization tower, while, when this temperature is below the acid dew point, the fouling and corrosion will occur on the heating surfaces in the second pass of boilers. So, the knowledge on accurately predicting the acid dew point is essential. By investigating the previous models on acid dew point prediction, an improved thermodynamic correlation formula between the acid dew point and its influencing factors is derived first. And then, a semi-empirical prediction model is proposed, which is validated with the data both in field test and experiment, and comparing with the previous models.

  14. An empirical model for prediction of household solid waste generation rate - A case study of Dhanbad, India.

    Science.gov (United States)

    Kumar, Atul; Samadder, S R

    2017-10-01

    Accurate prediction of the quantity of household solid waste generation is very much essential for effective management of municipal solid waste (MSW). In actual practice, modelling methods are often found useful for precise prediction of MSW generation rate. In this study, two models have been proposed that established the relationships between the household solid waste generation rate and the socioeconomic parameters, such as household size, total family income, education, occupation and fuel used in the kitchen. Multiple linear regression technique was applied to develop the two models, one for the prediction of biodegradable MSW generation rate and the other for non-biodegradable MSW generation rate for individual households of the city Dhanbad, India. The results of the two models showed that the coefficient of determinations (R 2 ) were 0.782 for biodegradable waste generation rate and 0.676 for non-biodegradable waste generation rate using the selected independent variables. The accuracy tests of the developed models showed convincing results, as the predicted values were very close to the observed values. Validation of the developed models with a new set of data indicated a good fit for actual prediction purpose with predicted R 2 values of 0.76 and 0.64 for biodegradable and non-biodegradable MSW generation rate respectively. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. A multi-level model of emerging technology: An empirical study of the evolution of biotechnology from 1976 to 2003

    Science.gov (United States)

    van Witteloostuijn, Arjen

    2018-01-01

    In this paper, we develop an ecological, multi-level model that can be used to study the evolution of emerging technology. More specifically, by defining technology as a system composed of a set of interacting components, we can build upon the argument of multi-level density dependence from organizational ecology to develop a distribution-independent model of technological evolution. This allows us to distinguish between different stages of component development, which provides more insight into the emergence of stable component configurations, or dominant designs. We validate our hypotheses in the biotechnology industry by using patent data from the USPTO from 1976 to 2003. PMID:29795575

  16. Evolution of viral virulence: empirical studies

    Science.gov (United States)

    Kurath, Gael; Wargo, Andrew R.

    2016-01-01

    The concept of virulence as a pathogen trait that can evolve in response to selection has led to a large body of virulence evolution theory developed in the 1980-1990s. Various aspects of this theory predict increased or decreased virulence in response to a complex array of selection pressures including mode of transmission, changes in host, mixed infection, vector-borne transmission, environmental changes, host vaccination, host resistance, and co-evolution of virus and host. A fundamental concept is prediction of trade-offs between the costs and benefits associated with higher virulence, leading to selection of optimal virulence levels. Through a combination of observational and experimental studies, including experimental evolution of viruses during serial passage, many of these predictions have now been explored in systems ranging from bacteriophage to viruses of plants, invertebrates, and vertebrate hosts. This chapter summarizes empirical studies of viral virulence evolution in numerous diverse systems, including the classic models myxomavirus in rabbits, Marek's disease virus in chickens, and HIV in humans. Collectively these studies support some aspects of virulence evolution theory, suggest modifications for other aspects, and show that predictions may apply in some virus:host interactions but not in others. Finally, we consider how virulence evolution theory applies to disease management in the field.

  17. Empirical high-latitude electric field models

    International Nuclear Information System (INIS)

    Heppner, J.P.; Maynard, N.C.

    1987-01-01

    Electric field measurements from the Dynamics Explorer 2 satellite have been analyzed to extend the empirical models previously developed from dawn-dusk OGO 6 measurements (J.P. Heppner, 1977). The analysis embraces large quantities of data from polar crossings entering and exiting the high latitudes in all magnetic local time zones. Paralleling the previous analysis, the modeling is based on the distinctly different polar cap and dayside convective patterns that occur as a function of the sign of the Y component of the interplanetary magnetic field. The objective, which is to represent the typical distributions of convective electric fields with a minimum number of characteristic patterns, is met by deriving one pattern (model BC) for the northern hemisphere with a +Y interplanetary magnetic field (IMF) and southern hemisphere with a -Y IMF and two patterns (models A and DE) for the northern hemisphere with a -Y IMF and southern hemisphere with a +Y IMF. The most significant large-scale revisions of the OGO 6 models are (1) on the dayside where the latitudinal overlap of morning and evening convection cells reverses with the sign of the IMF Y component, (2) on the nightside where a westward flow region poleward from the Harang discontinuity appears under model BC conditions, and (3) magnetic local time shifts in the positions of the convection cell foci. The modeling above was followed by a detailed examination of cases where the IMF Z component was clearly positive (northward). Neglecting the seasonally dependent cases where irregularities obscure pattern recognition, the observations range from reasonable agreement with the new BC and DE models, to cases where different characteristics appeared primarily at dayside high latitudes

  18. A three-model comparison of the relationship between quality, satisfaction and loyalty: an empirical study of the Chinese healthcare system

    Science.gov (United States)

    2012-01-01

    Background Previous research has addressed the relationship between customer satisfaction, perceived quality and customer loyalty intentions in consumer markets. In this study, we test and compare three theoretical models of the quality–satisfaction–loyalty relationship in the Chinese healthcare system. Methods This research focuses on hospital patients as participants in the process of healthcare procurement. Empirical data were obtained from six Chinese public hospitals in Shanghai. A total of 630 questionnaires were collected in two studies. Study 1 tested the research instruments, and Study 2 tested the three models. Confirmatory factor analysis was used to assess the scales’ construct validity by testing convergent and discriminant validity. A structural equation model (SEM) specified the distinctions between each construct. A comparison of the three theoretical models was conducted via AMOS analysis. Results The results of the SEM demonstrate that quality and satisfaction are distinct concepts and that the first model (satisfaction mediates quality and loyalty) is the most appropriate one in the context of the Chinese healthcare environment. Conclusions In this study, we test and compare three theoretical models of the quality–satisfaction–loyalty relationship in the Chinese healthcare system. Findings show that perceived quality improvement does not lead directly to customer loyalty. The strategy of using quality improvement to maintain patient loyalty depends on the level of patient satisfaction. This implies that the measurement of patient experiences should include topics of importance for patients’ satisfaction with health care services. PMID:23198824

  19. A three-model comparison of the relationship between quality, satisfaction and loyalty: an empirical study of the Chinese healthcare system

    Directory of Open Access Journals (Sweden)

    Lei Ping

    2012-11-01

    Full Text Available Abstract Background Previous research has addressed the relationship between customer satisfaction, perceived quality and customer loyalty intentions in consumer markets. In this study, we test and compare three theoretical models of the quality–satisfaction–loyalty relationship in the Chinese healthcare system. Methods This research focuses on hospital patients as participants in the process of healthcare procurement. Empirical data were obtained from six Chinese public hospitals in Shanghai. A total of 630 questionnaires were collected in two studies. Study 1 tested the research instruments, and Study 2 tested the three models. Confirmatory factor analysis was used to assess the scales’ construct validity by testing convergent and discriminant validity. A structural equation model (SEM specified the distinctions between each construct. A comparison of the three theoretical models was conducted via AMOS analysis. Results The results of the SEM demonstrate that quality and satisfaction are distinct concepts and that the first model (satisfaction mediates quality and loyalty is the most appropriate one in the context of the Chinese healthcare environment. Conclusions In this study, we test and compare three theoretical models of the quality–satisfaction–loyalty relationship in the Chinese healthcare system. Findings show that perceived quality improvement does not lead directly to customer loyalty. The strategy of using quality improvement to maintain patient loyalty depends on the level of patient satisfaction. This implies that the measurement of patient experiences should include topics of importance for patients’ satisfaction with health care services.

  20. A three-model comparison of the relationship between quality, satisfaction and loyalty: an empirical study of the Chinese healthcare system.

    Science.gov (United States)

    Lei, Ping; Jolibert, Alain

    2012-11-30

    Previous research has addressed the relationship between customer satisfaction, perceived quality and customer loyalty intentions in consumer markets. In this study, we test and compare three theoretical models of the quality-satisfaction-loyalty relationship in the Chinese healthcare system. This research focuses on hospital patients as participants in the process of healthcare procurement. Empirical data were obtained from six Chinese public hospitals in Shanghai. A total of 630 questionnaires were collected in two studies. Study 1 tested the research instruments, and Study 2 tested the three models. Confirmatory factor analysis was used to assess the scales' construct validity by testing convergent and discriminant validity. A structural equation model (SEM) specified the distinctions between each construct. A comparison of the three theoretical models was conducted via AMOS analysis. The results of the SEM demonstrate that quality and satisfaction are distinct concepts and that the first model (satisfaction mediates quality and loyalty) is the most appropriate one in the context of the Chinese healthcare environment. In this study, we test and compare three theoretical models of the quality-satisfaction-loyalty relationship in the Chinese healthcare system. Findings show that perceived quality improvement does not lead directly to customer loyalty. The strategy of using quality improvement to maintain patient loyalty depends on the level of patient satisfaction. This implies that the measurement of patient experiences should include topics of importance for patients' satisfaction with health care services.

  1. An empirical survey on perceived value from tourism destination based on brand equity model: A case study of Qeshm Island

    Directory of Open Access Journals (Sweden)

    Vahid Qaemi

    2012-10-01

    Full Text Available Tourism destination brand equity is defined as all assets (or debts of brands provided with name and symbol of tourism destination to make changes in value services of experiences. In many cases, brand equity is more than physical assets. This survey performs an investigation to identify effective factors in tourism destination equity, cause and affects relationships, and proposes a model for perceived value of tourism destination. The proposed study is performed in one of free islands named Gheshm. The preliminary results indicate that there is a positive relationship between tourism destination brand awareness and destination brand equity, tourism destination brand image, destination brand equity and tourism destination brand loyalty.

  2. PWR surveillance based on correspondence between empirical models and physical

    International Nuclear Information System (INIS)

    Zwingelstein, G.; Upadhyaya, B.R.; Kerlin, T.W.

    1976-01-01

    An on line surveillance method based on the correspondence between empirical models and physicals models is proposed for pressurized water reactors. Two types of empirical models are considered as well as the mathematical models defining the correspondence between the physical and empirical parameters. The efficiency of this method is illustrated for the surveillance of the Doppler coefficient for Oconee I (an 886 MWe PWR) [fr

  3. A Model of Service Marketing in Port Services: Empirical Study in PT Pelabuhan Indonesia II (Persero, Tanjung Priok Branch

    Directory of Open Access Journals (Sweden)

    Rati Farini Srihadi

    2016-06-01

    Full Text Available Port of Tanjung Priok is one of the ports in Indonesia which has the potential to be developed into an international seaport with the level of activity and the capacity of loading and unloading. This study aims to understand the key variables that determine the service quality of the port, so as to achieve customer satisfaction and loyalty. By using Structural Equation Modeling, this study analyzes the effects of various dimensions of service quality to customer satisfaction, and the relationships formed between perceived value and customer loyalty of the port service industry in Indonesia. The sampling method used was stratified random sampling with a total of 406 respondents. The results show positive relationships between the variables. This implies that service quality is an important aspect to focus on in order for Port of Tanjung Priok to achieve customer satisfaction and loyalty.

  4. EMPIRICAL STUDY OF DIFFERENT FACTORS EFFECTS ON ARTICLES PUBLICATION REGARDING SURVEY INTERVIEWER CHARACTERISTICS USING MULTILEVEL REGRESSION MODEL

    Directory of Open Access Journals (Sweden)

    Alina MOROŞANU

    2013-06-01

    Full Text Available The purpose of this research work is to evaluate the effects which some factors could have on articles publication regarding survey interviewer characteristics. For this, the author studied the existing literature from the various fields in which articles on survey interviewer characteristics has been published and which can be found in online articles database. The analysis was performed on 243 articles achieved by researchers in the time period 1949-2012. Using statistical software R and applying multilevel regression model, the results showed that the time period when the studied articles are made and the interaction between the number of authors and the number of pages affect the most their publication in journals with a certain level of impact factor.

  5. Assessing models of speciation under different biogeographic scenarios; An empirical study using multi-locus and RNA-seq analyses

    Science.gov (United States)

    Edwards, Taylor; Tollis, Marc; Hsieh, PingHsun; Gutenkunst, Ryan N.; Liu, Zhen; Kusumi, Kenro; Culver, Melanie; Murphy, Robert W.

    2016-01-01

    Evolutionary biology often seeks to decipher the drivers of speciation, and much debate persists over the relative importance of isolation and gene flow in the formation of new species. Genetic studies of closely related species can assess if gene flow was present during speciation, because signatures of past introgression often persist in the genome. We test hypotheses on which mechanisms of speciation drove diversity among three distinct lineages of desert tortoise in the genus Gopherus. These lineages offer a powerful system to study speciation, because different biogeographic patterns (physical vs. ecological segregation) are observed at opposing ends of their distributions. We use 82 samples collected from 38 sites, representing the entire species' distribution and generate sequence data for mtDNA and four nuclear loci. A multilocus phylogenetic analysis in *BEAST estimates the species tree. RNA-seq data yield 20,126 synonymous variants from 7665 contigs from two individuals of each of the three lineages. Analyses of these data using the demographic inference package ∂a∂i serve to test the null hypothesis of no gene flow during divergence. The best-fit demographic model for the three taxa is concordant with the *BEAST species tree, and the ∂a∂i analysis does not indicate gene flow among any of the three lineages during their divergence. These analyses suggest that divergence among the lineages occurred in the absence of gene flow and in this scenario the genetic signature of ecological isolation (parapatric model) cannot be differentiated from geographic isolation (allopatric model).

  6. Empirical particle transport model for tokamaks

    International Nuclear Information System (INIS)

    Petravic, M.; Kuo-Petravic, G.

    1986-08-01

    A simple empirical particle transport model has been constructed with the purpose of gaining insight into the L- to H-mode transition in tokamaks. The aim was to construct the simplest possible model which would reproduce the measured density profiles in the L-regime, and also produce a qualitatively correct transition to the H-regime without having to assume a completely different transport mode for the bulk of the plasma. Rather than using completely ad hoc constructions for the particle diffusion coefficient, we assume D = 1/5 chi/sub total/, where chi/sub total/ ≅ chi/sub e/ is the thermal diffusivity, and then use the κ/sub e/ = n/sub e/chi/sub e/ values derived from experiments. The observed temperature profiles are then automatically reproduced, but nontrivially, the correct density profiles are also obtained, for realistic fueling rates and profiles. Our conclusion is that it is sufficient to reduce the transport coefficients within a few centimeters of the surface to produce the H-mode behavior. An additional simple assumption, concerning the particle mean-free path, leads to a convective transport term which reverses sign a few centimeters inside the surface, as required by the H-mode density profiles

  7. Conceptual Model of IT Infrastructure Capability and Its Empirical Justification

    Institute of Scientific and Technical Information of China (English)

    QI Xianfeng; LAN Boxiong; GUO Zhenwei

    2008-01-01

    Increasing importance has been attached to the value of information technology (IT) infrastructure in today's organizations. The development of efficacious IT infrastructure capability enhances business performance and brings sustainable competitive advantage. This study analyzed the IT infrastructure capability in a holistic way and then presented a concept model of IT capability. IT infrastructure capability was categorized into sharing capability, service capability, and flexibility. This study then empirically tested the model using a set of survey data collected from 145 firms. Three factors emerge from the factor analysis as IT flexibility, IT service capability, and IT sharing capability, which agree with those in the conceptual model built in this study.

  8. Empirical study of the metal-nitride-oxide-semiconductor device characteristics deduced from a microscopic model of memory traps

    International Nuclear Information System (INIS)

    Ngai, K.L.; Hsia, Y.

    1982-01-01

    A graded-nitride gate dielectric metal-nitride-oxide-semiconductor (MNOS) memory transistor exhibiting superior device characteristics is presented and analyzed based on a qualitative microscopic model of the memory traps. The model is further reviewed to interpret some generic properties of the MNOS memory transistors including memory window, erase-write speed, and the retention-endurance characteristic features

  9. Empirical modeling of dynamic behaviors of pneumatic artificial muscle actuators.

    Science.gov (United States)

    Wickramatunge, Kanchana Crishan; Leephakpreeda, Thananchai

    2013-11-01

    Pneumatic Artificial Muscle (PAM) actuators yield muscle-like mechanical actuation with high force to weight ratio, soft and flexible structure, and adaptable compliance for rehabilitation and prosthetic appliances to the disabled as well as humanoid robots or machines. The present study is to develop empirical models of the PAM actuators, that is, a PAM coupled with pneumatic control valves, in order to describe their dynamic behaviors for practical control design and usage. Empirical modeling is an efficient approach to computer-based modeling with observations of real behaviors. Different characteristics of dynamic behaviors of each PAM actuator are due not only to the structures of the PAM actuators themselves, but also to the variations of their material properties in manufacturing processes. To overcome the difficulties, the proposed empirical models are experimentally derived from real physical behaviors of the PAM actuators, which are being implemented. In case studies, the simulated results with good agreement to experimental results, show that the proposed methodology can be applied to describe the dynamic behaviors of the real PAM actuators. Copyright © 2013 ISA. Published by Elsevier Ltd. All rights reserved.

  10. Semi-empirical corrosion model for Zircaloy-4 cladding

    International Nuclear Information System (INIS)

    Nadeem Elahi, Waseem; Atif Rana, Muhammad

    2015-01-01

    The Zircaloy-4 cladding tube in Pressurize Water Reactors (PWRs) bears corrosion due to fast neutron flux, coolant temperature, and water chemistry. The thickness of Zircaloy-4 cladding tube may be decreased due to the increase in corrosion penetration which may affect the integrity of the fuel rod. The tin content and inter-metallic particles sizes has been found significantly in the magnitude of oxide thickness. In present study we have developed a Semiempirical corrosion model by modifying the Arrhenius equation for corrosion as a function of acceleration factor for tin content and accumulative annealing. This developed model has been incorporated into fuel performance computer code. The cladding oxide thickness data obtained from the Semi-empirical corrosion model has been compared with the experimental results i.e., numerous cases of measured cladding oxide thickness from UO 2 fuel rods, irradiated in various PWRs. The results of the both studies lie within the error band of 20μm, which confirms the validity of the developed Semi-empirical corrosion model. Key words: Corrosion, Zircaloy-4, tin content, accumulative annealing factor, Semi-empirical, PWR. (author)

  11. Educational Inequality and Income Inequality: An Empirical Study on China

    Science.gov (United States)

    Yang, Jun; Huang, Xiao; Li, Xiaoyu

    2009-01-01

    Based on the endogenous growth theory, this paper uses the Gini coefficient to measure educational inequality and studies the empirical relationship between educational inequality and income inequality through a simultaneous equation model. The results show that: (1) Income inequality leads to educational inequality while the reduction of…

  12. Connecting theoretical and empirical studies of trait-mediated interactions

    Czech Academy of Sciences Publication Activity Database

    Bolker, B.; Holyoak, M.; Křivan, Vlastimil; Rowe, L.; Schmitz, O.

    2003-01-01

    Roč. 84, č. 5 (2003), s. 1101-1114 ISSN 0012-9658 Institutional research plan: CEZ:AV0Z5007907 Keywords : community models * competition * empirical study Subject RIV: EH - Ecology, Behaviour Impact factor: 3.701, year: 2003

  13. An Empirical Study on the Influence of PBL Teaching Model on College Students' Critical Thinking Ability

    Science.gov (United States)

    Zhou, Zhen

    2018-01-01

    The critical thinking ability is an indispensable ability of contemporary college students, and the PBL teaching model abandons the shortcomings of traditional teaching methods, which is more suitable for the development trend of university curriculum teaching reform in China. In order to understand the influence of PBL teaching mode on college…

  14. An Empirical Investigation into a Subsidiary Absorptive Capacity Process Model

    DEFF Research Database (Denmark)

    Schleimer, Stephanie; Pedersen, Torben

    2011-01-01

    and empirically test a process model of absorptive capacity. The setting of our empirical study is 213 subsidiaries of multinational enterprises and the focus is on the capacity of these subsidiaries to successfully absorb best practices in marketing strategy from their headquarters. This setting allows us...... to explore the process model in its entirety, including different drivers of subsidiary absorptive capacity (organizational mechanisms and contextual drivers), the three original dimensions of absorptive capacity (recognition, assimilation, application), and related outcomes (implementation...... and internalization of the best practice). The study’s findings reveal that managers have discretion in promoting absorptive capacity through the application of specific organizational mechanism and that the impact of contextual drivers on subsidiary absorptive capacity is not direct, but mediated...

  15. Empirical questions for collective-behaviour modelling

    Indian Academy of Sciences (India)

    2015-02-04

    Feb 4, 2015 ... The collective behaviour of groups of social animals has been an active topic of study across many disciplines, and has a long history of modelling. Classical models have been successful in capturing the large-scale patterns formed by animal aggregations, but fare less well in accounting for details, ...

  16. Empirical studies on changes in oil governance

    Science.gov (United States)

    Kemal, Mohammad

    Regulation of the oil and gas sector is consequential to the economies of oil-producing countries. In the literature, there are two types of regulation: indirect regulation through taxes and tariffs or direct regulation through the creation of a National Oil Company (NOC). In the 1970s, many oil-producing countries nationalized their oil and gas sectors by creating and giving ownership rights of oil and gas resources to NOCs. In light of the success of Norway in regulating its oil and gas resources, over the past two decades several countries have changed their oil governance by changing the rights given to NOC from ownership right to mere access rights like other oil companies. However, empirical literature on these changes in oil governance is quite thin. Thus, this dissertation will explore three research questions to investigate empirically these changes in oil governance. First, I investigate empirically the impact of the changes in oil governance on aggregate domestic income. By employing a difference-in-difference method, I will show that a country which changed its oil governance increases its GDP per-capita by 10%. However, the impact is different for different types of political institution. Second, by observing the changes in oil governance in Indonesia , I explore the impact of the changes on learning-by-doing and learning spillover effect in offshore exploration drilling. By employing an econometric model which includes interaction terms between various experience variables and changes in an oil governance dummy, I will show that the change in oil governance in Indonesia enhances learning-by-doing by the rigs and learning spillover in a basin. Lastly, the impact of the changes in oil governance on expropriation risk and extraction path will be explored. By employing a difference-in-difference method, this essay will show that the changes in oil governance reduce expropriation and the impact of it is different for different sizes of resource stock.

  17. Empirical Study on Sustainable Opportunities Recognition. A Polyvinyl Chloride (PVC Joinery Industry Analysis Using Augmented Sustainable Development Process Model

    Directory of Open Access Journals (Sweden)

    Eduard-Gabriel Ceptureanu

    2017-09-01

    Full Text Available This paper analyzes factors influencing recognition of sustainable opportunities by using an augmented sustainability process model. The conceptual model used two main factors, Knowledge and Motivation, and one moderating variable, Social embeddedness. We investigated entrepreneurs from PVC joinery industry and concluded that while market orientation and sustainable entrepreneurial orientation definitely and positively influence sustainable opportunity recognition, others variables like knowledge of the natural/communal environment, awareness of sustainable development or focus on success have less support. Among all variables analyzed, perception of the threat of the natural/communal environment and altruism toward others have the poorest impact on opportunity recognition. Finally, we concluded that social embeddedness has a moderating effect on sustainable opportunity recognition, even though the results were mixed.

  18. Developing a suitable model for supplier selection based on supply chain risks: an empirical study from Iranian pharmaceutical companies.

    Science.gov (United States)

    Mehralian, Gholamhossein; Rajabzadeh Gatari, Ali; Morakabati, Mohadese; Vatanpour, Hossein

    2012-01-01

    The supply chain represents the critical link between the development of new product and the market in pharmaceutical industry. Over the years, improvements made in supply chain operations have focused largely on ways to reduce cost and gain efficiencies in scale. In addition, powerful regulatory and market forces have provided new incentives for pharmaceutical firms to basically rethink the way they produce and distribute products, and also to re-imagine the role of the supply chain in driving strategic growth, brand differentiation and economic value in the health continuum. The purpose of this paper is to formulate basic factors involved in risk analysis of pharmaceutical industry, and also determine the effective factors involved in suppliers selection and their priorities. This paper is based on the results of literature review, experts' opinion acquisition, statistical analysis and also using MADM models on data gathered from distributed questionnaires. The model consists of the following steps and components: first factors involved in to supply chain risks are determined. Based on them a framework is considered. According the result of statistical analysis and MADM models the risk factors are formulated. The paper determines the main components and influenceial factors involving in the supply chain risks. Results showed that delivery risk can make an important contribution to mitigate the risk of pharmaceutical industry.

  19. Developing a Suitable Model for Supplier Selection Based on Supply Chain Risks: An Empirical Study from Iranian Pharmaceutical Companies

    Science.gov (United States)

    Mehralian, Gholamhossein; Rajabzadeh Gatari, Ali; Morakabati, Mohadese; Vatanpour, Hossein

    2012-01-01

    The supply chain represents the critical link between the development of new product and the market in pharmaceutical industry. Over the years, improvements made in supply chain operations have focused largely on ways to reduce cost and gain efficiencies in scale. In addition, powerful regulatory and market forces have provided new incentives for pharmaceutical firms to basically rethink the way they produce and distribute products, and also to re-imagine the role of the supply chain in driving strategic growth, brand differentiation and economic value in the health continuum. The purpose of this paper is to formulate basic factors involved in risk analysis of pharmaceutical industry, and also determine the effective factors involved in suppliers selection and their priorities. This paper is based on the results of literature review, experts’ opinion acquisition, statistical analysis and also using MADM models on data gathered from distributed questionnaires. The model consists of the following steps and components: first factors involved in to supply chain risks are determined. Based on them a framework is considered. According the result of statistical analysis and MADM models the risk factors are formulated. The paper determines the main components and influenceial factors involving in the supply chain risks. Results showed that delivery risk can make an important contribution to mitigate the risk of pharmaceutical industry. PMID:24250442

  20. Using Modeling and Simulation to Predict Operator Performance and Automation-Induced Complacency With Robotic Automation: A Case Study and Empirical Validation.

    Science.gov (United States)

    Wickens, Christopher D; Sebok, Angelia; Li, Huiyang; Sarter, Nadine; Gacy, Andrew M

    2015-09-01

    The aim of this study was to develop and validate a computational model of the automation complacency effect, as operators work on a robotic arm task, supported by three different degrees of automation. Some computational models of complacency in human-automation interaction exist, but those are formed and validated within the context of fairly simplified monitoring failures. This research extends model validation to a much more complex task, so that system designers can establish, without need for human-in-the-loop (HITL) experimentation, merits and shortcomings of different automation degrees. We developed a realistic simulation of a space-based robotic arm task that could be carried out with three different levels of trajectory visualization and execution automation support. Using this simulation, we performed HITL testing. Complacency was induced via several trials of correctly performing automation and then was assessed on trials when automation failed. Following a cognitive task analysis of the robotic arm operation, we developed a multicomponent model of the robotic operator and his or her reliance on automation, based in part on visual scanning. The comparison of model predictions with empirical results revealed that the model accurately predicted routine performance and predicted the responses to these failures after complacency developed. However, the scanning models do not account for the entire attention allocation effects of complacency. Complacency modeling can provide a useful tool for predicting the effects of different types of imperfect automation. The results from this research suggest that focus should be given to supporting situation awareness in automation development. © 2015, Human Factors and Ergonomics Society.

  1. Empirical Model for Mobile Learning and their Factors. Case Study: Universities Located in the Urban City of Guadalajara, México

    Directory of Open Access Journals (Sweden)

    Juan Mejía Trejo

    2015-10-01

    Full Text Available Information and communication technologies (ICT are producing new and innovative teaching-learning processes. The research question we focused on is: Which is the empirical model and the factors for mobile learning at universities located within the Metropolitan Zone of Guadalajara, in Jalisco, México? Our research is grounded on a documentary study that chose variables used by specialists in m-learning using Analytic Hierarchy Process (AHP. The factors discovered were three: Technology (TECH; Contents Teaching-Learning Management and Styles (CTLMS; and Professor and Student Role (PSR. We used 13 dimensions and 60 variables. 20 professors and 800 students in social sciences courses participated in the study; they came from 7 universities located in the Urban City of Guadalajara, during 2013-2014 school cycles (24 months. We applied questionnaires and the data were analyzed by structural equations modeling (SEM, using EQS 6.1 software. The results suggest that there are 9/60 variables that have the most influence to improve the interaction with m-Learning model within the universities.

  2. Empirically Derived Dehydration Scoring and Decision Tree Models for Children With Diarrhea: Assessment and Internal Validation in a Prospective Cohort Study in Dhaka, Bangladesh.

    Science.gov (United States)

    Levine, Adam C; Glavis-Bloom, Justin; Modi, Payal; Nasrin, Sabiha; Rege, Soham; Chu, Chieh; Schmid, Christopher H; Alam, Nur H

    2015-08-18

    Diarrhea remains one of the most common and most deadly conditions affecting children worldwide. Accurately assessing dehydration status is critical to determining treatment course, yet no clinical diagnostic models for dehydration have been empirically derived and validated for use in resource-limited settings. In the Dehydration: Assessing Kids Accurately (DHAKA) prospective cohort study, a random sample of children under 5 with acute diarrhea was enrolled between February and June 2014 in Bangladesh. Local nurses assessed children for clinical signs of dehydration on arrival, and then serial weights were obtained as subjects were rehydrated. For each child, the percent weight change with rehydration was used to classify subjects with severe dehydration (>9% weight change), some dehydration (3-9%), or no dehydration (Dehydration Score and DHAKA Dehydration Tree, respectively. Models were assessed for their accuracy using the area under their receiver operating characteristic curve (AUC) and for their reliability through repeat clinical exams. Bootstrapping was used to internally validate the models. A total of 850 children were enrolled, with 771 included in the final analysis. Of the 771 children included in the analysis, 11% were classified with severe dehydration, 45% with some dehydration, and 44% with no dehydration. Both the DHAKA Dehydration Score and DHAKA Dehydration Tree had significant AUCs of 0.79 (95% CI = 0.74, 0.84) and 0.76 (95% CI = 0.71, 0.80), respectively, for the diagnosis of severe dehydration. Additionally, the DHAKA Dehydration Score and DHAKA Dehydration Tree had significant positive likelihood ratios of 2.0 (95% CI = 1.8, 2.3) and 2.5 (95% CI = 2.1, 2.8), respectively, and significant negative likelihood ratios of 0.23 (95% CI = 0.13, 0.40) and 0.28 (95% CI = 0.18, 0.44), respectively, for the diagnosis of severe dehydration. Both models demonstrated 90% agreement between independent raters and good

  3. An Empirical Study of Wrappers for Feature Subset Selection based on a Parallel Genetic Algorithm: The Multi-Wrapper Model

    KAUST Repository

    Soufan, Othman

    2012-09-01

    Feature selection is the first task of any learning approach that is applied in major fields of biomedical, bioinformatics, robotics, natural language processing and social networking. In feature subset selection problem, a search methodology with a proper criterion seeks to find the best subset of features describing data (relevance) and achieving better performance (optimality). Wrapper approaches are feature selection methods which are wrapped around a classification algorithm and use a performance measure to select the best subset of features. We analyze the proper design of the objective function for the wrapper approach and highlight an objective based on several classification algorithms. We compare the wrapper approaches to different feature selection methods based on distance and information based criteria. Significant improvement in performance, computational time, and selection of minimally sized feature subsets is achieved by combining different objectives for the wrapper model. In addition, considering various classification methods in the feature selection process could lead to a global solution of desirable characteristics.

  4. Improving the desolvation penalty in empirical protein pKa modeling

    DEFF Research Database (Denmark)

    Olsson, Mats Henrik Mikael

    2012-01-01

    Unlike atomistic and continuum models, empirical pk(a) predicting methods need to include desolvation contributions explicitly. This study describes a new empirical desolvation method based on the Born solvation model. The new desolvation model was evaluated by high-level Poisson-Boltzmann...

  5. Mobile Systems Development: An Empirical Study

    DEFF Research Database (Denmark)

    Hosbond, J. H.

    As part of an ongoing study on mobile systems development (MSD), this paper presents preliminary findings of research-in-progress. The debate on mobility in research has so far been dominated by mobile HCI, technological innovations, and socio-technical issues related to new and emerging mobile...... work patterns. This paper is about the development of mobile systems.Based on an on-going empirical study I present four case studies of companies each with different products or services to offer and diverging ways of establishing and sustaining a successful business in the mobile industry. From...... the case studies I propose a five-layered framework for understanding the structure and segmentation of the industry. This leads to an analysis of the different modes of operation within the mobile industry, exemplified by the four case studies.The contribution of this paper is therefore two-fold: (1) I...

  6. Parametrisation and empirical model for bedload movement in the multibar coastal zone on the base of field radiotracer study

    International Nuclear Information System (INIS)

    Owczarczyk, A.; Wierzchnicki, R.; Pruszak, Z.

    1999-01-01

    The near-shore zone is the most interesting sea region is coastal engineering. In this region the most effective changes in coastal morphodynamic takes place due to intensive sediment transport generated by waves and currents. The processes occurring in this zone are of great importance for coast protection and hydrotechnic activities as well as recreation. They are extremely complicated due to their stochastic character in the time and space domain. The most valuable information concerning the dynamics of bedload transport and its local character is provided by the field surveys. Such investigations are carried out under natural conditions and take into account the characteristic properties of the region. The subject of the work was the study of bedload movement for the multibar conditions

  7. Does size matter? : An empirical study modifying Fama & French's three factor model to detect size-effect based on turnover in the Swedish markets

    OpenAIRE

    Boros, Daniel; Eriksson, Claes

    2014-01-01

    This thesis investigates whether the estimation of the cost of equity (or the expected return) in the Swedish market should incorporate an adjustment for a company’s size. This is what is commonly known as the size-effect, first presented by Banz (1980) and has later been a part of models for estimating cost of equity, such as Fama & French’s three factor model (1992). The Fama & French model was developed based on empirical research. Since the model was developed, the research on the...

  8. An Empirical Study of Audit Expectation Gap in Hungary

    OpenAIRE

    Judit Füredi-Fülöp

    2015-01-01

    The audit expectation gap has preoccupied the finance and accounting profession for a long time. Considerable research has been conducted into this issue and attempts have been made to provide an accurate definition of the audit expectation gap, model this concept and assess the possibilities of its narrowing. Also, a number of studies investigate whether there is an audit expectation gap in several researched regions. The objectives of empirical studies on the structure and nature of the aud...

  9. Visual Semiotics & Uncertainty Visualization: An Empirical Study.

    Science.gov (United States)

    MacEachren, A M; Roth, R E; O'Brien, J; Li, B; Swingley, D; Gahegan, M

    2012-12-01

    This paper presents two linked empirical studies focused on uncertainty visualization. The experiments are framed from two conceptual perspectives. First, a typology of uncertainty is used to delineate kinds of uncertainty matched with space, time, and attribute components of data. Second, concepts from visual semiotics are applied to characterize the kind of visual signification that is appropriate for representing those different categories of uncertainty. This framework guided the two experiments reported here. The first addresses representation intuitiveness, considering both visual variables and iconicity of representation. The second addresses relative performance of the most intuitive abstract and iconic representations of uncertainty on a map reading task. Combined results suggest initial guidelines for representing uncertainty and discussion focuses on practical applicability of results.

  10. Empirical model for estimating the surface roughness of machined ...

    African Journals Online (AJOL)

    Empirical model for estimating the surface roughness of machined ... as well as surface finish is one of the most critical quality measure in mechanical products. ... various cutting speed have been developed using regression analysis software.

  11. Empirical model for estimating the surface roughness of machined ...

    African Journals Online (AJOL)

    Michael Horsfall

    one of the most critical quality measure in mechanical products. In the ... Keywords: cutting speed, centre lathe, empirical model, surface roughness, Mean absolute percentage deviation ... The factors considered were work piece properties.

  12. Advanced empirical estimate of information value for credit scoring models

    Directory of Open Access Journals (Sweden)

    Martin Řezáč

    2011-01-01

    Full Text Available Credit scoring, it is a term for a wide spectrum of predictive models and their underlying techniques that aid financial institutions in granting credits. These methods decide who will get credit, how much credit they should get, and what further strategies will enhance the profitability of the borrowers to the lenders. Many statistical tools are avaiable for measuring quality, within the meaning of the predictive power, of credit scoring models. Because it is impossible to use a scoring model effectively without knowing how good it is, quality indexes like Gini, Kolmogorov-Smirnov statisic and Information value are used to assess quality of given credit scoring model. The paper deals primarily with the Information value, sometimes called divergency. Commonly it is computed by discretisation of data into bins using deciles. One constraint is required to be met in this case. Number of cases have to be nonzero for all bins. If this constraint is not fulfilled there are some practical procedures for preserving finite results. As an alternative method to the empirical estimates one can use the kernel smoothing theory, which allows to estimate unknown densities and consequently, using some numerical method for integration, to estimate value of the Information value. The main contribution of this paper is a proposal and description of the empirical estimate with supervised interval selection. This advanced estimate is based on requirement to have at least k, where k is a positive integer, observations of socres of both good and bad client in each considered interval. A simulation study shows that this estimate outperform both the empirical estimate using deciles and the kernel estimate. Furthermore it shows high dependency on choice of the parameter k. If we choose too small value, we get overestimated value of the Information value, and vice versa. Adjusted square root of number of bad clients seems to be a reasonable compromise.

  13. Reference Evapotranspiration Variation Analysis and Its Approaches Evaluation of 13 Empirical Models in Sub-Humid and Humid Regions: A Case Study of the Huai River Basin, Eastern China

    Directory of Open Access Journals (Sweden)

    Meng Li

    2018-04-01

    Full Text Available Accurate and reliable estimations of reference evapotranspiration (ET0 are imperative in irrigation scheduling and water resource planning. This study aims to analyze the spatiotemporal trends of the monthly ET0 calculated by the Penman–Monteith FAO-56 (PMF-56 model in the Huai River Basin (HRB, eastern China. However, the use of the PMF-56 model is limited by the insufficiency of climatic input parameters in various sites, and the alternative is to employ simple empirical models. In this study, the performances of 13 empirical models were evaluated against the PMF-56 model by using three common statistical approaches: relative root-mean-square error (RRMSE, mean absolute error (MAE, and the Nash–Sutcliffe coefficient (NS. Additionally, a linear regression model was adopted to calibrate and validate the performances of the empirical models during the 1961–2000 and 2001–2014 time periods, respectively. The results showed that the ETPMF increased initially and then decreased on a monthly timescale. On a daily timescale, the Valiantzas3 (VA3 was the best alternative model for estimating the ET0, while the Penman (PEN, WMO, Trabert (TRA, and Jensen-Haise (JH models showed poor results with large errors. Before calibration, the determination coefficients of the temperature-based, radiation-based, and combined models showed the opposite changing trends compared to the mass transfer-based models. After calibration, the performance of each empirical model in each month improved greatly except for the PEN model. If the comprehensive climatic datasets were available, the VA3 would be the recommended model because it had a simple computation procedure and was also very well correlated linearly to the PMF-56 model. Given the data availability, the temperature-based, radiation-based, Valiantzas1 (VA1 and Valiantzas2 (VA2 models were recommended during April–October in the HRB and other similar regions, and also, the mass transfer-based models were

  14. Identifiability of Baranyi model and comparison with empirical ...

    African Journals Online (AJOL)

    In addition, performance of the Baranyi model was compared with those of the empirical modified Gompertz and logistic models and Huang models. Higher values of R2, modeling efficiency and lower absolute values of mean bias error, root mean square error, mean percentage error and chi-square were obtained with ...

  15. Predictive ability of logistic regression, auto-logistic regression and neural network models in empirical land-use change modeling: a case study

    NARCIS (Netherlands)

    Lin, Y.P.; Chu, H.J.; Wu, C.F.; Verburg, P.H.

    2011-01-01

    The objective of this study is to compare the abilities of logistic, auto-logistic and artificial neural network (ANN) models for quantifying the relationships between land uses and their drivers. In addition, the application of the results obtained by the three techniques is tested in a dynamic

  16. An empirical study on entrepreneurs' personal characteristics

    Directory of Open Access Journals (Sweden)

    Ahmad Ahmadkhani

    2012-04-01

    Full Text Available The personality of an entrepreneur is one of the most important characteristics of reaching success by creating jobs and opportunities. In this paper, we demonstrate an empirical study on personal characteristics of students who are supposed to act as entrepreneur to create jobs in seven fields of accounting, computer science, mechanical engineering, civil engineering, metallurgy engineering, electrical engineering and drawing. There are seven aspects of accepting reasonable risk, locus of control, the need for success, mental health conditions, being pragmatic, tolerating ambiguity, dreaming and the sense of challenging in our study to measure the level of entrepreneurship. We uniformly distribute 133 questionnaires among undergraduate students in all seven groups and analyze the results based on t-student test. Our investigation indicates that all students accept reasonable amount of risk, they preserve sufficient locus of control and they are eager for success. In addition, our tests indicate that students believe they maintain sufficient level of mental health care with strong sense of being pragmatic and they could handle ambiguity and challenges.

  17. Forecasting Inflation through Econometrics Models: An Empirical ...

    African Journals Online (AJOL)

    This article aims at modeling and forecasting inflation in Pakistan. For this purpose a number of econometric approaches are implemented and their results are compared. In ARIMA models, adding additional lags for p and/or q necessarily reduced the sum of squares of the estimated residuals. When a model is estimated ...

  18. Successful intelligence and giftedness: an empirical study

    Directory of Open Access Journals (Sweden)

    Mercedes Ferrando

    Full Text Available The aim of our research is to look into the diversity within gifted and talented students. This is important to better understand their complexity and thus offer a more appropriate educational programs. There are rather few empirical works which attempt to identify high abilities profiles (giftedness and talent that actually exist beyond the theoretical level. The present work intends to single out the different patterns or profiles resulting from the combination of the successful intelligence abilities (analytical, synthetic and practical, as defined by Stenberg. A total of 431 students from the Region of Murcia participated in this study. These students performed the Aurora Battery tasks (Chart, Grigorenko, & Sternberg, 2008, designed to measure the analytical, practical and creative intelligence. Analytically gifted students (n=27, practically gifted (n=33 and creatively gifted (n= 34 were identified, taking as criteria scores equal to or higher than 120 IQ on each intelligence. Different Q-factor analyses were carried out for the three groups of students, in such a way that students were grouped according to their similarities. A total of 10 profiles showing how successful intelligence abilities are combined were obtained, something that has made possible to support the theory put forward by Sternberg (2000: the analytical, practical and creative talent profiles, as well as the resulting combinations, the analytical-practical, analytical-creative, practical-creative profiles, along with the consummate balance talent (high performance in the three types of intelligence.

  19. Business models of micro businesses: Empirical evidence from creative industries

    Directory of Open Access Journals (Sweden)

    Pfeifer Sanja

    2017-01-01

    Full Text Available Business model describes how a business identifies and creates value for customers and how it organizes itself to capture some of this value in a profitable manner. Previous studies of business models in creative industries have only recently identified the unresolved issues in this field of research. The main objective of this article is to analyse the structure and diversity of business models and to deduce how these components interact or change in the context of micro and small businesses in creative services such as advertising, architecture and design. The article uses a qualitative approach. Case studies and semi-structured, in-depth interviews with six owners/managers of micro businesses in Croatia provide rich data. Structural coding in data analysis has been performed manually. The qualitative analysis has indicative relevance for the assessment and comparison of business models, however, it provides insights into which components of business models seem to be consolidated and which seem to contribute to the diversity of business models in creative industries. The article contributes to the advancement of empirical evidence and conceptual constructs that might lead to more advanced methodological approaches and proposition of the core typologies or classifications of business models in creative industries. In addition, a more detailed mapping of different choices available in managing value creation, value capturing or value networking might be a valuable help for owners/managers who want to change or cross-fertilize their business models.

  20. Political economy models and agricultural policy formation : empirical applicability and relevance for the CAP

    OpenAIRE

    Zee, van der, F.A.

    1997-01-01

    This study explores the relevance and applicability of political economy models for the explanation of agricultural policies. Part I (chapters 4-7) takes a general perspective and evaluates the empirical applicability of voting models and interest group models to agricultural policy formation in industrialised market economics. Part II (chapters 8-11) focuses on the empirical applicability of political economy models to agricultural policy formation and agricultural policy developmen...

  1. On the empirical relevance of the transient in opinion models

    Energy Technology Data Exchange (ETDEWEB)

    Banisch, Sven, E-mail: sven.banisch@universecity.d [Mathematical Physics, Physics Department, Bielefeld University, 33501 Bielefeld (Germany); Institute for Complexity Science (ICC), 1249-078 Lisbon (Portugal); Araujo, Tanya, E-mail: tanya@iseg.utl.p [Research Unit on Complexity in Economics (UECE), ISEG, TULisbon, 1249-078 Lisbon (Portugal); Institute for Complexity Science (ICC), 1249-078 Lisbon (Portugal)

    2010-07-12

    While the number and variety of models to explain opinion exchange dynamics is huge, attempts to justify the model results using empirical data are relatively rare. As linking to real data is essential for establishing model credibility, this Letter develops an empirical confirmation experiment by which an opinion model is related to real election data. The model is based on a representation of opinions as a vector of k bits. Individuals interact according to the principle that similarity leads to interaction and interaction leads to still more similarity. In the comparison to real data we concentrate on the transient opinion profiles that form during the dynamic process. An artificial election procedure is introduced which allows to relate transient opinion configurations to the electoral performance of candidates for which data are available. The election procedure based on the well-established principle of proximity voting is repeatedly performed during the transient period and remarkable statistical agreement with the empirical data is observed.

  2. On the empirical relevance of the transient in opinion models

    International Nuclear Information System (INIS)

    Banisch, Sven; Araujo, Tanya

    2010-01-01

    While the number and variety of models to explain opinion exchange dynamics is huge, attempts to justify the model results using empirical data are relatively rare. As linking to real data is essential for establishing model credibility, this Letter develops an empirical confirmation experiment by which an opinion model is related to real election data. The model is based on a representation of opinions as a vector of k bits. Individuals interact according to the principle that similarity leads to interaction and interaction leads to still more similarity. In the comparison to real data we concentrate on the transient opinion profiles that form during the dynamic process. An artificial election procedure is introduced which allows to relate transient opinion configurations to the electoral performance of candidates for which data are available. The election procedure based on the well-established principle of proximity voting is repeatedly performed during the transient period and remarkable statistical agreement with the empirical data is observed.

  3. NOx PREDICTION FOR FBC BOILERS USING EMPIRICAL MODELS

    Directory of Open Access Journals (Sweden)

    Jiří Štefanica

    2014-02-01

    Full Text Available Reliable prediction of NOx emissions can provide useful information for boiler design and fuel selection. Recently used kinetic prediction models for FBC boilers are overly complex and require large computing capacity. Even so, there are many uncertainties in the case of FBC boilers. An empirical modeling approach for NOx prediction has been used exclusively for PCC boilers. No reference is available for modifying this method for FBC conditions. This paper presents possible advantages of empirical modeling based prediction of NOx emissions for FBC boilers, together with a discussion of its limitations. Empirical models are reviewed, and are applied to operation data from FBC boilers used for combusting Czech lignite coal or coal-biomass mixtures. Modifications to the model are proposed in accordance with theoretical knowledge and prediction accuracy.

  4. Bankruptcy risk model and empirical tests

    Science.gov (United States)

    Podobnik, Boris; Horvatic, Davor; Petersen, Alexander M.; Urošević, Branko; Stanley, H. Eugene

    2010-01-01

    We analyze the size dependence and temporal stability of firm bankruptcy risk in the US economy by applying Zipf scaling techniques. We focus on a single risk factor—the debt-to-asset ratio R—in order to study the stability of the Zipf distribution of R over time. We find that the Zipf exponent increases during market crashes, implying that firms go bankrupt with larger values of R. Based on the Zipf analysis, we employ Bayes’s theorem and relate the conditional probability that a bankrupt firm has a ratio R with the conditional probability of bankruptcy for a firm with a given R value. For 2,737 bankrupt firms, we demonstrate size dependence in assets change during the bankruptcy proceedings. Prepetition firm assets and petition firm assets follow Zipf distributions but with different exponents, meaning that firms with smaller assets adjust their assets more than firms with larger assets during the bankruptcy process. We compare bankrupt firms with nonbankrupt firms by analyzing the assets and liabilities of two large subsets of the US economy: 2,545 Nasdaq members and 1,680 New York Stock Exchange (NYSE) members. We find that both assets and liabilities follow a Pareto distribution. The finding is not a trivial consequence of the Zipf scaling relationship of firm size quantified by employees—although the market capitalization of Nasdaq stocks follows a Pareto distribution, the same distribution does not describe NYSE stocks. We propose a coupled Simon model that simultaneously evolves both assets and debt with the possibility of bankruptcy, and we also consider the possibility of firm mergers. PMID:20937903

  5. Empirical studies of regulatory restructuring and incentives

    Science.gov (United States)

    Knittel, Christopher Roland

    This dissertation examines the actions of firms when faced with regulatory restructuring. Chapter I examines the equilibrium pricing behavior of local exchange telephone companies under a variety of market structures. In particular, the pricing behavior of three services are analyzed: residential local service, business local service, and intraLATA toll service. Beginning in 1984, a variety of market structure changes have taken place in the local telecommunications industry. I analyze differences in the method of price-setting regulation and the restrictions on entry. Specifically, the relative pricing behavior under rate of return and price cap regulation is analyzed, as well as the impact of entry in the local exchange and intraLATA toll service markets. In doing so, I estimate an empirical model that accounts for the stickiness of rates in regulated industries that is based on firm and regulator decision processes in the presence of adjustment costs. I find that, faced with competitive pressures that reduce rates in one service, incumbent firm rates increase in other services, thereby reducing the benefits from competition. In addition, the findings suggest that price cap regulation leads to higher rates relative to rate-of-return regulation. Chapter 2 analyzes the pricing and investment behavior of electricity firms. Electricity and natural gas markets have traditionally been serviced by one of two market structures. In some markets, electricity and natural gas are sold by a dual-product regulated monopolist, while in other markets, electricity and natural gas are sold by separate single-product regulated monopolies. This paper analyzes the relative pricing and investment decisions of electricity firms operating in the two market structures. The unique relationship between these two products imply that the relative incentives of single and dual-product firms are likely to differ. Namely electricity and natural gas are substitutes in consumption while natural

  6. Empirical soot formation and oxidation model

    Directory of Open Access Journals (Sweden)

    Boussouara Karima

    2009-01-01

    Full Text Available Modelling internal combustion engines can be made following different approaches, depending on the type of problem to be simulated. A diesel combustion model has been developed and implemented in a full cycle simulation of a combustion, model accounts for transient fuel spray evolution, fuel-air mixing, ignition, combustion, and soot pollutant formation. The models of turbulent combustion of diffusion flame, apply to diffusion flames, which one meets in industry, typically in the diesel engines particulate emission represents one of the most deleterious pollutants generated during diesel combustion. Stringent standards on particulate emission along with specific emphasis on size of emitted particulates have resulted in increased interest in fundamental understanding of the mechanisms of soot particulate formation and oxidation in internal combustion engines. A phenomenological numerical model which can predict the particle size distribution of the soot emitted will be very useful in explaining the above observed results and will also be of use to develop better particulate control techniques. A diesel engine chosen for simulation is a version of the Caterpillar 3406. We are interested in employing a standard finite-volume computational fluid dynamics code, KIVA3V-RELEASE2.

  7. Are Models Easier to Understand than Code? An Empirical Study on Comprehension of Entity-Relationship (ER) Models vs. Structured Query Language (SQL) Code

    Science.gov (United States)

    Sanchez, Pablo; Zorrilla, Marta; Duque, Rafael; Nieto-Reyes, Alicia

    2011-01-01

    Models in Software Engineering are considered as abstract representations of software systems. Models highlight relevant details for a certain purpose, whereas irrelevant ones are hidden. Models are supposed to make system comprehension easier by reducing complexity. Therefore, models should play a key role in education, since they would ease the…

  8. Ranking Multivariate GARCH Models by Problem Dimension: An Empirical Evaluation

    NARCIS (Netherlands)

    M. Caporin (Massimiliano); M.J. McAleer (Michael)

    2011-01-01

    textabstractIn the last 15 years, several Multivariate GARCH (MGARCH) models have appeared in the literature. Recent research has begun to examine MGARCH specifications in terms of their out-of-sample forecasting performance. In this paper, we provide an empirical comparison of a set of models,

  9. Empirical Comparison of Criterion Referenced Measurement Models

    Science.gov (United States)

    1976-10-01

    rument cons isting of a l a r~c nu mb r of items. The models ~o·ould the n be used to es t imate the tna.• s t""~r c us in~ a smaller and mor r ea lis...ti number o f items. This. rrrun·h is em- piri ca l a nd more dir\\’c tly o ri e nted to pr ti ca l app li ·a i on:; \\ viH ’ r t. tes ting time a nd the

  10. An Empirical Model for Estimating the Probability of Electrical Short Circuits from Tin Whiskers. Part 2

    Science.gov (United States)

    Courey, Karim; Wright, Clara; Asfour, Shihab; Onar, Arzu; Bayliss, Jon; Ludwig, Larry

    2009-01-01

    In this experiment, an empirical model to quantify the probability of occurrence of an electrical short circuit from tin whiskers as a function of voltage was developed. This empirical model can be used to improve existing risk simulation models. FIB and TEM images of a tin whisker confirm the rare polycrystalline structure on one of the three whiskers studied. FIB cross-section of the card guides verified that the tin finish was bright tin.

  11. Measurement and correlation study of silymarin solubility in supercritical carbon dioxide with and without a cosolvent using semi-empirical models and back-propagation artificial neural networks

    Directory of Open Access Journals (Sweden)

    Gang Yang

    2017-09-01

    Full Text Available The solubility data of compounds in supercritical fluids and the correlation between the experimental solubility data and predicted solubility data are crucial to the development of supercritical technologies. In the present work, the solubility data of silymarin (SM in both pure supercritical carbon dioxide (SCCO2 and SCCO2 with added cosolvent was measured at temperatures ranging from 308 to 338 K and pressures from 8 to 22 MPa. The experimental data were fit with three semi-empirical density-based models (Chrastil, Bartle and Mendez-Santiago and Teja models and a back-propagation artificial neural networks (BPANN model. Interaction parameters for the models were obtained and the percentage of average absolute relative deviation (AARD% in each calculation was determined. The correlation results were in good agreement with the experimental data. A comparison among the four models revealed that the experimental solubility data were more fit with the BPANN model with AARDs ranging from 1.14% to 2.15% for silymarin in pure SCCO2 and with added cosolvent. The results provide fundamental data for designing the extraction of SM or the preparation of its particle using SCCO2 techniques.

  12. Empirical modeling of information communication technology usage ...

    African Journals Online (AJOL)

    The study will play a vital role in filling up the research gap that exist in technology acceptance behaviour among business education faculties across tertiary institutions in Nigeria and the rest of Africa. Future research on the subject matter may attempt to investigate the moderating roles of voluntariness and compulsory ...

  13. Power spectrum model of visual masking: simulations and empirical data.

    Science.gov (United States)

    Serrano-Pedraza, Ignacio; Sierra-Vázquez, Vicente; Derrington, Andrew M

    2013-06-01

    In the study of the spatial characteristics of the visual channels, the power spectrum model of visual masking is one of the most widely used. When the task is to detect a signal masked by visual noise, this classical model assumes that the signal and the noise are previously processed by a bank of linear channels and that the power of the signal at threshold is proportional to the power of the noise passing through the visual channel that mediates detection. The model also assumes that this visual channel will have the highest ratio of signal power to noise power at its output. According to this, there are masking conditions where the highest signal-to-noise ratio (SNR) occurs in a channel centered in a spatial frequency different from the spatial frequency of the signal (off-frequency looking). Under these conditions the channel mediating detection could vary with the type of noise used in the masking experiment and this could affect the estimation of the shape and the bandwidth of the visual channels. It is generally believed that notched noise, white noise and double bandpass noise prevent off-frequency looking, and high-pass, low-pass and bandpass noises can promote it independently of the channel's shape. In this study, by means of a procedure that finds the channel that maximizes the SNR at its output, we performed numerical simulations using the power spectrum model to study the characteristics of masking caused by six types of one-dimensional noise (white, high-pass, low-pass, bandpass, notched, and double bandpass) for two types of channel's shape (symmetric and asymmetric). Our simulations confirm that (1) high-pass, low-pass, and bandpass noises do not prevent the off-frequency looking, (2) white noise satisfactorily prevents the off-frequency looking independently of the shape and bandwidth of the visual channel, and interestingly we proved for the first time that (3) notched and double bandpass noises prevent off-frequency looking only when the noise

  14. Combining Empirical and Stochastic Models for Extreme Floods Estimation

    Science.gov (United States)

    Zemzami, M.; Benaabidate, L.

    2013-12-01

    Hydrological models can be defined as physical, mathematical or empirical. The latter class uses mathematical equations independent of the physical processes involved in the hydrological system. The linear regression and Gradex (Gradient of Extreme values) are classic examples of empirical models. However, conventional empirical models are still used as a tool for hydrological analysis by probabilistic approaches. In many regions in the world, watersheds are not gauged. This is true even in developed countries where the gauging network has continued to decline as a result of the lack of human and financial resources. Indeed, the obvious lack of data in these watersheds makes it impossible to apply some basic empirical models for daily forecast. So we had to find a combination of rainfall-runoff models in which it would be possible to create our own data and use them to estimate the flow. The estimated design floods would be a good choice to illustrate the difficulties facing the hydrologist for the construction of a standard empirical model in basins where hydrological information is rare. The construction of the climate-hydrological model, which is based on frequency analysis, was established to estimate the design flood in the Anseghmir catchments, Morocco. The choice of using this complex model returns to its ability to be applied in watersheds where hydrological information is not sufficient. It was found that this method is a powerful tool for estimating the design flood of the watershed and also other hydrological elements (runoff, volumes of water...).The hydrographic characteristics and climatic parameters were used to estimate the runoff, water volumes and design flood for different return periods.

  15. Vocational Teachers and Professionalism - A Model Based on Empirical Analyses

    DEFF Research Database (Denmark)

    Duch, Henriette Skjærbæk; Andreasen, Karen E

    Vocational Teachers and Professionalism - A Model Based on Empirical Analyses Several theorists has developed models to illustrate the processes of adult learning and professional development (e.g. Illeris, Argyris, Engeström; Wahlgren & Aarkorg, Kolb and Wenger). Models can sometimes be criticized...... emphasis on the adult employee, the organization, its surroundings as well as other contextual factors. Our concern is adult vocational teachers attending a pedagogical course and teaching at vocational colleges. The aim of the paper is to discuss different models and develop a model concerning teachers...... at vocational colleges based on empirical data in a specific context, vocational teacher-training course in Denmark. By offering a basis and concepts for analysis of practice such model is meant to support the development of vocational teachers’ professionalism at courses and in organizational contexts...

  16. Application of parameters space analysis tools for empirical model validation

    Energy Technology Data Exchange (ETDEWEB)

    Paloma del Barrio, E. [LEPT-ENSAM UMR 8508, Talence (France); Guyon, G. [Electricite de France, Moret-sur-Loing (France)

    2004-01-01

    A new methodology for empirical model validation has been proposed in the framework of the Task 22 (Building Energy Analysis Tools) of the International Energy Agency. It involves two main steps: checking model validity and diagnosis. Both steps, as well as the underlying methods, have been presented in the first part of the paper. In this part, they are applied for testing modelling hypothesis in the framework of the thermal analysis of an actual building. Sensitivity analysis tools have been first used to identify the parts of the model that can be really tested on the available data. A preliminary diagnosis is then supplied by principal components analysis. Useful information for model behaviour improvement has been finally obtained by optimisation techniques. This example of application shows how model parameters space analysis is a powerful tool for empirical validation. In particular, diagnosis possibilities are largely increased in comparison with residuals analysis techniques. (author)

  17. Empiric Study about the Mix Fiscal Policy – Economic Development

    Directory of Open Access Journals (Sweden)

    Alexandru Sergiu Ocnean

    2006-11-01

    Full Text Available Economic development is one of the primary objectives of any government. Fiscal policy represents one of the most effective tools that government authorities could use in order to influence the economy. Having this in mind, this paper focuses on the connection between economic development and fiscal policy and proposes an empirical study based on a sample of 21 European countries. Using a simple pool data model, we tried to distinguish the relations between the evolution of GDP per capita, as a proxy for economic development, and the evolution of three fiscal policy variables, namely the tax burden, the public expenditure to GDP ratio and the budget deficit to GDP ratio.

  18. Empiric Study about the Mix Fiscal Policy – Economic Development

    Directory of Open Access Journals (Sweden)

    Alexandru Sergiu Ocnean

    2006-09-01

    Full Text Available Economic development is one of the primary objectives of any government. Fiscal policy represents one of the most effective tools that government authorities could use in order to influence the economy. Having this in mind, this paper focuses on the connection between economic development and fiscal policy and proposes an empirical study based on a sample of 21 European countries. Using a simple pool data model, we tried to distinguish the relations between the evolution of GDP per capita, as a proxy for economic development, and the evolution of three fiscal policy variables, namely the tax burden, the public expenditure to GDP ratio and the budget deficit to GDP ratio.

  19. Empirical Models for the Estimation of Global Solar Radiation in ...

    African Journals Online (AJOL)

    Empirical Models for the Estimation of Global Solar Radiation in Yola, Nigeria. ... and average daily wind speed (WS) for the interval of three years (2010 – 2012) measured using various instruments for Yola of recorded data collected from the Center for Atmospheric Research (CAR), Anyigba are presented and analyzed.

  20. Empirical Model for Predicting Rate of Biogas Production | Adamu ...

    African Journals Online (AJOL)

    Rate of biogas production using cow manure as substrate was monitored in two laboratory scale batch reactors (13 liter and 108 liter capacities). Two empirical models based on the Gompertz and the modified logistic equations were used to fit the experimental data based on non-linear regression analysis using Solver tool ...

  1. A semi-empirical two phase model for rocks

    International Nuclear Information System (INIS)

    Fogel, M.B.

    1993-01-01

    This article presents data from an experiment simulating a spherically symmetric tamped nuclear explosion. A semi-empirical two-phase model of the measured response in tuff is presented. A comparison is made of the computed peak stress and velocity versus scaled range and that measured on several recent tuff events

  2. Empirical Modeling of the Plasmasphere Dynamics Using Neural Networks

    Science.gov (United States)

    Zhelavskaya, I. S.; Shprits, Y.; Spasojevic, M.

    2017-12-01

    We present a new empirical model for reconstructing the global dynamics of the cold plasma density distribution based only on solar wind data and geomagnetic indices. Utilizing the density database obtained using the NURD (Neural-network-based Upper hybrid Resonance Determination) algorithm for the period of October 1, 2012 - July 1, 2016, in conjunction with solar wind data and geomagnetic indices, we develop a neural network model that is capable of globally reconstructing the dynamics of the cold plasma density distribution for 2 ≤ L ≤ 6 and all local times. We validate and test the model by measuring its performance on independent datasets withheld from the training set and by comparing the model predicted global evolution with global images of He+ distribution in the Earth's plasmasphere from the IMAGE Extreme UltraViolet (EUV) instrument. We identify the parameters that best quantify the plasmasphere dynamics by training and comparing multiple neural networks with different combinations of input parameters (geomagnetic indices, solar wind data, and different durations of their time history). We demonstrate results of both local and global plasma density reconstruction. This study illustrates how global dynamics can be reconstructed from local in-situ observations by using machine learning techniques.

  3. Modelling the Factors that Affect Individuals' Utilisation of Online Learning Systems: An Empirical Study Combining the Task Technology Fit Model with the Theory of Planned Behaviour

    Science.gov (United States)

    Yu, Tai-Kuei; Yu, Tai-Yi

    2010-01-01

    Understanding learners' behaviour, perceptions and influence in terms of learner performance is crucial to predict the use of electronic learning systems. By integrating the task-technology fit (TTF) model and the theory of planned behaviour (TPB), this paper investigates the online learning utilisation of Taiwanese students. This paper provides a…

  4. Generalized Empirical Likelihood-Based Focused Information Criterion and Model Averaging

    Directory of Open Access Journals (Sweden)

    Naoya Sueishi

    2013-07-01

    Full Text Available This paper develops model selection and averaging methods for moment restriction models. We first propose a focused information criterion based on the generalized empirical likelihood estimator. We address the issue of selecting an optimal model, rather than a correct model, for estimating a specific parameter of interest. Then, this study investigates a generalized empirical likelihood-based model averaging estimator that minimizes the asymptotic mean squared error. A simulation study suggests that our averaging estimator can be a useful alternative to existing post-selection estimators.

  5. An empirical model for friction in cold forging

    DEFF Research Database (Denmark)

    Bay, Niels; Eriksen, Morten; Tan, Xincai

    2002-01-01

    With a system of simulative tribology tests for cold forging the friction stress for aluminum, steel and stainless steel provided with typical lubricants for cold forging has been determined for varying normal pressure, surface expansion, sliding length and tool/work piece interface temperature...... of normal pressure and tool/work piece interface temperature. The model is verified by process testing measuring friction at varying reductions in cold forward rod extrusion. KEY WORDS: empirical friction model, cold forging, simulative friction tests....

  6. An empirical study of an agglomeration network

    International Nuclear Information System (INIS)

    Zhang, Yichao; Zhang, Zhaochun; Guan, Jihong

    2007-01-01

    Recently, researchers have reported many models mimicking real network evolution growth, among which some are based on network aggregation growth. However, until now, relatively few experiments have been reported. Accordingly, in this paper, photomicrographs of real materials (the agglomeration in the filtrate of slurry formed by a GaP-nanoparticle conglomerate dispersed in water) are analyzed within the framework of complex network theory. By data mapping from photomicrographs we generate undirected networks and as a definition of degree we adopt the number of pixel's nearest neighbors while adjacent pixels define a connection or an edge. We study the topological structure of these networks including degree distribution, clustering coefficient and average path length. In addition, we discuss the self-similarity and synchronizability of the networks. We find that the synchronizability of high-concentration agglomeration is better than that of low-concentration agglomeration; we also find that agglomeration networks possess good self-similar features

  7. An Empirical Temperature Variance Source Model in Heated Jets

    Science.gov (United States)

    Khavaran, Abbas; Bridges, James

    2012-01-01

    An acoustic analogy approach is implemented that models the sources of jet noise in heated jets. The equivalent sources of turbulent mixing noise are recognized as the differences between the fluctuating and Favre-averaged Reynolds stresses and enthalpy fluxes. While in a conventional acoustic analogy only Reynolds stress components are scrutinized for their noise generation properties, it is now accepted that a comprehensive source model should include the additional entropy source term. Following Goldstein s generalized acoustic analogy, the set of Euler equations are divided into two sets of equations that govern a non-radiating base flow plus its residual components. When the base flow is considered as a locally parallel mean flow, the residual equations may be rearranged to form an inhomogeneous third-order wave equation. A general solution is written subsequently using a Green s function method while all non-linear terms are treated as the equivalent sources of aerodynamic sound and are modeled accordingly. In a previous study, a specialized Reynolds-averaged Navier-Stokes (RANS) solver was implemented to compute the variance of thermal fluctuations that determine the enthalpy flux source strength. The main objective here is to present an empirical model capable of providing a reasonable estimate of the stagnation temperature variance in a jet. Such a model is parameterized as a function of the mean stagnation temperature gradient in the jet, and is evaluated using commonly available RANS solvers. The ensuing thermal source distribution is compared with measurements as well as computational result from a dedicated RANS solver that employs an enthalpy variance and dissipation rate model. Turbulent mixing noise predictions are presented for a wide range of jet temperature ratios from 1.0 to 3.20.

  8. Political economy models and agricultural policy formation : empirical applicability and relevance for the CAP

    NARCIS (Netherlands)

    Zee, van der F.A.

    1997-01-01

    This study explores the relevance and applicability of political economy models for the explanation of agricultural policies. Part I (chapters 4-7) takes a general perspective and evaluates the empirical applicability of voting models and interest group models to agricultural policy

  9. an empirical study of poverty in calabar and its environs.

    African Journals Online (AJOL)

    DJFLEX

    2009-06-17

    Jun 17, 2009 ... AN EMPIRICAL STUDY OF POVERTY IN CALABAR AND ITS. ENVIRONS. ... one of the poorest nations in the world (CBN, 2001). Specifically, these .... rural development in poor regions, inadequate access to education ...

  10. Lessons from empirical studies in product and service variety management.

    OpenAIRE

    Lyons, Andrew C.L.

    2013-01-01

    [EN] For many years, a trend for businesses has been to increase market segmentation and extend product and service-variety offerings in order to provid more choice for customers and gain a competitive advantags. However, there have been relatively few variety-related, empirical studies that have been undertaken. In this research, two empirical studies are presented that address the impact of product and service variety on business and business function performance. In the first (service-vari...

  11. A REVIEW of WEBERIAN STUDIES ON THE OTTOMAN EMPIRE

    OpenAIRE

    MAZMAN, İbrahim

    2018-01-01

    This study examines the secondary literature on Max Weber’s (1864-1920) writings onIslam and the Ottoman Empire. It demarcates approaches prevalent in the secondaryliterature. Three basic themes are apparent:- Section a) concentrates on authors who applied Weber’s concepts of patrimonialism andbureaucracy to non-Ottoman countries, such as Maslovski (on the Soviet bureaucracy)and Eisenberg (on China).- Section b) focuses on authors who studied the Ottoman Empire utilizing non-Weberianaboveall ...

  12. An empirical model for the melt viscosity of polymer blends

    International Nuclear Information System (INIS)

    Dobrescu, V.

    1981-01-01

    On the basis of experimental data for blends of polyethylene with different polymers an empirical equation is proposed to describe the dependence of melt viscosity of blends on component viscosities and composition. The model ensures the continuity of viscosity vs. composition curves throughout the whole composition range, the possibility of obtaining extremum values higher or lower than the viscosities of components, allows the calculation of flow curves of blends from the flow curves of components and their volume fractions. (orig.)

  13. Empirical model for mineralisation of manure nitrogen in soil

    DEFF Research Database (Denmark)

    Sørensen, Peter; Thomsen, Ingrid Kaag; Schröder, Jaap

    2017-01-01

    A simple empirical model was developed for estimation of net mineralisation of pig and cattle slurry nitrogen (N) in arable soils under cool and moist climate conditions during the initial 5 years after spring application. The model is based on a Danish 3-year field experiment with measurements...... of N uptake in spring barley and ryegrass catch crops, supplemented with data from the literature on the temporal release of organic residues in soil. The model estimates a faster mineralisation rate for organic N in pig slurry compared with cattle slurry, and the description includes an initial N...

  14. Empirical intrinsic geometry for nonlinear modeling and time series filtering.

    Science.gov (United States)

    Talmon, Ronen; Coifman, Ronald R

    2013-07-30

    In this paper, we present a method for time series analysis based on empirical intrinsic geometry (EIG). EIG enables one to reveal the low-dimensional parametric manifold as well as to infer the underlying dynamics of high-dimensional time series. By incorporating concepts of information geometry, this method extends existing geometric analysis tools to support stochastic settings and parametrizes the geometry of empirical distributions. However, the statistical models are not required as priors; hence, EIG may be applied to a wide range of real signals without existing definitive models. We show that the inferred model is noise-resilient and invariant under different observation and instrumental modalities. In addition, we show that it can be extended efficiently to newly acquired measurements in a sequential manner. These two advantages enable us to revisit the Bayesian approach and incorporate empirical dynamics and intrinsic geometry into a nonlinear filtering framework. We show applications to nonlinear and non-Gaussian tracking problems as well as to acoustic signal localization.

  15. Testing the gravity p-median model empirically

    Directory of Open Access Journals (Sweden)

    Kenneth Carling

    2015-12-01

    Full Text Available Regarding the location of a facility, the presumption in the widely used p-median model is that the customer opts for the shortest route to the nearest facility. However, this assumption is problematic on free markets since the customer is presumed to gravitate to a facility by the distance to and the attractiveness of it. The recently introduced gravity p-median model offers an extension to the p-median model that account for this. The model is therefore potentially interesting, although it has not yet been implemented and tested empirically. In this paper, we have implemented the model in an empirical problem of locating vehicle inspections, locksmiths, and retail stores of vehicle spare-parts for the purpose of investigating its superiority to the p-median model. We found, however, the gravity p-median model to be of limited use for the problem of locating facilities as it either gives solutions similar to the p-median model, or it gives unstable solutions due to a non-concave objective function.

  16. Data envelopment analysis in service quality evaluation: an empirical study

    Science.gov (United States)

    Najafi, Seyedvahid; Saati, Saber; Tavana, Madjid

    2015-09-01

    Service quality is often conceptualized as the comparison between service expectations and the actual performance perceptions. It enhances customer satisfaction, decreases customer defection, and promotes customer loyalty. Substantial literature has examined the concept of service quality, its dimensions, and measurement methods. We introduce the perceived service quality index (PSQI) as a single measure for evaluating the multiple-item service quality construct based on the SERVQUAL model. A slack-based measure (SBM) of efficiency with constant inputs is used to calculate the PSQI. In addition, a non-linear programming model based on the SBM is proposed to delineate an improvement guideline and improve service quality. An empirical study is conducted to assess the applicability of the method proposed in this study. A large number of studies have used DEA as a benchmarking tool to measure service quality. These models do not propose a coherent performance evaluation construct and consequently fail to deliver improvement guidelines for improving service quality. The DEA models proposed in this study are designed to evaluate and improve service quality within a comprehensive framework and without any dependency on external data.

  17. Empirical model of subdaily variations in the Earth rotation from GPS and its stability

    Science.gov (United States)

    Panafidina, N.; Kurdubov, S.; Rothacher, M.

    2012-12-01

    The model recommended by the IERS for these variations at diurnal and semidiurnal periods has been computed from an ocean tide model and comprises 71 terms in polar motion and Universal Time. In the present study we compute an empirical model of variations in the Earth rotation on tidal frequencies from homogeneously re-processed GPS-observations over 1994-2007 available as free daily normal equations. We discuss the reliability of the obtained amplitudes of the ERP variations and compare results from GPS and VLBI data to identify technique-specific problems and instabilities of the empirical tidal models.

  18. Plant water potential improves prediction of empirical stomatal models.

    Directory of Open Access Journals (Sweden)

    William R L Anderegg

    Full Text Available Climate change is expected to lead to increases in drought frequency and severity, with deleterious effects on many ecosystems. Stomatal responses to changing environmental conditions form the backbone of all ecosystem models, but are based on empirical relationships and are not well-tested during drought conditions. Here, we use a dataset of 34 woody plant species spanning global forest biomes to examine the effect of leaf water potential on stomatal conductance and test the predictive accuracy of three major stomatal models and a recently proposed model. We find that current leaf-level empirical models have consistent biases of over-prediction of stomatal conductance during dry conditions, particularly at low soil water potentials. Furthermore, the recently proposed stomatal conductance model yields increases in predictive capability compared to current models, and with particular improvement during drought conditions. Our results reveal that including stomatal sensitivity to declining water potential and consequent impairment of plant water transport will improve predictions during drought conditions and show that many biomes contain a diversity of plant stomatal strategies that range from risky to conservative stomatal regulation during water stress. Such improvements in stomatal simulation are greatly needed to help unravel and predict the response of ecosystems to future climate extremes.

  19. Empirical studies in the economics of education

    NARCIS (Netherlands)

    Ruijs, N.M.

    2015-01-01

    This thesis consists of four studies in the economics of education. All chapters use applied microeconometric techniques to answer questions on education. Chapter two studies determinants of school choice in Amsterdam. Contrasting to a popular argument on school choice, quality indicators are not

  20. Sourcing of internal auditing : An empirical study

    NARCIS (Netherlands)

    Speklé, R.F.; Elten, van H.J.; Kruis, A.

    2007-01-01

    This paper studies the factors associated with organizations’ internal audit sourcing decisions, building from a previous study by Widener and Selto (henceforth W&S) [Widener, S.K., Selto, F.H., 1999. Management control systems and boundaries of the firm: why do firms outsource internal audit

  1. Organizational Learning, Strategic Flexibility and Business Model Innovation: An Empirical Research Based on Logistics Enterprises

    Science.gov (United States)

    Bao, Yaodong; Cheng, Lin; Zhang, Jian

    Using the data of 237 Jiangsu logistics firms, this paper empirically studies the relationship among organizational learning capability, business model innovation, strategic flexibility. The results show as follows; organizational learning capability has positive impacts on business model innovation performance; strategic flexibility plays mediating roles on the relationship between organizational learning capability and business model innovation; interaction among strategic flexibility, explorative learning and exploitative learning play significant roles in radical business model innovation and incremental business model innovation.

  2. Behavioral corporate governance : four empirical studies

    NARCIS (Netherlands)

    van der Laan, G.

    2009-01-01

    This thesis consists of studies of corporate governance from a behavioral perspective. The chapters are about trust between chief executive officers (CEOs) and board chairpersons, asymmetric effects of corporate social responsibility on corporate financial performance, compliance with corporate

  3. Deadline Dodgers: An Empirical Study, Of Sorts.

    Science.gov (United States)

    Fletcher, J. B.

    1987-01-01

    Details (facetiously) a study that examines why English students do not turn their work in on time. Concludes that numerous distractions, such as "Dear Abby," are responsible for sidetracking students doing research. (NKA)

  4. Empirical Studies on Sovereign Fixed Income Markets

    NARCIS (Netherlands)

    J.G. Duyvesteyn (Johan)

    2015-01-01

    markdownabstractAbstract This dissertation presents evidence of five studies showing that sovereign fixed income markets are not always price efficient. The emerging local currency debt market has grown to a large size of more than 1.5 trill ion US Dollars at the end of 2012. The factors

  5. Empirical studies in labor and education economics

    NARCIS (Netherlands)

    Ketel, N.

    2016-01-01

    The chapters of this thesis focus on policy-relevant research questions in economics of education and labor economics. All chapters make use of randomized experiments in order to answer these questions. The second chapter studies the returns to medical school in a regulated labor market, by

  6. The Environmental Assessment Technique: An Empirical Study.

    Science.gov (United States)

    Overall, Jesse U., IV

    The purpose of the study was to investigate the effectiveness of Alexander Astin's Environmental Assessment Technique (EAT) in describing the environmental press at a large public university, California State University at Los Angeles. Results indicate that EAT is a very economical method for broadly describing aspects of a university's…

  7. Household energy demand. Empirical studies concerning Sweden

    Energy Technology Data Exchange (ETDEWEB)

    Dargay, J; Lundin, A

    1978-06-01

    This paper investigates the effects of energy policy on households in Sweden and provides the material necessary for evaluation of current and proposed energy-conservation measures. Emphasis is placed on the impact of enery taxation or price changes on household demand for electricity, heating oil, and gasoline and the consequences of such measures for income distribution. The results of the Swedish studies of household demand for heating oil and gasoline indicate that price changes can have a considerable long run impact on fuel utilization. In the short run, price responsiveness is notably reduced, but it is nevertheless of consequence for energy demand.

  8. Developing technology pushed breakthroughs: an empirical study

    Directory of Open Access Journals (Sweden)

    Jari Sarja

    2017-12-01

    Full Text Available Developing a technology push product that brings real novelty to the market is difficult, risky and costly. This case study analyzes success factors defined by the literature. True industrial cases, representing Finnish ICT firms in their early phase after a successful market entry, were researched for the success factor analysis. The whole set of the previously introduced success factors were variably supported, and three new factors arose. Because the technology pushed development processes are risky with high failure rates, the validated success factors are valuable knowledge for the developments intensive firm’s management.

  9. Empirical Study on the Creative Accounting Phenomenon

    Directory of Open Access Journals (Sweden)

    Cernusca Lucian

    2016-06-01

    Full Text Available The present study aims to analyze the accounting professionals’ point of view as opposed to the students and master students’ one, regarding the creative accounting phenomenon existence and manifestation forms. In order to accomplish this objective, there has been used the poll/investigation, as a research method and the questionnaire, as a research instrument. Within the study, there is suggested the testing of more hypotheses that contribute to the clarifying of the aspects wished to be analyzed through the research. These hypotheses’ acceptance or rejection is based on the „chi-square“ (Karl Pearson statistical test and rank ordering method. Trying to elaborate a global conclusion of the questionnaire, there could be noticed the fact that over 50% from the questioned accounting students are not tempted to use the creative accounting practices and techniques in order to optimize the taxation without breaking the actual law regulations. At the opposite side, more than a half from the questioned accounting professionals would use these practices without breaking the law regulations that lead to the taxation’s optimization. The creative accounting has a negative connotation if the accurate image of the financial position and of the performance is not targeted because it represents the essential factor for elaborating and grounding the accounting policies. However, the positive side of the creative accounting is not excluded, given that one appeals to the „fair“ professional judgment of the accounting professionals and to the good faith of managers.

  10. Guidelines for using empirical studies in software engineering education

    Directory of Open Access Journals (Sweden)

    Fabian Fagerholm

    2017-09-01

    Full Text Available Software engineering education is under constant pressure to provide students with industry-relevant knowledge and skills. Educators must address issues beyond exercises and theories that can be directly rehearsed in small settings. Industry training has similar requirements of relevance as companies seek to keep their workforce up to date with technological advances. Real-life software development often deals with large, software-intensive systems and is influenced by the complex effects of teamwork and distributed software development, which are hard to demonstrate in an educational environment. A way to experience such effects and to increase the relevance of software engineering education is to apply empirical studies in teaching. In this paper, we show how different types of empirical studies can be used for educational purposes in software engineering. We give examples illustrating how to utilize empirical studies, discuss challenges, and derive an initial guideline that supports teachers to include empirical studies in software engineering courses. Furthermore, we give examples that show how empirical studies contribute to high-quality learning outcomes, to student motivation, and to the awareness of the advantages of applying software engineering principles. Having awareness, experience, and understanding of the actions required, students are more likely to apply such principles under real-life constraints in their working life.

  11. Empirical model development and validation with dynamic learning in the recurrent multilayer perception

    International Nuclear Information System (INIS)

    Parlos, A.G.; Chong, K.T.; Atiya, A.F.

    1994-01-01

    A nonlinear multivariable empirical model is developed for a U-tube steam generator using the recurrent multilayer perceptron network as the underlying model structure. The recurrent multilayer perceptron is a dynamic neural network, very effective in the input-output modeling of complex process systems. A dynamic gradient descent learning algorithm is used to train the recurrent multilayer perceptron, resulting in an order of magnitude improvement in convergence speed over static learning algorithms. In developing the U-tube steam generator empirical model, the effects of actuator, process,and sensor noise on the training and testing sets are investigated. Learning and prediction both appear very effective, despite the presence of training and testing set noise, respectively. The recurrent multilayer perceptron appears to learn the deterministic part of a stochastic training set, and it predicts approximately a moving average response. Extensive model validation studies indicate that the empirical model can substantially generalize (extrapolate), though online learning becomes necessary for tracking transients significantly different than the ones included in the training set and slowly varying U-tube steam generator dynamics. In view of the satisfactory modeling accuracy and the associated short development time, neural networks based empirical models in some cases appear to provide a serious alternative to first principles models. Caution, however, must be exercised because extensive on-line validation of these models is still warranted

  12. The emotional involvement in the workplace: An empirical study

    Directory of Open Access Journals (Sweden)

    Ana María Lucia-Casademunt

    2012-06-01

    Full Text Available Purpose: In a multitude of studies, it is verified that the generation of positive attitudes for employees such as job satisfaction or job involvement, have a positive influence on productivity levels of companies. The current investigation focus on the identification of employee-profile -who is emotionally involved with their work activity- through the use of a set of individual, job related and attitudinal factors.Design/methodology: A review of the literature about the main factors that affect the job involvement particularly on its emotional dimension has been completed. For its measurement at the empirical level, various items related to psychological well being of employees included in the IV European Working Conditions Survey-2010 are used. Moreover, those items are identified in Job Involvement Questionnaire (Lodahl & Kejner, 1965. Since then, an empirical and multidimensional study is carried out by applying a logistic regression model on the sample of 11,149 employees obtained with European survey cited previously.Findings: The logistic regression model identifies the factors, which are directly related to emotional involvement at the workplace. Ultimately, is raised a definitive model that define the European employee-profile -who is emotionally involved at the workplace-: a rather aged person who has been working at his/her present place of employment for several years in a medium-sized company where possibly there exist a good working relationship between workers and their superiors –social support-. These employees are “white-collar” workers, have career advancement opportunities in the organizational hierarchy. They perform variety, flexible and complex tasks, which leads to satisfaction in terms of pay and working conditions.Research limitations/implications: Emotional involvement has been measured through self-awareness and, therefore, the corresponding bias in the key variable must be assumed. In addition, the casual

  13. Social amplification of risk: An empirical study

    International Nuclear Information System (INIS)

    Burns, W.; Slovic, P.; Kasperson, R.; Kasperson, J.; Renn, O.; Emani, S.

    1990-09-01

    The social amplification of risk is a theoretical framework that addresses an important deficiency of formal risk assessment methods and procedures. Typically assessments of risk from technological mishaps have been based upon the expected number of people who could be killed or injured or the amount of property that might be damaged. The diverse and consequential impacts that followed in the aftermath of the Three Mile Island accident make it clear that risk assessments that exclude the role of public perceptions of risk will greatly underestimate the potential costs of certain types of hazards. The accident at Three Mile Island produced no direct fatalities and few, if any, expected deaths due to cancer, yet few other accidents in history have had such costly societal impacts. The experience of amplified impacts argues for the development of a broadened theoretical and methodological perspective capable of integrating technical assessment of risk with public perceptions. This report presents the results to date in an ongoing research effort to better understand the complex processes by which adverse events produce impacts. In particular this research attempts to construct a framework that can account for those events that have produced, or are capable of producing, greater societal impacts than would be forecast by traditional risk assessment methods. This study demonstrates that the social amplification of risk involves interactions between sophisticated technological hazards, public and private institutions, and subtle individual and public perceptions and behaviors. These factors, and the variables underlying the intricate processes of social amplification that occur in modern society, are not fully defined and clarified in this report. 19 refs., 9 figs., 10 tabs

  14. Empirical STORM-E Model. [I. Theoretical and Observational Basis

    Science.gov (United States)

    Mertens, Christopher J.; Xu, Xiaojing; Bilitza, Dieter; Mlynczak, Martin G.; Russell, James M., III

    2013-01-01

    Auroral nighttime infrared emission observed by the Sounding of the Atmosphere using Broadband Emission Radiometry (SABER) instrument onboard the Thermosphere-Ionosphere-Mesosphere Energetics and Dynamics (TIMED) satellite is used to develop an empirical model of geomagnetic storm enhancements to E-region peak electron densities. The empirical model is called STORM-E and will be incorporated into the 2012 release of the International Reference Ionosphere (IRI). The proxy for characterizing the E-region response to geomagnetic forcing is NO+(v) volume emission rates (VER) derived from the TIMED/SABER 4.3 lm channel limb radiance measurements. The storm-time response of the NO+(v) 4.3 lm VER is sensitive to auroral particle precipitation. A statistical database of storm-time to climatological quiet-time ratios of SABER-observed NO+(v) 4.3 lm VER are fit to widely available geomagnetic indices using the theoretical framework of linear impulse-response theory. The STORM-E model provides a dynamic storm-time correction factor to adjust a known quiescent E-region electron density peak concentration for geomagnetic enhancements due to auroral particle precipitation. Part II of this series describes the explicit development of the empirical storm-time correction factor for E-region peak electron densities, and shows comparisons of E-region electron densities between STORM-E predictions and incoherent scatter radar measurements. In this paper, Part I of the series, the efficacy of using SABER-derived NO+(v) VER as a proxy for the E-region response to solar-geomagnetic disturbances is presented. Furthermore, a detailed description of the algorithms and methodologies used to derive NO+(v) VER from SABER 4.3 lm limb emission measurements is given. Finally, an assessment of key uncertainties in retrieving NO+(v) VER is presented

  15. The empirical study of norms of justice - an overview

    OpenAIRE

    Jacquemain, Marc

    2003-01-01

    The paper discusses what is empirical study of justice feeling, drawing the line between this and normative study, but defending nevertheless that there are important linkis between both stances. It gives an overview of main theories whitin normative study of justice feelings

  16. Empathy at the confluence of neuroscience and empirical literary studies

    NARCIS (Netherlands)

    Burke, M.; Mangen, Anne; Kuzmicova, Anezka; Schilhab, Theresa

    2016-01-01

    The objective of this article is to review extant empirical studies of empathy in narrative reading in light of (a) contemporary literary theory, and (b) neuroscientific studies of empathy, and to discuss how a closer interplay between neuroscience and literary studies may enhance our understanding

  17. Theoretical Semi-Empirical AM1 studies of Schiff Bases

    International Nuclear Information System (INIS)

    Arora, K.; Burman, K.

    2005-01-01

    The present communication reports the theoretical semi-empirical studies of schiff bases of 2-amino pyridine along with their comparison with their parent compounds. Theoretical studies reveal that it is the azomethine group, in the schiff bases under study, that acts as site for coordination to metals as it is reported by many coordination chemists. (author)

  18. An Empirical Study about China: Gender Equity in Science Education.

    Science.gov (United States)

    Wang, Jianjun; Staver, John R.

    A data base representing a random sample of more than 10,000 grade 9 students in an SISS (Second IEA Science Study) Extended Study (SES), a key project supported by the China State Commission of Education in the late 1980s, was employed in this study to investigate gender equity in student science achievement in China. This empirical data analysis…

  19. An empirical study on the utility of BRDF model parameters and topographic parameters for mapping vegetation in a semi-arid region with MISR imagery

    Science.gov (United States)

    Multi-angle remote sensing has been proved useful for mapping vegetation community types in desert regions. Based on Multi-angle Imaging Spectro-Radiometer (MISR) multi-angular images, this study compares roles played by Bidirectional Reflectance Distribution Function (BRDF) model parameters with th...

  20. Semiphysiological versus Empirical Modelling of the Population Pharmacokinetics of Free and Total Cefazolin during Pregnancy

    Directory of Open Access Journals (Sweden)

    J. G. Coen van Hasselt

    2014-01-01

    Full Text Available This work describes a first population pharmacokinetic (PK model for free and total cefazolin during pregnancy, which can be used for dose regimen optimization. Secondly, analysis of PK studies in pregnant patients is challenging due to study design limitations. We therefore developed a semiphysiological modeling approach, which leveraged gestation-induced changes in creatinine clearance (CrCL into a population PK model. This model was then compared to the conventional empirical covariate model. First, a base two-compartmental PK model with a linear protein binding was developed. The empirical covariate model for gestational changes consisted of a linear relationship between CL and gestational age. The semiphysiological model was based on the base population PK model and a separately developed mixed-effect model for gestation-induced change in CrCL. Estimates for baseline clearance (CL were 0.119 L/min (RSE 58% and 0.142 L/min (RSE 44% for the empirical and semiphysiological models, respectively. Both models described the available PK data comparably well. However, as the semiphysiological model was based on prior knowledge of gestation-induced changes in renal function, this model may have improved predictive performance. This work demonstrates how a hybrid semiphysiological population PK approach may be of relevance in order to derive more informative inferences.

  1. Empirical Bayes Credibility Models for Economic Catastrophic Losses by Regions

    Directory of Open Access Journals (Sweden)

    Jindrová Pavla

    2017-01-01

    Full Text Available Catastrophic events affect various regions of the world with increasing frequency and intensity. The number of catastrophic events and the amount of economic losses is varying in different world regions. Part of these losses is covered by insurance. Catastrophe events in last years are associated with increases in premiums for some lines of business. The article focus on estimating the amount of net premiums that would be needed to cover the total or insured catastrophic losses in different world regions using Bühlmann and Bühlmann-Straub empirical credibility models based on data from Sigma Swiss Re 2010-2016. The empirical credibility models have been developed to estimate insurance premiums for short term insurance contracts using two ingredients: past data from the risk itself and collateral data from other sources considered to be relevant. In this article we deal with application of these models based on the real data about number of catastrophic events and about the total economic and insured catastrophe losses in seven regions of the world in time period 2009-2015. Estimated credible premiums by world regions provide information how much money in the monitored regions will be need to cover total and insured catastrophic losses in next year.

  2. Non-Linear Relationship between Economic Growth and CO₂ Emissions in China: An Empirical Study Based on Panel Smooth Transition Regression Models.

    Science.gov (United States)

    Wang, Zheng-Xin; Hao, Peng; Yao, Pei-Yi

    2017-12-13

    The non-linear relationship between provincial economic growth and carbon emissions is investigated by using panel smooth transition regression (PSTR) models. The research indicates that, on the condition of separately taking Gross Domestic Product per capita (GDPpc), energy structure (Es), and urbanisation level (Ul) as transition variables, three models all reject the null hypothesis of a linear relationship, i.e., a non-linear relationship exists. The results show that the three models all contain only one transition function but different numbers of location parameters. The model taking GDPpc as the transition variable has two location parameters, while the other two models separately considering Es and Ul as the transition variables both contain one location parameter. The three models applied in the study all favourably describe the non-linear relationship between economic growth and CO₂ emissions in China. It also can be seen that the conversion rate of the influence of Ul on per capita CO₂ emissions is significantly higher than those of GDPpc and Es on per capita CO₂ emissions.

  3. Empirical spatial econometric modelling of small scale neighbourhood

    Science.gov (United States)

    Gerkman, Linda

    2012-07-01

    The aim of the paper is to model small scale neighbourhood in a house price model by implementing the newest methodology in spatial econometrics. A common problem when modelling house prices is that in practice it is seldom possible to obtain all the desired variables. Especially variables capturing the small scale neighbourhood conditions are hard to find. If there are important explanatory variables missing from the model, the omitted variables are spatially autocorrelated and they are correlated with the explanatory variables included in the model, it can be shown that a spatial Durbin model is motivated. In the empirical application on new house price data from Helsinki in Finland, we find the motivation for a spatial Durbin model, we estimate the model and interpret the estimates for the summary measures of impacts. By the analysis we show that the model structure makes it possible to model and find small scale neighbourhood effects, when we know that they exist, but we are lacking proper variables to measure them.

  4. Regime switching model for financial data: Empirical risk analysis

    Science.gov (United States)

    Salhi, Khaled; Deaconu, Madalina; Lejay, Antoine; Champagnat, Nicolas; Navet, Nicolas

    2016-11-01

    This paper constructs a regime switching model for the univariate Value-at-Risk estimation. Extreme value theory (EVT) and hidden Markov models (HMM) are combined to estimate a hybrid model that takes volatility clustering into account. In the first stage, HMM is used to classify data in crisis and steady periods, while in the second stage, EVT is applied to the previously classified data to rub out the delay between regime switching and their detection. This new model is applied to prices of numerous stocks exchanged on NYSE Euronext Paris over the period 2001-2011. We focus on daily returns for which calibration has to be done on a small dataset. The relative performance of the regime switching model is benchmarked against other well-known modeling techniques, such as stable, power laws and GARCH models. The empirical results show that the regime switching model increases predictive performance of financial forecasting according to the number of violations and tail-loss tests. This suggests that the regime switching model is a robust forecasting variant of power laws model while remaining practical to implement the VaR measurement.

  5. Empirical Reduced-Order Modeling for Boundary Feedback Flow Control

    Directory of Open Access Journals (Sweden)

    Seddik M. Djouadi

    2008-01-01

    Full Text Available This paper deals with the practical and theoretical implications of model reduction for aerodynamic flow-based control problems. Various aspects of model reduction are discussed that apply to partial differential equation- (PDE- based models in general. Specifically, the proper orthogonal decomposition (POD of a high dimension system as well as frequency domain identification methods are discussed for initial model construction. Projections on the POD basis give a nonlinear Galerkin model. Then, a model reduction method based on empirical balanced truncation is developed and applied to the Galerkin model. The rationale for doing so is that linear subspace approximations to exact submanifolds associated with nonlinear controllability and observability require only standard matrix manipulations utilizing simulation/experimental data. The proposed method uses a chirp signal as input to produce the output in the eigensystem realization algorithm (ERA. This method estimates the system's Markov parameters that accurately reproduce the output. Balanced truncation is used to show that model reduction is still effective on ERA produced approximated systems. The method is applied to a prototype convective flow on obstacle geometry. An H∞ feedback flow controller is designed based on the reduced model to achieve tracking and then applied to the full-order model with excellent performance.

  6. Consistent constitutive modeling of metallic target penetration using empirical, analytical, and numerical penetration models

    Directory of Open Access Journals (Sweden)

    John (Jack P. Riegel III

    2016-04-01

    Full Text Available Historically, there has been little correlation between the material properties used in (1 empirical formulae, (2 analytical formulations, and (3 numerical models. The various regressions and models may each provide excellent agreement for the depth of penetration into semi-infinite targets. But the input parameters for the empirically based procedures may have little in common with either the analytical model or the numerical model. This paper builds on previous work by Riegel and Anderson (2014 to show how the Effective Flow Stress (EFS strength model, based on empirical data, can be used as the average flow stress in the analytical Walker–Anderson Penetration model (WAPEN (Anderson and Walker, 1991 and how the same value may be utilized as an effective von Mises yield strength in numerical hydrocode simulations to predict the depth of penetration for eroding projectiles at impact velocities in the mechanical response regime of the materials. The method has the benefit of allowing the three techniques (empirical, analytical, and numerical to work in tandem. The empirical method can be used for many shot line calculations, but more advanced analytical or numerical models can be employed when necessary to address specific geometries such as edge effects or layering that are not treated by the simpler methods. Developing complete constitutive relationships for a material can be costly. If the only concern is depth of penetration, such a level of detail may not be required. The effective flow stress can be determined from a small set of depth of penetration experiments in many cases, especially for long penetrators such as the L/D = 10 ones considered here, making it a very practical approach. In the process of performing this effort, the authors considered numerical simulations by other researchers based on the same set of experimental data that the authors used for their empirical and analytical assessment. The goals were to establish a

  7. Environmental ethics and wilderness management: an empirical study

    Science.gov (United States)

    William A. Valliere; Robert E. Manning

    1995-01-01

    The underlying hypothesis of this study is that environmental ethics influence public attitudes toward wilderness management. To study this hypothesis, environmental ethics were defined, categorized, and measured empirically. Additionally, attitudes toward selected wilderness management issues were measured. Associations were found between beliefs in selected...

  8. Visual Design Principles: An Empirical Study of Design Lore

    Science.gov (United States)

    Kimball, Miles A.

    2013-01-01

    Many books, designers, and design educators talk about visual design principles such as balance, contrast, and alignment, but with little consistency. This study uses empirical methods to explore the lore surrounding design principles. The study took the form of two stages: a quantitative literature review to determine what design principles are…

  9. A Causal Model of Linkages between Environment and Organizational Structure, and Its Performance Implications in International Service Distribution: An Empirical Study of Restaurant and Hotel Industry

    OpenAIRE

    Kim, Seehyung

    2005-01-01

    This research develops and tests a model of the service unit ownership and control patterns used by international service companies. The main purpose of this study is to investigate trivariate causal relationships among environmental factors, organizational structure, and perceived performance in the internationalization process of service firms. A service firm operating in foreign soil has a choice of three general entry mode strategies offering different degrees of ownership and control of ...

  10. Selection Bias in Educational Transition Models: Theory and Empirical Evidence

    DEFF Research Database (Denmark)

    Holm, Anders; Jæger, Mads

    variables. This paper, first, explains theoretically how selection on unobserved variables leads to waning coefficients and, second, illustrates empirically how selection leads to biased estimates of the effect of family background on educational transitions. Our empirical analysis using data from...

  11. Empirical study of travel mode forecasting improvement for the combined revealed preference/stated preference data–based discrete choice model

    Directory of Open Access Journals (Sweden)

    Yanfu Qiao

    2016-01-01

    Full Text Available The combined revealed preference/stated preference data–based discrete choice model has provided the actual choice-making restraints as well as reduced the prediction errors. But the random error variance of alternatives belonging to different data would impact its universality. In this article, we studied the traffic corridor between Chengdu and Longquan with the revealed preference/stated preference joint model, and the single stated preference data model separately predicted the choice probability of each mode. We found the revealed preference/stated preference joint model is universal only when there is a significant difference between the random error terms in different data. The single stated preference data would amplify the travelers’ preference and cause prediction error. We proposed a universal way that uses revealed preference data to modify the single stated preference data parameter estimation results to achieve the composite utility and reduce the prediction error. And the result suggests that prediction results are more reasonable based on the composite utility than the results based on the single stated preference data, especially forecasting the mode share of bus. The future metro line will be the main travel mode in this corridor, and 45% of passenger flow will transfer to the metro.

  12. Empirical models of wind conditions on Upper Klamath Lake, Oregon

    Science.gov (United States)

    Buccola, Norman L.; Wood, Tamara M.

    2010-01-01

    Upper Klamath Lake is a large (230 square kilometers), shallow (mean depth 2.8 meters at full pool) lake in southern Oregon. Lake circulation patterns are driven largely by wind, and the resulting currents affect the water quality and ecology of the lake. To support hydrodynamic modeling of the lake and statistical investigations of the relation between wind and lake water-quality measurements, the U.S. Geological Survey has monitored wind conditions along the lakeshore and at floating raft sites in the middle of the lake since 2005. In order to make the existing wind archive more useful, this report summarizes the development of empirical wind models that serve two purposes: (1) to fill short (on the order of hours or days) wind data gaps at raft sites in the middle of the lake, and (2) to reconstruct, on a daily basis, over periods of months to years, historical wind conditions at U.S. Geological Survey sites prior to 2005. Empirical wind models based on Artificial Neural Network (ANN) and Multivariate-Adaptive Regressive Splines (MARS) algorithms were compared. ANNs were better suited to simulating the 10-minute wind data that are the dependent variables of the gap-filling models, but the simpler MARS algorithm may be adequate to accurately simulate the daily wind data that are the dependent variables of the historical wind models. To further test the accuracy of the gap-filling models, the resulting simulated winds were used to force the hydrodynamic model of the lake, and the resulting simulated currents were compared to measurements from an acoustic Doppler current profiler. The error statistics indicated that the simulation of currents was degraded as compared to when the model was forced with observed winds, but probably is adequate for short gaps in the data of a few days or less. Transport seems to be less affected by the use of the simulated winds in place of observed winds. The simulated tracer concentration was similar between model results when

  13. Empirical atom model of Vegard's law

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Lei, E-mail: zhleile2002@163.com [Materials Department, College of Electromechanical Engineering, China University of Petroleum, Qingdao 266555 (China); School of Electromechanical Automobile Engineering, Yantai University, Yantai 264005 (China); Li, Shichun [Materials Department, College of Electromechanical Engineering, China University of Petroleum, Qingdao 266555 (China)

    2014-02-01

    Vegard's law seldom holds true for most binary continuous solid solutions. When two components form a solid solution, the atom radii of component elements will change to satisfy the continuity requirement of electron density at the interface between component atom A and atom B so that the atom with larger electron density will expand and the atom with the smaller one will contract. If the expansion and contraction of the atomic radii of A and B respectively are equal in magnitude, Vegard's law will hold true. However, the expansion and contraction of two component atoms are not equal in most situations. The magnitude of the variation will depend on the cohesive energy of corresponding element crystals. An empirical atom model of Vegard's law has been proposed to account for signs of deviations according to the electron density at Wigner–Seitz cell from Thomas–Fermi–Dirac–Cheng model.

  14. Semi-empirical neural network models of controlled dynamical systems

    Directory of Open Access Journals (Sweden)

    Mihail V. Egorchev

    2017-12-01

    Full Text Available A simulation approach is discussed for maneuverable aircraft motion as nonlinear controlled dynamical system under multiple and diverse uncertainties including knowledge imperfection concerning simulated plant and its environment exposure. The suggested approach is based on a merging of theoretical knowledge for the plant with training tools of artificial neural network field. The efficiency of this approach is demonstrated using the example of motion modeling and the identification of the aerodynamic characteristics of a maneuverable aircraft. A semi-empirical recurrent neural network based model learning algorithm is proposed for multi-step ahead prediction problem. This algorithm sequentially states and solves numerical optimization subproblems of increasing complexity, using each solution as initial guess for subsequent subproblem. We also consider a procedure for representative training set acquisition that utilizes multisine control signals.

  15. Empirical flow parameters : a tool for hydraulic model validity

    Science.gov (United States)

    Asquith, William H.; Burley, Thomas E.; Cleveland, Theodore G.

    2013-01-01

    The objectives of this project were (1) To determine and present from existing data in Texas, relations between observed stream flow, topographic slope, mean section velocity, and other hydraulic factors, to produce charts such as Figure 1 and to produce empirical distributions of the various flow parameters to provide a methodology to "check if model results are way off!"; (2) To produce a statistical regional tool to estimate mean velocity or other selected parameters for storm flows or other conditional discharges at ungauged locations (most bridge crossings) in Texas to provide a secondary way to compare such values to a conventional hydraulic modeling approach. (3.) To present ancillary values such as Froude number, stream power, Rosgen channel classification, sinuosity, and other selected characteristics (readily determinable from existing data) to provide additional information to engineers concerned with the hydraulic-soil-foundation component of transportation infrastructure.

  16. Investigating Low-Carbon City: Empirical Study of Shanghai

    Directory of Open Access Journals (Sweden)

    Xuan Yang

    2018-04-01

    Full Text Available A low-carbon economy is an inevitable choice for achieving economic and ecological sustainable development. It is of significant importance to analyze a city’s low-carbon economy development level scientifically and reasonably. In order to achieve this goal, we propose an urban low-carbon economic development level evaluation model based on the matter-element extension method. First, we select some indicators from the existing indicator system based on past research and experience. Then, a matter-element model is established on the basis of weight to evaluate the level of a city’s low-carbon, the critical value of each index is determined through the classical domain and the section domain, calculating the correlation degree of a single index and a comprehensive index. Finally, we analyze the low-carbon economy development status and future development trends according to the analysis results. In this study, we select Shanghai as an empirical study—the results show that Shanghai is a city with a low-carbon level and there is a trend of further improvement in Shanghai’s low-carbon economy. But its low carbon construction and low carbon technology investment are relatively low. In summary, this method can provide another angle for evaluating a city’s low-carbon economy.

  17. Semi-Empirical Models for Buoyancy-Driven Ventilation

    DEFF Research Database (Denmark)

    Terpager Andersen, Karl

    2015-01-01

    A literature study is presented on the theories and models dealing with buoyancy-driven ventilation in rooms. The models are categorised into four types according to how the physical process is conceived: column model, fan model, neutral plane model and pressure model. These models are analysed...... and compared with a reference model. Discrepancies and differences are shown, and the deviations are discussed. It is concluded that a reliable buoyancy model based solely on the fundamental flow equations is desirable....

  18. Complex decision-making: initial results of an empirical study

    Directory of Open Access Journals (Sweden)

    Pier Luigi Baldi

    2011-09-01

    Full Text Available A brief survey of key literature on emotions and decision-making introduces an empirical study of a group of university students exploring the effects of decision-making complexity on error risk. The results clearly show that decision-making under stress in the experimental group produces significantly more errors than in the stress-free control group.

  19. Complex decision-making: initial results of an empirical study

    OpenAIRE

    Pier Luigi Baldi

    2011-01-01

    A brief survey of key literature on emotions and decision-making introduces an empirical study of a group of university students exploring the effects of decision-making complexity on error risk. The results clearly show that decision-making under stress in the experimental group produces significantly more errors than in the stress-free control group.

  20. Climate change, income and happiness: An empirical study for Barcelona

    NARCIS (Netherlands)

    Sekulova, F.; van den Bergh, J.C.J.M.

    2013-01-01

    The present article builds upon the results of an empirical study exploring key factors which determine life satisfaction in Barcelona. Based on a sample of 840 individuals we first look at the way changes in income, notably income reductions, associated with the current economic situation in Spain,

  1. Continued Use of a Chinese Online Portal: An Empirical Study

    Science.gov (United States)

    Shih, Hung-Pin

    2008-01-01

    The evolution of the internet has made online portals a popular means of surfing the internet. In internet commerce, understanding the post-adoption behaviour of users of online portals can help enterprises to attract new users and retain existing customers. For predicting continued use intentions, this empirical study focused on applying and…

  2. Labour flexibility in China's companies: An Empirical Study

    NARCIS (Netherlands)

    Y. Chen (Yongping)

    2001-01-01

    textabstractLabour flexibility in China???s Companies: An Empirical Study explores labour flexibility at the workplace in ten manufacturing companies in China. It addresses how HRM contributes and facilitates management in coping with increasing market competition. Flexible labour practices are

  3. Decoupling among CSR policies, programs, and impacts : An empirical study

    NARCIS (Netherlands)

    Graafland, Johan; Smid, Hugo

    2016-01-01

    There are relatively few empirical studies on the impacts of corporate social responsibility (CSR) policies and programs. This article addresses the research gap by analyzing the incidence of, and the conditions that affect, decoupling (defined as divergence) among CSR policies, implementation of

  4. Organisational Learning and Performance--An Empirical Study

    Science.gov (United States)

    Jyothibabu, C.; Pradhan, Bibhuti Bhusan; Farooq, Ayesha

    2011-01-01

    This paper explores the important question "how the learning entities--individual, group or organisation--are affecting organisational performance". The answer is important for promoting learning and improving performance. This empirical study in the leading power utility in India found that there is a positive relation between…

  5. Distribution of longshore sediment transport along the Indian coast based on empirical model

    Digital Repository Service at National Institute of Oceanography (India)

    Chandramohan, P.; Nayak, B.U.

    An empirical sediment transport model has been developed based on longshore energy flux equation. Study indicates that annual gross sediment transport rate is high (1.5 x 10 super(6) cubic meters to 2.0 x 10 super(6) cubic meters) along the coasts...

  6. Threshold model of cascades in empirical temporal networks

    Science.gov (United States)

    Karimi, Fariba; Holme, Petter

    2013-08-01

    Threshold models try to explain the consequences of social influence like the spread of fads and opinions. Along with models of epidemics, they constitute a major theoretical framework of social spreading processes. In threshold models on static networks, an individual changes her state if a certain fraction of her neighbors has done the same. When there are strong correlations in the temporal aspects of contact patterns, it is useful to represent the system as a temporal network. In such a system, not only contacts but also the time of the contacts are represented explicitly. In many cases, bursty temporal patterns slow down disease spreading. However, as we will see, this is not a universal truth for threshold models. In this work we propose an extension of Watts’s classic threshold model to temporal networks. We do this by assuming that an agent is influenced by contacts which lie a certain time into the past. I.e., the individuals are affected by contacts within a time window. In addition to thresholds in the fraction of contacts, we also investigate the number of contacts within the time window as a basis for influence. To elucidate the model’s behavior, we run the model on real and randomized empirical contact datasets.

  7. Empirical membrane lifetime model for heavy duty fuel cell systems

    Science.gov (United States)

    Macauley, Natalia; Watson, Mark; Lauritzen, Michael; Knights, Shanna; Wang, G. Gary; Kjeang, Erik

    2016-12-01

    Heavy duty fuel cells used in transportation system applications such as transit buses expose the fuel cell membranes to conditions that can lead to lifetime-limiting membrane failure via combined chemical and mechanical degradation. Highly durable membranes and reliable predictive models are therefore needed in order to achieve the ultimate heavy duty fuel cell lifetime target of 25,000 h. In the present work, an empirical membrane lifetime model was developed based on laboratory data from a suite of accelerated membrane durability tests. The model considers the effects of cell voltage, temperature, oxygen concentration, humidity cycling, humidity level, and platinum in the membrane using inverse power law and exponential relationships within the framework of a general log-linear Weibull life-stress statistical distribution. The obtained model is capable of extrapolating the membrane lifetime from accelerated test conditions to use level conditions during field operation. Based on typical conditions for the Whistler, British Columbia fuel cell transit bus fleet, the model predicts a stack lifetime of 17,500 h and a membrane leak initiation time of 9200 h. Validation performed with the aid of a field operated stack confirmed the initial goal of the model to predict membrane lifetime within 20% of the actual operating time.

  8. Supply chain strategy: empirical case study in Europe and Asia:

    OpenAIRE

    Sillanpää, Ilkka; Sillanpää, Sebastian

    2014-01-01

    The purpose of this case study research is to present a literature review of supply chain strategy approaches, develop supply chain strategy framework and to validate a framework in empirical case study. Literature review and case study research are the research methods for this research. This study presents the supply chain strategy framework which merges together business environment, corporate strategy, supply chain demand and supply chain strategy. Research argues that all the different c...

  9. empirical modeling of oxygen modeling of oxygen uptake of flow

    African Journals Online (AJOL)

    eobe

    structure. Keywords: stepped chute, skimming flow, aeration l. 1. INTRODUCTION ..... [3] Toombes, L. and Chanson, H., “Air-water flow and gas transfer at aeration ... of numerical model of the flow behaviour through smooth and stepped.

  10. Empirical classification of resources in a business model concept

    Directory of Open Access Journals (Sweden)

    Marko Seppänen

    2009-04-01

    Full Text Available The concept of the business model has been designed for aiding exploitation of the business potential of an innovation. This exploitation inevitably involves new activities in the organisational context and generates a need to select and arrange the resources of the firm in these new activities. A business model encompasses those resources that a firm has access to and aids in a firm’s effort to create a superior ‘innovation capability’. Selecting and arranging resources to utilise innovations requires resource allocation decisions on multiple fronts as well as poses significant challenges for management of innovations. Although current business model conceptualisations elucidate resources, explicit considerations for the composition and the structures of the resource compositions have remained ambiguous. As a result, current business model conceptualisations fail in their core purpose in assisting the decision-making that must consider the resource allocation in exploiting business opportunities. This paper contributes to the existing discussion regarding the representation of resources as components in the business model concept. The categorized list of resources in business models is validated empirically, using two samples of managers in different positions in several industries. The results indicate that most of the theoretically derived resource items have their equivalents in the business language and concepts used by managers. Thus, the categorisation of the resource components enables further development of the business model concept as well as improves daily communication between managers and their subordinates. Future research could be targeted on linking these components of a business model with each other in order to gain a model to assess the performance of different business model configurations. Furthermore, different applications for the developed resource configuration may be envisioned.

  11. Measuring mobile patient safety information system success: an empirical study.

    Science.gov (United States)

    Jen, Wen-Yuan; Chao, Chia-Cheng

    2008-10-01

    The Health Risk Reminders and Surveillance (HRRS) system was designed to deliver critical abnormal test results of severely ill patients from Laboratory, Radiology, and Pathology departments to physicians within 5 min using cell phone text messages. This paper explores the success of the HRRS system. This study employed an augmented version of the DeLone and McLean IS success model. Seven variables (system quality, information quality, system use, user satisfaction, mobile healthcare anxiety, impact on the individual and impact on the organization) were used to evaluate the success of the HRRS system. The interrelationships between the seven variables were hypothesized and the hypotheses were empirically tested. The results indicate that the information quality of the HRRS system is positively associated with both system use and user satisfaction. In addition, system use is positively associated with user satisfaction, which is also positively associated with mobile healthcare anxiety. Moreover, results indicate that impact on the individual is positively associated with both user satisfaction and mobile healthcare anxiety. Finally, the impact of the organization is positively associated with impact on the individual. The results of the study provide an expanded understanding of the factors that contribute to mobile patient safety information system (IS) success. Implications of the relationship between system use and physician mobile healthcare anxiety are discussed.

  12. Modelling of proton exchange membrane fuel cell performance based on semi-empirical equations

    Energy Technology Data Exchange (ETDEWEB)

    Al-Baghdadi, Maher A.R. Sadiq [Babylon Univ., Dept. of Mechanical Engineering, Babylon (Iraq)

    2005-08-01

    Using semi-empirical equations for modeling a proton exchange membrane fuel cell is proposed for providing a tool for the design and analysis of fuel cell total systems. The focus of this study is to derive an empirical model including process variations to estimate the performance of fuel cell without extensive calculations. The model take into account not only the current density but also the process variations, such as the gas pressure, temperature, humidity, and utilization to cover operating processes, which are important factors in determining the real performance of fuel cell. The modelling results are compared well with known experimental results. The comparison shows good agreements between the modeling results and the experimental data. The model can be used to investigate the influence of process variables for design optimization of fuel cells, stacks, and complete fuel cell power system. (Author)

  13. Production functions for climate policy modeling. An empirical analysis

    International Nuclear Information System (INIS)

    Van der Werf, Edwin

    2008-01-01

    Quantitative models for climate policy modeling differ in the production structure used and in the sizes of the elasticities of substitution. The empirical foundation for both is generally lacking. This paper estimates the parameters of 2-level CES production functions with capital, labour and energy as inputs, and is the first to systematically compare all nesting structures. Using industry-level data from 12 OECD countries, we find that the nesting structure where capital and labour are combined first, fits the data best, but for most countries and industries we cannot reject that all three inputs can be put into one single nest. These two nesting structures are used by most climate models. However, while several climate policy models use a Cobb-Douglas function for (part of the) production function, we reject elasticities equal to one, in favour of considerably smaller values. Finally we find evidence for factor-specific technological change. With lower elasticities and with factor-specific technological change, some climate policy models may find a bigger effect of endogenous technological change on mitigating the costs of climate policy. (author)

  14. Reverse logistics: an empirical study for operational framework

    International Nuclear Information System (INIS)

    Yusuf, I.

    2013-01-01

    This paper presents framework of reverse logistics optimizing the stakeholders gain, social gain, economic gain and environmental gain. It identifies the roadblocks that prevail in recycling industry and describes various types of returns and wastes. Framework of the reverse logistics is evolved on the basis of actual happening of the items shown in table 1-4 disposed off from industries shown in table 6. The rejected items require environmental disposal passing through the different phases described in flow of operational framework. An operational framework of reverse logistics is developed studying fifty organizations.. In ad -dition three best practices of reverse logistics are proposed by consolidating the experiential information and rich hands on industrial experience in supply chain and reverse logistics area. The research has proposed the Social, Stakeholder, Economic and Environmental (SSEE) sustained gain model optimizing the benefits of stakeholders and highlights the variety of waste and its operational methodology in Pakistani industry. The proposed framework does not include the hospital waste, radioactive waste, hazardous materials waste, municipal waste, agricultural waste and cold chain waste like meat, milk, etc.The operational framework is existing way of doing that takes the waste materials from point of origin to the point of recycling. A better understanding of this framework may help researchers and front line managers to develop better, more ac -curate models for effective and sustainable utilization of waste materials, benefiting organizations and soci -ety by simultaneously enhancing the cost effectiveness and improving environmental awareness. The paper provides an operational framework of reverse logistics and 2S2E sustained gain model. Specific applications are examined through empirical research. (author)

  15. EMPIRE-II statistical model code for nuclear reaction calculations

    Energy Technology Data Exchange (ETDEWEB)

    Herman, M [International Atomic Energy Agency, Vienna (Austria)

    2001-12-15

    EMPIRE II is a nuclear reaction code, comprising various nuclear models, and designed for calculations in the broad range of energies and incident particles. A projectile can be any nucleon or Heavy Ion. The energy range starts just above the resonance region, in the case of neutron projectile, and extends up to few hundreds of MeV for Heavy Ion induced reactions. The code accounts for the major nuclear reaction mechanisms, such as optical model (SCATB), Multistep Direct (ORION + TRISTAN), NVWY Multistep Compound, and the full featured Hauser-Feshbach model. Heavy Ion fusion cross section can be calculated within the simplified coupled channels approach (CCFUS). A comprehensive library of input parameters covers nuclear masses, optical model parameters, ground state deformations, discrete levels and decay schemes, level densities, fission barriers (BARFIT), moments of inertia (MOMFIT), and {gamma}-ray strength functions. Effects of the dynamic deformation of a fast rotating nucleus can be taken into account in the calculations. The results can be converted into the ENDF-VI format using the accompanying code EMPEND. The package contains the full EXFOR library of experimental data. Relevant EXFOR entries are automatically retrieved during the calculations. Plots comparing experimental results with the calculated ones can be produced using X4TOC4 and PLOTC4 codes linked to the rest of the system through bash-shell (UNIX) scripts. The graphic user interface written in Tcl/Tk is provided. (author)

  16. Energy taxation and the double dividend effect in Taiwan's energy conservation policy-an empirical study using a computable general equilibrium model

    International Nuclear Information System (INIS)

    Bor, Yunchang Jeffrey; Huang Yophy

    2010-01-01

    Faced with pressure from greenhouse gas reductions and energy price hikes, the Taiwan government is in the process of developing an energy tax regime to reflect environmental external costs and effectively curb energy consumption, as well as mitigate CO 2 emissions through an adequate pricing system. This study utilizes a CGE model to simulate and analyze the economic impacts of the draft Energy Tax Bill and its complementary fiscal measures. Under the assumption of tax revenue neutrality, the use of energy tax revenue generated for the purpose of reducing income tax is the best choice with double dividend effects since it will effectively stimulate domestic consumption and investment, and, consequently, mitigate the negative impacts of the distortionary tax regime. The double dividend effect is less significant, however, when the supplementary measures being used are for government expenditure. Nevertheless, all supplementary measures have effectively reduced energy consumption, which means they have delivered at least the first dividend-in the sense of CO 2 emissions control. It has been verified in this study that having adequate public-finance policy measures is the key to realizing the double dividend effect.

  17. A Longitudinal Study for the Empirical Validation of an Etiopathogenetic Model of Internet Addiction in Adolescence Based on Early Emotion Regulation

    Directory of Open Access Journals (Sweden)

    Silvia Cimino

    2018-01-01

    Full Text Available Several etiopathogenetic models have been conceptualized for the onset of Internet Addiction (IA. However, no study had evaluated the possible predictive effect of early emotion regulation strategies on the development of IA in adolescence. In a sample of N=142 adolescents with Internet Addiction, this twelve-year longitudinal study aimed at verifying whether and how emotion regulation strategies (self-focused versus other-focused at two years of age were predictive of school-age children’s internalizing/externalizing symptoms, which in turn fostered Internet Addiction (compulsive use of the Web versus distressed use in adolescence. Our results confirmed our hypotheses demonstrating that early emotion regulation has an impact on the emotional-behavioral functioning in middle childhood (8 years of age, which in turn has an influence on the onset of IA in adolescence. Moreover, our results showed a strong, direct statistical link between the characteristics of emotion regulation strategies in infancy and IA in adolescence. These results indicate that a common root of unbalanced emotion regulation could lead to two different manifestations of Internet Addiction in youths and could be useful in the assessment and treatment of adolescents with IA.

  18. Empirical molecular-dynamics study of diffusion in liquid semiconductors

    Science.gov (United States)

    Yu, W.; Wang, Z. Q.; Stroud, D.

    1996-11-01

    We report the results of an extensive molecular-dynamics study of diffusion in liquid Si and Ge (l-Si and l-Ge) and of impurities in l-Ge, using empirical Stillinger-Weber (SW) potentials with several choices of parameters. We use a numerical algorithm in which the three-body part of the SW potential is decomposed into products of two-body potentials, thereby permitting the study of large systems. One choice of SW parameters agrees very well with the observed l-Ge structure factors. The diffusion coefficients D(T) at melting are found to be approximately 6.4×10-5 cm2/s for l-Si, in good agreement with previous calculations, and about 4.2×10-5 and 4.6×10-5 cm2/s for two models of l-Ge. In all cases, D(T) can be fitted to an activated temperature dependence, with activation energies Ed of about 0.42 eV for l-Si, and 0.32 or 0.26 eV for two models of l-Ge, as calculated from either the Einstein relation or from a Green-Kubo-type integration of the velocity autocorrelation function. D(T) for Si impurities in l-Ge is found to be very similar to the self-diffusion coefficient of l-Ge. We briefly discuss possible reasons why the SW potentials give D(T)'s substantially lower than ab initio predictions.

  19. A New Empirical Model for Radar Scattering from Bare Soil Surfaces

    Directory of Open Access Journals (Sweden)

    Nicolas Baghdadi

    2016-11-01

    Full Text Available The objective of this paper is to propose a new semi-empirical radar backscattering model for bare soil surfaces based on the Dubois model. A wide dataset of backscattering coefficients extracted from synthetic aperture radar (SAR images and in situ soil surface parameter measurements (moisture content and roughness is used. The retrieval of soil parameters from SAR images remains challenging because the available backscattering models have limited performances. Existing models, physical, semi-empirical, or empirical, do not allow for a reliable estimate of soil surface geophysical parameters for all surface conditions. The proposed model, developed in HH, HV, and VV polarizations, uses a formulation of radar signals based on physical principles that are validated in numerous studies. Never before has a backscattering model been built and validated on such an important dataset as the one proposed in this study. It contains a wide range of incidence angles (18°–57° and radar wavelengths (L, C, X, well distributed, geographically, for regions with different climate conditions (humid, semi-arid, and arid sites, and involving many SAR sensors. The results show that the new model shows a very good performance for different radar wavelengths (L, C, X, incidence angles, and polarizations (RMSE of about 2 dB. This model is easy to invert and could provide a way to improve the retrieval of soil parameters.

  20. Merging expert and empirical data for rare event frequency estimation: Pool homogenisation for empirical Bayes models

    International Nuclear Information System (INIS)

    Quigley, John; Hardman, Gavin; Bedford, Tim; Walls, Lesley

    2011-01-01

    Empirical Bayes provides one approach to estimating the frequency of rare events as a weighted average of the frequencies of an event and a pool of events. The pool will draw upon, for example, events with similar precursors. The higher the degree of homogeneity of the pool, then the Empirical Bayes estimator will be more accurate. We propose and evaluate a new method using homogenisation factors under the assumption that events are generated from a Homogeneous Poisson Process. The homogenisation factors are scaling constants, which can be elicited through structured expert judgement and used to align the frequencies of different events, hence homogenising the pool. The estimation error relative to the homogeneity of the pool is examined theoretically indicating that reduced error is associated with larger pool homogeneity. The effects of misspecified expert assessments of the homogenisation factors are examined theoretically and through simulation experiments. Our results show that the proposed Empirical Bayes method using homogenisation factors is robust under different degrees of misspecification.

  1. Localization in random bipartite graphs: Numerical and empirical study

    Science.gov (United States)

    Slanina, František

    2017-05-01

    We investigate adjacency matrices of bipartite graphs with a power-law degree distribution. Motivation for this study is twofold: first, vibrational states in granular matter and jammed sphere packings; second, graphs encoding social interaction, especially electronic commerce. We establish the position of the mobility edge and show that it strongly depends on the power in the degree distribution and on the ratio of the sizes of the two parts of the bipartite graph. At the jamming threshold, where the two parts have the same size, localization vanishes. We found that the multifractal spectrum is nontrivial in the delocalized phase, but still near the mobility edge. We also study an empirical bipartite graph, namely, the Amazon reviewer-item network. We found that in this specific graph the mobility edge disappears, and we draw a conclusion from this fact regarding earlier empirical studies of the Amazon network.

  2. Mission Operations Planning with Preferences: An Empirical Study

    Science.gov (United States)

    Bresina, John L.; Khatib, Lina; McGann, Conor

    2006-01-01

    This paper presents an empirical study of some nonexhaustive approaches to optimizing preferences within the context of constraint-based, mixed-initiative planning for mission operations. This work is motivated by the experience of deploying and operating the MAPGEN (Mixed-initiative Activity Plan GENerator) system for the Mars Exploration Rover Mission. Responsiveness to the user is one of the important requirements for MAPGEN, hence, the additional computation time needed to optimize preferences must be kept within reasonabble bounds. This was the primary motivation for studying non-exhaustive optimization approaches. The specific goals of rhe empirical study are to assess the impact on solution quality of two greedy heuristics used in MAPGEN and to assess the improvement gained by applying a linear programming optimization technique to the final solution.

  3. Hybrid empirical--theoretical approach to modeling uranium adsorption

    International Nuclear Information System (INIS)

    Hull, Larry C.; Grossman, Christopher; Fjeld, Robert A.; Coates, John T.; Elzerman, Alan W.

    2004-01-01

    An estimated 330 metric tons of U are buried in the radioactive waste Subsurface Disposal Area (SDA) at the Idaho National Engineering and Environmental Laboratory (INEEL). An assessment of U transport parameters is being performed to decrease the uncertainty in risk and dose predictions derived from computer simulations of U fate and transport to the underlying Snake River Plain Aquifer. Uranium adsorption isotherms were measured for 14 sediment samples collected from sedimentary interbeds underlying the SDA. The adsorption data were fit with a Freundlich isotherm. The Freundlich n parameter is statistically identical for all 14 sediment samples and the Freundlich K f parameter is correlated to sediment surface area (r 2 =0.80). These findings suggest an efficient approach to material characterization and implementation of a spatially variable reactive transport model that requires only the measurement of sediment surface area. To expand the potential applicability of the measured isotherms, a model is derived from the empirical observations by incorporating concepts from surface complexation theory to account for the effects of solution chemistry. The resulting model is then used to predict the range of adsorption conditions to be expected in the vadose zone at the SDA based on the range in measured pore water chemistry. Adsorption in the deep vadose zone is predicted to be stronger than in near-surface sediments because the total dissolved carbonate decreases with depth

  4. Empirical Modeling of ICMEs Using ACE/SWICS Ionic Distributions

    Science.gov (United States)

    Rivera, Y.; Landi, E.; Lepri, S. T.; Gilbert, J. A.

    2017-12-01

    Coronal Mass Ejections (CMEs) are some of the largest, most energetic events in the solar system releasing an immense amount of plasma and magnetic field into the Heliosphere. The Earth-bound plasma plays a large role in space weather, causing geomagnetic storms that can damage space and ground based instrumentation. As a CME is released, the plasma experiences heating, expansion and acceleration; however, the physical mechanism supplying the heating as it lifts out of the corona still remains uncertain. From previous work we know the ionic composition of solar ejecta undergoes a gradual transition to a state where ionization and recombination processes become ineffective rendering the ionic composition static along its trajectory. This property makes them a good indicator of thermal conditions in the corona, where the CME plasma likely receives most of its heating. We model this so-called `freeze-in' process in Earth-directed CMEs using an ionization code to empirically determine the electron temperature, density and bulk velocity. `Frozen-in' ions from an ensemble of independently modeled plasmas within the CME are added together to fit the full range of observational ionic abundances collected by ACE/SWICS during ICME events. The models derived using this method are used to estimate the CME energy budget to determine a heating rate used to compare with a variety of heating mechanisms that can sustain the required heating with a compatible timescale.

  5. Development and empirical exploration of an extended model of intragroup conflict

    OpenAIRE

    Hjertø, Kjell B.; Kuvaas, Bård

    2009-01-01

    Dette er post-print av artikkelen publisert i International Journal of Conflict Management Purpose - The purpose of this study was to develop and empirically explore a model of four intragroup conflict types (the 4IC model), consisting of an emotional person, a cognitive task, an emotional task, and a cognitive person conflict. The two first conflict types are similar to existing conceptualizations, whereas the two latter represent new dimensions of group conflict. Design/m...

  6. Decoupling Analysis of China’s Product Sector Output and Its Embodied Carbon Emissions—An Empirical Study Based on Non-Competitive I-O and Tapio Decoupling Model

    Directory of Open Access Journals (Sweden)

    Jianbo Hu

    2017-05-01

    Full Text Available This paper uses the non-competitive I-O model and the Tapio decoupling model to comprehensively analyze the decoupling relationship between the output of the product sector in China and its embodied carbon emissions under trade openness. For this purpose, the Chinese input and output data in 2002, 2005, 2007, 2010, and 2012 are used. This approach is beneficial to identify the direct mechanism for the increased carbon emission in China from a micro perspective and provides a new perspective for the subsequent study about low-carbon economy. The obtained empirical results are as follows: (1 From overall perspective, the decoupling elasticity between the output of the product sector and its embodied carbon emissions decreased. Output and embodied carbon emissions showed a growth link from 2002 to 2005 and a weak decoupling relationship for the rest of the study period. (2 Among the 28 industries in the product sector, the increased growth rate of output in more and more product sectors was no longer accompanied by large CO2 emissions. The number of industries with strong decoupling relationships between output and embodied carbon emissions increased. (3 From the perspective of three industries, the output and embodied carbon emissions in the second and third industries exhibited a growth link only from 2002 to 2005; the three industries presented weak or strong decoupling for the rest of the study period. Through empirical analysis, this paper mainly through the construction of ecological and environmental protection of low carbon agriculture, low carbon cycle industrial system, as well as intensive and efficient service industry to reduce the carbon emissions of China’s product sector.

  7. Can the consumption-free nonexpected utility model solve the risk premium puzzle? An empirical study of the Japanese stock market

    OpenAIRE

    Kang, Myong-Il

    2010-01-01

    This paper investigates whether the consumption-free two-beta intertemporal capital asset-pricing model developed by Campbell and Vuolteenaho (2004) is able to solve the risk premium puzzle in the Japanese stock market over the period 1984-2002. Using the cash flow and discount rate betas as risk factors, the model is able to explain about half of the market returns by selection of suitable vector autoregression variables. On this basis, the model proposed solves the risk premium puzzle in Ja...

  8. Structural properties of silicon clusters: An empirical potential study

    International Nuclear Information System (INIS)

    Gong, X.G.; Zheng, Q.Q.; He Yizhen

    1993-09-01

    By using our newly proposed empirical interatomic potential for silicon, the structure and some dynamical properties of silicon cluster Si n (10 ≤ n ≤ 24) have been studied. It is found that the obtained results are close to those from ab-initio methods. From present results, we can gain a new insight into the understanding of the experimental data on the Si n clusters. (author). 20 refs, 6 figs

  9. Empirical study on how social media promotes product innovation

    OpenAIRE

    Idota, Hiroki; Bunno, Teruyuki; Tsuji, Masatsugu

    2014-01-01

    Social media such as SNS, Twitter, and the blogs has been spreading all over the world, and a large number of firms recognize social media as new communication tools for obtaining information on consumer needs and market for developing new goods and services and promoting marketing. In spite of increasing its use in the reality, academic research on whether or how social media contributes to promoting product innovation is not enough yet. This study thus attempts to analyze empirically how so...

  10. Construction of an Early Risk Warning Model of Organizational Resilience: An Empirical Study Based on Samples of R&D Teams

    Directory of Open Access Journals (Sweden)

    Si-hua Chen

    2016-01-01

    Full Text Available Facing fierce competition, it is critical for organizations to keep advantages either actively or passively. Organizational resilience is the ability of an organization to anticipate, prepare for, respond to, and adapt to incremental change and sudden disruptions in order to survive and prosper. It is of particular importance for enterprises to apprehend the intensity of organizational resilience and thereby judge their abilities to withstand pressure. By conducting an exploratory factor analysis and a confirmatory factor analysis, this paper clarifies a five-factor model for organizational resilience of R&D teams. Moreover, based on it, this paper applies fuzzy integrated evaluation method to build an early risk warning model for organizational resilience of R&D teams. The application of the model to a company shows that the model can adequately evaluate the intensity of organizational resilience of R&D teams. The results are also supposed to contribute to applied early risk warning theory.

  11. Understanding Functional Reuse of ERP Requirements in the Telecommunication Sector: an Empirical Study

    NARCIS (Netherlands)

    Daneva, Maia

    2014-01-01

    This paper is an empirical study on the application of Function Points (FP) and a FP-based reuse measurement model in Enterprise Resource Planning (ERP) projects in three organizations in the telecommunication sector. The findings of the study are used to compare the requirements reuse for one

  12. Empirical modelling to predict the refractive index of human blood

    Science.gov (United States)

    Yahya, M.; Saghir, M. Z.

    2016-02-01

    Optical techniques used for the measurement of the optical properties of blood are of great interest in clinical diagnostics. Blood analysis is a routine procedure used in medical diagnostics to confirm a patient’s condition. Measuring the optical properties of blood is difficult due to the non-homogenous nature of the blood itself. In addition, there is a lot of variation in the refractive indices reported in the literature. These are the reasons that motivated the researchers to develop a mathematical model that can be used to predict the refractive index of human blood as a function of concentration, temperature and wavelength. The experimental measurements were conducted on mimicking phantom hemoglobin samples using the Abbemat Refractometer. The results analysis revealed a linear relationship between the refractive index and concentration as well as temperature, and a non-linear relationship between refractive index and wavelength. These results are in agreement with those found in the literature. In addition, a new formula was developed based on empirical modelling which suggests that temperature and wavelength coefficients be added to the Barer formula. The verification of this correlation confirmed its ability to determine refractive index and/or blood hematocrit values with appropriate clinical accuracy.

  13. Empirical modelling to predict the refractive index of human blood

    International Nuclear Information System (INIS)

    Yahya, M; Saghir, M Z

    2016-01-01

    Optical techniques used for the measurement of the optical properties of blood are of great interest in clinical diagnostics. Blood analysis is a routine procedure used in medical diagnostics to confirm a patient’s condition. Measuring the optical properties of blood is difficult due to the non-homogenous nature of the blood itself. In addition, there is a lot of variation in the refractive indices reported in the literature. These are the reasons that motivated the researchers to develop a mathematical model that can be used to predict the refractive index of human blood as a function of concentration, temperature and wavelength. The experimental measurements were conducted on mimicking phantom hemoglobin samples using the Abbemat Refractometer. The results analysis revealed a linear relationship between the refractive index and concentration as well as temperature, and a non-linear relationship between refractive index and wavelength. These results are in agreement with those found in the literature. In addition, a new formula was developed based on empirical modelling which suggests that temperature and wavelength coefficients be added to the Barer formula. The verification of this correlation confirmed its ability to determine refractive index and/or blood hematocrit values with appropriate clinical accuracy. (paper)

  14. Empirical Modeling on Hot Air Drying of Fresh and Pre-treated Pineapples

    Directory of Open Access Journals (Sweden)

    Tanongkankit Yardfon

    2016-01-01

    Full Text Available This research was aimed to study drying kinetics and determine empirical model of fresh pineapple and pre-treated pineapple with sucrose solution at different concentrations during drying. 3 mm thick samples were immersed into 30, 40 and 50 Brix of sucrose solution before hot air drying at temperatures of 60, 70 and 80°C. The empirical models to predict the drying kinetics were investigated. The results showed that the moisture content decreased when increasing the drying temperatures and times. Increase in sucrose concentration led to longer drying time. According to the statistical values of the highest coefficients (R2, the lowest least of chi-square (χ2 and root mean square error (RMSE, Logarithmic model was the best models for describing the drying behavior of soaked samples into 30, 40 and 50 Brix of sucrose solution.

  15. Flexible Modeling of Epidemics with an Empirical Bayes Framework

    Science.gov (United States)

    Brooks, Logan C.; Farrow, David C.; Hyun, Sangwon; Tibshirani, Ryan J.; Rosenfeld, Roni

    2015-01-01

    Seasonal influenza epidemics cause consistent, considerable, widespread loss annually in terms of economic burden, morbidity, and mortality. With access to accurate and reliable forecasts of a current or upcoming influenza epidemic’s behavior, policy makers can design and implement more effective countermeasures. This past year, the Centers for Disease Control and Prevention hosted the “Predict the Influenza Season Challenge”, with the task of predicting key epidemiological measures for the 2013–2014 U.S. influenza season with the help of digital surveillance data. We developed a framework for in-season forecasts of epidemics using a semiparametric Empirical Bayes framework, and applied it to predict the weekly percentage of outpatient doctors visits for influenza-like illness, and the season onset, duration, peak time, and peak height, with and without using Google Flu Trends data. Previous work on epidemic modeling has focused on developing mechanistic models of disease behavior and applying time series tools to explain historical data. However, tailoring these models to certain types of surveillance data can be challenging, and overly complex models with many parameters can compromise forecasting ability. Our approach instead produces possibilities for the epidemic curve of the season of interest using modified versions of data from previous seasons, allowing for reasonable variations in the timing, pace, and intensity of the seasonal epidemics, as well as noise in observations. Since the framework does not make strict domain-specific assumptions, it can easily be applied to some other diseases with seasonal epidemics. This method produces a complete posterior distribution over epidemic curves, rather than, for example, solely point predictions of forecasting targets. We report prospective influenza-like-illness forecasts made for the 2013–2014 U.S. influenza season, and compare the framework’s cross-validated prediction error on historical data to

  16. An anthology of theories and models of design philosophy, approaches and empirical explorations

    CERN Document Server

    Blessing, Lucienne

    2014-01-01

    While investigations into both theories and models has remained a major strand of engineering design research, current literature sorely lacks a reference book that provides a comprehensive and up-to-date anthology of theories and models, and their philosophical and empirical underpinnings; An Anthology of Theories and Models of Design fills this gap. The text collects the expert views of an international authorship, covering: ·         significant theories in engineering design, including CK theory, domain theory, and the theory of technical systems; ·         current models of design, from a function behavior structure model to an integrated model; ·         important empirical research findings from studies into design; and ·         philosophical underpinnings of design itself. For educators and researchers in engineering design, An Anthology of Theories and Models of Design gives access to in-depth coverage of theoretical and empirical developments in this area; for pr...

  17. Using ERP and WfM Systems for Implementing Business Processes: An Empirical Study

    Science.gov (United States)

    Aversano, Lerina; Tortorella, Maria

    Software systems mainly considered from enterprises for dealing with a business process automation belong to the following two categories: Workflow Management Systems (WfMS) and Enterprise Resource Planning (ERP) systems. The wider diffusion of ERP systems tends to favourite this solution, but there are several limitations of most ERP systems for automating business processes. This paper reports an empirical study aiming at comparing the ability of implementing business processes of ERP systems and WfMSs. Two different case studies have been considered in the empirical study. It evaluates and analyses the correctness and completeness of the process models implemented by using ERP and WfM systems.

  18. Software Reuse Success Strategy Model: An Empirical Study of Factors Involved in the Success of Software Reuse in Information System Development

    Science.gov (United States)

    Tran, Kiet T.

    2012-01-01

    This study examined the relationship between information technology (IT) governance and software reuse success. Software reuse has been mostly an IT problem but rarely a business one. Studies in software reuse are abundant; however, to date, none has a deep appreciation of IT governance. This study demonstrated that IT governance had a positive…

  19. An Empirical Study on Stochastic Mortality Modelling under the Age-Period-Cohort Framework: The Case of Greece with Applications to Insurance Pricing

    Directory of Open Access Journals (Sweden)

    Apostolos Bozikas

    2018-04-01

    Full Text Available During the last decades, life expectancy has risen significantly in the most developed countries all over the world. Greece is a case in point; consequently, higher governmental financial responsibilities occur as well as serious concerns are raised owing to population ageing. To address this issue, an efficient forecasting method is required. Therefore, the most important stochastic models were comparatively applied to Greek data for the first time. An analysis of their fitting behaviour by gender was conducted and the corresponding forecasting results were evaluated. In particular, we incorporated the Greek population data into seven stochastic mortality models under a common age-period-cohort framework. The fitting performance of each model was thoroughly evaluated based on information criteria values as well as the likelihood ratio test and their robustness to period changes was investigated. In addition, parameter risk in forecasts was assessed by employing bootstrapping techniques. For completeness, projection results for both genders were also illustrated in pricing insurance-related products.

  20. Empirical Models of Social Learning in a Large, Evolving Network.

    Directory of Open Access Journals (Sweden)

    Ayşe Başar Bener

    Full Text Available This paper advances theories of social learning through an empirical examination of how social networks change over time. Social networks are important for learning because they constrain individuals' access to information about the behaviors and cognitions of other people. Using data on a large social network of mobile device users over a one-month time period, we test three hypotheses: 1 attraction homophily causes individuals to form ties on the basis of attribute similarity, 2 aversion homophily causes individuals to delete existing ties on the basis of attribute dissimilarity, and 3 social influence causes individuals to adopt the attributes of others they share direct ties with. Statistical models offer varied degrees of support for all three hypotheses and show that these mechanisms are more complex than assumed in prior work. Although homophily is normally thought of as a process of attraction, people also avoid relationships with others who are different. These mechanisms have distinct effects on network structure. While social influence does help explain behavior, people tend to follow global trends more than they follow their friends.

  1. U-tube steam generator empirical model development and validation using neural networks

    International Nuclear Information System (INIS)

    Parlos, A.G.; Chong, K.T.; Atiya, A.

    1992-01-01

    Empirical modeling techniques that use model structures motivated from neural networks research have proven effective in identifying complex process dynamics. A recurrent multilayer perception (RMLP) network was developed as a nonlinear state-space model structure along with a static learning algorithm for estimating the parameter associated with it. The methods developed were demonstrated by identifying two submodels of a U-tube steam generator (UTSG), each valid around an operating power level. A significant drawback of this approach is the long off-line training times required for the development of even a simplified model of a UTSG. Subsequently, a dynamic gradient descent-based learning algorithm was developed as an accelerated alternative to train an RMLP network for use in empirical modeling of power plants. The two main advantages of this learning algorithm are its ability to consider past error gradient information for future use and the two forward passes associated with its implementation. The enhanced learning capabilities provided by the dynamic gradient descent-based learning algorithm were demonstrated via the case study of a simple steam boiler power plant. In this paper, the dynamic gradient descent-based learning algorithm is used for the development and validation of a complete UTSG empirical model

  2. Empirically modelled Pc3 activity based on solar wind parameters

    Directory of Open Access Journals (Sweden)

    B. Heilig

    2010-09-01

    Full Text Available It is known that under certain solar wind (SW/interplanetary magnetic field (IMF conditions (e.g. high SW speed, low cone angle the occurrence of ground-level Pc3–4 pulsations is more likely. In this paper we demonstrate that in the event of anomalously low SW particle density, Pc3 activity is extremely low regardless of otherwise favourable SW speed and cone angle. We re-investigate the SW control of Pc3 pulsation activity through a statistical analysis and two empirical models with emphasis on the influence of SW density on Pc3 activity. We utilise SW and IMF measurements from the OMNI project and ground-based magnetometer measurements from the MM100 array to relate SW and IMF measurements to the occurrence of Pc3 activity. Multiple linear regression and artificial neural network models are used in iterative processes in order to identify sets of SW-based input parameters, which optimally reproduce a set of Pc3 activity data. The inclusion of SW density in the parameter set significantly improves the models. Not only the density itself, but other density related parameters, such as the dynamic pressure of the SW, or the standoff distance of the magnetopause work equally well in the model. The disappearance of Pc3s during low-density events can have at least four reasons according to the existing upstream wave theory: 1. Pausing the ion-cyclotron resonance that generates the upstream ultra low frequency waves in the absence of protons, 2. Weakening of the bow shock that implies less efficient reflection, 3. The SW becomes sub-Alfvénic and hence it is not able to sweep back the waves propagating upstream with the Alfvén-speed, and 4. The increase of the standoff distance of the magnetopause (and of the bow shock. Although the models cannot account for the lack of Pc3s during intervals when the SW density is extremely low, the resulting sets of optimal model inputs support the generation of mid latitude Pc3 activity predominantly through

  3. Empirical study of 99Tcm-HYNIC-A(D) A(D) APRPG in rabbit model of inflammation and VX2 tumor xenografted

    International Nuclear Information System (INIS)

    Liu Ciyi; Song Shaoli; Xie Wenhui; Cai Xiaojia; Zhang Lihua; Huang Gang

    2011-01-01

    Objective: To investigate the uptake of 99 Tc m -hydrazinonicotinamide-D-alanine- D-alanine-alanine-proline-arginine-proline-glycine (HYNIC-A(D) A(D) APRPG) in rabbit models of inflammation and VX2 tumor xenografted, so as to evaluate its use as a new tracer for tumor angiogenesis. Methods: Ten rabbit models of xenoplanted VX2 tumor and inflammation were randomly divided into two groups which were injected with different injected tracers, 99 Tc m -HYNIC-A(D) A (D)APRPG 99 Tc m -RGD, followed by serial Gamma images at various time points. The first group underwent 18 F-FDG PET ahead of 99 Tc m -HYNICA(D)A (D) APRPG SPECT. Analysis of variance and t-test were performed with SPSS 10.0. Results: 99 Tc m -HYNIC-A(D) A (D)APRPG scan showed negative uptake at inflammation focus but positive uptake at tumor. Pathological examination confirmed high 99 Tc m -HYNIC-A(D)A(D) APRPG accumulation in tumor cells, with the highest tumor/inflammation ratio (3.25±0.171) at 2 h post-injection, which was significantly higher than that of 99 Tc m -RGD (2.37±0.076) (F = 15.63, P 99 Tc m -HYNIC-A(D)A(D)APRPG, 99 Tc m -RGD, 18 F-FDG were significantly different at 0.5, 1, 2, 3, 6 h (F=13.83∼26.41; t =23.84, 12.75; all P 99 Tc m -HYNIC-A (D) A (D)APRPG can be used as a potential tracer for tumor angiogenesis. (authors)

  4. The effects of performance measurement and compensation on motivation: An empirical study

    NARCIS (Netherlands)

    van Herpen, M.; van Praag, C.M.; Cools, K.

    2003-01-01

    The design and implementation of a performance measurement and compensation system can strongly affect the motivation of employees. Building on economic and psychological theory this study develops a conceptual model that is used to empirically test this effect. Our survey results demonstrate a

  5. An Empirical Study of Relationships between Student Self-Concept and Science Achievement in Hong Kong

    Science.gov (United States)

    Wang, Jianjun; Oliver, Steve; Garcia, Augustine

    2004-01-01

    Positive self-concept and good understanding of science are important indicators of scientific literacy endorsed by professional organizations. The existing research literature suggests that these two indicators are reciprocally related and mutually reinforcing. Generalization of the reciprocal model demands empirical studies in different…

  6. Determinants of user acceptance of internet banking: An empirical study

    Directory of Open Access Journals (Sweden)

    Hessam Zandhessami

    2014-07-01

    Full Text Available The boom of Internet usage and the significant funding dynamism in electronic banking have attracted the attention of researchers towards Internet banking. In the past, the traditional focus of Internet banking research has been on technological development, but it is now switching to user-focused research. This paper presents an empirical investigation to determine determinants of user acceptance of internet banking. The proposed study uses Decision-Making Trial and Evaluation Laboratory (DEMATEL technique to measure the relationships between different factors in a case study of Iranian firm. The results indicate that trust is the most important factor for development of internet banking.

  7. Empirical Modeling of Lithium-ion Batteries Based on Electrochemical Impedance Spectroscopy Tests

    International Nuclear Information System (INIS)

    Samadani, Ehsan; Farhad, Siamak; Scott, William; Mastali, Mehrdad; Gimenez, Leonardo E.; Fowler, Michael; Fraser, Roydon A.

    2015-01-01

    Highlights: • Two commercial Lithium-ion batteries are studied through HPPC and EIS tests. • An equivalent circuit model is developed for a range of operating conditions. • This model improves the current battery empirical models for vehicle applications • This model is proved to be efficient in terms of predicting HPPC test resistances. - ABSTRACT: An empirical model for commercial lithium-ion batteries is developed based on electrochemical impedance spectroscopy (EIS) tests. An equivalent circuit is established according to EIS test observations at various battery states of charge and temperatures. A Laplace transfer time based model is developed based on the circuit which can predict the battery operating output potential difference in battery electric and plug-in hybrid vehicles at various operating conditions. This model demonstrates up to 6% improvement compared to simple resistance and Thevenin models and is suitable for modeling and on-board controller purposes. Results also show that this model can be used to predict the battery internal resistance obtained from hybrid pulse power characterization (HPPC) tests to within 20 percent, making it suitable for low to medium fidelity powertrain design purposes. In total, this simple battery model can be employed as a real-time model in electrified vehicle battery management systems

  8. Empirical studies of design software: Implications for software engineering environments

    Science.gov (United States)

    Krasner, Herb

    1988-01-01

    The empirical studies team of MCC's Design Process Group conducted three studies in 1986-87 in order to gather data on professionals designing software systems in a range of situations. The first study (the Lift Experiment) used thinking aloud protocols in a controlled laboratory setting to study the cognitive processes of individual designers. The second study (the Object Server Project) involved the observation, videotaping, and data collection of a design team of a medium-sized development project over several months in order to study team dynamics. The third study (the Field Study) involved interviews with the personnel from 19 large development projects in the MCC shareholders in order to study how the process of design is affected by organizationl and project behavior. The focus of this report will be on key observations of design process (at several levels) and their implications for the design of environments.

  9. Empirical Analysis of Farm Credit Risk under the Structure Model

    Science.gov (United States)

    Yan, Yan

    2009-01-01

    The study measures farm credit risk by using farm records collected by Farm Business Farm Management (FBFM) during the period 1995-2004. The study addresses the following questions: (1) whether farm's financial position is fully described by the structure model, (2) what are the determinants of farm capital structure under the structure model, (3)…

  10. Knowledge discovery from patients' behavior via clustering-classification algorithms based on weighted eRFM and CLV model: An empirical study in public health care services.

    Science.gov (United States)

    Zare Hosseini, Zeinab; Mohammadzadeh, Mahdi

    2016-01-01

    The rapid growing of information technology (IT) motivates and makes competitive advantages in health care industry. Nowadays, many hospitals try to build a successful customer relationship management (CRM) to recognize target and potential patients, increase patient loyalty and satisfaction and finally maximize their profitability. Many hospitals have large data warehouses containing customer demographic and transactions information. Data mining techniques can be used to analyze this data and discover hidden knowledge of customers. This research develops an extended RFM model, namely RFML (added parameter: Length) based on health care services for a public sector hospital in Iran with the idea that there is contrast between patient and customer loyalty, to estimate customer life time value (CLV) for each patient. We used Two-step and K-means algorithms as clustering methods and Decision tree (CHAID) as classification technique to segment the patients to find out target, potential and loyal customers in order to implement strengthen CRM. Two approaches are used for classification: first, the result of clustering is considered as Decision attribute in classification process and second, the result of segmentation based on CLV value of patients (estimated by RFML) is considered as Decision attribute. Finally the results of CHAID algorithm show the significant hidden rules and identify existing patterns of hospital consumers.

  11. Knowledge discovery from patients’ behavior via clustering-classification algorithms based on weighted eRFM and CLV model: An empirical study in public health care services

    Science.gov (United States)

    Zare Hosseini, Zeinab; Mohammadzadeh, Mahdi

    2016-01-01

    The rapid growing of information technology (IT) motivates and makes competitive advantages in health care industry. Nowadays, many hospitals try to build a successful customer relationship management (CRM) to recognize target and potential patients, increase patient loyalty and satisfaction and finally maximize their profitability. Many hospitals have large data warehouses containing customer demographic and transactions information. Data mining techniques can be used to analyze this data and discover hidden knowledge of customers. This research develops an extended RFM model, namely RFML (added parameter: Length) based on health care services for a public sector hospital in Iran with the idea that there is contrast between patient and customer loyalty, to estimate customer life time value (CLV) for each patient. We used Two-step and K-means algorithms as clustering methods and Decision tree (CHAID) as classification technique to segment the patients to find out target, potential and loyal customers in order to implement strengthen CRM. Two approaches are used for classification: first, the result of clustering is considered as Decision attribute in classification process and second, the result of segmentation based on CLV value of patients (estimated by RFML) is considered as Decision attribute. Finally the results of CHAID algorithm show the significant hidden rules and identify existing patterns of hospital consumers. PMID:27610177

  12. An empirically based model for knowledge management in health care organizations.

    Science.gov (United States)

    Sibbald, Shannon L; Wathen, C Nadine; Kothari, Anita

    2016-01-01

    Knowledge management (KM) encompasses strategies, processes, and practices that allow an organization to capture, share, store, access, and use knowledge. Ideal KM combines different sources of knowledge to support innovation and improve performance. Despite the importance of KM in health care organizations (HCOs), there has been very little empirical research to describe KM in this context. This study explores KM in HCOs, focusing on the status of current intraorganizational KM. The intention is to provide insight for future studies and model development for effective KM implementation in HCOs. A qualitative methods approach was used to create an empirically based model of KM in HCOs. Methods included (a) qualitative interviews (n = 24) with senior leadership to identify types of knowledge important in these roles plus current information-seeking behaviors/needs and (b) in-depth case study with leaders in new executive positions (n = 2). The data were collected from 10 HCOs. Our empirically based model for KM was assessed for face and content validity. The findings highlight the paucity of formal KM in our sample HCOs. Organizational culture, leadership, and resources are instrumental in supporting KM processes. An executive's knowledge needs are extensive, but knowledge assets are often limited or difficult to acquire as much of the available information is not in a usable format. We propose an empirically based model for KM to highlight the importance of context (internal and external), and knowledge seeking, synthesis, sharing, and organization. Participants who reviewed the model supported its basic components and processes, and potential for incorporating KM into organizational processes. Our results articulate ways to improve KM, increase organizational learning, and support evidence-informed decision-making. This research has implications for how to better integrate evidence and knowledge into organizations while considering context and the role of

  13. Application of GIS to Empirical Windthrow Risk Model in Mountain Forested Landscapes

    Directory of Open Access Journals (Sweden)

    Lukas Krejci

    2018-02-01

    Full Text Available Norway spruce dominates mountain forests in Europe. Natural variations in the mountainous coniferous forests are strongly influenced by all the main components of forest and landscape dynamics: species diversity, the structure of forest stands, nutrient cycling, carbon storage, and other ecosystem services. This paper deals with an empirical windthrow risk model based on the integration of logistic regression into GIS to assess forest vulnerability to wind-disturbance in the mountain spruce forests of Šumava National Park (Czech Republic. It is an area where forest management has been the focus of international discussions by conservationists, forest managers, and stakeholders. The authors developed the empirical windthrow risk model, which involves designing an optimized data structure containing dependent and independent variables entering logistic regression. The results from the model, visualized in the form of map outputs, outline the probability of risk to forest stands from wind in the examined territory of the national park. Such an application of the empirical windthrow risk model could be used as a decision support tool for the mountain spruce forests in a study area. Future development of these models could be useful for other protected European mountain forests dominated by Norway spruce.

  14. Physical Limitations of Empirical Field Models: Force Balance and Plasma Pressure

    International Nuclear Information System (INIS)

    Sorin Zaharia; Cheng, C.Z.

    2002-01-01

    In this paper, we study whether the magnetic field of the T96 empirical model can be in force balance with an isotropic plasma pressure distribution. Using the field of T96, we obtain values for the pressure P by solving a Poisson-type equation (gradient) 2 P = (gradient) · (J x B) in the equatorial plane, and 1-D profiles on the Sun-Earth axis by integrating (gradient)P = J x B. We work in a flux coordinate system in which the magnetic field is expressed in terms of Euler potentials. Our results lead to the conclusion that the T96 model field cannot be in equilibrium with an isotropic pressure. We also analyze in detail the computation of Birkeland currents using the Vasyliunas relation and the T96 field, which yields unphysical results, again indicating the lack of force balance in the empirical model. The underlying reason for the force imbalance is likely the fact that the derivatives of the least-square fitted model B are not accurate predictions of the actual magnetospheric field derivatives. Finally, we discuss a possible solution to the problem of lack of force balance in empirical field models

  15. Development of econometric models for cost and time over-runs: an empirical study of major road construction projects in pakistan

    International Nuclear Information System (INIS)

    Khan, A.; Chaudhary, M.A.

    2016-01-01

    The construction industry is flourishing worldwide and contributes about 10% to the GDP of the world i.e. up to the tune of 4.6 Trillion US dollars. It employs almost 7% of the total employee dpersons and, consumes around 40% of the total energy. The Pakistani construction sector has displayed impressive growth in recent past years. The efficient road network is a key part of construction business and plays a significant role in the economic uplift of country. The overruns in costs and delays in completion of projects are very common phenomena and it has also been observed that the projects involving construction of roads also face problems of delays and cost over runs especially in developing countries. The causes of cost overruns and delays in road projects being undertaken by the premier road construction organization of Pakistan National Highway Authority (NHA) have been considered in this study. It has been done specifically in the context of impact of cause(s) determined from project report of a total of one hundred and thirty one (131) projects. The ten causative factors which we recognize as Design, Planning and Scheduling Related problems, Financial Constraint Related reasons, Social Problem Related reasons, Technical Reasons, Administrative Reasons, Scope Increase, Specification Changes, Cost Escalation Related reasons, Non-Availability of Equipment or Material and Force Majeure play a commanding role in determination of the cost and time over runs. It has also been observed that among these identified causes, the factors of Administrative Reason, Design, Planning and Scheduling Related, Technical Reasons and Force Majeure are the most significant reasons in cost and time overruns. Whereas, the Cost Escalation related reasons has the least impact on cost increase and delays. The NHA possesses a financial worth of around Rs. 36 billion and with an annual turn over amounting to Rs. 22 billion is responsible to perform road construction project in entire

  16. Lessons from empirical studies in product and service variety management.

    Directory of Open Access Journals (Sweden)

    Andrew C.L. Lyons

    2013-07-01

    Full Text Available For many years, a trend for businesses has been to increase market segmentation and extend product and service-variety offerings in order to provid more choice for customers and gain a competitive advantags. However, there have been relatively few variety-related, empirical studies that have been undertaken. In this research, two empirical studies are presented that address the impact of product and service variety on business and business function performance. In the first (service-variety study, the focus concerns the relationship between service provision offered by UK-based, third-party logistics (3PL providers and the operational and financial performance of those providers. Here, the results of a large survey identify the  most important services offered by 3PLs and the most important aspects of 3PL operational performance. Also, the research suggests that the range of service variety offered by 3PLs does not directly influence the 3PLs’ financial performance. The second (product-variety study presents the findings from an analysis of data from 163 manufacturing plants where the impact of product variety on the performance of five business functions is examined. An increase in product variety was found to influence business functions differently depending on the combination of customisation and variety offered to customers

  17. Process health management using success tree and empirical model

    Energy Technology Data Exchange (ETDEWEB)

    Heo, Gyunyoung [Kyung Hee Univ., Yongin (Korea, Republic of); Kim, Suyoung [BNF Technology, Daejeon (Korea, Republic of); Sung, Wounkyoung [Korea South-East Power Co. Ltd., Seoul (Korea, Republic of)

    2012-03-15

    Interests on predictive or condition-based maintenance are heightening in power industries. The ultimate goal of the condition-based maintenance is to prioritize and optimize the maintenance resources by taking a reasonable decision-making process depending op plant's conditions. Such decision-making process should be able to not only observe the deviation from a normal state but also determine the severity or impact of the deviation on different levels such as a component, a system, or a plant. In order to achieve this purpose, a Plant Health Index (PHI) monitoring system was developed, which is operational in more than 10 units of large steam turbine cycles in Korea as well as desalination plants in Saudi Arabia as a proto-type demonstration. The PHI monitoring system has capability to detect whether the deviation between a measured and an estimated parameter which is the result of kernel regression using the accumulated operation data and the current plant boundary conditions (referred as an empirical model) is statistically meaningful. This deviation is converted into a certain index considering the margin to set points which are associated with safety. This index is referred as a PHI and the PHIs can be monitored for an individual parameter as well as a component, system, or plant level. In order to organize the PHIs at the component, system, or plant level, a success tree was developed. At the top of the success tree, the PHIs nodes in the middle of the success tree, the PHIs represent the health status of a component or a system. The concept and definition of the PHI, the key methodologies, the architecture of the developed system, and a practical case of using the PHI monitoring system are described in this article.

  18. Process health management using success tree and empirical model

    International Nuclear Information System (INIS)

    Heo, Gyunyoung; Kim, Suyoung; Sung, Wounkyoung

    2012-01-01

    Interests on predictive or condition-based maintenance are heightening in power industries. The ultimate goal of the condition-based maintenance is to prioritize and optimize the maintenance resources by taking a reasonable decision-making process depending op plant's conditions. Such decision-making process should be able to not only observe the deviation from a normal state but also determine the severity or impact of the deviation on different levels such as a component, a system, or a plant. In order to achieve this purpose, a Plant Health Index (PHI) monitoring system was developed, which is operational in more than 10 units of large steam turbine cycles in Korea as well as desalination plants in Saudi Arabia as a proto-type demonstration. The PHI monitoring system has capability to detect whether the deviation between a measured and an estimated parameter which is the result of kernel regression using the accumulated operation data and the current plant boundary conditions (referred as an empirical model) is statistically meaningful. This deviation is converted into a certain index considering the margin to set points which are associated with safety. This index is referred as a PHI and the PHIs can be monitored for an individual parameter as well as a component, system, or plant level. In order to organize the PHIs at the component, system, or plant level, a success tree was developed. At the top of the success tree, the PHIs nodes in the middle of the success tree, the PHIs represent the health status of a component or a system. The concept and definition of the PHI, the key methodologies, the architecture of the developed system, and a practical case of using the PHI monitoring system are described in this article

  19. Modeling gallic acid production rate by empirical and statistical analysis

    Directory of Open Access Journals (Sweden)

    Bratati Kar

    2000-01-01

    Full Text Available For predicting the rate of enzymatic reaction empirical correlation based on the experimental results obtained under various operating conditions have been developed. Models represent both the activation as well as deactivation conditions of enzymatic hydrolysis and the results have been analyzed by analysis of variance (ANOVA. The tannase activity was found maximum at incubation time 5 min, reaction temperature 40ºC, pH 4.0, initial enzyme concentration 0.12 v/v, initial substrate concentration 0.42 mg/ml, ionic strength 0.2 M and under these optimal conditions, the maximum rate of gallic acid production was 33.49 mumoles/ml/min.Para predizer a taxa das reações enzimaticas uma correlação empírica baseada nos resultados experimentais foi desenvolvida. Os modelos representam a ativação e a desativativação da hydrolise enzimatica. Os resultados foram avaliados pela análise de variança (ANOVA. A atividade máxima da tannase foi obtida após 5 minutos de incubação, temperatura 40ºC, pH 4,0, concentração inicial da enzima de 0,12 v/v, concentração inicial do substrato 0,42 mg/ml, força iônica 0,2 M. Sob essas condições a taxa máxima de produção ácido galico foi de 33,49 µmoles/ml/min.

  20. An empirical study of the information premium on electricity markets

    International Nuclear Information System (INIS)

    Benth, Fred Espen; Biegler-König, Richard; Kiesel, Rüdiger

    2013-01-01

    Due to the non-storability of electricity and the resulting lack of arbitrage-based arguments to price electricity forward contracts, a significant time-varying risk premium is exhibited. Using EEX data during the introduction of emission certificates and the German “Atom Moratorium” we show that a significant part of the risk premium in electricity forwards is due to different information sets in spot and forward markets. In order to show the existence of the resulting information premium and to analyse its size we design an empirical method based on techniques relating to enlargement of filtrations and the structure of Hilbert spaces. - Highlights: ► Electricity is non-storable and the classical spot–forward-relationship is invalid. ► Future information will cause an information premium for forward contracts. ► We model this premium mathematically using enlargement of filtrations. ► We develop a statistical method testing for the information premium empirically. ► We apply the test to the 2nd phase of the EUETS and the German “Atom Moratorium”

  1. Bridging process-based and empirical approaches to modeling tree growth

    Science.gov (United States)

    Harry T. Valentine; Annikki Makela; Annikki Makela

    2005-01-01

    The gulf between process-based and empirical approaches to modeling tree growth may be bridged, in part, by the use of a common model. To this end, we have formulated a process-based model of tree growth that can be fitted and applied in an empirical mode. The growth model is grounded in pipe model theory and an optimal control model of crown development. Together, the...

  2. A generalized preferential attachment model for business firms growth rates. I. Empirical evidence

    Science.gov (United States)

    Pammolli, F.; Fu, D.; Buldyrev, S. V.; Riccaboni, M.; Matia, K.; Yamasaki, K.; Stanley, H. E.

    2007-05-01

    We introduce a model of proportional growth to explain the distribution P(g) of business firm growth rates. The model predicts that P(g) is Laplace in the central part and depicts an asymptotic power-law behavior in the tails with an exponent ζ = 3. Because of data limitations, previous studies in this field have been focusing exclusively on the Laplace shape of the body of the distribution. We test the model at different levels of aggregation in the economy, from products, to firms, to countries, and we find that the predictions are in good agreement with empirical evidence on both growth distributions and size-variance relationships.

  3. Bias-dependent hybrid PKI empirical-neural model of microwave FETs

    Science.gov (United States)

    Marinković, Zlatica; Pronić-Rančić, Olivera; Marković, Vera

    2011-10-01

    Empirical models of microwave transistors based on an equivalent circuit are valid for only one bias point. Bias-dependent analysis requires repeated extractions of the model parameters for each bias point. In order to make model bias-dependent, a new hybrid empirical-neural model of microwave field-effect transistors is proposed in this article. The model is a combination of an equivalent circuit model including noise developed for one bias point and two prior knowledge input artificial neural networks (PKI ANNs) aimed at introducing bias dependency of scattering (S) and noise parameters, respectively. The prior knowledge of the proposed ANNs involves the values of the S- and noise parameters obtained by the empirical model. The proposed hybrid model is valid in the whole range of bias conditions. Moreover, the proposed model provides better accuracy than the empirical model, which is illustrated by an appropriate modelling example of a pseudomorphic high-electron mobility transistor device.

  4. Space evolution model and empirical analysis of an urban public transport network

    Science.gov (United States)

    Sui, Yi; Shao, Feng-jing; Sun, Ren-cheng; Li, Shu-jing

    2012-07-01

    This study explores the space evolution of an urban public transport network, using empirical evidence and a simulation model validated on that data. Public transport patterns primarily depend on traffic spatial-distribution, demands of passengers and expected utility of investors. Evolution is an iterative process of satisfying the needs of passengers and investors based on a given traffic spatial-distribution. The temporal change of urban public transport network is evaluated both using topological measures and spatial ones. The simulation model is validated using empirical data from nine big cities in China. Statistical analyses on topological and spatial attributes suggest that an evolution network with traffic demands characterized by power-law numerical values which distribute in a mode of concentric circles tallies well with these nine cities.

  5. The Role of Ethnographic Studies in Empirical Software Engineering

    DEFF Research Database (Denmark)

    Sharp, Helen; Dittrich, Yvonne; Souza, Cleidson R. B. de

    2016-01-01

    Ethnography is a qualitative research method used to study people and cultures. It is largely adopted in disciplines outside software engineering, including different areas of computer science. Ethnography can provide an in-depth understanding of the socio-technological realities surrounding ever...... as a useful and usable approach to empirical software engineering research. Throughout the paper, relevant examples of ethnographic studies of software practice are used to illustrate the points being made.......Ethnography is a qualitative research method used to study people and cultures. It is largely adopted in disciplines outside software engineering, including different areas of computer science. Ethnography can provide an in-depth understanding of the socio-technological realities surrounding...... everyday software development practice, i.e., it can help to uncover not only what practitioners do, but also why they do it. Despite its potential, ethnography has not been widely adopted by empirical software engineering researchers, and receives little attention in the related literature. The main goal...

  6. Matrix effect studies with empirical formulations in maize saplings

    International Nuclear Information System (INIS)

    Bansal, Meenakshi; Deep, Kanan; Mittal, Raj

    2012-01-01

    In X-ray fluorescence, the earlier derived matrix effects from fundamental relations of intensities of analyte/matrix elements with basic atomic and experimental setup parameters and tested on synthetic known samples were found empirically related to analyte/matrix elemental amounts. The present study involves the application of these relations on potassium and calcium macronutrients of maize saplings treated with different fertilizers. The novelty of work involves a determination of an element in the presence of its secondary excitation rather than avoiding the secondary fluorescence. Therefore, the possible utility of this process is in studying the absorption for some intermediate samples in a lot of a category of samples with close Z interfering constituents (just like Ca and K). Once the absorption and enhancement terms are fitted to elemental amounts and fitted coefficients are determined, with the absorption terms from the fit and an enhancer element amount known from its selective excitation, the next iterative elemental amount can be directly evaluated from the relations. - Highlights: ► Empirical formulation for matrix corrections in terms of amounts of analyte and matrix element. ► The study applied on K and Ca nutrients of maize, rice and potato organic materials. ► The formulation provides matrix terms from amounts of analyte/matrix elements and vice versa.

  7. Dynamic gradient descent learning algorithms for enhanced empirical modeling of power plants

    International Nuclear Information System (INIS)

    Parlos, A.G.; Atiya, Amir; Chong, K.T.

    1991-01-01

    A newly developed dynamic gradient descent-based learning algorithm is used to train a recurrent multilayer perceptron network for use in empirical modeling of power plants. The two main advantages of the proposed learning algorithm are its ability to consider past error gradient information for future use and the two forward passes associated with its implementation, instead of one forward and one backward pass of the backpropagation algorithm. The latter advantage results in computational time saving because both passes can be performed simultaneously. The dynamic learning algorithm is used to train a hybrid feedforward/feedback neural network, a recurrent multilayer perceptron, which was previously found to exhibit good interpolation and extrapolation capabilities in modeling nonlinear dynamic systems. One of the drawbacks, however, of the previously reported work has been the long training times associated with accurate empirical models. The enhanced learning capabilities provided by the dynamic gradient descent-based learning algorithm are demonstrated by a case study of a steam power plant. The number of iterations required for accurate empirical modeling has been reduced from tens of thousands to hundreds, thus significantly expediting the learning process

  8. Empirical Validation of a Thermal Model of a Complex Roof Including Phase Change Materials

    Directory of Open Access Journals (Sweden)

    Stéphane Guichard

    2015-12-01

    Full Text Available This paper deals with the empirical validation of a building thermal model of a complex roof including a phase change material (PCM. A mathematical model dedicated to PCMs based on the heat apparent capacity method was implemented in a multi-zone building simulation code, the aim being to increase the understanding of the thermal behavior of the whole building with PCM technologies. In order to empirically validate the model, the methodology is based both on numerical and experimental studies. A parametric sensitivity analysis was performed and a set of parameters of the thermal model has been identified for optimization. The use of the generic optimization program called GenOpt® coupled to the building simulation code enabled to determine the set of adequate parameters. We first present the empirical validation methodology and main results of previous work. We then give an overview of GenOpt® and its coupling with the building simulation code. Finally, once the optimization results are obtained, comparisons of the thermal predictions with measurements are found to be acceptable and are presented.

  9. Does the Spectrum model accurately predict trends in adult mortality? Evaluation of model estimates using empirical data from a rural HIV community cohort study in north-western Tanzania

    Directory of Open Access Journals (Sweden)

    Denna Michael

    2014-01-01

    Full Text Available Introduction: Spectrum epidemiological models are used by UNAIDS to provide global, regional and national HIV estimates and projections, which are then used for evidence-based health planning for HIV services. However, there are no validations of the Spectrum model against empirical serological and mortality data from populations in sub-Saharan Africa. Methods: Serologic, demographic and verbal autopsy data have been regularly collected among over 30,000 residents in north-western Tanzania since 1994. Five-year age-specific mortality rates (ASMRs per 1,000 person years and the probability of dying between 15 and 60 years of age (45Q15, were calculated and compared with the Spectrum model outputs. Mortality trends by HIV status are shown for periods before the introduction of antiretroviral therapy (1994–1999, 2000–2005 and the first 5 years afterwards (2005–2009. Results: Among 30–34 year olds of both sexes, observed ASMRs per 1,000 person years were 13.33 (95% CI: 10.75–16.52 in the period 1994–1999, 11.03 (95% CI: 8.84–13.77 in 2000–2004, and 6.22 (95% CI; 4.75–8.15 in 2005–2009. Among the same age group, the ASMRs estimated by the Spectrum model were 10.55, 11.13 and 8.15 for the periods 1994–1999, 2000–2004 and 2005–2009, respectively. The cohort data, for both sexes combined, showed that the 45Q15 declined from 39% (95% CI: 27–55% in 1994 to 22% (95% CI: 17–29% in 2009, whereas the Spectrum model predicted a decline from 43% in 1994 to 37% in 2009. Conclusion: From 1994 to 2009, the observed decrease in ASMRs was steeper in younger age groups than that predicted by the Spectrum model, perhaps because the Spectrum model under-estimated the ASMRs in 30–34 year olds in 1994–99. However, the Spectrum model predicted a greater decrease in 45Q15 mortality than observed in the cohort, although the reasons for this over-estimate are unclear.

  10. Development of an empirical model of turbine efficiency using the Taylor expansion and regression analysis

    International Nuclear Information System (INIS)

    Fang, Xiande; Xu, Yu

    2011-01-01

    The empirical model of turbine efficiency is necessary for the control- and/or diagnosis-oriented simulation and useful for the simulation and analysis of dynamic performances of the turbine equipment and systems, such as air cycle refrigeration systems, power plants, turbine engines, and turbochargers. Existing empirical models of turbine efficiency are insufficient because there is no suitable form available for air cycle refrigeration turbines. This work performs a critical review of empirical models (called mean value models in some literature) of turbine efficiency and develops an empirical model in the desired form for air cycle refrigeration, the dominant cooling approach in aircraft environmental control systems. The Taylor series and regression analysis are used to build the model, with the Taylor series being used to expand functions with the polytropic exponent and the regression analysis to finalize the model. The measured data of a turbocharger turbine and two air cycle refrigeration turbines are used for the regression analysis. The proposed model is compact and able to present the turbine efficiency map. Its predictions agree with the measured data very well, with the corrected coefficient of determination R c 2 ≥ 0.96 and the mean absolute percentage deviation = 1.19% for the three turbines. -- Highlights: → Performed a critical review of empirical models of turbine efficiency. → Developed an empirical model in the desired form for air cycle refrigeration, using the Taylor expansion and regression analysis. → Verified the method for developing the empirical model. → Verified the model.

  11. Abnormal Time Experiences in Major Depression: An Empirical Qualitative Study.

    Science.gov (United States)

    Stanghellini, Giovanni; Ballerini, Massimo; Presenza, Simona; Mancini, Milena; Northoff, Georg; Cutting, John

    2017-01-01

    Phenomenological psychopathology, through theoretical and idiographic studies, conceptualizes major depressive disorder (MDD) as a disorder of time experience. Investigations on abnormal time experience (ATE) in MDD adopting methodologies requested by the standards of empirical sciences are still lacking. Our study aimed to provide a qualitative analysis, on an empirical ground and on a large scale, of narratives of temporal experiences of persons affected by MDD. We interviewed 550 consecutive patients affected by affective and schizophrenic disorders. Clinical files were analysed by means of consensual qualitative research. Out of 100 MDD patients, 96 reported at least 1 ATE. The principal categories of ATE are vital retardation - the experience of a stagnation of endogenous vital processes (37 patients), the experience of present and future dominated by the past (29 patients), and the experience of the slackening of the flow oftime (25 patients). A comparison with ATE in schizophrenia patients showed that in MDD, unlike in schizophrenia, there is no disarticulation of time experience (disorder of temporal synthesis) but rather a disorder of conation or inhibition of becoming. The interview style was not meant to make a quantitative assessment ("false negatives" cannot be excluded). Our findings confirm the relevance of distinctive features of ATE in MDD, support the hypothesis of an intrinsic disordered temporal structure in depressive symptoms, and may have direct implications in clinical practice, especially in relation to differential diagnosis, setting the boundaries between "true" and milder forms of depression, and neurobiological research. © 2016 S. Karger AG, Basel.

  12. A behavioural approach to financial portfolio selection problem: an empirical study using heuristics

    OpenAIRE

    Grishina, Nina

    2014-01-01

    This thesis was submitted for the degree of Doctor of Philosophy and awarded by Brunel University The behaviourally based portfolio selection problem with investor's loss aversion and risk aversion biases in portfolio choice under uncertainty are studied. The main results of this work are developed heuristic approaches for the prospect theory and cumulative prospect theory models proposed by Kahneman and Tversky in 1979 and 1992 as well as an empirical comparative analysis of these models ...

  13. Empirical Modeling of Oxygen Uptake of Flow Over Stepped Chutes ...

    African Journals Online (AJOL)

    The present investigation evaluates the influence of three different step chute geometry when skimming flow was allowed over them with the aim of determining the aerated flow length which is a significant factor when developing empirical equations for estimating aeration efficiency of flow. Overall, forty experiments were ...

  14. On the Complete Instability of Empirically Implemented Dynamic Leontief Models

    NARCIS (Netherlands)

    Steenge, A.E.

    1990-01-01

    On theoretical grounds, real world implementations of forward-looking dynamic Leontief systems were expected to be stable. Empirical work, however, showed the opposite to be true: all investigated systems proved to be unstable. In fact, an extreme form of instability ('complete instability')

  15. Corporate Diversification and Firm Performance: an Empirical Study

    Directory of Open Access Journals (Sweden)

    Olu Ojo

    2009-05-01

    Full Text Available The importance of diversification and performance in then strategic management literature is widely accepted among academics and practitioners . However, the proxies for performance and diversification that have been employed in past strategy research has not been unanimously agreed upon. Given the current state of confusion that exists with regard to the impact of corporate diversification on firm performance in selected Nigerian companies. The reason for increased interest in diversification has always been on the possibility that diversification is related to corporate performance. However , while this topic is rich in studies, empirical evidence semerging from various studies about the effect of diversification on performance have so far yield mixed results that are inconclusive and contradictory. In addition , despite the existence of these studies, very litlle attention has been given to the companies in developing countries including Nigeria. This means that there is a major gap in the relevant literature on developing countries which has to be covered by research. This research attempts to fill this gap by studying the situation of the Nigeria companies and providing more empirical evidence on the effects of corporate diversification on firm performance based on individual company-level data. Survey research design was adopted in this study with the application of simple random sampling tehnique in selecting our case study companies as well as our respondents. Primary data were collected through questionnaire. Data were analysed through descriptive statistics and correlation and coefficient of determination were used to test our hypothese. It was discovered that diversification impacted performance of these companies posivitely and we recommend that these companies should engage in geographical diversification in addition to other forms of diversification they are currently involved in for maximum performance.

  16. Context, Experience, Expectation, and Action—Towards an Empirically Grounded, General Model for Analyzing Biographical Uncertainty

    Directory of Open Access Journals (Sweden)

    Herwig Reiter

    2010-01-01

    Full Text Available The article proposes a general, empirically grounded model for analyzing biographical uncertainty. The model is based on findings from a qualitative-explorative study of transforming meanings of unemployment among young people in post-Soviet Lithuania. In a first step, the particular features of the uncertainty puzzle in post-communist youth transitions are briefly discussed. A historical event like the collapse of state socialism in Europe, similar to the recent financial and economic crisis, is a generator of uncertainty par excellence: it undermines the foundations of societies and the taken-for-grantedness of related expectations. Against this background, the case of a young woman and how she responds to the novel threat of unemployment in the transition to the world of work is introduced. Her uncertainty management in the specific time perspective of certainty production is then conceptually rephrased by distinguishing three types or levels of biographical uncertainty: knowledge, outcome, and recognition uncertainty. Biographical uncertainty, it is argued, is empirically observable through the analysis of acting and projecting at the biographical level. The final part synthesizes the empirical findings and the conceptual discussion into a stratification model of biographical uncertainty as a general tool for the biographical analysis of uncertainty phenomena. URN: urn:nbn:de:0114-fqs100120

  17. An empirical study on empowering private bank workers using EFQM

    Directory of Open Access Journals (Sweden)

    Jafar Beikzad

    2012-01-01

    Full Text Available Empowering workers play an essential role on increasing productivity in any organization. The service industries such as insurance companies or banks mostly rely on their own people to retain their customers and incomes. The recent increasing trend on the number of private banks in Iran has increased competition among existing banks. The banking industry strives to empower its employees as much as possible in an attempt to maintain market share by not losing its customers. In this paper, we present an empirical study to detect the most important factors empowering bank employees. The study is implemented for a recently established private bank with 228 people with 32 questions where 15 questions are focused on empowering employees. The results are analyzed using statistical tests and descriptive methods. The results indicate that leadership, academic qualification, appropriate policy and strategy, cooperation and processes play important role on empowering and enabling bank's employee.

  18. Comparing Web Applications with Desktop Applications: An Empirical Study

    DEFF Research Database (Denmark)

    Pop, Paul

    2002-01-01

    In recent years, many desktop applications have been ported to the world wide web in order to reduce (multiplatform) development, distribution and maintenance costs. However, there is little data concerning the usability of web applications, and the impact of their usability on the total cost...... of developing and using such applications. In this paper we present a comparison of web and desktop applications from the usability point of view. The comparison is based on an empirical study that investigates the performance of a group of users on two calendaring applications: Yahoo!Calendar and Microsoft...... Calendar. The study shows that in the case of web applications the performance of the users is significantly reduced, mainly because of the restricted interaction mechanisms provided by current web browsers....

  19. Trade liberalization, social policies and health: an empirical case study.

    Science.gov (United States)

    McNamara, Courtney

    2015-10-12

    This study investigates the health impacts of a major liberalization episode in the textile and clothing (T&C) sector. This episode triggered substantial shifts in employment across a wide range of countries. It is the first study to empirically link trade liberalization to health via changes in employment and offers some of the first empirical insights on how trade liberalization interacts with social policies to influence health. Data from 32 T&C reliant countries were analysed in reference to the pre- and post-liberalization periods of 2000-2004 and 2005-2009. Fuzzy-set qualitative comparative analysis (fsQCA) was used to examine the association between countries' a) level of development b) labour market and welfare state protections c) T&C employment changes and d) changes in adult female and infant mortality rates. Process tracing was used to further investigate these associations through twelve in-depth country studies. Results from the fsQCA relate changes in employment after the phase-out to both changing adult female and infant mortality rates. Findings from the in-depth country studies suggest that the worsening of adult female mortality rates is related to workers' lack of social protection, both in the context of T&C employment growth and loss. Overall, it is found that social protection is often inaccessible to the type of workers who may be the most vulnerable to processes of liberalization and that many workers are particularly vulnerable due to the structure of social protection policies. Social policies are therefore found to both moderate pathways to health and influence the type of health-related pathways resulting from trade liberalizing policies.

  20. Modelling metal speciation in the Scheldt Estuary: Combining a flexible-resolution transport model with empirical functions

    Energy Technology Data Exchange (ETDEWEB)

    Elskens, Marc [Vrije Universiteit Brussel, Analytical, Pleinlaan 2, BE-1050 Brussels (Belgium); Gourgue, Olivier [Université catholique de Louvain, Institute of Mechanics, Materials and Civil Engineering (IMMC), 4 Avenue G. Lemaître, bte L4.05.02, BE-1348 Louvain-la-Neuve (Belgium); Université catholique de Louvain, Georges Lemaître Centre for Earth and Climate Research (TECLIM), Place Louis Pasteur 2, bte L4.03.08, BE-1348 Louvain-la-Neuve (Belgium); Baeyens, Willy [Vrije Universiteit Brussel, Analytical, Pleinlaan 2, BE-1050 Brussels (Belgium); Chou, Lei [Université Libre de Bruxelles, Biogéochimie et Modélisation du Système Terre (BGéoSys) —Océanographie Chimique et Géochimie des Eaux, Campus de la Plaine —CP 208, Boulevard du Triomphe, BE-1050 Brussels (Belgium); Deleersnijder, Eric [Université catholique de Louvain, Institute of Mechanics, Materials and Civil Engineering (IMMC), 4 Avenue G. Lemaître, bte L4.05.02, BE-1348 Louvain-la-Neuve (Belgium); Université catholique de Louvain, Earth and Life Institute (ELI), Georges Lemaître Centre for Earth and Climate Research (TECLIM), Place Louis Pasteur 2, bte L4.03.08, BE-1348 Louvain-la-Neuve (Belgium); Leermakers, Martine [Vrije Universiteit Brussel, Analytical, Pleinlaan 2, BE-1050 Brussels (Belgium); and others

    2014-04-01

    Predicting metal concentrations in surface waters is an important step in the understanding and ultimately the assessment of the ecological risk associated with metal contamination. In terms of risk an essential piece of information is the accurate knowledge of the partitioning of the metals between the dissolved and particulate phases, as the former species are generally regarded as the most bioavailable and thus harmful form. As a first step towards the understanding and prediction of metal speciation in the Scheldt Estuary (Belgium, the Netherlands), we carried out a detailed analysis of a historical dataset covering the period 1982–2011. This study reports on the results for two selected metals: Cu and Cd. Data analysis revealed that both the total metal concentration and the metal partitioning coefficient (K{sub d}) could be predicted using relatively simple empirical functions of environmental variables such as salinity and suspended particulate matter concentration (SPM). The validity of these functions has been assessed by their application to salinity and SPM fields simulated by the hydro-environmental model SLIM. The high-resolution total and dissolved metal concentrations reconstructed using this approach, compared surprisingly well with an independent set of validation measurements. These first results from the combined mechanistic-empirical model approach suggest that it may be an interesting tool for risk assessment studies, e.g. to help identify conditions associated with elevated (dissolved) metal concentrations. - Highlights: • Empirical functions were designed for assessing metal speciation in estuarine water. • The empirical functions were implemented in the hydro-environmental model SLIM. • Validation was carried out in the Scheldt Estuary using historical data 1982–2011. • This combined mechanistic-empirical approach is useful for risk assessment.

  1. Semi-empirical modelization of charge funneling in a NP diode

    International Nuclear Information System (INIS)

    Musseau, O.

    1991-01-01

    Heavy ion interaction with a semiconductor generates a high density of electrons and holes pairs along the trajectory and in a space charge zone the collected charge is considerably increased. The chronology of this charge funneling is described in a semi-empirical model. From initial conditions characterizing the incident ion and the studied structure, it is possible to evaluate directly the transient current, the collected charge and the length of funneling with a good agreement. The model can be extrapolated to more complex structures

  2. Strategy for a Rock Mechanics Site Descriptive Model. Development and testing of the empirical approach

    International Nuclear Information System (INIS)

    Roeshoff, Kennert; Lanaro, Flavio; Lanru Jing

    2002-05-01

    This report presents the results of one part of a wide project for the determination of a methodology for the determination of the rock mechanics properties of the rock mass for the so-called Aespoe Test Case. The Project consists of three major parts: the empirical part dealing with the characterisation of the rock mass by applying empirical methods, a part determining the rock mechanics properties of the rock mass through numerical modelling, and a third part carrying out numerical modelling for the determination of the stress state at Aespoe. All Project's parts were performed based on a limited amount of data about the geology and mechanical tests on samples selected from the Aespoe Database. This Report only considers the empirical approach. The purpose of the project is the development of a descriptive rock mechanics model for SKBs rock mass investigations for a final repository site. The empirical characterisation of the rock mass provides correlations with some of the rock mechanics properties of the rock mass such as the deformation modulus, the friction angle and cohesion for a certain stress interval and the uniaxial compressive strength. For the characterisation of the rock mass, several empirical methods were analysed and reviewed. Among those methods, some were chosen because robust, applicable and widespread in modern rock mechanics. Major weight was given to the well-known Tunnel Quality Index (Q) and Rock Mass Rating (RMR) but also the Rock Mass Index (RMi), the Geological Strength Index (GSI) and Ramamurthy's Criterion were applied for comparison with the two classical methods. The process of: i) sorting the geometrical/geological/rock mechanics data, ii) identifying homogeneous rock volumes, iii) determining the input parameters for the empirical ratings for rock mass characterisation; iv) evaluating the mechanical properties by using empirical relations with the rock mass ratings; was considered. By comparing the methodologies involved by the

  3. Strategy for a Rock Mechanics Site Descriptive Model. Development and testing of the empirical approach

    Energy Technology Data Exchange (ETDEWEB)

    Roeshoff, Kennert; Lanaro, Flavio [Berg Bygg Konsult AB, Stockholm (Sweden); Lanru Jing [Royal Inst. of Techn., Stockholm (Sweden). Div. of Engineering Geology

    2002-05-01

    This report presents the results of one part of a wide project for the determination of a methodology for the determination of the rock mechanics properties of the rock mass for the so-called Aespoe Test Case. The Project consists of three major parts: the empirical part dealing with the characterisation of the rock mass by applying empirical methods, a part determining the rock mechanics properties of the rock mass through numerical modelling, and a third part carrying out numerical modelling for the determination of the stress state at Aespoe. All Project's parts were performed based on a limited amount of data about the geology and mechanical tests on samples selected from the Aespoe Database. This Report only considers the empirical approach. The purpose of the project is the development of a descriptive rock mechanics model for SKBs rock mass investigations for a final repository site. The empirical characterisation of the rock mass provides correlations with some of the rock mechanics properties of the rock mass such as the deformation modulus, the friction angle and cohesion for a certain stress interval and the uniaxial compressive strength. For the characterisation of the rock mass, several empirical methods were analysed and reviewed. Among those methods, some were chosen because robust, applicable and widespread in modern rock mechanics. Major weight was given to the well-known Tunnel Quality Index (Q) and Rock Mass Rating (RMR) but also the Rock Mass Index (RMi), the Geological Strength Index (GSI) and Ramamurthy's Criterion were applied for comparison with the two classical methods. The process of: i) sorting the geometrical/geological/rock mechanics data, ii) identifying homogeneous rock volumes, iii) determining the input parameters for the empirical ratings for rock mass characterisation; iv) evaluating the mechanical properties by using empirical relations with the rock mass ratings; was considered. By comparing the methodologies involved

  4. Assessment of empirical antibiotic therapy optimisation in six hospitals: an observational cohort study.

    Science.gov (United States)

    Braykov, Nikolay P; Morgan, Daniel J; Schweizer, Marin L; Uslan, Daniel Z; Kelesidis, Theodoros; Weisenberg, Scott A; Johannsson, Birgir; Young, Heather; Cantey, Joseph; Srinivasan, Arjun; Perencevich, Eli; Septimus, Edward; Laxminarayan, Ramanan

    2014-12-01

    Modification of empirical antimicrobials when warranted by culture results or clinical signs is recommended to control antimicrobial overuse and resistance. We aimed to assess the frequency with which patients were started on empirical antimicrobials, characteristics of the empirical regimen and the clinical characteristics of patients at the time of starting antimicrobials, patterns of changes to empirical therapy at different timepoints, and modifiable factors associated with changes to the initial empirical regimen in the first 5 days of therapy. We did a chart review of adult inpatients receiving one or more antimicrobials in six US hospitals on 4 days during 2009 and 2010. Our primary outcome was the modification of antimicrobial regimen on or before the 5th day of empirical therapy, analysed as a three-category variable. Bivariate analyses were used to establish demographic and clinical variables associated with the outcome. Variables with p values below 0·1 were included in a multivariable generalised linear latent and mixed model with multinomial logit link to adjust for clustering within hospitals and accommodate a non-binary outcome variable. Across the six study sites, 4119 (60%) of 6812 inpatients received antimicrobials. Of 1200 randomly selected patients with active antimicrobials, 730 (61%) met inclusion criteria. At the start of therapy, 220 (30%) patients were afebrile and had normal white blood cell counts. Appropriate cultures were collected from 432 (59%) patients, and 250 (58%) were negative. By the 5th day of therapy, 12·5% of empirical antimicrobials were escalated, 21·5% were narrowed or discontinued, and 66·4% were unchanged. Narrowing or discontinuation was more likely when cultures were collected at the start of therapy (adjusted OR 1·68, 95% CI 1·05-2·70) and no infection was noted on an initial radiological study (1·76, 1·11-2·79). Escalation was associated with multiple infection sites (2·54, 1·34-4·83) and a positive

  5. Empirical angle-dependent Biot and MBA models for acoustic anisotropy in cancellous bone

    International Nuclear Information System (INIS)

    Lee, Kang ll; Hughes, E R; Humphrey, V F; Leighton, T G; Choi, Min Joo

    2007-01-01

    The Biot and the modified Biot-Attenborough (MBA) models have been found useful to understand ultrasonic wave propagation in cancellous bone. However, neither of the models, as previously applied to cancellous bone, allows for the angular dependence of acoustic properties with direction. The present study aims to account for the acoustic anisotropy in cancellous bone, by introducing empirical angle-dependent input parameters, as defined for a highly oriented structure, into the Biot and the MBA models. The anisotropy of the angle-dependent Biot model is attributed to the variation in the elastic moduli of the skeletal frame with respect to the trabecular alignment. The angle-dependent MBA model employs a simple empirical way of using the parametric fit for the fast and the slow wave speeds. The angle-dependent models were used to predict both the fast and slow wave velocities as a function of propagation angle with respect to the trabecular alignment of cancellous bone. The predictions were compared with those of the Schoenberg model for anisotropy in cancellous bone and in vitro experimental measurements from the literature. The angle-dependent models successfully predicted the angular dependence of phase velocity of the fast wave with direction. The root-mean-square errors of the measured versus predicted fast wave velocities were 79.2 m s -1 (angle-dependent Biot model) and 36.1 m s -1 (angle-dependent MBA model). They also predicted the fact that the slow wave is nearly independent of propagation angle for angles about 50 0 , but consistently underestimated the slow wave velocity with the root-mean-square errors of 187.2 m s -1 (angle-dependent Biot model) and 240.8 m s -1 (angle-dependent MBA model). The study indicates that the angle-dependent models reasonably replicate the acoustic anisotropy in cancellous bone

  6. An empirical Bayesian approach for model-based inference of cellular signaling networks

    Directory of Open Access Journals (Sweden)

    Klinke David J

    2009-11-01

    Full Text Available Abstract Background A common challenge in systems biology is to infer mechanistic descriptions of biological process given limited observations of a biological system. Mathematical models are frequently used to represent a belief about the causal relationships among proteins within a signaling network. Bayesian methods provide an attractive framework for inferring the validity of those beliefs in the context of the available data. However, efficient sampling of high-dimensional parameter space and appropriate convergence criteria provide barriers for implementing an empirical Bayesian approach. The objective of this study was to apply an Adaptive Markov chain Monte Carlo technique to a typical study of cellular signaling pathways. Results As an illustrative example, a kinetic model for the early signaling events associated with the epidermal growth factor (EGF signaling network was calibrated against dynamic measurements observed in primary rat hepatocytes. A convergence criterion, based upon the Gelman-Rubin potential scale reduction factor, was applied to the model predictions. The posterior distributions of the parameters exhibited complicated structure, including significant covariance between specific parameters and a broad range of variance among the parameters. The model predictions, in contrast, were narrowly distributed and were used to identify areas of agreement among a collection of experimental studies. Conclusion In summary, an empirical Bayesian approach was developed for inferring the confidence that one can place in a particular model that describes signal transduction mechanisms and for inferring inconsistencies in experimental measurements.

  7. An empirical model for independent control of variable speed refrigeration system

    International Nuclear Information System (INIS)

    Li Hua; Jeong, Seok-Kwon; Yoon, Jung-In; You, Sam-Sang

    2008-01-01

    This paper deals with an empirical dynamic model for decoupling control of the variable speed refrigeration system (VSRS). To cope with inherent complexity and nonlinearity in system dynamics, the model parameters are first obtained based on experimental data. In the study, the dynamic characteristics of indoor temperature and superheat are assumed to be first-order model with time delay. While the compressor frequency and opening angle of electronic expansion valve are varying, the indoor temperature and the superheat exhibit interfering characteristics each other in the VSRS. Thus, each decoupling model has been proposed to eliminate such interference. Finally, the experiment and simulation results indicate that the proposed model offers more tractable means for describing the actual VSRS comparing to other models currently available

  8. Narration and Escalation. An Empirical Study of Conflict Narratives

    Directory of Open Access Journals (Sweden)

    Evelyn Gius

    2016-06-01

    Full Text Available This article describes the methodology and the outcomes of an empirical study of conflict narratives. The narratological analysis deployed narratological catego­ries in the structuralist tradition based on Genette and was conducted with the help of the text annotation tool CATMA. The analysis aimed at covering as many narratological phenomena as possible by establishing 14 fields of narrato­logical phenomena that were annotated in a corpus of 39 factual narratives about situations at the workplace with and without conflicts. The evaluation of approximately 28,000 annotations brought to light a series of interrelations be­tween narratological phenomena and the presence or absence of conflicts in the narratives. Additionally, this approach led to the identification of some over­sights of narrative theory by detecting hitherto unnoticed interrelations among narratological concepts.

  9. Information Assurance in Saudi Organizations - An Empirical Study

    Science.gov (United States)

    Nabi, Syed Irfan; Mirza, Abdulrahman A.; Alghathbar, Khaled

    This paper presents selective results of a survey conducted to find out the much needed insight into the status of information security in Saudi Arabian organizations. The purpose of this research is to give the state of information assurance in the Kingdom and to better understand the prevalent ground realities. The survey covered technical aspects of information security, risk management and information assurance management. The results provide deep insights in to the existing level of information assurance in various sectors that can be helpful in better understanding the intricate details of the prevalent information security in the Kingdom. Also, the results can be very useful for information assurance policy makers in the government as well as private sector organizations. There are few empirical studies on information assurance governance available in literature, especially about the Middle East and Saudi Arabia, therefore, the results are invaluable for information security researchers in improving the understanding of information assurance in this region and the Kingdom.

  10. Environmental management in Slovenian industrial enterprises - Empirical study

    Directory of Open Access Journals (Sweden)

    Vesna Čančer

    2002-01-01

    Full Text Available timulated with the firm belief that environmental management helps enterprises to achieve business success, expressed by a majority of managers in the sample enterprises, we present the results of an empirical study in the Slovene processing industry. The purpose of our research work is to identify, analyse and present the importance of the environment in business decision-making, the role of environmental management in strategic decision-making and its distribution across the business functions; environmental performance in business processes; the use of the methods for environmentally oriented business decision-making and the developmental tendencies of environmental management in Slovene enterprises of the processing industry. We define the key drivers of environmental management and their effect on the environmental behaviour of these enterprises. We present and interpret data indicating that environmental management is caused not only by compliance and regulation, but also by competition and enterprises’ own initiative.

  11. Empirical study on social groups in pedestrian evacuation dynamics

    Science.gov (United States)

    von Krüchten, Cornelia; Schadschneider, Andreas

    2017-06-01

    Pedestrian crowds often include social groups, i.e. pedestrians that walk together because of social relationships. They show characteristic configurations and influence the dynamics of the entire crowd. In order to investigate the impact of social groups on evacuations we performed an empirical study with pupils. Several evacuation runs with groups of different sizes and different interactions were performed. New group parameters are introduced which allow to describe the dynamics of the groups and the configuration of the group members quantitatively. The analysis shows a possible decrease of evacuation times for large groups due to self-ordering effects. Social groups can be approximated as ellipses that orientate along their direction of motion. Furthermore, explicitly cooperative behaviour among group members leads to a stronger aggregation of group members and an intermittent way of evacuation.

  12. Modeling social networks in geographic space: approach and empirical application

    NARCIS (Netherlands)

    Arentze, T.A.; Berg, van den P.E.W.; Timmermans, H.J.P.

    2012-01-01

    Social activities are responsible for a large proportion of travel demands of individuals. Modeling of the social network of a studied population offers a basis to predict social travel in a more comprehensive way than currently is possible. In this paper we develop a method to generate a whole

  13. ACCOUNTING MANIPULATION: AN EMPIRICAL STUDY REGARDING MANAGERS’ BEHAVIOR

    Directory of Open Access Journals (Sweden)

    Balaciu Diana Elisabeta

    2014-07-01

    Full Text Available The study analyses managers’ behaviour from Romania when dealing with creative accounting. For this we realized an empirical study within the Arad county space, having as main objective the identification of the managers’ perception regarding the usefulness of accounting information and of other factors considered when making decisions. Another aspect was to interpret the managers’ tendency towards manipulation or strategic management of results. The empirical research was carried out between December 2013 and January 2014, and the research method used was the survey, and as an instrument of the research we used the questionnaire. The questionnaires were posted on a website but some of them were applied directly in the field to ensure a response rate of at least 30%. For the analysis and confirmation of the feasibility of the questionnaire and of the measurement scale, we used Cronbach Alpha method. The results obtained after statistically processing the answers received from respondents and after testing the research hypotheses show that there is an increased interest of the managers from the investigated Arad firms to improve the quality of the financial accounting information and to give the users as beautiful an image as possible of the company’s performances; it is noticed their inclination to the manipulation of the accounting figures. On the other hand the results obtained demonstrate that, despite the fact that most of the managers from the sample consider that using ethics is a priority in making decisions, this idea is not an obstacle for more than a half to change an accounting policy that would affect the true image but would be favorable to the image of the company The utility of such research contributes to the insurance of the premises for the development of future research, looking to test the reaction of the professional accountants in our country, regarding the phenomenon of creative accounting.

  14. Empirical pseudo-potential studies on electronic structure

    Indian Academy of Sciences (India)

    Theoretical investigations of electronic structure of quantum dots is of current interest in nanophase materials. Empirical theories such as effective mass approximation, tight binding methods and empirical pseudo-potential method are capable of explaining the experimentally observed optical properties. We employ the ...

  15. Empirical models for the estimation of global solar radiation with sunshine hours on horizontal surface in various cities of Pakistan

    International Nuclear Information System (INIS)

    Gadiwala, M.S.; Usman, A.; Akhtar, M.; Jamil, K.

    2013-01-01

    In developing countries like Pakistan the global solar radiation and its components is not available for all locations due to which there is a requirement of using different models for the estimation of global solar radiation that use climatological parameters of the locations. Only five long-period locations data of solar radiation data is available in Pakistan (Karachi, Quetta, Lahore, Multan and Peshawar). These locations almost encompass the different geographical features of Pakistan. For this reason in this study the Mean monthly global solar radiation has been estimated using empirical models of Angstrom, FAO, Glover Mc-Culloch, Sangeeta & Tiwari for the diversity of approach and use of climatic and geographical parameters. Empirical constants for these models have been estimated and the results obtained by these models have been tested statistically. The results show encouraging agreement between estimated and measured values. The outcome of these empirical models will assist the researchers working on solar energy estimation of the location having similar conditions

  16. Models of expected returns on the brazilian market: Empirical tests using predictive methodology

    Directory of Open Access Journals (Sweden)

    Adriano Mussa

    2009-01-01

    Full Text Available Predictive methodologies for test of the expected returns models are largely diffused on the international academic environment. However, these methods have not been used in Brazil in a systematic way. Generally, empirical studies proceeded with Brazilian stock market data are concentrated only in the first step of these methodologies. The purpose of this article was test and compare the models CAPM, 3-factors and 4-factors using a predictive methodology, considering two steps – temporal and cross-section regressions – with standard errors obtained by the techniques of Fama and Macbeth (1973. The results indicated the superiority of the 4-fators model as compared to the 3-fators model, and the superiority of the 3- factors model as compared to the CAPM, but no one of the tested models were enough on the explanation of the Brazilian stock returns. Contrary to some empirical evidences, that do not use predictive methodology, the size and momentum effect seem do not exist on the Brazilian capital markets, but there are evidences of the value effect and the relevance of the market for explanation of expected returns. These finds rise some questions, mainly caused by the originality of the methodology on the local market and by the fact that this subject is still incipient and polemic on the Brazilian academic environment.

  17. Empirical Reconstruction and Numerical Modeling of the First Geoeffective Coronal Mass Ejection of Solar Cycle 24

    Science.gov (United States)

    Wood, B. E.; Wu, C.-C.; Howard, R. A.; Socker, D. G.; Rouillard, A. P.

    2011-03-01

    We analyze the kinematics and morphology of a coronal mass ejection (CME) from 2010 April 3, which was responsible for the first significant geomagnetic storm of solar cycle 24. The analysis utilizes coronagraphic and heliospheric images from the two STEREO spacecraft, and coronagraphic images from SOHO/LASCO. Using an empirical three-dimensional (3D) reconstruction technique, we demonstrate that the CME can be reproduced reasonably well at all times with a 3D flux rope shape, but the case for a flux rope being the correct interpretation is not as strong as some events studied with STEREO in the past, given that we are unable to infer a unique orientation for the flux rope. A model with an orientation angle of -80° from the ecliptic plane (i.e., nearly N-S) works best close to the Sun, but a model at 10° (i.e., nearly E-W) works better far from the Sun. Both interpretations require the cross section of the flux rope to be significantly elliptical rather than circular. In addition to our empirical modeling, we also present a fully 3D numerical MHD model of the CME. This physical model appears to effectively reproduce aspects of the shape and kinematics of the CME's leading edge. It is particularly encouraging that the model reproduces the amount of interplanetary deceleration observed for the CME during its journey from the Sun to 1 AU.

  18. EMPIRICAL RECONSTRUCTION AND NUMERICAL MODELING OF THE FIRST GEOEFFECTIVE CORONAL MASS EJECTION OF SOLAR CYCLE 24

    International Nuclear Information System (INIS)

    Wood, B. E.; Wu, C.-C.; Howard, R. A.; Socker, D. G.; Rouillard, A. P.

    2011-01-01

    We analyze the kinematics and morphology of a coronal mass ejection (CME) from 2010 April 3, which was responsible for the first significant geomagnetic storm of solar cycle 24. The analysis utilizes coronagraphic and heliospheric images from the two STEREO spacecraft, and coronagraphic images from SOHO/LASCO. Using an empirical three-dimensional (3D) reconstruction technique, we demonstrate that the CME can be reproduced reasonably well at all times with a 3D flux rope shape, but the case for a flux rope being the correct interpretation is not as strong as some events studied with STEREO in the past, given that we are unable to infer a unique orientation for the flux rope. A model with an orientation angle of -80 deg. from the ecliptic plane (i.e., nearly N-S) works best close to the Sun, but a model at 10 deg. (i.e., nearly E-W) works better far from the Sun. Both interpretations require the cross section of the flux rope to be significantly elliptical rather than circular. In addition to our empirical modeling, we also present a fully 3D numerical MHD model of the CME. This physical model appears to effectively reproduce aspects of the shape and kinematics of the CME's leading edge. It is particularly encouraging that the model reproduces the amount of interplanetary deceleration observed for the CME during its journey from the Sun to 1 AU.

  19. An Empirical Study Analyzing Job Productivity in Toxic Workplace Environments

    Directory of Open Access Journals (Sweden)

    Amna Anjum

    2018-05-01

    Full Text Available Purpose: This empirical study aims to determine the effects of a toxic workplace environment, which can negatively impact the job productivity of an employee. Methodology: Three hundred questionnaires were randomly distributed among the staff members of seven private universities in Pakistan with a final response rate of 89%. For analysis purposes, AMOS 22 was used to study the direct and indirect effects of the toxic workplace environment on job productivity. Confirmatory Factor Analysis (CFA was conducted to ensure the convergent and discriminant validity of the factors, while the Hayes mediation approach was used to verify the mediating role of job burnout between the four dimensions of toxic workplace environment and job productivity. A toxic workplace with multiple dimensions, such as workplace ostracism, workplace incivility, workplace harassment, and workplace bullying, was used in this study. Findings: By using the multiple statistical tools and techniques, it has been proven that ostracism, incivility, harassment, and bullying have direct negative significant effects on job productivity, while job burnout was shown to be a statistical significant mediator between the dimensions of a toxic workplace environment and job productivity. Finally, we concluded that organizations need to eradicate the factors of toxic workplace environments to ensure their prosperity and success. Practical Implications: This study encourages managers, leaders, and top management to adopt appropriate policies for enhancing employees’ productivity. Limitations: This study was conducted by using a cross-sectional research design. Future research aims to expand the study by using a longitudinal research design.

  20. An Empirical Study Analyzing Job Productivity in Toxic Workplace Environments.

    Science.gov (United States)

    Anjum, Amna; Ming, Xu; Siddiqi, Ahmed Faisal; Rasool, Samma Faiz

    2018-05-21

    Purpose: This empirical study aims to determine the effects of a toxic workplace environment, which can negatively impact the job productivity of an employee. Methodology: Three hundred questionnaires were randomly distributed among the staff members of seven private universities in Pakistan with a final response rate of 89%. For analysis purposes, AMOS 22 was used to study the direct and indirect effects of the toxic workplace environment on job productivity. Confirmatory Factor Analysis (CFA) was conducted to ensure the convergent and discriminant validity of the factors, while the Hayes mediation approach was used to verify the mediating role of job burnout between the four dimensions of toxic workplace environment and job productivity. A toxic workplace with multiple dimensions, such as workplace ostracism, workplace incivility, workplace harassment, and workplace bullying, was used in this study. Findings: By using the multiple statistical tools and techniques, it has been proven that ostracism, incivility, harassment, and bullying have direct negative significant effects on job productivity, while job burnout was shown to be a statistical significant mediator between the dimensions of a toxic workplace environment and job productivity. Finally, we concluded that organizations need to eradicate the factors of toxic workplace environments to ensure their prosperity and success. Practical Implications: This study encourages managers, leaders, and top management to adopt appropriate policies for enhancing employees’ productivity. Limitations: This study was conducted by using a cross-sectional research design. Future research aims to expand the study by using a longitudinal research design.

  1. An empirical model to predict infield thin layer drying rate of cut switchgrass

    International Nuclear Information System (INIS)

    Khanchi, A.; Jones, C.L.; Sharma, B.; Huhnke, R.L.; Weckler, P.; Maness, N.O.

    2013-01-01

    A series of 62 thin layer drying experiments were conducted to evaluate the effect of solar radiation, vapor pressure deficit and wind speed on drying rate of switchgrass. An environmental chamber was fabricated that can simulate field drying conditions. An empirical drying model based on maturity stage of switchgrass was also developed during the study. It was observed that solar radiation was the most significant factor in improving the drying rate of switchgrass at seed shattering and seed shattered maturity stage. Therefore, drying switchgrass in wide swath to intercept the maximum amount of radiation at these stages of maturity is recommended. Moreover, it was observed that under low radiation intensity conditions, wind speed helps to improve the drying rate of switchgrass. Field operations such as raking or turning of the windrows are recommended to improve air circulation within a swath on cloudy days. Additionally, it was found that the effect of individual weather parameters on the drying rate of switchgrass was dependent on maturity stage. Vapor pressure deficit was strongly correlated with the drying rate during seed development stage whereas, vapor pressure deficit was weakly correlated during seed shattering and seed shattered stage. These findings suggest the importance of using separate drying rate models for each maturity stage of switchgrass. The empirical models developed in this study can predict the drying time of switchgrass based on the forecasted weather conditions so that the appropriate decisions can be made. -- Highlights: • An environmental chamber was developed in the present study to simulate field drying conditions. • An empirical model was developed that can estimate drying rate of switchgrass based on forecasted weather conditions. • Separate equations were developed based on maturity stage of switchgrass. • Designed environmental chamber can be used to evaluate the effect of other parameters that affect drying of crops

  2. How "Does" the Comforting Process Work? An Empirical Test of an Appraisal-Based Model of Comforting

    Science.gov (United States)

    Jones, Susanne M.; Wirtz, John G.

    2006-01-01

    Burleson and Goldsmith's (1998) comforting model suggests an appraisal-based mechanism through which comforting messages can bring about a positive change in emotional states. This study is a first empirical test of three causal linkages implied by the appraisal-based comforting model. Participants (N=258) talked about an upsetting event with a…

  3. Towards an Empirical-Relational Model of Supply Chain Flexibility

    OpenAIRE

    Santanu Mandal

    2015-01-01

    Supply chains are prone to disruptions and associated risks. To develop capabilities for risk mitigation, supply chains need to be flexible. A flexible supply chain can respond better to environmental contingencies. Based on the theoretical tenets of resource-based view, relational view and dynamic capabilities theory, the current study develops a relational model of supply chain flexibility comprising trust, commitment, communication, co-operation, adaptation and interdependence. Subsequentl...

  4. PERFORMANCE EVALUATION OF EMPIRICAL MODELS FOR VENTED LEAN HYDROGEN EXPLOSIONS

    OpenAIRE

    Anubhav Sinha; Vendra C. Madhav Rao; Jennifer X. Wen

    2017-01-01

    Explosion venting is a method commonly used to prevent or minimize damage to an enclosure caused by an accidental explosion. An estimate of the maximum overpressure generated though explosion is an important parameter in the design of the vents. Various engineering models (Bauwens et al., 2012, Molkov and Bragin, 2015) and European (EN 14994 ) and USA standards (NFPA 68) are available to predict such overpressure. In this study, their performance is evaluated using a number of published exper...

  5. An empirical firn-densification model comprising ice-lences

    DEFF Research Database (Denmark)

    Reeh, Niels; Fisher, D.A.; Koerner, R.M.

    2005-01-01

    a suitable value of the surface snow density. In the present study, a simple densification model is developed that specifically accounts for the content of ice lenses in the snowpack. An annual layer is considered to be composed of an ice fraction and a firn fraction. It is assumed that all meltwater formed...... changes reflect a volume change of the ice sheet with no corresponding change of mass, i.e. a volume change that does not influence global sea level....

  6. Empirical study on the feasibility of measures for public self-protection capability enhancement

    International Nuclear Information System (INIS)

    Goersch, Henning G.; Werner, Ute

    2011-01-01

    The empirical study on the feasibility of measures for public self-protection capability enhancement covers the following issues with several sections: (1) Introduction: scope of the study; structure of the study. (2) Issue coherence: self-protection; reduction and prevention of damage by personal emergency preparedness, personal emergency preparedness in Germany. (3) Solution coherence: scientific approaches, development of practical problem solution approaches, proposal of a promotion system. (4) Empirical studies: Promotion system evaluation by experts; questioning of the public; Delphi-study on minimum standards in emergency preparedness; local networks in emergency preparedness. (5) Evaluation of models for personal emergency preparedness (M3P). (6) Integration of all research results into the approach of emergency preparedness: scope; recommendations, conclusions.

  7. Risky forward interest rates and swaptions: Quantum finance model and empirical results

    Science.gov (United States)

    Baaquie, Belal Ehsan; Yu, Miao; Bhanap, Jitendra

    2018-02-01

    Risk free forward interest rates (Diebold and Li, 2006 [1]; Jamshidian, 1991 [2 ]) - and their realization by US Treasury bonds as the leading exemplar - have been studied extensively. In Baaquie (2010), models of risk free bonds and their forward interest rates based on the quantum field theoretic formulation of the risk free forward interest rates have been discussed, including the empirical evidence supporting these models. The quantum finance formulation of risk free forward interest rates is extended to the case of risky forward interest rates. The examples of the Singapore and Malaysian forward interest rates are used as specific cases. The main feature of the quantum finance model is that the risky forward interest rates are modeled both a) as a stand-alone case as well as b) being driven by the US forward interest rates plus a spread - having its own term structure -above the US forward interest rates. Both the US forward interest rates and the term structure for the spread are modeled by a two dimensional Euclidean quantum field. As a precursor to the evaluation of put option of the Singapore coupon bond, the quantum finance model for swaptions is tested using empirical study of swaptions for the US Dollar -showing that the model is quite accurate. A prediction for the market price of the put option for the Singapore coupon bonds is obtained. The quantum finance model is generalized to study the Malaysian case and the Malaysian forward interest rates are shown to have anomalies absent for the US and Singapore case. The model's prediction for a Malaysian interest rate swap is obtained.

  8. ADAPTATION PROCESS TO CLIMATE CHANGE IN AGRICULTURE- AN EMPIRICAL STUDY

    Directory of Open Access Journals (Sweden)

    Ghulam Mustafa

    2017-10-01

    Full Text Available Climatic variations affect agriculture in a process with no known end means. Adaptations help to reduce the adverse impacts of climate change. Unfortunately, adaptation has never been considered as a process. Current study empirically identified the adaptation process and its different stages. Moreover, little is known about the farm level adaptation strategies and their determinants. The study in hand found farm level adaptation strategies and determinants of these strategies. The study identified three stages of adaptation i.e. perception, intention and adaptation. It was found that 71.4% farmers perceived about climate change, 58.5% intended to adapt while 40.2% actually adapted. The study further explored that farmers do adaptations through changing crop variety (56.3%, changing planting dates (44.6%, tree plantation (37.5%, increase/conserve irrigation (39.3% and crop diversification (49.2%. The adaptation strategies used by farmers were autonomous and mostly determined perception to climate change. It was also noted that the adaptation strategies move in a circular process and once they are adapted they remained adapted for a longer period of time. Some constraints slow the adaptation process so; we recommend farmers should be given price incentives to speed-up this process.

  9. Empirical results of entrepreneurs’ network: case study of Slovakia

    Directory of Open Access Journals (Sweden)

    Ladislav Mura

    2017-05-01

    Full Text Available The changes in economic development of different countries in recent decades are influenced by processes that have modified the ways and forms of doing business. While in the past, the emphasis was put on size of the company, currently its participation in various forms of network cooperation prevails. The following paper presents results of empirical research realized between the small and medium sized enterprises involved into entrepreneurs’ networks. Research was conducted during the period 2014 – 2015 in a frame of scientific research project VEGA 1/0381/13 and KEGA 001UCM-4/2016. The main aim of paper is to propose the basic model of network entrepreneurial cooperation in the conditions of the Slovak republic. The novelty of this paper is to mention the main steps that should be done by stakeholders, if they want to create successful network. The partial aims of this paper are: the evaluation of quantitative and qualitative conditions for networking and description of partial steps which lead to creation of network cooperation. We used selected quantitative methods (Localization coefficient with combination of BCG matrix. Qualitative conditions will be evaluated through the results of questionnaire survey, which identified the entrepreneurs’ suggestions of business for engagement into the network (by using Pearson chi-square, Kruskal Wallis and Median Tests.

  10. Inter-firm Networks, Organizational Learning and Knowledge Updating: An Empirical Study

    Science.gov (United States)

    Zhang, Su-rong; Wang, Wen-ping

    In the era of knowledge-based economy which information technology develops rapidly, the rate of knowledge updating has become a critical factor for enterprises to gaining competitive advantage .We build an interactional theoretical model among inter-firm networks, organizational learning and knowledge updating thereby and demonstrate it with empirical study at last. The result shows that inter-firm networks and organizational learning is the source of knowledge updating.

  11. An empirical study of business effect and industry effect in Galicia

    Directory of Open Access Journals (Sweden)

    Susana Iglesias

    2007-07-01

    Full Text Available This work is a contribution to the analysis of the influence that industry and business factors have on the variability of the organizational performance. A linear hierarchical model with fixed effects is applied to a sample of Galician firms. The results show that the portion of such variability explained by the business factor is clearly greater than that explained by the industry factor. These results, in favour of the business effect, are similar to other obtained in previous empirical studies.

  12. An Automated Defect Prediction Framework using Genetic Algorithms: A Validation of Empirical Studies

    Directory of Open Access Journals (Sweden)

    Juan Murillo-Morera

    2016-05-01

    Full Text Available Today, it is common for software projects to collect measurement data through development processes. With these data, defect prediction software can try to estimate the defect proneness of a software module, with the objective of assisting and guiding software practitioners. With timely and accurate defect predictions, practitioners can focus their limited testing resources on higher risk areas. This paper reports the results of three empirical studies that uses an automated genetic defect prediction framework. This framework generates and compares different learning schemes (preprocessing + attribute selection + learning algorithms and selects the best one using a genetic algorithm, with the objective to estimate the defect proneness of a software module. The first empirical study is a performance comparison of our framework with the most important framework of the literature. The second empirical study is a performance and runtime comparison between our framework and an exhaustive framework. The third empirical study is a sensitivity analysis. The last empirical study, is our main contribution in this paper. Performance of the software development defect prediction models (using AUC, Area Under the Curve was validated using NASA-MDP and PROMISE data sets. Seventeen data sets from NASA-MDP (13 and PROMISE (4 projects were analyzed running a NxM-fold cross-validation. A genetic algorithm was used to select the components of the learning schemes automatically, and to assess and report the results. Our results reported similar performance between frameworks. Our framework reported better runtime than exhaustive framework. Finally, we reported the best configuration according to sensitivity analysis.

  13. An Empirical Model of Wage Dispersion with Sorting

    DEFF Research Database (Denmark)

    Bagger, Jesper; Lentz, Rasmus

    (submodular). The model is estimated on Danish matched employer-employee data. We find evidence of positive assortative matching. In the estimated equilibrium match distribution, the correlation between worker skill and firm productivity is 0.12. The assortative matching has a substantial impact on wage......This paper studies wage dispersion in an equilibrium on-the-job-search model with endogenous search intensity. Workers differ in their permanent skill level and firms differ with respect to productivity. Positive (negative) sorting results if the match production function is supermodular...... to mismatch by asking how much greater output would be if the estimated population of matches were perfectly positively assorted. In this case, output would increase by 7.7%....

  14. Psychological Vulnerability to Completed Suicide: A Review of Empirical Studies.

    Science.gov (United States)

    Conner, Kenneth R.; Duberstein, Paul R.; Conwell, Yeates; Seidlitz, Larry; Caine, Eric D.

    2001-01-01

    This article reviews empirical literature on psychological vulnerability to completed suicide. Five constructs have been consistently associated with completed suicide: impulsivity/aggression; depression; anxiety; hopelessness; and self-consciousness/social disengagement. Current knowledge of psychological vulnerability could inform social…

  15. EMPIRICAL STUDY REGARDING SUSTAINABILITY OF ROMANIAN PENSION SYSTEM

    Directory of Open Access Journals (Sweden)

    Oprean Delia

    2013-07-01

    Full Text Available This paper is part of a broad, applied scientific research, based on popular empirical procedures (such as natural observation. Positivistic and constructive research methodology used was based on the consensual-inductive system (Locke, which is why we studied the different views of specialists on sustainability of pensions in Romania, necessary to formulate the problem of generating relevant information. Research strategies used were the comparative and longitudinal ones, as we analyzed the time evolution of qualitative indicators VUAN (unitary value of net asset specific to pension funds Pillar II and Pillar III of Romania, concomitant with the number of participants in these funds, as to determine their direct relationship with the need for sustainability in this area. The hypotheses regarding causal relationship efficiency – participants - sustainability and needed measures for pension reform were built in this paper inductively (by analyzing the sustainability issues of pensions in time, causally (by explaining the cause and effect phenomenon studied, deductively, logically and subjectively (due to the existence and perpetuation of conflict premise between generations and social inequality between employees and pensioners. The qualitative approach of the phenomenon studied by collecting information (using mediated data collection technique has allowed the relevant findings and practical solutions necessary for all those involved in this concerted action of pensions, which affects us all.

  16. Evaluating Method Engineer Performance: an error classification and preliminary empirical study

    Directory of Open Access Journals (Sweden)

    Steven Kelly

    1998-11-01

    Full Text Available We describe an approach to empirically test the use of metaCASE environments to model methods. Both diagrams and matrices have been proposed as a means for presenting the methods. These different paradigms may have their own effects on how easily and well users can model methods. We extend Batra's classification of errors in data modelling to cover metamodelling, and use it to measure the performance of a group of metamodellers using either diagrams or matrices. The tentative results from this pilot study confirm the usefulness of the classification, and show some interesting differences between the paradigms.

  17. An Empirical Study on Using Visual Embellishments in Visualization.

    Science.gov (United States)

    Borgo, R; Abdul-Rahman, A; Mohamed, F; Grant, P W; Reppa, I; Floridi, L; Chen, Min

    2012-12-01

    In written and spoken communications, figures of speech (e.g., metaphors and synecdoche) are often used as an aid to help convey abstract or less tangible concepts. However, the benefits of using rhetorical illustrations or embellishments in visualization have so far been inconclusive. In this work, we report an empirical study to evaluate hypotheses that visual embellishments may aid memorization, visual search and concept comprehension. One major departure from related experiments in the literature is that we make use of a dual-task methodology in our experiment. This design offers an abstraction of typical situations where viewers do not have their full attention focused on visualization (e.g., in meetings and lectures). The secondary task introduces "divided attention", and makes the effects of visual embellishments more observable. In addition, it also serves as additional masking in memory-based trials. The results of this study show that visual embellishments can help participants better remember the information depicted in visualization. On the other hand, visual embellishments can have a negative impact on the speed of visual search. The results show a complex pattern as to the benefits of visual embellishments in helping participants grasp key concepts from visualization.

  18. Setting healthcare priorities in hospitals: a review of empirical studies.

    Science.gov (United States)

    Barasa, Edwine W; Molyneux, Sassy; English, Mike; Cleary, Susan

    2015-04-01

    Priority setting research has focused on the macro (national) and micro (bedside) level, leaving the meso (institutional, hospital) level relatively neglected. This is surprising given the key role that hospitals play in the delivery of healthcare services and the large proportion of health systems resources that they absorb. To explore the factors that impact upon priority setting at the hospital level, we conducted a thematic review of empirical studies. A systematic search of PubMed, EBSCOHOST, Econlit databases and Google scholar was supplemented by a search of key websites and a manual search of relevant papers' reference lists. A total of 24 papers were identified from developed and developing countries. We applied a policy analysis framework to examine and synthesize the findings of the selected papers. Findings suggest that priority setting practice in hospitals was influenced by (1) contextual factors such as decision space, resource availability, financing arrangements, availability and use of information, organizational culture and leadership, (2) priority setting processes that depend on the type of priority setting activity, (3) content factors such as priority setting criteria and (4) actors, their interests and power relations. We observe that there is need for studies to examine these issues and the interplay between them in greater depth and propose a conceptual framework that might be useful in examining priority setting practices in hospitals. Published by Oxford University Press in association with The London School of Hygiene and Tropical Medicine © The Author 2014; all rights reserved.

  19. Safety certification of airborne software: An empirical study

    International Nuclear Information System (INIS)

    Dodd, Ian; Habli, Ibrahim

    2012-01-01

    Many safety-critical aircraft functions are software-enabled. Airborne software must be audited and approved by the aerospace certification authorities prior to deployment. The auditing process is time-consuming, and its outcome is unpredictable, due to the criticality and complex nature of airborne software. To ensure that the engineering of airborne software is systematically regulated and is auditable, certification authorities mandate compliance with safety standards that detail industrial best practice. This paper reviews existing practices in software safety certification. It also explores how software safety audits are performed in the civil aerospace domain. The paper then proposes a statistical method for supporting software safety audits by collecting and analysing data about the software throughout its lifecycle. This method is then empirically evaluated through an industrial case study based on data collected from 9 aerospace projects covering 58 software releases. The results of this case study show that our proposed method can help the certification authorities and the software and safety engineers to gain confidence in the certification readiness of airborne software and predict the likely outcome of the audits. The results also highlight some confidentiality issues concerning the management and retention of sensitive data generated from safety-critical projects.

  20. Permeability-driven selection in a semi-empirical protocell model

    DEFF Research Database (Denmark)

    Piedrafita, Gabriel; Monnard, Pierre-Alain; Mavelli, Fabio

    2017-01-01

    to prebiotic systems evolution more intricate, but were surely essential for sustaining far-from-equilibrium chemical dynamics, given their functional relevance in all modern cells. Here we explore a protocellular scenario in which some of those additional constraints/mechanisms are addressed, demonstrating...... their 'system-level' implications. In particular, an experimental study on the permeability of prebiotic vesicle membranes composed of binary lipid mixtures allows us to construct a semi-empirical model where protocells are able to reproduce and undergo an evolutionary process based on their coupling...

  1. Empirical modeling and data analysis for engineers and applied scientists

    CERN Document Server

    Pardo, Scott A

    2016-01-01

    This textbook teaches advanced undergraduate and first-year graduate students in Engineering and Applied Sciences to gather and analyze empirical observations (data) in order to aid in making design decisions. While science is about discovery, the primary paradigm of engineering and "applied science" is design. Scientists are in the discovery business and want, in general, to understand the natural world rather than to alter it. In contrast, engineers and applied scientists design products, processes, and solutions to problems. That said, statistics, as a discipline, is mostly oriented toward the discovery paradigm. Young engineers come out of their degree programs having taken courses such as "Statistics for Engineers and Scientists" without any clear idea as to how they can use statistical methods to help them design products or processes. Many seem to think that statistics is only useful for demonstrating that a device or process actually does what it was designed to do. Statistics courses emphasize creati...

  2. Temporal structure of neuronal population oscillations with empirical model decomposition

    International Nuclear Information System (INIS)

    Li Xiaoli

    2006-01-01

    Frequency analysis of neuronal oscillation is very important for understanding the neural information processing and mechanism of disorder in the brain. This Letter addresses a new method to analyze the neuronal population oscillations with empirical mode decomposition (EMD). Following EMD of neuronal oscillation, a series of intrinsic mode functions (IMFs) are obtained, then Hilbert transform of IMFs can be used to extract the instantaneous time frequency structure of neuronal oscillation. The method is applied to analyze the neuronal oscillation in the hippocampus of epileptic rats in vivo, the results show the neuronal oscillations have different descriptions during the pre-ictal, seizure onset and ictal periods of the epileptic EEG at the different frequency band. This new method is very helpful to provide a view for the temporal structure of neural oscillation

  3. Reconstructing plateau icefields: Evaluating empirical and modelled approaches

    Science.gov (United States)

    Pearce, Danni; Rea, Brice; Barr, Iestyn

    2013-04-01

    Glacial landforms are widely utilised to reconstruct former glacier geometries with a common aim to estimate the Equilibrium Line Altitudes (ELAs) and from these, infer palaeoclimatic conditions. Such inferences may be studied on a regional scale and used to correlate climatic gradients across large distances (e.g., Europe). In Britain, the traditional approach uses geomorphological mapping with hand contouring to derive the palaeo-ice surface. Recently, ice surface modelling enables an equilibrium profile reconstruction tuned using the geomorphology. Both methods permit derivation of palaeo-climate but no study has compared the two methods for the same ice-mass. This is important because either approach may result in differences in glacier limits, ELAs and palaeo-climate. This research uses both methods to reconstruct a plateau icefield and quantifies the results from a cartographic and geometrical aspect. Detailed geomorphological mapping of the Tweedsmuir Hills in the Southern Uplands, Scotland (c. 320 km2) was conducted to examine the extent of Younger Dryas (YD; 12.9 -11.7 cal. ka BP) glaciation. Landform evidence indicates a plateau icefield configuration of two separate ice-masses during the YD covering an area c. 45 km2 and 25 km2. The interpreted age is supported by new radiocarbon dating of basal stratigraphies and Terrestrial Cosmogenic Nuclide Analysis (TCNA) of in situ boulders. Both techniques produce similar configurations however; the model results in a coarser resolution requiring further processing if a cartographic map is required. When landforms are absent or fragmentary (e.g., trimlines and lateral moraines), like in many accumulation zones on plateau icefields, the geomorphological approach increasingly relies on extrapolation between lines of evidence and on the individual's perception of how the ice-mass ought to look. In some locations this results in an underestimation of the ice surface compared to the modelled surface most likely due to

  4. A Socio-Cultural Model Based on Empirical Data of Cultural and Social Relationship

    DEFF Research Database (Denmark)

    Lipi, Afia Akhter; Nakano, Yukiko; Rehm, Matthias

    2010-01-01

    The goal of this paper is to integrate culture and social relationship as a computational term in an embodied conversational agent system by employing empirical and theoretical approach. We propose a parameter-based model that predicts nonverbal expressions appropriate for specific cultures...... in different social relationship. So, first, we introduce the theories of social and cultural characteristics. Then, we did corpus analysis of human interaction of two cultures in two different social situations and extracted empirical data and finally, by integrating socio-cultural characteristics...... with empirical data, we establish a parameterized network model that generates culture specific non-verbal expressions in different social relationships....

  5. An Empirical Study of Atmospheric Correction Procedures for Regional Infrasound Amplitudes with Ground Truth.

    Science.gov (United States)

    Howard, J. E.

    2014-12-01

    This study focusses on improving methods of accounting for atmospheric effects on infrasound amplitudes observed on arrays at regional distances in the southwestern United States. Recordings at ranges of 150 to nearly 300 km from a repeating ground truth source of small HE explosions are used. The explosions range in actual weight from approximately 2000-4000 lbs. and are detonated year-round which provides signals for a wide range of atmospheric conditions. Three methods of correcting the observed amplitudes for atmospheric effects are investigated with the data set. The first corrects amplitudes for upper stratospheric wind as developed by Mutschlecner and Whitaker (1999) and uses the average wind speed between 45-55 km altitudes in the direction of propagation to derive an empirical correction formula. This approach was developed using large chemical and nuclear explosions and is tested with the smaller explosions for which shorter wavelengths cause the energy to be scattered by the smaller scale structure of the atmosphere. The second approach isa semi-empirical method using ray tracing to determine wind speed at ray turning heights where the wind estimates replace the wind values in the existing formula. Finally, parabolic equation (PE) modeling is used to predict the amplitudes at the arrays at 1 Hz. The PE amplitudes are compared to the observed amplitudes with a narrow band filter centered at 1 Hz. An analysis is performed of the conditions under which the empirical and semi-empirical methods fail and full wave methods must be used.

  6. Towards New Empirical Versions of Financial and Accounting Models Corrected for Measurement Errors

    OpenAIRE

    Francois-Éric Racicot; Raymond Théoret; Alain Coen

    2006-01-01

    In this paper, we propose a new empirical version of the Fama and French Model based on the Hausman (1978) specification test and aimed at discarding measurement errors in the variables. The proposed empirical framework is general enough to be used for correcting other financial and accounting models of measurement errors. Removing measurement errors is important at many levels as information disclosure, corporate governance and protection of investors.

  7. Using Graph and Vertex Entropy to Compare Empirical Graphs with Theoretical Graph Models

    Directory of Open Access Journals (Sweden)

    Tomasz Kajdanowicz

    2016-09-01

    Full Text Available Over the years, several theoretical graph generation models have been proposed. Among the most prominent are: the Erdős–Renyi random graph model, Watts–Strogatz small world model, Albert–Barabási preferential attachment model, Price citation model, and many more. Often, researchers working with real-world data are interested in understanding the generative phenomena underlying their empirical graphs. They want to know which of the theoretical graph generation models would most probably generate a particular empirical graph. In other words, they expect some similarity assessment between the empirical graph and graphs artificially created from theoretical graph generation models. Usually, in order to assess the similarity of two graphs, centrality measure distributions are compared. For a theoretical graph model this means comparing the empirical graph to a single realization of a theoretical graph model, where the realization is generated from the given model using an arbitrary set of parameters. The similarity between centrality measure distributions can be measured using standard statistical tests, e.g., the Kolmogorov–Smirnov test of distances between cumulative distributions. However, this approach is both error-prone and leads to incorrect conclusions, as we show in our experiments. Therefore, we propose a new method for graph comparison and type classification by comparing the entropies of centrality measure distributions (degree centrality, betweenness centrality, closeness centrality. We demonstrate that our approach can help assign the empirical graph to the most similar theoretical model using a simple unsupervised learning method.

  8. A new model of Social Support in Bereavement (SSB): An empirical investigation with a Chinese sample.

    Science.gov (United States)

    Li, Jie; Chen, Sheying

    2016-01-01

    Bereavement can be an extremely stressful experience while the protective effect of social support is expected to facilitate the adjustment after loss. The ingredients or elements of social support as illustrated by a new model of Social Support in Bereavement (SSB), however, requires empirical evidence. Who might be the most effective providers of social support in bereavement has also been understudied, particularly within specific cultural contexts. The present study uses both qualitative and quantitative analyses to explore these two important issues among bereaved Chinese families and individuals. The results show that three major types of social support described by the SSB model were frequently acknowledged by the participants in this study. Aside from relevant books, family and friends were the primary sources of social support who in turn received support from their workplaces. Helping professionals turned out to be the least significant source of social support in the Chinese cultural context. Differences by gender, age, and bereavement time were also found. The findings render empirical evidence to the conceptual model of Social Support in Bereavement and also offer culturally relevant guidance for providing effective support to the bereaved.

  9. Theoretical-empirical model of the steam-water cycle of the power unit

    Directory of Open Access Journals (Sweden)

    Grzegorz Szapajko

    2010-06-01

    Full Text Available The diagnostics of the energy conversion systems’ operation is realised as a result of collecting, processing, evaluatingand analysing the measurement signals. The result of the analysis is the determination of the process state. It requires a usageof the thermal processes models. Construction of the analytical model with the auxiliary empirical functions built-in brings satisfyingresults. The paper presents theoretical-empirical model of the steam-water cycle. Worked out mathematical simulation model containspartial models of the turbine, the regenerative heat exchangers and the condenser. Statistical verification of the model is presented.

  10. Supply strategy configuration in fragmented production systems: An empirical study

    Directory of Open Access Journals (Sweden)

    Claudia Chackelson

    2013-07-01

    Full Text Available Purpose: Companies survive in saturated markets trying to be more productive and more efficient. In this context, it becomes critical for companies to manage the entire supply network to optimize overall performance.  Hence, the supply strategy plays an important role because it influences the way in which production and logistics network has to be configured and managed. This paper explores the benefits obtained configuring different supply strategies adapted to customer needs.Design/methodology/approach: For this purpose a case research from a Tier 2 point of view of the supply chain has been conducted. Findings and Originality/value: The case research demonstrates that a higher service level, less holding costs and increase turnovers can be obtained implementing the adequate supply strategy. Originality/value: There is a scarcity of research specifically focused on applied Supply Chain Principles within network configuration processes. Moreover, there are few empirical studies of global Tier 2 with multiple decoupling points into its supply chain network.

  11. Systematic risk and liquidity : an empirical study comparing Norwegian equity certificates before and after the regulation in 2009

    OpenAIRE

    Hatlevik, Håkon; Einvik, Christian

    2014-01-01

    In 2009, the Norwegian savings banks industry was subject to a regulation change, which resulted in a modification of the instrument issued by these banks. Thus, in this empirical study we compare the systematic risk and liquidity of equity certificates issued by Norwegian savings banks before and after the regulation change. We go about estimating systematic risk and liquidity using regression analysis. In order to estimate systematic risk we use the empirical model of the CAPM often referre...

  12. Cycle length maximization in PWRs using empirical core models

    International Nuclear Information System (INIS)

    Okafor, K.C.; Aldemir, T.

    1987-01-01

    The problem of maximizing cycle length in nuclear reactors through optimal fuel and poison management has been addressed by many investigators. An often-used neutronic modeling technique is to find correlations between the state and control variables to describe the response of the core to changes in the control variables. In this study, a set of linear correlations, generated by two-dimensional diffusion-depletion calculations, is used to find the enrichment distribution that maximizes cycle length for the initial core of a pressurized water reactor (PWR). These correlations (a) incorporate the effect of composition changes in all the control zones on a given fuel assembly and (b) are valid for a given range of control variables. The advantage of using such correlations is that the cycle length maximization problem can be reduced to a linear programming problem

  13. Empirical modeling of single-wake advection and expansion using full-scale pulsed lidar-based measurements

    DEFF Research Database (Denmark)

    Machefaux, Ewan; Larsen, Gunner Chr.; Troldborg, Niels

    2015-01-01

    In the present paper, single-wake dynamics have been studied both experimentally and numerically. The use of pulsed lidar measurements allows for validation of basic dynamic wake meandering modeling assumptions. Wake center tracking is used to estimate the wake advection velocity experimentally...... fairly well in the far wake but lacks accuracy in the outer region of the near wake. An empirical relationship, relating maximum wake induction and wake advection velocity, is derived and linked to the characteristics of a spherical vortex structure. Furthermore, a new empirical model for single...

  14. User acceptance of mobile commerce: an empirical study in Macau

    Science.gov (United States)

    Lai, Ivan K. W.; Lai, Donny C. F.

    2014-06-01

    This study aims to examine the positive and negative factors that can significantly explain user acceptance of mobile commerce (m-commerce) in Macau. A technology acceptance model for m-commerce with five factors is constructed. The proposed model is tested using data collected from 219 respondents. Confirmatory factor analysis is performed to examine the reliability and validity of the model, and structural equation modelling is performed to access the relationship between behaviour intention and each factor. The acceptance of m-commerce is influenced by factors including performance expectancy, social influence, facilitating conditions and privacy concern; while effort expectancy is insignificant in this case. The results of the study are useful for m-commerce service providers to adjust their strategies for promoting m-commerce services. This study contributes to the practice by providing a user technology acceptance model for m-commerce that can be used as a foundation for future research.

  15. An extended technology acceptance model for detecting influencing factors: An empirical investigation

    Directory of Open Access Journals (Sweden)

    Mohamd Hakkak

    2013-11-01

    Full Text Available The rapid diffusion of the Internet has radically changed the delivery channels applied by the financial services industry. The aim of this study is to identify the influencing factors that encourage customers to adopt online banking in Khorramabad. The research constructs are developed based on the technology acceptance model (TAM and incorporates some extra important control variables. The model is empirically verified to study the factors influencing the online banking adoption behavior of 210 customers of Tejarat Banks in Khorramabad. The findings of the study suggest that the quality of the internet connection, the awareness of online banking and its benefits, the social influence and computer self-efficacy have significant impacts on the perceived usefulness (PU and perceived ease of use (PEOU of online banking acceptance. Trust and resistance to change also have significant impact on the attitude towards the likelihood of adopting online banking.

  16. Empirically derived neighbourhood rules for urban land-use modelling

    DEFF Research Database (Denmark)

    Hansen, Henning Sten

    2012-01-01

    Land-use modelling and spatial scenarios have gained attention as a means to meet the challenge of reducing uncertainty in spatial planning and decision making. Many of the recent modelling efforts incorporate cellular automata to accomplish spatially explicit land-use-change modelling. Spatial...

  17. Poisson-generalized gamma empirical Bayes model for disease ...

    African Journals Online (AJOL)

    In spatial disease mapping, the use of Bayesian models of estimation technique is becoming popular for smoothing relative risks estimates for disease mapping. The most common Bayesian conjugate model for disease mapping is the Poisson-Gamma Model (PG). To explore further the activity of smoothing of relative risk ...

  18. Empiric model for mean generation time adjustment factor for classic point kinetics equations

    Energy Technology Data Exchange (ETDEWEB)

    Goes, David A.B.V. de; Martinez, Aquilino S.; Goncalves, Alessandro da C., E-mail: david.goes@poli.ufrj.br, E-mail: aquilino@lmp.ufrj.br, E-mail: alessandro@con.ufrj.br [Coordenacao de Pos-Graduacao e Pesquisa de Engenharia (COPPE/UFRJ), Rio de Janeiro, RJ (Brazil). Departamento de Engenharia Nuclear

    2017-11-01

    Point reactor kinetics equations are the easiest way to observe the neutron production time behavior in a nuclear reactor. These equations are derived from the neutron transport equation using an approximation called Fick's law leading to a set of first order differential equations. The main objective of this study is to review classic point kinetics equation in order to approximate its results to the case when it is considered the time variation of the neutron currents. The computational modeling used for the calculations is based on the finite difference method. The results obtained with this model are compared with the reference model and then it is determined an empirical adjustment factor that modifies the point reactor kinetics equation to the real scenario. (author)

  19. Empiric model for mean generation time adjustment factor for classic point kinetics equations

    International Nuclear Information System (INIS)

    Goes, David A.B.V. de; Martinez, Aquilino S.; Goncalves, Alessandro da C.

    2017-01-01

    Point reactor kinetics equations are the easiest way to observe the neutron production time behavior in a nuclear reactor. These equations are derived from the neutron transport equation using an approximation called Fick's law leading to a set of first order differential equations. The main objective of this study is to review classic point kinetics equation in order to approximate its results to the case when it is considered the time variation of the neutron currents. The computational modeling used for the calculations is based on the finite difference method. The results obtained with this model are compared with the reference model and then it is determined an empirical adjustment factor that modifies the point reactor kinetics equation to the real scenario. (author)

  20. Growing the intention to adopt educational innovations: An empirical study

    Directory of Open Access Journals (Sweden)

    David M. Bourrie

    2016-03-01

    Full Text Available In order for the Open Access (OA to learning concept to a have wider impact in formal education, it is important that faculty members intent to adopt new educational innovations. However, little is known about which variables influence the intention of faculty members. Therefore, the purposes of this study are to empirically determine: 1 which of the characteristics of the educational innovation significantly influence the intention to adopt educational innovations, 2 which variables influence the readiness of faculty members intention to adopt educational innovations, and 3 how the characteristics of the innovations moderate the relationship between faculty readiness and intention to adopt the innovations. Participants of this study include 335 faculty members in ABET certified computer science and electrical engineering programs in the United States. The results show that ease of use is positively related to the intention of faculty members to adopt an educational innovation. We conclude that Open-CourseWare developers need to ensure that ease of use is emphasized in the CourseWare and they need to propagate these initially in institutions where faculty members have positive attitude to the CourseWare and care about student learning. In addition, a new method of identifying, building, and funding “open access grant” universities that develop easy-to-use educational innovations, make them available on an open access platform, and spread them widely by embedding agents in community colleges, schools, and other educational institutions is essential. Such an initiative may lead to wider adoption of MOOCS and other open access materials.

  1. An empirical investigation of the efficiency effects of integrated care models in Switzerland

    Directory of Open Access Journals (Sweden)

    Oliver Reich

    2012-01-01

    Full Text Available Introduction: This study investigates the efficiency gains of integrated care models in Switzerland, since these models are regarded as cost containment options in national social health insurance. These plans generate much lower average health care expenditure than the basic insurance plan. The question is, however, to what extent these total savings are due to the effects of selection and efficiency. Methods: The empirical analysis is based on data from 399,274 Swiss residents that constantly had compulsory health insurance with the Helsana Group, the largest health insurer in Switzerland, covering the years 2006 to 2009. In order to evaluate the efficiency of the different integrated care models, we apply an econometric approach with a mixed-effects model. Results: Our estimations indicate that the efficiency effects of integrated care models on health care expenditure are significant. However, the different insurance plans vary, revealing the following efficiency gains per model: contracted capitated model 21.2%, contracted non-capitated model 15.5% and telemedicine model 3.7%. The remaining 8.5%, 5.6% and 22.5% respectively of the variation in total health care expenditure can be attributed to the effects of selection. Conclusions: Integrated care models have the potential to improve care for patients with chronic diseases and concurrently have a positive impact on health care expenditure. We suggest policy makers improve the incentives for patients with chronic diseases within the existing regulations providing further potential for cost-efficiency of medical care.

  2. An empirical investigation of the efficiency effects of integrated care models in Switzerland

    Directory of Open Access Journals (Sweden)

    Oliver Reich

    2012-01-01

    Full Text Available Introduction: This study investigates the efficiency gains of integrated care models in Switzerland, since these models are regarded as cost containment options in national social health insurance. These plans generate much lower average health care expenditure than the basic insurance plan. The question is, however, to what extent these total savings are due to the effects of selection and efficiency.Methods: The empirical analysis is based on data from 399,274 Swiss residents that constantly had compulsory health insurance with the Helsana Group, the largest health insurer in Switzerland, covering the years 2006 to 2009. In order to evaluate the efficiency of the different integrated care models, we apply an econometric approach with a mixed-effects model.Results: Our estimations indicate that the efficiency effects of integrated care models on health care expenditure are significant. However, the different insurance plans vary, revealing the following efficiency gains per model: contracted capitated model 21.2%, contracted non-capitated model 15.5% and telemedicine model 3.7%. The remaining 8.5%, 5.6% and 22.5% respectively of the variation in total health care expenditure can be attributed to the effects of selection.Conclusions: Integrated care models have the potential to improve care for patients with chronic diseases and concurrently have a positive impact on health care expenditure. We suggest policy makers improve the incentives for patients with chronic diseases within the existing regulations providing further potential for cost-efficiency of medical care.

  3. Business Processes Modeling Recommender Systems: User Expectations and Empirical Evidence

    Directory of Open Access Journals (Sweden)

    Michael Fellmann

    2018-04-01

    Full Text Available Recommender systems are in widespread use in many areas, especially electronic commerce solutions. In this contribution, we apply recommender functionalities to business process modeling and investigate their potential for supporting process modeling. To do so, we have implemented two prototypes, demonstrated them at a major fair and collected user feedback. After analysis of the feedback, we have confronted the findings with the results of the experiment. Our results indicate that fairgoers expect increased modeling speed as the key advantage and completeness of models as the most unlikely advantage. This stands in contrast to an initial experiment revealing that modelers, in fact, increase the completeness of their models when adequate knowledge is presented while time consumption is not necessarily reduced. We explain possible causes of this mismatch and finally hypothesize on two “sweet spots” of process modeling recommender systems.

  4. Linking customisation of ERP systems to support effort: an empirical study

    Science.gov (United States)

    Koch, Stefan; Mitteregger, Kurt

    2016-01-01

    The amount of customisation to an enterprise resource planning (ERP) system has always been a major concern in the context of the implementation. This article focuses on the phase of maintenance and presents an empirical study about the relationship between the amount of customising and the resulting support effort. We establish a structural equation modelling model that explains support effort using customisation effort, organisational characteristics and scope of implementation. The findings using data from an ERP provider show that there is a statistically significant effect: with an increasing amount of customisation, the quantity of telephone calls to support increases, as well as the duration of each call.

  5. Empirical modeling of nuclear power plants using neural networks

    International Nuclear Information System (INIS)

    Parlos, A.G.; Atiya, A.; Chong, K.T.

    1991-01-01

    A summary of a procedure for nonlinear identification of process dynamics encountered in nuclear power plant components is presented in this paper using artificial neural systems. A hybrid feedforward/feedback neural network, namely, a recurrent multilayer perceptron, is used as the nonlinear structure for system identification. In the overall identification process, the feedforward portion of the network architecture provides its well-known interpolation property, while through recurrency and cross-talk, the local information feedback enables representation of time-dependent system nonlinearities. The standard backpropagation learning algorithm is modified and is used to train the proposed hybrid network in a supervised manner. The performance of recurrent multilayer perceptron networks in identifying process dynamics is investigated via the case study of a U-tube steam generator. The nonlinear response of a representative steam generator is predicted using a neural network and is compared to the response obtained from a sophisticated physical model during both high- and low-power operation. The transient responses compare well, though further research is warranted for training and testing of recurrent neural networks during more severe operational transients and accident scenarios

  6. Screening enterprising personality in youth: an empirical model.

    Science.gov (United States)

    Suárez-Álvarez, Javier; Pedrosa, Ignacio; García-Cueto, Eduardo; Muñiz, José

    2014-02-20

    Entrepreneurial attitudes of individuals are determined by different variables, some of them related to the cognitive and personality characteristics of the person, and others focused on contextual aspects. The aim of this study is to review the essential dimensions of enterprising personality and develop a test that will permit their thorough assessment. Nine dimensions were identified: achievement motivation, risk taking, innovativeness, autonomy, internal locus of control, external locus of control, stress tolerance, self-efficacy and optimism. For the assessment of these dimensions, 161 items were developed which were applied to a sample of 416 students, 54% male and 46% female (M = 17.89 years old, SD = 3.26). After conducting several qualitative and quantitative analyses, the final test was composed of 127 items with acceptable psychometric properties. Alpha coefficients for the subscales ranged from .81 to .98. The validity evidence relative to the content was provided by experts (V = .71, 95% CI = .56 - .85). Construct validity was assessed using different factorial analyses, obtaining a dimensional structure in accordance with the proposed model of nine interdependent dimensions as well as a global factor that groups these nine dimensions (explained variance = 49.07%; χ2/df = 1.78; GFI= .97; SRMR = .07). Nine out of the 127 items showed Differential Item Functioning as a function of gender (p .035). The results obtained are discussed and future lines of research analyzed.

  7. Practical implications of empirically studying moral decision making

    NARCIS (Netherlands)

    Heinzelmann, N.; Ugazio, G.; Tobler, P.N.

    2012-01-01

    This paper considers the practical question of why people do not behave in the way they ought to behave. This question is a practical one, reaching both into the normative and descriptive domains of morality. That is, it concerns moral norms as well as empirical facts. We argue that two main

  8. Comparing empirical results of transaction avoidance rules studies

    NARCIS (Netherlands)

    van Dijck, G.

    2008-01-01

    Empirical legal research in the UK and in the Netherlands has provided data on the extent to which the transaction avoidance rules (avoidance powers, actio Pauliana) generate practical problems. This article’s goal is to explore the similarities and differences of the data. To achieve this, existing

  9. High-frequency volatility combine forecast evaluations: An empirical study for DAX

    Directory of Open Access Journals (Sweden)

    Wen Cheong Chin

    2017-01-01

    Full Text Available This study aims to examine the benefits of combining realized volatility, higher power variation volatility and nearest neighbour truncation volatility in the forecasts of financial stock market of DAX. A structural break heavy-tailed heterogeneous autoregressive model under the heterogeneous market hypothesis specification is employed to capture the stylized facts of high-frequency empirical data. Using selected averaging forecast methods, the forecast weights are assigned based on the simple average, simple median, least squares and mean square error. The empirical results indicated that the combination of forecasts in general shown superiority under four evaluation criteria regardless which proxy is set as the actual volatility. As a conclusion, we summarized that the forecast performance is influenced by three factors namely the types of volatility proxy, forecast methods (individual or averaging forecast and lastly the type of actual forecast value used in the evaluation criteria.

  10. Empirical Modeling of Information Communication Technology Usage Behaviour among Business Education Teachers in Tertiary Colleges of a Developing Country

    Science.gov (United States)

    Isiyaku, Dauda Dansarki; Ayub, Ahmad Fauzi Mohd; Abdulkadir, Suhaida

    2015-01-01

    This study has empirically tested the fitness of a structural model in explaining the influence of two exogenous variables (perceived enjoyment and attitude towards ICTs) on two endogenous variables (behavioural intention and teachers' Information Communication Technology (ICT) usage behavior), based on the proposition of Technology Acceptance…

  11. Empirical methods for modeling landscape change, ecosystem services, and biodiversity

    Science.gov (United States)

    David Lewis; Ralph. Alig

    2009-01-01

    The purpose of this paper is to synthesize recent economics research aimed at integrating discrete-choice econometric models of land-use change with spatially-explicit landscape simulations and quantitative ecology. This research explicitly models changes in the spatial pattern of landscapes in two steps: 1) econometric estimation of parcel-scale transition...

  12. An Empirical Comparison of Default Swap Pricing Models

    NARCIS (Netherlands)

    P. Houweling (Patrick); A.C.F. Vorst (Ton)

    2002-01-01

    textabstractAbstract: In this paper we compare market prices of credit default swaps with model prices. We show that a simple reduced form model with a constant recovery rate outperforms the market practice of directly comparing bonds' credit spreads to default swap premiums. We find that the

  13. Hybrid modeling and empirical analysis of automobile supply chain network

    Science.gov (United States)

    Sun, Jun-yan; Tang, Jian-ming; Fu, Wei-ping; Wu, Bing-ying

    2017-05-01

    Based on the connection mechanism of nodes which automatically select upstream and downstream agents, a simulation model for dynamic evolutionary process of consumer-driven automobile supply chain is established by integrating ABM and discrete modeling in the GIS-based map. Firstly, the rationality is proved by analyzing the consistency of sales and changes in various agent parameters between the simulation model and a real automobile supply chain. Second, through complex network theory, hierarchical structures of the model and relationships of networks at different levels are analyzed to calculate various characteristic parameters such as mean distance, mean clustering coefficients, and degree distributions. By doing so, it verifies that the model is a typical scale-free network and small-world network. Finally, the motion law of this model is analyzed from the perspective of complex self-adaptive systems. The chaotic state of the simulation system is verified, which suggests that this system has typical nonlinear characteristics. This model not only macroscopically illustrates the dynamic evolution of complex networks of automobile supply chain but also microcosmically reflects the business process of each agent. Moreover, the model construction and simulation of the system by means of combining CAS theory and complex networks supplies a novel method for supply chain analysis, as well as theory bases and experience for supply chain analysis of auto companies.

  14. An Empirical Test of a Model of Resistance to Persuasion.

    Science.gov (United States)

    And Others; Burgoon, Michael

    1978-01-01

    Tests a model of resistance to persuasion based upon variables not considered by earlier congruity and inoculation models. Supports the prediction that the kind of critical response set induced and the target of the criticism are mediators of resistance to persuasion. (JMF)

  15. Travel Time Reliability for Urban Networks : Modelling and Empirics

    NARCIS (Netherlands)

    Zheng, F.; Liu, Xiaobo; van Zuylen, H.J.; Li, Jie; Lu, Chao

    2017-01-01

    The importance of travel time reliability in traffic management, control, and network design has received a lot of attention in the past decade. In this paper, a network travel time distribution model based on the Johnson curve system is proposed. The model is applied to field travel time data

  16. Psychological Determinants of University Students' Academic Performance: An Empirical Study

    Science.gov (United States)

    Gebka, Bartosz

    2014-01-01

    This study utilises an integrated conceptual model of academic performance which captures a series of psychological factors: cognitive style; self-theories such as self-esteem and self-efficacy; achievement goals such as mastery, performance, performance avoidance and work avoidance; study-processing strategies such as deep and surface learning;…

  17. A Semi-Empirical SNR Model for Soil Moisture Retrieval Using GNSS SNR Data

    Directory of Open Access Journals (Sweden)

    Mutian Han

    2018-02-01

    Full Text Available The Global Navigation Satellite System-Interferometry and Reflectometry (GNSS-IR technique on soil moisture remote sensing was studied. A semi-empirical Signal-to-Noise Ratio (SNR model was proposed as a curve-fitting model for SNR data routinely collected by a GNSS receiver. This model aims at reconstructing the direct and reflected signal from SNR data and at the same time extracting frequency and phase information that is affected by soil moisture as proposed by K. M. Larson et al. This is achieved empirically through approximating the direct and reflected signal by a second-order and fourth-order polynomial, respectively, based on the well-established SNR model. Compared with other models (K. M. Larson et al., T. Yang et al., this model can improve the Quality of Fit (QoF with little prior knowledge needed and can allow soil permittivity to be estimated from the reconstructed signals. In developing this model, we showed how noise affects the receiver SNR estimation and thus the model performance through simulations under the bare soil assumption. Results showed that the reconstructed signals with a grazing angle of 5°–15° were better for soil moisture retrieval. The QoF was improved by around 45%, which resulted in better estimation of the frequency and phase information. However, we found that the improvement on phase estimation could be neglected. Experimental data collected at Lamasquère, France, were also used to validate the proposed model. The results were compared with the simulation and previous works. It was found that the model could ensure good fitting quality even in the case of irregular SNR variation. Additionally, the soil moisture calculated from the reconstructed signals was about 15% closer in relation to the ground truth measurements. A deeper insight into the Larson model and the proposed model was given at this stage, which formed a possible explanation of this fact. Furthermore, frequency and phase information

  18. Empirical phylogenies and species abundance distributions are consistent with pre-equilibrium dynamics of neutral community models with gene flow

    KAUST Repository

    Bonnet-Lebrun, Anne-Sophie

    2017-03-17

    Community characteristics reflect past ecological and evolutionary dynamics. Here, we investigate whether it is possible to obtain realistically shaped modelled communities - i.e., with phylogenetic trees and species abundance distributions shaped similarly to typical empirical bird and mammal communities - from neutral community models. To test the effect of gene flow, we contrasted two spatially explicit individual-based neutral models: one with protracted speciation, delayed by gene flow, and one with point mutation speciation, unaffected by gene flow. The former produced more realistic communities (shape of phylogenetic tree and species-abundance distribution), consistent with gene flow being a key process in macro-evolutionary dynamics. Earlier models struggled to capture the empirically observed branching tempo in phylogenetic trees, as measured by the gamma statistic. We show that the low gamma values typical of empirical trees can be obtained in models with protracted speciation, in pre-equilibrium communities developing from an initially abundant and widespread species. This was even more so in communities sampled incompletely, particularly if the unknown species are the youngest. Overall, our results demonstrate that the characteristics of empirical communities that we have studied can, to a large extent, be explained through a purely neutral model under pre-equilibrium conditions. This article is protected by copyright. All rights reserved.

  19. Empirical phylogenies and species abundance distributions are consistent with pre-equilibrium dynamics of neutral community models with gene flow

    KAUST Repository

    Bonnet-Lebrun, Anne-Sophie; Manica, Andrea; Eriksson, Anders; Rodrigues, Ana S.L.

    2017-01-01

    Community characteristics reflect past ecological and evolutionary dynamics. Here, we investigate whether it is possible to obtain realistically shaped modelled communities - i.e., with phylogenetic trees and species abundance distributions shaped similarly to typical empirical bird and mammal communities - from neutral community models. To test the effect of gene flow, we contrasted two spatially explicit individual-based neutral models: one with protracted speciation, delayed by gene flow, and one with point mutation speciation, unaffected by gene flow. The former produced more realistic communities (shape of phylogenetic tree and species-abundance distribution), consistent with gene flow being a key process in macro-evolutionary dynamics. Earlier models struggled to capture the empirically observed branching tempo in phylogenetic trees, as measured by the gamma statistic. We show that the low gamma values typical of empirical trees can be obtained in models with protracted speciation, in pre-equilibrium communities developing from an initially abundant and widespread species. This was even more so in communities sampled incompletely, particularly if the unknown species are the youngest. Overall, our results demonstrate that the characteristics of empirical communities that we have studied can, to a large extent, be explained through a purely neutral model under pre-equilibrium conditions. This article is protected by copyright. All rights reserved.

  20. EVOLUTION OF THEORIES AND EMPIRICAL MODELS OF A RELATIONSHIP BETWEEN ECONOMIC GROWTH, SCIENCE AND INNOVATIONS (PART I

    Directory of Open Access Journals (Sweden)

    Kaneva M. A.

    2017-12-01

    Full Text Available This article is a first chapter of an analytical review of existing theoretical models of a relationship between economic growth / GRP and indicators of scientific development and innovation activities, as well as empirical approaches to testing this relationship. Aim of the paper is a systematization of existing approaches to modeling of economic growth geared by science and innovations. The novelty of the current review lies in the authors’ criteria of interconnectedness of theoretical and empirical studies in the systematization of a wide range of publications presented in a final table-scheme. In the first part of the article the authors discuss evolution of theoretical approaches, while the second chapter presents a time gap between theories and their empirical verification caused by the level of development of quantitative instruments such as econometric models. The results of this study can be used by researchers and graduate students for familiarization with current scientific approaches that manifest progress from theory to empirical verification of a relationship «economic growth-innovations» for improvement of different types of models in spatial econometrics. To apply these models to management practices the presented review could be supplemented with new criteria for classification of knowledge production functions and other theories about effect of science on economic growth.

  1. An empirical model of global spread-f occurrence

    International Nuclear Information System (INIS)

    Singleton, D.G.

    1974-09-01

    A method of combining models of ionospheric F-layer peak electron density and irregularity incremental electron density into a model of the occurrence probability of the frequency spreading component of spread-F is presented. The predictions of the model are compared with spread-F occurrence data obtained under sunspot maximum conditions. Good agreement is obtained for latitudes less than 70 0 geomagnetic. At higher latitudes, the inclusion of a 'blackout factor' in the model allows it to accurately represent the data and, in so doing, resolves an apparent discrepancy in the occurrence statistics at high latitudes. The blackout factor is ascribed to the effect of polar blackout on the spread-F statistics and/or the lack of a definitve incremental electron density model for irregularities at polar latitudes. Ways of isolating these effects and assessing their relative importance in the blackout factor are discussed. The model, besides providing estimates of spread-F occurrence on a worldwide basis, which will be of value in the engineering of HF and VHF communications, also furnishes a means of further checking the irregularity incremental electron density model on which it is based. (author)

  2. EMPIRICAL WEIGHTED MODELLING ON INTER-COUNTY INEQUALITIES EVOLUTION AND TO TEST ECONOMICAL CONVERGENCE IN ROMANIA

    Directory of Open Access Journals (Sweden)

    Natalia\tMOROIANU‐DUMITRESCU

    2015-06-01

    Full Text Available During the last decades, the regional convergence process in Europe has attracted a considerable interest as a highly significant issue, especially after EU enlargement with the New Member States from Central and Eastern Europe. The most usual empirical approaches are using the β- and σ-convergence, originally developed by a series of neo-classical models. Up-to-date, the EU integration process was proven to be accompanied by an increase of the regional inequalities. In order to determine the existence of a similar increase of the inequalities between the administrative counties (NUTS3 included in the NUTS2 and NUTS1 regions of Romania, this paper provides an empirical modelling of economic convergence allowing to evaluate the level and evolution of the inter-regional inequalities over more than a decade period lasting from 1995 up to 2011. The paper presents the results of a large cross-sectional study of σ-convergence and weighted coefficient of variation, using GDP and population data obtained from the National Institute of Statistics of Romania. Both graphical representation including non-linear regression and the associated tables summarizing numerical values of the main statistical tests are demonstrating the impact of pre- accession policy on the economic development of all Romanian NUTS types. The clearly emphasised convergence in the middle time subinterval can be correlated with the pre-accession drastic changes on economic, political and social level, and with the opening of the Schengen borders for Romanian labor force in 2002.

  3. Empirical evaluation of a forecasting model for successful facilitation ...

    African Journals Online (AJOL)

    During 2000 the annual Facilitator Customer Satisfaction Survey was ... the forecasting model is successful concerning the CSI value and a high positive linear ... namely that of human behaviour to incorporate other influences than just the ...

  4. EMPIRICAL MODELS FOR DESCRIBING FIRE BEHAVIOR IN BRAZILIAN COMMERCIAL EUCALYPT PLANTATIONS

    Directory of Open Access Journals (Sweden)

    Benjamin Leonardo Alves White

    2016-12-01

    Full Text Available Modeling forest fire behavior is an important task that can be used to assist in fire prevention and suppression operations. However, according to previous studies, the existing common worldwide fire behavior models used do not correctly estimate the fire behavior in Brazilian commercial hybrid eucalypt plantations. Therefore, this study aims to build new empirical models to predict the fire rate of spread, flame length and fuel consumption for such vegetation. To meet these objectives, 105 laboratory experimental burns were done, where the main fuel characteristics and weather variables that influence fire behavior were controlled and/or measured in each experiment. Dependent and independent variables were fitted through multiple regression analysis. The fire rate of spread proposed model is based on the wind speed, fuel bed bulk density and 1-h dead fuel moisture content (r2 = 0.86; the flame length model is based on the fuel bed depth, 1-h dead fuel moisture content and wind speed (r2 = 0.72; the fuel consumption proposed model has the 1-h dead fuel moisture, fuel bed bulk density and 1-h dead dry fuel load as independent variables (r2= 0.80. These models were used to develop a new fire behavior software, the “Eucalyptus Fire Safety System”.

  5. An empirical model for estimating solar radiation in the Algerian Sahara

    Science.gov (United States)

    Benatiallah, Djelloul; Benatiallah, Ali; Bouchouicha, Kada; Hamouda, Messaoud; Nasri, Bahous

    2018-05-01

    The present work aims to determine the empirical model R.sun that will allow us to evaluate the solar radiation flues on a horizontal plane and in clear-sky on the located Adrar city (27°18 N and 0°11 W) of Algeria and compare with the results measured at the localized site. The expected results of this comparison are of importance for the investment study of solar systems (solar power plants for electricity production, CSP) and also for the design and performance analysis of any system using the solar energy. Statistical indicators used to evaluate the accuracy of the model where the mean bias error (MBE), root mean square error (RMSE) and coefficient of determination. The results show that for global radiation, the daily correlation coefficient is 0.9984. The mean absolute percentage error is 9.44 %. The daily mean bias error is -7.94 %. The daily root mean square error is 12.31 %.

  6. A semi-empirical molecular orbital model of silica, application to radiation compaction

    International Nuclear Information System (INIS)

    Tasker, P.W.

    1978-11-01

    Semi-empirical molecular-orbital theory is used to calculate the bonding in a cluster of two SiO 4 tetrahedra, with the outer bonds saturated with pseudo-hydrogen atoms. The basic properties of the cluster, bond energies and band gap are calculated using a very simple parameterisation scheme. The resulting cluster is used to study the rebonding that occurs when an oxygen vacancy is created. It is suggested that a vacancy model is capable of producing the observed differences between quartz and vitreous silica, and the calculations show that the compaction effect observed in the glass is of a magnitude compatible with the relaxations around the vacancy. More detailed lattice models will be needed to examine this mechanism further. (author)

  7. Physician leadership styles and effectiveness: an empirical study.

    Science.gov (United States)

    Xirasagar, Sudha; Samuels, Michael E; Stoskopf, Carleen H

    2005-12-01

    The authors study the association between physician leadership styles and leadership effectiveness. Executive directors of community health centers were surveyed (269 respondents; response rate = 40.9 percent) for their perceptions of the medical director's leadership behaviors and effectiveness, using an adapted Multifactor Leadership Questionnaire (43 items on a 0-4 point Likert-type scale), with additional questions on demographics and the center's clinical goals and achievements. The authors hypothesize that transformational leadership would be more positively associated with executive directors' ratings of effectiveness, satisfaction with the leader, and subordinate extra effort, as well as the center's clinical goal achievement, than transactional or laissez-faire leadership. Separate ordinary least squares regressions were used to model each of the effectiveness measures, and general linear model regression was used to model clinical goal achievement. Results support the hypothesis and suggest that physician leadership development using the transformational leadership model may result in improved health care quality and cost control.

  8. Empirical assessment of a threshold model for sylvatic plague

    DEFF Research Database (Denmark)

    Davis, Stephen; Leirs, Herwig; Viljugrein, H.

    2007-01-01

    Plague surveillance programmes established in Kazakhstan, Central Asia, during the previous century, have generated large plague archives that have been used to parameterize an abundance threshold model for sylvatic plague in great gerbil (Rhombomys opimus) populations. Here, we assess the model...... examine six hypotheses that could explain the resulting false positive predictions, namely (i) including end-of-outbreak data erroneously lowers the estimated threshold, (ii) too few gerbils were tested, (iii) plague becomes locally extinct, (iv) the abundance of fleas was too low, (v) the climate...

  9. Empirical justification of the elementary model of money circulation

    Science.gov (United States)

    Schinckus, Christophe; Altukhov, Yurii A.; Pokrovskii, Vladimir N.

    2018-03-01

    This paper proposes an elementary model describing the money circulation for a system, composed by a production system, the government, a central bank, commercial banks and their customers. A set of equations for the system determines the main features of interaction between the production and the money circulation. It is shown, that the money system can evolve independently of the evolution of production. The model can be applied to any national economy but we will illustrate our claim in the context of the Russian monetary system.

  10. An empirical study of marketing communications effectiveness in Slovenian market

    OpenAIRE

    Jerman, Damjana; Završnik, Bruno

    2017-01-01

    This paper deals with the value or more specifically, the contribution of marketing communications strategy to effectiveness of marketing communications and hypothesizes that marketing communications strategy correlate with the effectiveness of marketing communications. The paper consists of two parts: the theoretical framework for the role of marketing communications strategy for the effectiveness of the marketing communications and the empirical analysis, based on the primary data collected...

  11. An empirical study of Malaysian firms' capital structure

    OpenAIRE

    Zain, Sharifah Raihan Syed Mohd

    2003-01-01

    Merged with duplicate record 10026.1/821 on 27.03.2017 by CS (TIS) It is sometimes purported that one of the factors affecting a firm's value is its capital structure. The event of the 1997 Asian financial crisis was expected to affect the firms' gearing level as the firms' earnings deteriorated and the capital market collapsed. The main objective of this research is to examine empirically the determinants of the capital structure of Malaysian firms. The main additional aim is ...

  12. Service delivery innovation architecture: An empirical study of antecedents and outcomes

    Directory of Open Access Journals (Sweden)

    Rajeev Verma

    2014-06-01

    Full Text Available The research examines service delivery innovation architecture and its role in achieving sustainable competitive advantage of firms. The study develops and empirically examines an antecedent based model of service delivery innovation. We collected data from 203 service sector professionals working in Mexican financial and information technology firms, and tested the proposed relationship. Further, the study investigates the moderating role of customer orientation on innovation driven performance outcomes. Results show that customer orientation strengthens the service delivery–performance relationship. This paper aims to contribute to the strategic planning of service firms by guiding their resource allocation to ensure sustainable growth.

  13. Proposed Core Competencies and Empirical Validation Procedure in Competency Modeling: Confirmation and Classification.

    Science.gov (United States)

    Baczyńska, Anna K; Rowiński, Tomasz; Cybis, Natalia

    2016-01-01

    Competency models provide insight into key skills which are common to many positions in an organization. Moreover, there is a range of competencies that is used by many companies. Researchers have developed core competency terminology to underline their cross-organizational value. The article presents a theoretical model of core competencies consisting of two main higher-order competencies called performance and entrepreneurship. Each of them consists of three elements: the performance competency includes cooperation, organization of work and goal orientation, while entrepreneurship includes innovativeness, calculated risk-taking and pro-activeness. However, there is lack of empirical validation of competency concepts in organizations and this would seem crucial for obtaining reliable results from organizational research. We propose a two-step empirical validation procedure: (1) confirmation factor analysis, and (2) classification of employees. The sample consisted of 636 respondents (M = 44.5; SD = 15.1). Participants were administered a questionnaire developed for the study purpose. The reliability, measured by Cronbach's alpha, ranged from 0.60 to 0.83 for six scales. Next, we tested the model using a confirmatory factor analysis. The two separate, single models of performance and entrepreneurial orientations fit quite well to the data, while a complex model based on the two single concepts needs further research. In the classification of employees based on the two higher order competencies we obtained four main groups of employees. Their profiles relate to those found in the literature, including so-called niche finders and top performers. Some proposal for organizations is discussed.

  14. Theoretical and Empirical Review of Asset Pricing Models: A Structural Synthesis

    Directory of Open Access Journals (Sweden)

    Saban Celik

    2012-01-01

    Full Text Available The purpose of this paper is to give a comprehensive theoretical review devoted to asset pricing models by emphasizing static and dynamic versions in the line with their empirical investigations. A considerable amount of financial economics literature devoted to the concept of asset pricing and their implications. The main task of asset pricing model can be seen as the way to evaluate the present value of the pay offs or cash flows discounted for risk and time lags. The difficulty coming from discounting process is that the relevant factors that affect the pay offs vary through the time whereas the theoretical framework is still useful to incorporate the changing factors into an asset pricing models. This paper fills the gap in literature by giving a comprehensive review of the models and evaluating the historical stream of empirical investigations in the form of structural empirical review.

  15. An Empirical Generative Framework for Computational Modeling of Language Acquisition

    Science.gov (United States)

    Waterfall, Heidi R.; Sandbank, Ben; Onnis, Luca; Edelman, Shimon

    2010-01-01

    This paper reports progress in developing a computer model of language acquisition in the form of (1) a generative grammar that is (2) algorithmically learnable from realistic corpus data, (3) viable in its large-scale quantitative performance and (4) psychologically real. First, we describe new algorithmic methods for unsupervised learning of…

  16. An auto-calibration procedure for empirical solar radiation models

    NARCIS (Netherlands)

    Bojanowski, J.S.; Donatelli, Marcello; Skidmore, A.K.; Vrieling, A.

    2013-01-01

    Solar radiation data are an important input for estimating evapotranspiration and modelling crop growth. Direct measurement of solar radiation is now carried out in most European countries, but the network of measuring stations is too sparse for reliable interpolation of measured values. Instead of

  17. HRD Interventions, Employee Competencies and Organizational Effectiveness: An Empirical Study

    Science.gov (United States)

    Potnuru, Rama Krishna Gupta; Sahoo, Chandan Kumar

    2016-01-01

    Purpose: The purpose of the study is to examine the impact of human resource development (HRD) interventions on organizational effectiveness by means of employee competencies which are built by some of the selected HRD interventions. Design/methodology/approach: An integrated research model has been developed by combining the principal factors…

  18. Block Empirical Likelihood for Longitudinal Single-Index Varying-Coefficient Model

    Directory of Open Access Journals (Sweden)

    Yunquan Song

    2013-01-01

    Full Text Available In this paper, we consider a single-index varying-coefficient model with application to longitudinal data. In order to accommodate the within-group correlation, we apply the block empirical likelihood procedure to longitudinal single-index varying-coefficient model, and prove a nonparametric version of Wilks’ theorem which can be used to construct the block empirical likelihood confidence region with asymptotically correct coverage probability for the parametric component. In comparison with normal approximations, the proposed method does not require a consistent estimator for the asymptotic covariance matrix, making it easier to conduct inference for the model's parametric component. Simulations demonstrate how the proposed method works.

  19. Cloud Computing (SaaS Adoption as a Strategic Technology: Results of an Empirical Study

    Directory of Open Access Journals (Sweden)

    Pedro R. Palos-Sanchez

    2017-01-01

    Full Text Available The present study empirically analyzes the factors that determine the adoption of cloud computing (SaaS model in firms where this strategy is considered strategic for executing their activity. A research model has been developed to evaluate the factors that influence the intention of using cloud computing that combines the variables found in the technology acceptance model (TAM with other external variables such as top management support, training, communication, organization size, and technological complexity. Data compiled from 150 companies in Andalusia (Spain are used to test the formulated hypotheses. The results of this study reflect what critical factors should be considered and how they are interrelated. They also show the organizational demands that must be considered by those companies wishing to implement a real management model adopted to the digital economy, especially those related to cloud computing.

  20. Empirical modeling of drying kinetics and microwave assisted extraction of bioactive compounds from Adathoda vasica

    Directory of Open Access Journals (Sweden)

    Prithvi Simha

    2016-03-01

    Full Text Available To highlight the shortcomings in conventional methods of extraction, this study investigates the efficacy of Microwave Assisted Extraction (MAE toward bioactive compound recovery from pharmaceutically-significant medicinal plants, Adathoda vasica and Cymbopogon citratus. Initially, the microwave (MW drying behavior of the plant leaves was investigated at different sample loadings, MW power and drying time. Kinetics was analyzed through empirical modeling of drying data against 10 conventional thin-layer drying equations that were further improvised through the incorporation of Arrhenius, exponential and linear-type expressions. 81 semi-empirical Midilli equations were derived and subjected to non-linear regression to arrive at the characteristic drying equations. Bioactive compounds recovery from the leaves was examined under various parameters through a comparative approach that studied MAE against Soxhlet extraction. MAE of A. vasica reported similar yields although drastic reduction in extraction time (210 s as against the average time of 10 h in the Soxhlet apparatus. Extract yield for MAE of C. citratus was higher than the conventional process with optimal parameters determined to be 20 g sample load, 1:20 sample/solvent ratio, extraction time of 150 s and 300 W output power. Scanning Electron Microscopy and Fourier Transform Infrared Spectroscopy were performed to depict changes in internal leaf morphology.

  1. Use of empirical likelihood to calibrate auxiliary information in partly linear monotone regression models.

    Science.gov (United States)

    Chen, Baojiang; Qin, Jing

    2014-05-10

    In statistical analysis, a regression model is needed if one is interested in finding the relationship between a response variable and covariates. When the response depends on the covariate, then it may also depend on the function of this covariate. If one has no knowledge of this functional form but expect for monotonic increasing or decreasing, then the isotonic regression model is preferable. Estimation of parameters for isotonic regression models is based on the pool-adjacent-violators algorithm (PAVA), where the monotonicity constraints are built in. With missing data, people often employ the augmented estimating method to improve estimation efficiency by incorporating auxiliary information through a working regression model. However, under the framework of the isotonic regression model, the PAVA does not work as the monotonicity constraints are violated. In this paper, we develop an empirical likelihood-based method for isotonic regression model to incorporate the auxiliary information. Because the monotonicity constraints still hold, the PAVA can be used for parameter estimation. Simulation studies demonstrate that the proposed method can yield more efficient estimates, and in some situations, the efficiency improvement is substantial. We apply this method to a dementia study. Copyright © 2013 John Wiley & Sons, Ltd.

  2. Health Status and Health Dynamics in an Empirical Model of Expected Longevity*

    Science.gov (United States)

    Benítez-Silva, Hugo; Ni, Huan

    2010-01-01

    Expected longevity is an important factor influencing older individuals’ decisions such as consumption, savings, purchase of life insurance and annuities, claiming of Social Security benefits, and labor supply. It has also been shown to be a good predictor of actual longevity, which in turn is highly correlated with health status. A relatively new literature on health investments under uncertainty, which builds upon the seminal work by Grossman (1972), has directly linked longevity with characteristics, behaviors, and decisions by utility maximizing agents. Our empirical model can be understood within that theoretical framework as estimating a production function of longevity. Using longitudinal data from the Health and Retirement Study, we directly incorporate health dynamics in explaining the variation in expected longevities, and compare two alternative measures of health dynamics: the self-reported health change, and the computed health change based on self-reports of health status. In 38% of the reports in our sample, computed health changes are inconsistent with the direct report on health changes over time. And another 15% of the sample can suffer from information losses if computed changes are used to assess changes in actual health. These potentially serious problems raise doubts regarding the use and interpretation of the computed health changes and even the lagged measures of self-reported health as controls for health dynamics in a variety of empirical settings. Our empirical results, controlling for both subjective and objective measures of health status and unobserved heterogeneity in reporting, suggest that self-reported health changes are a preferred measure of health dynamics. PMID:18187217

  3. β-empirical Bayes inference and model diagnosis of microarray data

    Directory of Open Access Journals (Sweden)

    Hossain Mollah Mohammad

    2012-06-01

    Full Text Available Abstract Background Microarray data enables the high-throughput survey of mRNA expression profiles at the genomic level; however, the data presents a challenging statistical problem because of the large number of transcripts with small sample sizes that are obtained. To reduce the dimensionality, various Bayesian or empirical Bayes hierarchical models have been developed. However, because of the complexity of the microarray data, no model can explain the data fully. It is generally difficult to scrutinize the irregular patterns of expression that are not expected by the usual statistical gene by gene models. Results As an extension of empirical Bayes (EB procedures, we have developed the β-empirical Bayes (β-EB approach based on a β-likelihood measure which can be regarded as an ’evidence-based’ weighted (quasi- likelihood inference. The weight of a transcript t is described as a power function of its likelihood, fβ(yt|θ. Genes with low likelihoods have unexpected expression patterns and low weights. By assigning low weights to outliers, the inference becomes robust. The value of β, which controls the balance between the robustness and efficiency, is selected by maximizing the predictive β0-likelihood by cross-validation. The proposed β-EB approach identified six significant (p−5 contaminated transcripts as differentially expressed (DE in normal/tumor tissues from the head and neck of cancer patients. These six genes were all confirmed to be related to cancer; they were not identified as DE genes by the classical EB approach. When applied to the eQTL analysis of Arabidopsis thaliana, the proposed β-EB approach identified some potential master regulators that were missed by the EB approach. Conclusions The simulation data and real gene expression data showed that the proposed β-EB method was robust against outliers. The distribution of the weights was used to scrutinize the irregular patterns of expression and diagnose the model

  4. Antecedents of Employee Loyalty in Educational Setting: An Empirical Study

    Directory of Open Access Journals (Sweden)

    Sabrina O. Sihombing

    2018-01-01

    Full Text Available No studies have been conducted to link three variables of work values, internal marketing, and job satisfaction in predicting employee loyalty. Therefore, this research aims to fulfill the gap by developing a model that include work values, internal marketing, and job satisfaction in assessing employee loyalty in educational context. This research applies a judgmental sampling with the sample size of 200 lecturers from private universities in Tangerang. Structural equation modeling was applied in testing the research hypotheses. The results showed that there is one out of three hypotheses that were not supported. That hypothesis is the relationship between internal marketing and job satisfaction.

  5. Libor and Swap Market Models for the Pricing of Interest Rate Derivatives : An Empirical Analysis

    NARCIS (Netherlands)

    de Jong, F.C.J.M.; Driessen, J.J.A.G.; Pelsser, A.

    2000-01-01

    In this paper we empirically analyze and compare the Libor and Swap Market Models, developed by Brace, Gatarek, and Musiela (1997) and Jamshidian (1997), using paneldata on prices of US caplets and swaptions.A Libor Market Model can directly be calibrated to observed prices of caplets, whereas a

  6. An improved empirical model for diversity gain on Earth-space propagation paths

    Science.gov (United States)

    Hodge, D. B.

    1981-01-01

    An empirical model was generated to estimate diversity gain on Earth-space propagation paths as a function of Earth terminal separation distance, link frequency, elevation angle, and angle between the baseline and the path azimuth. The resulting model reproduces the entire experimental data set with an RMS error of 0.73 dB.

  7. Comparing Multidimensional and Continuum Models of Vocabulary Acquisition: An Empirical Examination of the Vocabulary Knowledge Scale

    Science.gov (United States)

    Stewart, Jeffrey; Batty, Aaron Olaf; Bovee, Nicholas

    2012-01-01

    Second language vocabulary acquisition has been modeled both as multidimensional in nature and as a continuum wherein the learner's knowledge of a word develops along a cline from recognition through production. In order to empirically examine and compare these models, the authors assess the degree to which the Vocabulary Knowledge Scale (VKS;…

  8. A semi-empirical model for predicting crown diameter of cedrela ...

    African Journals Online (AJOL)

    A semi-empirical model relating age and breast height has been developed to predict individual tree crown diameter for Cedrela odorata (L) plantation in the moist evergreen forest zones of Ghana. The model was based on field records of 269 trees, and could determine the crown cover dynamics, forecast time of canopy ...

  9. Stochastic Modeling of Empirical Storm Loss in Germany

    Science.gov (United States)

    Prahl, B. F.; Rybski, D.; Kropp, J. P.; Burghoff, O.; Held, H.

    2012-04-01

    Based on German insurance loss data for residential property we derive storm damage functions that relate daily loss with maximum gust wind speed. Over a wide range of loss, steep power law relationships are found with spatially varying exponents ranging between approximately 8 and 12. Global correlations between parameters and socio-demographic data are employed to reduce the number of local parameters to 3. We apply a Monte Carlo approach to calculate German loss estimates including confidence bounds in daily and annual resolution. Our model reproduces the annual progression of winter storm losses and enables to estimate daily losses over a wide range of magnitude.

  10. Toward an Empirically-based Parametric Explosion Spectral Model

    Science.gov (United States)

    Ford, S. R.; Walter, W. R.; Ruppert, S.; Matzel, E.; Hauk, T. F.; Gok, R.

    2010-12-01

    Small underground nuclear explosions need to be confidently detected, identified, and characterized in regions of the world where they have never occurred. We develop a parametric model of the nuclear explosion seismic source spectrum derived from regional phases (Pn, Pg, and Lg) that is compatible with earthquake-based geometrical spreading and attenuation. Earthquake spectra are fit with a generalized version of the Brune spectrum, which is a three-parameter model that describes the long-period level, corner-frequency, and spectral slope at high-frequencies. These parameters are then correlated with near-source geology and containment conditions. There is a correlation of high gas-porosity (low strength) with increased spectral slope. However, there are trade-offs between the slope and corner-frequency, which we try to independently constrain using Mueller-Murphy relations and coda-ratio techniques. The relationship between the parametric equation and the geologic and containment conditions will assist in our physical understanding of the nuclear explosion source, and aid in the prediction of observed local and regional distance seismic amplitudes for event identification and yield determination in regions with incomplete or no prior history of underground nuclear testing.

  11. Benefits of Applying Hierarchical Models to the Empirical Green's Function Approach

    Science.gov (United States)

    Denolle, M.; Van Houtte, C.

    2017-12-01

    Stress drops calculated from source spectral studies currently show larger variability than what is implied by empirical ground motion models. One of the potential origins of the inflated variability is the simplified model-fitting techniques used in most source spectral studies. This study improves upon these existing methods, and shows that the fitting method may explain some of the discrepancy. In particular, Bayesian hierarchical modelling is shown to be a method that can reduce bias, better quantify uncertainties and allow additional effects to be resolved. The method is applied to the Mw7.1 Kumamoto, Japan earthquake, and other global, moderate-magnitude, strike-slip earthquakes between Mw5 and Mw7.5. It is shown that the variation of the corner frequency, fc, and the falloff rate, n, across the focal sphere can be reliably retrieved without overfitting the data. Additionally, it is shown that methods commonly used to calculate corner frequencies can give substantial biases. In particular, if fc were calculated for the Kumamoto earthquake using a model with a falloff rate fixed at 2 instead of the best fit 1.6, the obtained fc would be as large as twice its realistic value. The reliable retrieval of the falloff rate allows deeper examination of this parameter for a suite of global, strike-slip earthquakes, and its scaling with magnitude. The earthquake sequences considered in this study are from Japan, New Zealand, Haiti and California.

  12. Ensemble empirical model decomposition and neuro-fuzzy conjunction model for middle and long-term runoff forecast

    Science.gov (United States)

    Tan, Q.

    2017-12-01

    Forecasting the runoff over longer periods, such as months and years, is one of the important tasks for hydrologists and water resource managers to maximize the potential of the limited water. However, due to the nonlinear and nonstationary characteristic of the natural runoff, it is hard to forecast the middle and long-term runoff with a satisfactory accuracy. It has been proven that the forecast performance can be improved by using signal decomposition techniques to product more cleaner signals as model inputs. In this study, a new conjunction model (EEMD-neuro-fuzzy) with adaptive ability is proposed. The ensemble empirical model decomposition (EEMD) is used to decompose the runoff time series into several components, which are with different frequencies and more cleaner than the original time series. Then the neuro-fuzzy model is developed for each component. The final forecast results can be obtained by summing the outputs of all neuro-fuzzy models. Unlike the conventional forecast model, the decomposition and forecast models in this study are adjusted adaptively as long as new runoff information is added. The proposed models are applied to forecast the monthly runoff of Yichang station, located in Yangtze River of China. The results show that the performance of adaptive forecast model we proposed outperforms than the conventional forecast model, the Nash-Sutcliffe efficiency coefficient can reach to 0.9392. Due to its ability to process the nonstationary data, the forecast accuracy, especially in flood season, is improved significantly.

  13. On Integrating Student Empirical Software Engineering Studies with Research and Teaching Goals

    NARCIS (Netherlands)

    Galster, Matthias; Tofan, Dan; Avgeriou, Paris

    2012-01-01

    Background: Many empirical software engineering studies use students as subjects and are conducted as part of university courses. Aim: We aim at reporting our experiences with using guidelines for integrating empirical studies with our research and teaching goals. Method: We document our experience

  14. Empirical Scientific Research and Legal Studies Research--A Missing Link

    Science.gov (United States)

    Landry, Robert J., III

    2016-01-01

    This article begins with an overview of what is meant by empirical scientific research in the context of legal studies. With that backdrop, the argument is presented that without engaging in normative, theoretical, and doctrinal research in tandem with empirical scientific research, the role of legal studies scholarship in making meaningful…

  15. Soil Moisture Estimate under Forest using a Semi-empirical Model at P-Band

    Science.gov (United States)

    Truong-Loi, M.; Saatchi, S.; Jaruwatanadilok, S.

    2013-12-01

    In this paper we show the potential of a semi-empirical algorithm to retrieve soil moisture under forests using P-band polarimetric SAR data. In past decades, several remote sensing techniques have been developed to estimate the surface soil moisture. In most studies associated with radar sensing of soil moisture, the proposed algorithms are focused on bare or sparsely vegetated surfaces where the effect of vegetation can be ignored. At long wavelengths such as L-band, empirical or physical models such as the Small Perturbation Model (SPM) provide reasonable estimates of surface soil moisture at depths of 0-5cm. However for densely covered vegetated surfaces such as forests, the problem becomes more challenging because the vegetation canopy is a complex scattering environment. For this reason there have been only few studies focusing on retrieving soil moisture under vegetation canopy in the literature. Moghaddam et al. developed an algorithm to estimate soil moisture under a boreal forest using L- and P-band SAR data. For their studied area, double-bounce between trunks and ground appear to be the most important scattering mechanism. Thereby, they implemented parametric models of radar backscatter for double-bounce using simulations of a numerical forest scattering model. Hajnsek et al. showed the potential of estimating the soil moisture under agricultural vegetation using L-band polarimetric SAR data and using polarimetric-decomposition techniques to remove the vegetation layer. Here we use an approach based on physical formulation of dominant scattering mechanisms and three parameters that integrates the vegetation and soil effects at long wavelengths. The algorithm is a simplification of a 3-D coherent model of forest canopy based on the Distorted Born Approximation (DBA). The simplified model has three equations and three unknowns, preserving the three dominant scattering mechanisms of volume, double-bounce and surface for three polarized backscattering

  16. Integration of least angle regression with empirical Bayes for multi-locus genome-wide association studies

    Science.gov (United States)

    Multi-locus genome-wide association studies has become the state-of-the-art procedure to identify quantitative trait loci (QTL) associated with traits simultaneously. However, implementation of multi-locus model is still difficult. In this study, we integrated least angle regression with empirical B...

  17. Feature Evaluation for Building Facade Images - AN Empirical Study

    Science.gov (United States)

    Yang, M. Y.; Förstner, W.; Chai, D.

    2012-08-01

    The classification of building facade images is a challenging problem that receives a great deal of attention in the photogrammetry community. Image classification is critically dependent on the features. In this paper, we perform an empirical feature evaluation task for building facade images. Feature sets we choose are basic features, color features, histogram features, Peucker features, texture features, and SIFT features. We present an approach for region-wise labeling using an efficient randomized decision forest classifier and local features. We conduct our experiments with building facade image classification on the eTRIMS dataset, where our focus is the object classes building, car, door, pavement, road, sky, vegetation, and window.

  18. Organizational design and knowledge performance: An empirical study

    Directory of Open Access Journals (Sweden)

    Enrique Claver-Cortés

    2008-07-01

    Full Text Available The paper analyzes how the traditional variables that define a firm’s organizational structure (formalization, complexity, and centralization influence knowledge performance (the degree to which a firm generates knowledge and uses it to reach a competitive advantage. Three hypotheses are tested using a sample of 164 large Spanish firms. The results show that organizational complexity and centralization exert a positive and a negative influence, respectively, on knowledge performance, which confirms the two hypotheses related to these variables. However, the analysis reveals no empirical evidence to confirm the hypothesis that formalization and knowledge performance are related positively.

  19. An empirical comparison of alternate regime-switching models for electricity spot prices

    Energy Technology Data Exchange (ETDEWEB)

    Janczura, Joanna [Hugo Steinhaus Center, Institute of Mathematics and Computer Science, Wroclaw University of Technology, 50-370 Wroclaw (Poland); Weron, Rafal [Institute of Organization and Management, Wroclaw University of Technology, 50-370 Wroclaw (Poland)

    2010-09-15

    One of the most profound features of electricity spot prices are the price spikes. Markov regime-switching (MRS) models seem to be a natural candidate for modeling this spiky behavior. However, in the studies published so far, the goodness-of-fit of the proposed models has not been a major focus. While most of the models were elegant, their fit to empirical data has either been not examined thoroughly or the signs of a bad fit ignored. With this paper we want to fill the gap. We calibrate and test a range of MRS models in an attempt to find parsimonious specifications that not only address the main characteristics of electricity prices but are statistically sound as well. We find that the best structure is that of an independent spike 3-regime model with time-varying transition probabilities, heteroscedastic diffusion-type base regime dynamics and shifted spike regime distributions. Not only does it allow for a seasonal spike intensity throughout the year and consecutive spikes or price drops, which is consistent with market observations, but also exhibits the 'inverse leverage effect' reported in the literature for spot electricity prices. (author)

  20. An empirical comparison of alternate regime-switching models for electricity spot prices

    International Nuclear Information System (INIS)

    Janczura, Joanna; Weron, Rafal

    2010-01-01

    One of the most profound features of electricity spot prices are the price spikes. Markov regime-switching (MRS) models seem to be a natural candidate for modeling this spiky behavior. However, in the studies published so far, the goodness-of-fit of the proposed models has not been a major focus. While most of the models were elegant, their fit to empirical data has either been not examined thoroughly or the signs of a bad fit ignored. With this paper we want to fill the gap. We calibrate and test a range of MRS models in an attempt to find parsimonious specifications that not only address the main characteristics of electricity prices but are statistically sound as well. We find that the best structure is that of an independent spike 3-regime model with time-varying transition probabilities, heteroscedastic diffusion-type base regime dynamics and shifted spike regime distributions. Not only does it allow for a seasonal spike intensity throughout the year and consecutive spikes or price drops, which is consistent with market observations, but also exhibits the 'inverse leverage effect' reported in the literature for spot electricity prices. (author)

  1. An empirical study of factors affecting inflation in Republic of Tajikistan

    OpenAIRE

    Qurbanalieva, Nigina

    2013-01-01

    This paper investigates the core factors affecting the price level in republic of Tajikistan by using ‘auto regressive distributed lags’ and Johansen-Juselius cointegration models. The empirical analysis is based on a dataset of demand pull and cost push inflation indicators. We used the monthly data for a period of 2005 to 2012. The findings of this study reveal that in the long run exchange rate, world wheat prices, world oil prices and labor supply Granger cause the price level. Neverthele...

  2. An empirically-based model for the lift coefficients of twisted airfoils with leading-edge tubercles

    Science.gov (United States)

    Ni, Zao; Su, Tsung-chow; Dhanak, Manhar

    2018-04-01

    Experimental data for untwisted airfoils are utilized to propose a model for predicting the lift coefficients of twisted airfoils with leading-edge tubercles. The effectiveness of the empirical model is verified through comparison with results of a corresponding computational fluid-dynamic (CFD) study. The CFD study is carried out for both twisted and untwisted airfoils with tubercles, the latter shown to compare well with available experimental data. Lift coefficients of twisted airfoils predicted from the proposed empirically-based model match well with the corresponding coefficients determined using the verified CFD study. Flow details obtained from the latter provide better insight into the underlying mechanism and behavior at stall of twisted airfoils with leading edge tubercles.

  3. Empirical research on Kano's model and customer satisfaction.

    Science.gov (United States)

    Lin, Feng-Han; Tsai, Sang-Bing; Lee, Yu-Cheng; Hsiao, Cheng-Fu; Zhou, Jie; Wang, Jiangtao; Shang, Zhiwen

    2017-01-01

    Products are now developed based on what customers desire, and thus attractive quality creation has become crucial. In studies on customer satisfaction, methods for analyzing quality attributes and enhancing customer satisfaction have been proposed to facilitate product development. Although substantial studies have performed to assess the impact of the attributes on customer satisfaction, little research has been conducted that quantitatively calculate the odds of customer satisfaction for the Kano classification, fitting a nonlinear relationship between attribute-level performance and customer satisfaction. In the present study, the odds of customer satisfaction were determined to identify the classification of quality attributes, and took customer psychology into account to suggest how decision-makers should prioritize the allocation of resources. A novel method for quantitatively assessing quality attributes was proposed to determine classification criteria and fit the nonlinear relationship between quality attributes and customer satisfaction. Subsequently, a case study was conducted on bicycle user satisfaction to verify the novel method. The concept of customer satisfaction odds was integrated with the value function from prospect theory to understand quality attributes. The results of this study can serve as a reference for product designers to create attractive quality attributes in their products and thus enhance customer satisfaction.

  4. Empirical research on Kano’s model and customer satisfaction

    Science.gov (United States)

    Lin, Feng-Han; Tsai, Sang-Bing; Lee, Yu-Cheng; Hsiao, Cheng-Fu; Zhou, Jie; Wang, Jiangtao; Shang, Zhiwen

    2017-01-01

    Products are now developed based on what customers desire, and thus attractive quality creation has become crucial. In studies on customer satisfaction, methods for analyzing quality attributes and enhancing customer satisfaction have been proposed to facilitate product development. Although substantial studies have performed to assess the impact of the attributes on customer satisfaction, little research has been conducted that quantitatively calculate the odds of customer satisfaction for the Kano classification, fitting a nonlinear relationship between attribute-level performance and customer satisfaction. In the present study, the odds of customer satisfaction were determined to identify the classification of quality attributes, and took customer psychology into account to suggest how decision-makers should prioritize the allocation of resources. A novel method for quantitatively assessing quality attributes was proposed to determine classification criteria and fit the nonlinear relationship between quality attributes and customer satisfaction. Subsequently, a case study was conducted on bicycle user satisfaction to verify the novel method. The concept of customer satisfaction odds was integrated with the value function from prospect theory to understand quality attributes. The results of this study can serve as a reference for product designers to create attractive quality attributes in their products and thus enhance customer satisfaction. PMID:28873418

  5. Determinants of Business Success – Theoretical Model and Empirical Verification

    Directory of Open Access Journals (Sweden)

    Kozielski Robert

    2016-12-01

    Full Text Available Market knowledge, market orientation, learning competencies, and a business performance were the key issues of the research project conducted in the 2006 study. The main findings identified significant relationships between the independent variables (market knowledge, market orientation, learning competencies and the dependent variables (business success. A partial correlation analysis indicated that a business success primarily relies on organisational learning competencies. Organisational learning competencies, to a large extent (almost 60%, may be explained by the level of corporate market knowledge and market orientation. The aim of the paper is to evaluate to what extent the relationships between the variables are still valid. The research was based on primary and secondary data sources. The major field of the research was carried out in the form of quantitative studies. The results of the 2014 study are consistent with the previous (2006 results.

  6. Understanding virtual world usage : A multipurpose model and empirical testing

    NARCIS (Netherlands)

    Verhagen, Tibert; Feldberg, Frans; Van Den Hooff, Bart; Meents, Selmar

    2009-01-01

    This study reports an attempt to enhance our understanding of the reasons behind virtual world usage. By providing a mixture of utilitarian and hedonic value, virtual worlds represent an emerging class of multipurpose information systems (MPIS). Previous research seems to fall short in explaining

  7. Financial power laws: Empirical evidence, models, and mechanisms

    International Nuclear Information System (INIS)

    Lux, Thomas; Alfarano, Simone

    2016-01-01

    Financial markets (share markets, foreign exchange markets and others) are all characterized by a number of universal power laws. The most prominent example is the ubiquitous finding of a robust, approximately cubic power law characterizing the distribution of large returns. A similarly robust feature is long-range dependence in volatility (i.e., hyperbolic decline of its autocorrelation function). The recent literature adds temporal scaling of trading volume and multi-scaling of higher moments of returns. Increasing awareness of these properties has recently spurred attempts at theoretical explanations of the emergence of these key characteristics form the market process. In principle, different types of dynamic processes could be responsible for these power-laws. Examples to be found in the economics literature include multiplicative stochastic processes as well as dynamic processes with multiple equilibria. Though both types of dynamics are characterized by intermittent behavior which occasionally generates large bursts of activity, they can be based on fundamentally different perceptions of the trading process. The present paper reviews both the analytical background of the power laws emerging from the above data generating mechanisms as well as pertinent models proposed in the economics literature.

  8. A theoretical and empirical evaluation and extension of the Todaro migration model.

    Science.gov (United States)

    Salvatore, D

    1981-11-01

    "This paper postulates that it is theoretically and empirically preferable to base internal labor migration on the relative difference in rural-urban real income streams and rates of unemployment, taken as separate and independent variables, rather than on the difference in the expected real income streams as postulated by the very influential and often quoted Todaro model. The paper goes on to specify several important ways of extending the resulting migration model and improving its empirical performance." The analysis is based on Italian data. excerpt

  9. An empirical study of race times in recreational endurance runners.

    Science.gov (United States)

    Vickers, Andrew J; Vertosick, Emily A

    2016-01-01

    Studies of endurance running have typically involved elite athletes, small sample sizes and measures that require special expertise or equipment. We examined factors associated with race performance and explored methods for race time prediction using information routinely available to a recreational runner. An Internet survey was used to collect data from recreational endurance runners (N = 2303). The cohort was split 2:1 into a training set and validation set to create models to predict race time. Sex, age, BMI and race training were associated with mean race velocity for all race distances. The difference in velocity between males and females decreased with increasing distance. Tempo runs were more strongly associated with velocity for shorter distances, while typical weekly training mileage and interval training had similar associations with velocity for all race distances. The commonly used Riegel formula for race time prediction was well-calibrated for races up to a half-marathon, but dramatically underestimated marathon time, giving times at least 10 min too fast for half of runners. We built two models to predict marathon time. The mean squared error for Riegel was 381 compared to 228 (model based on one prior race) and 208 (model based on two prior races). Our findings can be used to inform race training and to provide more accurate race time predictions for better pacing.

  10. An Empirical Competence-Capability Model of Supply Chain Innovation

    OpenAIRE

    Mandal, Santanu

    2016-01-01

    Supply chain innovation has become the new pre-requisite for the survival of firms in developing capabilities and strategies for sustaining their operations and performance in the market. This study investigates the influence of supply and demand competence on supply chain innovation and its influence on a firm’s operational and relational performance. While the former competence refers to production and supply management related activities, the latter refers to distribution and demand manage...

  11. Empirical wind retrieval model based on SAR spectrum measurements

    Science.gov (United States)

    Panfilova, Maria; Karaev, Vladimir; Balandina, Galina; Kanevsky, Mikhail; Portabella, Marcos; Stoffelen, Ad

    The present paper considers polarimetric SAR wind vector applications. Remote-sensing measurements of the near-surface wind over the ocean are of great importance for the understanding of atmosphere-ocean interaction. In recent years investigations for wind vector retrieval using Synthetic Aperture Radar (SAR) data have been performed. In contrast with scatterometers, a SAR has a finer spatial resolution that makes it a more suitable microwave instrument to explore wind conditions in the marginal ice zones, coastal regions and lakes. The wind speed retrieval procedure from scatterometer data matches the measured radar backscattering signal with the geophysical model function (GMF). The GMF determines the radar cross section dependence on the wind speed and direction with respect to the azimuthal angle of the radar beam. Scatterometers provide information on wind speed and direction simultaneously due to the fact that each wind vector cell (WVC) is observed at several azimuth angles. However, SAR is not designed to be used as a high resolution scatterometer. In this case, each WVC is observed at only one single azimuth angle. That is why for wind vector determination additional information such as wind streak orientation over the sea surface is required. It is shown that the wind vector can be obtained using polarimetric SAR without additional information. The main idea is to analyze the spectrum of a homogeneous SAR image area instead of the backscattering normalized radar cross section. Preliminary numerical simulations revealed that SAR image spectral maxima positions depend on the wind vector. Thus the following method for wind speed retrieval is proposed. In the first stage of the algorithm, the SAR spectrum maxima are determined. This procedure is carried out to estimate the wind speed and direction with ambiguities separated by 180 degrees due to the SAR spectrum symmetry. The second stage of the algorithm allows us to select the correct wind direction

  12. An empirical probability model of detecting species at low densities.

    Science.gov (United States)

    Delaney, David G; Leung, Brian

    2010-06-01

    False negatives, not detecting things that are actually present, are an important but understudied problem. False negatives are the result of our inability to perfectly detect species, especially those at low density such as endangered species or newly arriving introduced species. They reduce our ability to interpret presence-absence survey data and make sound management decisions (e.g., rapid response). To reduce the probability of false negatives, we need to compare the efficacy and sensitivity of different sampling approaches and quantify an unbiased estimate of the probability of detection. We conducted field experiments in the intertidal zone of New England and New York to test the sensitivity of two sampling approaches (quadrat vs. total area search, TAS), given different target characteristics (mobile vs. sessile). Using logistic regression we built detection curves for each sampling approach that related the sampling intensity and the density of targets to the probability of detection. The TAS approach reduced the probability of false negatives and detected targets faster than the quadrat approach. Mobility of targets increased the time to detection but did not affect detection success. Finally, we interpreted two years of presence-absence data on the distribution of the Asian shore crab (Hemigrapsus sanguineus) in New England and New York, using our probability model for false negatives. The type of experimental approach in this paper can help to reduce false negatives and increase our ability to detect species at low densities by refining sampling approaches, which can guide conservation strategies and management decisions in various areas of ecology such as conservation biology and invasion ecology.

  13. An empirical model of diagnostic x-ray attenuation under narrow-beam geometry

    International Nuclear Information System (INIS)

    Mathieu, Kelsey B.; Kappadath, S. Cheenu; White, R. Allen; Atkinson, E. Neely; Cody, Dianna D.

    2011-01-01

    Purpose: The purpose of this study was to develop and validate a mathematical model to describe narrow-beam attenuation of kilovoltage x-ray beams for the intended applications of half-value layer (HVL) and quarter-value layer (QVL) estimations, patient organ shielding, and computer modeling. Methods: An empirical model, which uses the Lambert W function and represents a generalized Lambert-Beer law, was developed. To validate this model, transmission of diagnostic energy x-ray beams was measured over a wide range of attenuator thicknesses [0.49-33.03 mm Al on a computed tomography (CT) scanner, 0.09-1.93 mm Al on two mammography systems, and 0.1-0.45 mm Cu and 0.49-14.87 mm Al using general radiography]. Exposure measurements were acquired under narrow-beam geometry using standard methods, including the appropriate ionization chamber, for each radiographic system. Nonlinear regression was used to find the best-fit curve of the proposed Lambert W model to each measured transmission versus attenuator thickness data set. In addition to validating the Lambert W model, we also assessed the performance of two-point Lambert W interpolation compared to traditional methods for estimating the HVL and QVL [i.e., semilogarithmic (exponential) and linear interpolation]. Results: The Lambert W model was validated for modeling attenuation versus attenuator thickness with respect to the data collected in this study (R 2 > 0.99). Furthermore, Lambert W interpolation was more accurate and less sensitive to the choice of interpolation points used to estimate the HVL and/or QVL than the traditional methods of semilogarithmic and linear interpolation. Conclusions: The proposed Lambert W model accurately describes attenuation of both monoenergetic radiation and (kilovoltage) polyenergetic beams (under narrow-beam geometry).

  14. An empirical model of diagnostic x-ray attenuation under narrow-beam geometry.

    Science.gov (United States)

    Mathieu, Kelsey B; Kappadath, S Cheenu; White, R Allen; Atkinson, E Neely; Cody, Dianna D

    2011-08-01

    The purpose of this study was to develop and validate a mathematical model to describe narrow-beam attenuation of kilovoltage x-ray beams for the intended applications of half-value layer (HVL) and quarter-value layer (QVL) estimations, patient organ shielding, and computer modeling. An empirical model, which uses the Lambert W function and represents a generalized Lambert-Beer law, was developed. To validate this model, transmission of diagnostic energy x-ray beams was measured over a wide range of attenuator thicknesses [0.49-33.03 mm Al on a computed tomography (CT) scanner, 0.09-1.93 mm Al on two mammography systems, and 0.1-0.45 mm Cu and 0.49-14.87 mm Al using general radiography]. Exposure measurements were acquired under narrow-beam geometry using standard methods, including the appropriate ionization chamber, for each radiographic system. Nonlinear regression was used to find the best-fit curve of the proposed Lambert W model to each measured transmission versus attenuator thickness data set. In addition to validating the Lambert W model, we also assessed the performance of two-point Lambert W interpolation compared to traditional methods for estimating the HVL and QVL [i.e., semi-logarithmic (exponential) and linear interpolation]. The Lambert W model was validated for modeling attenuation versus attenuator thickness with respect to the data collected in this study (R2 > 0.99). Furthermore, Lambert W interpolation was more accurate and less sensitive to the choice of interpolation points used to estimate the HVL and/or QVL than the traditional methods of semilogarithmic and linear interpolation. The proposed Lambert W model accurately describes attenuation of both monoenergetic radiation and (kilovoltage) polyenergetic beams (under narrow-beam geometry).

  15. The Effect of Private Benefits of Control on Minority Shareholders: A Theoretical Model and Empirical Evidence from State Ownership

    Directory of Open Access Journals (Sweden)

    Kerry Liu

    2017-06-01

    Full Text Available Purpose: The purpose of this paper is to examine the effect of private benefits of control on minority shareholders. Design/methodology/approach: A theoretical model is established. The empirical analysis includes hand-collected data from a wide range of data sources. OLS and 2SLS regression analysis are applied with Huber-White standard errors. Findings: The theoretical model shows that, while private benefits are generally harmful to minority shareholders, the overall effect depends on the size of large shareholder ownership. The empirical evidence from government ownership is consistent with theoretical analysis. Research limitations/implications: The empirical evidence is based on a small number of hand-collected data sets of government ownership. Further studies can be expanded to other types of ownership, such as family ownership and financial institutional ownership. Originality/value: This study is the first to theoretically analyse and empirically test the effect of private benefits. In general, this study significantly contributes to the understanding of the effect of large shareholder and corporate governance.

  16. Empirical models of the electron concentration of the ionosphere and their value for radio communications purposes

    International Nuclear Information System (INIS)

    Dudeney, J.R.; Kressman, R.I.

    1986-01-01

    Criteria for the development of ionosphere electron concentration vertical profile empirical models for radio communications purposes are discussed and used to evaluate and compare four contemporary schemes. Schemes must be optimized with respect to quality of profile match, availability and simplicity of the external data required for profile specification, and numerical complexity, depending on the application. It is found that the Dudeney (1978) scheme provides the best general performance, while the Booker (1977) technique is optimized for precision radio wave studies where an observed profile is available. The CCIR (Bradley and Dudeney, 1973) scheme performance is found to be inferior to the previous two, and should be superceded except where mathematical simplicity is prioritized. The International Reference Ionosphere profile is seen to have significant disadvantages with respect to all three criteria. 17 references

  17. A nurse manager succession planning model with associated empirical outcomes.

    Science.gov (United States)

    Titzer, Jennifer L; Shirey, Maria R; Hauck, Sheila

    2014-01-01

    Perceptions of leadership and management competency after a formal nurse manager succession planning program were evaluated. A lack of strategic workforce planning and development of a leadership pipeline contributes to a predicted nurse manager shortage. To meet the anticipated needs for future leadership, evidence-based action is critical. A quasi-experimental mixed-methods, 1-group pretest/posttest research design was used. Nurses working in an acute care hospital were recruited for the study and selected using an objective evaluative process. Participant perceptions regarding their leadership and management competencies significantly increased after the leadership program. Program evaluations confirmed that participants found the program beneficial. One year after program completion, 100% of the program participants have been retained at the organization and 73% had transitioned to leadership roles. Succession planning and leadership development serve as beneficial and strategic mechanisms for identifying and developing high-potential individuals for leadership positions, contributing toward the future nursing leadership pipeline.

  18. Modeling for Determinants of Human Trafficking: An Empirical Analysis

    Directory of Open Access Journals (Sweden)

    Seo-Young Cho

    2015-02-01

    Full Text Available This study aims to identify robust push and pull factors of human trafficking. I test for the robustness of 70 push and 63 pull factors suggested in the literature. In doing so, I employ an extreme bound analysis, running more than two million regressions with all possible combinations of variables for up to 153 countries during the period of 1995–2010. My results show that crime prevalence robustly explains human trafficking both in destination and origin countries. Income level also has a robust impact, suggesting that the cause of human trafficking shares that of economic migration. Law enforcement matters more in origin countries than destination countries. Interestingly, a very low level of gender equality may have constraining effects on human trafficking outflow, possibly because gender discrimination limits female mobility that is necessary for the occurrence of human trafficking.

  19. Comparison of physical and semi-empirical hydraulic models for flood inundation mapping

    Science.gov (United States)

    Tavakoly, A. A.; Afshari, S.; Omranian, E.; Feng, D.; Rajib, A.; Snow, A.; Cohen, S.; Merwade, V.; Fekete, B. M.; Sharif, H. O.; Beighley, E.

    2016-12-01

    Various hydraulic/GIS-based tools can be used for illustrating spatial extent of flooding for first-responders, policy makers and the general public. The objective of this study is to compare four flood inundation modeling tools: HEC-RAS-2D, Gridded Surface Subsurface Hydrologic Analysis (GSSHA), AutoRoute and Height Above the Nearest Drainage (HAND). There is a trade-off among accuracy, workability and computational demand in detailed, physics-based flood inundation models (e.g. HEC-RAS-2D and GSSHA) in contrast with semi-empirical, topography-based, computationally less expensive approaches (e.g. AutoRoute and HAND). The motivation for this study is to evaluate this trade-off and offer guidance to potential large-scale application in an operational prediction system. The models were assessed and contrasted via comparability analysis (e.g. overlapping statistics) by using three case studies in the states of Alabama, Texas, and West Virginia. The sensitivity and accuracy of physical and semi-eimpirical models in producing inundation extent were evaluated for the following attributes: geophysical characteristics (e.g. high topographic variability vs. flat natural terrain, urbanized vs. rural zones, effect of surface roughness paratermer value), influence of hydraulic structures such as dams and levees compared to unobstructed flow condition, accuracy in large vs. small study domain, effect of spatial resolution in topographic data (e.g. 10m National Elevation Dataset vs. 0.3m LiDAR). Preliminary results suggest that semi-empericial models tend to underestimate in a flat, urbanized area with controlled/managed river channel around 40% of the inundation extent compared to the physical models, regardless of topographic resolution. However, in places where there are topographic undulations, semi-empericial models attain relatively higher level of accuracy than they do in flat non-urbanized terrain.

  20. Parameterization of water vapor using high-resolution GPS data and empirical models

    Science.gov (United States)

    Ningombam, Shantikumar S.; Jade, Sridevi; Shrungeshwara, T. S.

    2018-03-01

    The present work evaluates eleven existing empirical models to estimate Precipitable Water Vapor (PWV) over a high-altitude (4500 m amsl), cold-desert environment. These models are tested extensively and used globally to estimate PWV for low altitude sites (below 1000 m amsl). The moist parameters used in the model are: water vapor scale height (Hc), dew point temperature (Td) and water vapor pressure (Es 0). These moist parameters are derived from surface air temperature and relative humidity measured at high temporal resolution from automated weather station. The performance of these models are examined statistically with observed high-resolution GPS (GPSPWV) data over the region (2005-2012). The correlation coefficient (R) between the observed GPSPWV and Model PWV is 0.98 at daily data and varies diurnally from 0.93 to 0.97. Parameterization of moisture parameters were studied in-depth (i.e., 2 h to monthly time scales) using GPSPWV , Td , and Es 0 . The slope of the linear relationships between GPSPWV and Td varies from 0.073°C-1 to 0.106°C-1 (R: 0.83 to 0.97) while GPSPWV and Es 0 varied from 1.688 to 2.209 (R: 0.95 to 0.99) at daily, monthly and diurnal time scales. In addition, the moist parameters for the cold desert, high-altitude environment are examined in-depth at various time scales during 2005-2012.

  1. The Role of Light and Music in Gambling Behaviour: An Empirical Pilot Study

    Science.gov (United States)

    Spenwyn, Jenny; Barrett, Doug J. K.; Griffiths, Mark D.

    2010-01-01

    Empirical research examining the situational characteristics of gambling and their effect on gambling behaviour is limited but growing. This experimental pilot investigation reports the first ever empirical study into the combined effects of both music and light on gambling behaviour. While playing an online version of roulette, 56 participants…

  2. Models for governing relationships in healthcare organizations: Some empirical evidence.

    Science.gov (United States)

    Romiti, Anna; Del Vecchio, Mario; Grazzini, Maddalena

    2018-01-01

    Recently, most European countries have undergone integration processes through mergers and strategic alliances between healthcare organizations. The present paper examined three cases within the Italian National Health Service in order to determine how different organizations, within differing institutional contexts, govern an healthcare integration process. Furthermore, we explored the possibility that the governance mode, usually seen as alternatives (i.e., merger or alliance), could be considered as a separate step in the development of a more suitable integration process. Multiple case studies were used to compare different integration approaches. Specifically, three cases were considered, of which two were characterized by collaborative processes and the other by a merger. Semi-structured interviews were conducted with managers involved in the processes. Each case presents different governing modes, structures, and mechanisms for achieving integration. The role played by the institutional context also led to different results with unique advantages and disadvantages. Three main conclusions are discussed: (a) Alliances and mergers can be interpreted as different steps in a path leading to a better integration; (b) The alignment between institutional/political time horizon and the time needed for the organizations to achieve an integration process lead to a better integration; (c) Trust plays an important role in integration process operating at different levels that of institutional and organizational level and that built between people.

  3. Comparing the model-simulated global warming signal to observations using empirical estimates of unforced noise

    Science.gov (United States)

    Brown, Patrick T.; Li, Wenhong; Cordero, Eugene C.; Mauget, Steven A.

    2015-01-01

    The comparison of observed global mean surface air temperature (GMT) change to the mean change simulated by climate models has received much public and scientific attention. For a given global warming signal produced by a climate model ensemble, there exists an envelope of GMT values representing the range of possible unforced states of the climate system (the Envelope of Unforced Noise; EUN). Typically, the EUN is derived from climate models themselves, but climate models might not accurately simulate the correct characteristics of unforced GMT variability. Here, we simulate a new, empirical, EUN that is based on instrumental and reconstructed surface temperature records. We compare the forced GMT signal produced by climate models to observations while noting the range of GMT values provided by the empirical EUN. We find that the empirical EUN is wide enough so that the interdecadal variability in the rate of global warming over the 20th century does not necessarily require corresponding variability in the rate-of-increase of the forced signal. The empirical EUN also indicates that the reduced GMT warming over the past decade or so is still consistent with a middle emission scenario's forced signal, but is likely inconsistent with the steepest emission scenario's forced signal. PMID:25898351

  4. Comparing the model-simulated global warming signal to observations using empirical estimates of unforced noise.

    Science.gov (United States)

    Brown, Patrick T; Li, Wenhong; Cordero, Eugene C; Mauget, Steven A

    2015-04-21

    The comparison of observed global mean surface air temperature (GMT) change to the mean change simulated by climate models has received much public and scientific attention. For a given global warming signal produced by a climate model ensemble, there exists an envelope of GMT values representing the range of possible unforced states of the climate system (the Envelope of Unforced Noise; EUN). Typically, the EUN is derived from climate models themselves, but climate models might not accurately simulate the correct characteristics of unforced GMT variability. Here, we simulate a new, empirical, EUN that is based on instrumental and reconstructed surface temperature records. We compare the forced GMT signal produced by climate models to observations while noting the range of GMT values provided by the empirical EUN. We find that the empirical EUN is wide enough so that the interdecadal variability in the rate of global warming over the 20(th) century does not necessarily require corresponding variability in the rate-of-increase of the forced signal. The empirical EUN also indicates that the reduced GMT warming over the past decade or so is still consistent with a middle emission scenario's forced signal, but is likely inconsistent with the steepest emission scenario's forced signal.

  5. Knowledge-oriented strategies in the metal industry (empirical studies

    Directory of Open Access Journals (Sweden)

    A. Krawczyk-Sołtys

    2016-07-01

    Full Text Available The aim of this article is an attempt to determine which knowledge-oriented strategies can give metal industry enterprises the best results in achieving and maintaining a competitive advantage. To determine which of these discussed in the literature and implemented in various organizations knowledge-oriented strategies may prove to be the most effective in the metal industry, empirical research has begun. A chosen strategy of knowledge management and supporting strategies are the basis of a choice of methods and means of intended implementation. The choice of a specific knowledge management strategy may also result in the need for changes in an organization, particularly in an information system, internal communication, work organization and human resource management.

  6. FEATURE EVALUATION FOR BUILDING FACADE IMAGES – AN EMPIRICAL STUDY

    Directory of Open Access Journals (Sweden)

    M. Y. Yang

    2012-08-01

    Full Text Available The classification of building facade images is a challenging problem that receives a great deal of attention in the photogrammetry community. Image classification is critically dependent on the features. In this paper, we perform an empirical feature evaluation task for building facade images. Feature sets we choose are basic features, color features, histogram features, Peucker features, texture features, and SIFT features. We present an approach for region-wise labeling using an efficient randomized decision forest classifier and local features. We conduct our experiments with building facade image classification on the eTRIMS dataset, where our focus is the object classes building, car, door, pavement, road, sky, vegetation, and window.

  7. Empirical Study on Arbitrage Opportunities in China Copper Futures Market

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    No-arbitrage bound is established with no-arbitrage theory considering all kinds of trade costs, different deposit and loan interest rate, margin and tax in fuuaes markets. The empirical results find that there are many lower bound arbitrage opportunities in China copper futures market from August 8th, 2003 to August 16th, 2005. Concretely, no-arbitrage opportunity is dominant and lower bound arbitrage is narrow in normal market segment. Lower bound arbitrage almost always exists with huge magnitude in inverted market segment. There is basically no-arbitrage in normal market because spot volume is enough, so that upper or lower bound arbitrage can be realized. There is mostly lower bound arbitrage in inverted market because spot volume is lack.

  8. AN EMPIRICAL STUDY OF MARKETING COMMUNICATIONS EFFECTIVENESS IN SLOVENIAN MARKET

    Directory of Open Access Journals (Sweden)

    Damjana Jerman

    2014-01-01

    Full Text Available This paper deals with the value or more specifically, the contribution of marketing communications strategy to effectiveness of marketing communications and hypothesizes that marketing communications strategy correlate with the effectiveness of marketing communications. The paper consists of two parts: the theoretical framework for the role of marketing communications strategy for the effectiveness of the marketing communications and the empirical analysis, based on the primary data collected. The concept of the marketing communication effectiveness assumes that there are variables that can have a positive influence on the effectiveness of marketing communications, which incorporates facets of the marketing communication strategy and bidirectional communications. The results suggest that Slovenian organisations which design and implement marketing communication strategy, also have more effective marketing communications. The development of marketing communications strategy was correlated with increased effectiveness of marketing communications in their organisation. Managerial implications are discussed along with directions for further research.

  9. Empirical Results of Modeling EUR/RON Exchange Rate using ARCH, GARCH, EGARCH, TARCH and PARCH models

    Directory of Open Access Journals (Sweden)

    Andreea – Cristina PETRICĂ

    2017-03-01

    Full Text Available The aim of this study consists in examining the changes in the volatility of daily returns of EUR/RON exchange rate using on the one hand symmetric GARCH models (ARCH and GARCH and on the other hand the asymmetric GARCH models (EGARCH, TARCH and PARCH, since the conditional variance is time-varying. The analysis takes into account daily quotations of EUR/RON exchange rate over the period of 04th January 1999 to 13th June 2016. Thus, we are modeling heteroscedasticity by applying different specifications of GARCH models followed by looking for significant parameters and low information criteria (minimum Akaike Information Criterion. All models are estimated using the maximum likelihood method under the assumption of several distributions of the innovation terms such as: Normal (Gaussian distribution, Student’s t distribution, Generalized Error distribution (GED, Student’s with fixed df. Distribution, and GED with fixed parameter distribution. The predominant models turned out to be EGARCH and PARCH models, and the empirical results point out that the best model for estimating daily returns of EUR/RON exchange rate is EGARCH(2,1 with Asymmetric order 2 under the assumption of Student’s t distributed innovation terms. This can be explained by the fact that in case of EGARCH model, the restriction regarding the positivity of the conditional variance is automatically satisfied.

  10. Empirical evidence of study design biases in randomized trials

    DEFF Research Database (Denmark)

    Page, Matthew J.; Higgins, Julian P. T.; Clayton, Gemma

    2016-01-01

    search September 2012), and searched Ovid MEDLINE and Ovid EMBASE for studies indexed from Jan 2012-May 2015. Data were extracted by one author and verified by another. We combined estimates of average bias (e.g. ratio of odds ratios (ROR) or difference in standardised mean differences (dSMD)) in meta......-analyses using the random-effects model. Analyses were stratified by type of outcome ("mortality" versus "other objective" versus "subjective"). Direction of effect was standardised so that ROR SMD ... studies). For these characteristics, the average bias appeared to be larger in trials of subjective outcomes compared with other objective outcomes. Also, intervention effects for subjective outcomes appear to be exaggerated in trials with lack of/unclear blinding of participants (versus blinding) (dSMD...

  11. Sustaining the environment through recycling: an empirical study.

    Science.gov (United States)

    Ramayah, T; Lee, Jason Wai Chow; Lim, Shuwen

    2012-07-15

    This paper examines the determinants of recycling behaviour among 200 university students from the perspective of the theory of planned behaviour (TPB). Data was analysed using Structural Equation Modelling technique. Findings indicate that environmental awareness was significantly related to attitude towards recycling, whilst attitude and social norms had significant impact on recycling behaviour. However, convenience and cost of recycling were not significant reasons for recycling. The study has enhanced the understanding of the determinants of recycling behaviour and has implications for schools and governmental agencies in educating and encouraging positive recycling behaviour. It also confirms the appropriateness of the TPB in examining studies of this nature. Further suggestions for future research are offered. Copyright © 2012 Elsevier Ltd. All rights reserved.

  12. An Empirical Study Of User Acceptance Of Online Social Networks Marketing

    Directory of Open Access Journals (Sweden)

    Olumayowa Mulero

    2013-07-01

    Full Text Available The explosion of Internet usage has drawn the attention of researchers towards online Social Networks Marketing (SNM. Research has shown that a number of the Internet users are distrustful and indecisive, when it comes to the use of social networks marketing system. Therefore, there is a need for researchers to identify some of the factors that determine users’ acceptance of social networks marketing using Technology Acceptance Model (TAM. This study extended the Technology Acceptance Model theoretical framework to predict consumer acceptance of social networks marketing within Western Cape Province of South Africa. The research model was tested using data collected from 470 questionnaires and analysed using linear regression. The results showed that user intentions to use SNM are strongly and positively correlated with user acceptance of using SNM systems. Empirical results confirmed that perceived credibility and perceived usefulness are the strongest determinant in predicting user intentions to use SNM system.

  13. An Empirical Study on Washback Effects of the Internet-Based College English Test Band 4 in China

    Science.gov (United States)

    Wang, Chao; Yan, Jiaolan; Liu, Bao

    2014-01-01

    Based on Bailey's washback model, in respect of participants, process and products, the present empirical study was conducted to find the actual washback effects of the internet-based College English Test Band 4 (IB CET-4). The methods adopted are questionnaires, class observation, interview and the analysis of both the CET-4 teaching and testing…

  14. An Empirical Study on Market Timing Theory of Capital Structure

    Directory of Open Access Journals (Sweden)

    Ignatius Rony Setyawan

    2016-12-01

    Full Text Available The theory of capital structure has advanced remarkably. This development began as many firms had options to consider various external factors determining the composition of debt and equity. Not only the asymmetric information or the conflict among bondholders and shareholders initiated the Pecking Order Theory and the Static Trade-off Theory respectively but also the overvalued or undervalued of stock price had to be taken as a determinant factor for identifying the ideal debt-equity mix. The author maintains these factors as they were pioneers to this theory on Market Timing Theory (MTT introduced by Baker and Wurgler (2002. The essence of this theory is described when stock prices are overvalued, firms will finance projects through debts, otherwise the firms will be undervalued and be relied on equity financing. Using the methodology introduced by Baker and Wurgler (2002, the author selected only samples of IPOs of firms during 2008-2009 to limit the scope of this study. The main objective of this study is to test the hypothèses of Market Timing Theory formulated by Dahlan (2004 and by Kusumawati and Danny (2006 which have been proven by the GLS model, and the OLS model-like as in Baker and Wurgler (2002, Susilawati (2008 and Saad (2010. This study concludes that the market-to-book ratio has a negative effect on the market leverage. The implication is that when firms achieve certain level of earnings growth, the stock price will be overvalued, so it would be the right timing for firms to proceed equity financing. Under the robustness test with GLS Random Effect, the hypothèses of MTT is supported.

  15. An Empirical Study on Market Timing Theory of Capital Structure

    Directory of Open Access Journals (Sweden)

    Ignatius Rony Setyawan

    2011-08-01

    Full Text Available The theory of capital structure has advanced remarkably. This development began as many firms had options to consider various external factors determining the composition of debt and equity. Not only the asymmetric information or the conflict among bondholders and shareholders initiated the Pecking Order Theory and the Static Trade-off Theory respectively but also the overvalued or undervalued of stock price had to be taken as a determinant factor for identifying the ideal debt-equity mix. The author maintains these factors as they were pioneers to this theory on Market Timing Theory (MTT introduced by Baker and Wurgler (2002. The essence of this theory is described when stock prices are overvalued, firms will finance projects through debts, otherwise the firms will be undervalued and be relied on equity financing. Using the methodology introduced by Baker and Wurgler (2002, the author selected only samples of IPOs of firms during 2008-2009 to limit the scope of this study. The main objective of this study is to test the hypothèses of Market Timing Theory formulated by Dahlan (2004 and by Kusumawati and Danny (2006 which have been proven by the GLS model, and the OLS model-like as in Baker and Wurgler (2002, Susilawati (2008 and Saad (2010. This study concludes that the market-to-book ratio has a negative effect on the market leverage. The implication is that when firms achieve certain level of earnings growth, the stock price will be overvalued, so it would be the right timing for firms to proceed equity financing. Under the robustness test with GLS Random Effect, the hypothèses of MTT is supported.

  16. Development of efficient air-cooling strategies for lithium-ion battery module based on empirical heat source model

    International Nuclear Information System (INIS)

    Wang, Tao; Tseng, K.J.; Zhao, Jiyun

    2015-01-01

    Thermal modeling is the key issue in thermal management of lithium-ion battery system, and cooling strategies need to be carefully investigated to guarantee the temperature of batteries in operation within a narrow optimal range as well as provide cost effective and energy saving solutions for cooling system. This article reviews and summarizes the past cooling methods especially forced air cooling and introduces an empirical heat source model which can be widely applied in the battery module/pack thermal modeling. In the development of empirical heat source model, three-dimensional computational fluid dynamics (CFD) method is employed, and thermal insulation experiments are conducted to provide the key parameters. A transient thermal model of 5 × 5 battery module with forced air cooling is then developed based on the empirical heat source model. Thermal behaviors of battery module under different air cooling conditions, discharge rates and ambient temperatures are characterized and summarized. Varies cooling strategies are simulated and compared in order to obtain an optimal cooling method. Besides, the battery fault conditions are predicted from transient simulation scenarios. The temperature distributions and variations during discharge process are quantitatively described, and it is found that the upper limit of ambient temperature for forced air cooling is 35 °C, and when ambient temperature is lower than 20 °C, forced air-cooling is not necessary. - Highlights: • An empirical heat source model is developed for battery thermal modeling. • Different air-cooling strategies on module thermal characteristics are investigated. • Impact of different discharge rates on module thermal responses are investigated. • Impact of ambient temperatures on module thermal behaviors are investigated. • Locations of maximum temperatures under different operation conditions are studied.

  17. A semi-empirical model for the prediction of fouling in railway ballast using GPR

    Science.gov (United States)

    Bianchini Ciampoli, Luca; Tosti, Fabio; Benedetto, Andrea; Alani, Amir M.; Loizos, Andreas; D'Amico, Fabrizio; Calvi, Alessandro

    2016-04-01

    The first step in the planning for a renewal of a railway network consists in gathering information, as effectively as possible, about the state of the railway tracks. Nowadays, this activity is mostly carried out by digging trenches at regular intervals along the whole network, to evaluate both geometrical and geotechnical properties of the railway track bed. This involves issues, mainly concerning the invasiveness of the operations, the impacts on the rail traffic, the high costs, and the low levels of significance concerning such discrete data set. Ground-penetrating radar (GPR) can represent a useful technique for overstepping these issues, as it can be directly mounted onto a train crossing the railway, and collect continuous information along the network. This study is aimed at defining an empirical model for the prediction of fouling in railway ballast, by using GPR. With this purpose, a thorough laboratory campaign was implemented within the facilities of Roma Tre University. In more details, a 1.47 m long × 1.47 m wide × 0.48 m height plexiglass framework, accounting for the domain of investigation, was laid over a perfect electric conductor, and filled up with several configuration of railway ballast and fouling material (clayey sand), thereby representing different levels of fouling. Then, the set of fouling configurations was surveyed with several GPR systems. In particular, a ground-coupled multi-channel radar (600 MHz and 1600 MHz center frequency antennas) and three air-launched radar systems (1000 MHz and 2000 MHz center frequency antennas) were employed for surveying the materials. By observing the results both in terms of time and frequency domains, interesting insights are highlighted and an empirical model, relating in particular the shape of the frequency spectrum of the signal and the percentage of fouling characterizing the surveyed material, is finally proposed. Acknowledgement The Authors thank COST, for funding the Action TU1208 "Civil

  18. A stochastic empirical model for heavy-metal balnces in Agro-ecosystems

    NARCIS (Netherlands)

    Keller, A.N.; Steiger, von B.; Zee, van der S.E.A.T.M.; Schulin, R.

    2001-01-01

    Mass flux balancing provides essential information for preventive strategies against heavy-metal accumulation in agricultural soils that may result from atmospheric deposition and application of fertilizers and pesticides. In this paper we present the empirical stochastic balance model, PROTERRA-S,

  19. Modeling Lolium perenne L. roots in the presence of empirical black holes

    Science.gov (United States)

    Plant root models are designed for understanding structural or functional aspects of root systems. When a process is not thoroughly understood, a black box object is used. However, when a process exists but empirical data do not indicate its existence, you have a black hole. The object of this re...

  20. Climate Prediction for Brazil's Nordeste: Performance of Empirical and Numerical Modeling Methods.

    Science.gov (United States)

    Moura, Antonio Divino; Hastenrath, Stefan

    2004-07-01

    Comparisons of performance of climate forecast methods require consistency in the predictand and a long common reference period. For Brazil's Nordeste, empirical methods developed at the University of Wisconsin use preseason (October January) rainfall and January indices of the fields of meridional wind component and sea surface temperature (SST) in the tropical Atlantic and the equatorial Pacific as input to stepwise multiple regression and neural networking. These are used to predict the March June rainfall at a network of 27 stations. An experiment at the International Research Institute for Climate Prediction, Columbia University, with a numerical model (ECHAM4.5) used global SST information through February to predict the March June rainfall at three grid points in the Nordeste. The predictands for the empirical and numerical model forecasts are correlated at +0.96, and the period common to the independent portion of record of the empirical prediction and the numerical modeling is 1968 99. Over this period, predicted versus observed rainfall are evaluated in terms of correlation, root-mean-square error, absolute error, and bias. Performance is high for both approaches. Numerical modeling produces a correlation of +0.68, moderate errors, and strong negative bias. For the empirical methods, errors and bias are small, and correlations of +0.73 and +0.82 are reached between predicted and observed rainfall.

  1. Understanding users’ motivations to engage in virtual worlds: A multipurpose model and empirical testing

    NARCIS (Netherlands)

    Verhagen, T.; Feldberg, J.F.M.; van den Hooff, B.J.; Meents, S.; Merikivi, J.

    2012-01-01

    Despite the growth and commercial potential of virtual worlds, relatively little is known about what drives users' motivations to engage in virtual worlds. This paper proposes and empirically tests a conceptual model aimed at filling this research gap. Given the multipurpose nature of virtual words

  2. MERGANSER - An Empirical Model to Predict Fish and Loon Mercury in New England Lakes

    Science.gov (United States)

    MERGANSER (MERcury Geo-spatial AssessmeNtS for the New England Region) is an empirical least-squares multiple regression model using mercury (Hg) deposition and readily obtainable lake and watershed features to predict fish (fillet) and common loon (blood) Hg in New England lakes...

  3. An empirical test of stage models of e-government development: evidence from Dutch municipalities

    NARCIS (Netherlands)

    Rooks, G.; Matzat, U.; Sadowski, B.M.

    2017-01-01

    In this article we empirically test stage models of e-government development. We use Lee's classification to make a distinction between four stages of e-government: informational, requests, personal, and e-democracy. We draw on a comprehensive data set on the adoption and development of e-government

  4. Satellite-based empirical models linking river plume dynamics with hypoxic area andvolume

    Science.gov (United States)

    Satellite-based empirical models explaining hypoxic area and volume variation were developed for the seasonally hypoxic (O2 < 2 mg L−1) northern Gulf of Mexico adjacent to the Mississippi River. Annual variations in midsummer hypoxic area and ...

  5. Integrating social science into empirical models of coupled human and natural systems

    Science.gov (United States)

    Jeffrey D. Kline; Eric M. White; A Paige Fischer; Michelle M. Steen-Adams; Susan Charnley; Christine S. Olsen; Thomas A. Spies; John D. Bailey

    2017-01-01

    Coupled human and natural systems (CHANS) research highlights reciprocal interactions (or feedbacks) between biophysical and socioeconomic variables to explain system dynamics and resilience. Empirical models often are used to test hypotheses and apply theory that represent human behavior. Parameterizing reciprocal interactions presents two challenges for social...

  6. Testing the robustness of the anthropogenic climate change detection statements using different empirical models

    KAUST Repository

    Imbers, J.; Lopez, A.; Huntingford, C.; Allen, M. R.

    2013-01-01

    This paper aims to test the robustness of the detection and attribution of anthropogenic climate change using four different empirical models that were previously developed to explain the observed global mean temperature changes over the last few decades. These studies postulated that the main drivers of these changes included not only the usual natural forcings, such as solar and volcanic, and anthropogenic forcings, such as greenhouse gases and sulfates, but also other known Earth system oscillations such as El Niño Southern Oscillation (ENSO) or the Atlantic Multidecadal Oscillation (AMO). In this paper, we consider these signals, or forced responses, and test whether or not the anthropogenic signal can be robustly detected under different assumptions for the internal variability of the climate system. We assume that the internal variability of the global mean surface temperature can be described by simple stochastic models that explore a wide range of plausible temporal autocorrelations, ranging from short memory processes exemplified by an AR(1) model to long memory processes, represented by a fractional differenced model. In all instances, we conclude that human-induced changes to atmospheric gas composition is affecting global mean surface temperature changes. ©2013. American Geophysical Union. All Rights Reserved.

  7. Testing the robustness of the anthropogenic climate change detection statements using different empirical models

    KAUST Repository

    Imbers, J.

    2013-04-27

    This paper aims to test the robustness of the detection and attribution of anthropogenic climate change using four different empirical models that were previously developed to explain the observed global mean temperature changes over the last few decades. These studies postulated that the main drivers of these changes included not only the usual natural forcings, such as solar and volcanic, and anthropogenic forcings, such as greenhouse gases and sulfates, but also other known Earth system oscillations such as El Niño Southern Oscillation (ENSO) or the Atlantic Multidecadal Oscillation (AMO). In this paper, we consider these signals, or forced responses, and test whether or not the anthropogenic signal can be robustly detected under different assumptions for the internal variability of the climate system. We assume that the internal variability of the global mean surface temperature can be described by simple stochastic models that explore a wide range of plausible temporal autocorrelations, ranging from short memory processes exemplified by an AR(1) model to long memory processes, represented by a fractional differenced model. In all instances, we conclude that human-induced changes to atmospheric gas composition is affecting global mean surface temperature changes. ©2013. American Geophysical Union. All Rights Reserved.

  8. A comparative empirical analysis of statistical models for evaluating highway segment crash frequency

    Directory of Open Access Journals (Sweden)

    Bismark R.D.K. Agbelie

    2016-08-01

    Full Text Available The present study conducted an empirical highway segment crash frequency analysis on the basis of fixed-parameters negative binomial and random-parameters negative binomial models. Using a 4-year data from a total of 158 highway segments, with a total of 11,168 crashes, the results from both models were presented, discussed, and compared. About 58% of the selected variables produced normally distributed parameters across highway segments, while the remaining produced fixed parameters. The presence of a noise barrier along a highway segment would increase mean annual crash frequency by 0.492 for 88.21% of the highway segments, and would decrease crash frequency for 11.79% of the remaining highway segments. Besides, the number of vertical curves per mile along a segment would increase mean annual crash frequency by 0.006 for 84.13% of the highway segments, and would decrease crash frequency for 15.87% of the remaining highway segments. Thus, constraining the parameters to be fixed across all highway segments would lead to an inaccurate conclusion. Although, the estimated parameters from both models showed consistency in direction, the magnitudes were significantly different. Out of the two models, the random-parameters negative binomial model was found to be statistically superior in evaluating highway segment crashes compared with the fixed-parameters negative binomial model. On average, the marginal effects from the fixed-parameters negative binomial model were observed to be significantly overestimated compared with those from the random-parameters negative binomial model.

  9. Conceptual modeling in systems biology fosters empirical findings: the mRNA lifecycle.

    Directory of Open Access Journals (Sweden)

    Dov Dori

    Full Text Available One of the main obstacles to understanding complex biological systems is the extent and rapid evolution of information, way beyond the capacity individuals to manage and comprehend. Current modeling approaches and tools lack adequate capacity to model concurrently structure and behavior of biological systems. Here we propose Object-Process Methodology (OPM, a holistic conceptual modeling paradigm, as a means to model both diagrammatically and textually biological systems formally and intuitively at any desired number of levels of detail. OPM combines objects, e.g., proteins, and processes, e.g., transcription, in a way that is simple and easily comprehensible to researchers and scholars. As a case in point, we modeled the yeast mRNA lifecycle. The mRNA lifecycle involves mRNA synthesis in the nucleus, mRNA transport to the cytoplasm, and its subsequent translation and degradation therein. Recent studies have identified specific cytoplasmic foci, termed processing bodies that contain large complexes of mRNAs and decay factors. Our OPM model of this cellular subsystem, presented here, led to the discovery of a new constituent of these complexes, the translation termination factor eRF3. Association of eRF3 with processing bodies is observed after a long-term starvation period. We suggest that OPM can eventually serve as a comprehensive evolvable model of the entire living cell system. The model would serve as a research and communication platform, highlighting unknown and uncertain aspects that can be addressed empirically and updated consequently while maintaining consistency.

  10. Empirical evidence of design-related bias in studies of diagnostic tests

    NARCIS (Netherlands)

    Lijmer, J. G.; Mol, B. W.; Heisterkamp, S.; Bonsel, G. J.; Prins, M. H.; van der Meulen, J. H.; Bossuyt, P. M.

    1999-01-01

    CONTEXT: The literature contains a large number of potential biases in the evaluation of diagnostic tests. Strict application of appropriate methodological criteria would invalidate the clinical application of most study results. OBJECTIVE: To empirically determine the quantitative effect of study

  11. An empirical study of multidimensional fidelity of COMPASS consultation.

    Science.gov (United States)

    Wong, Venus; Ruble, Lisa A; McGrew, John H; Yu, Yue

    2018-06-01

    Consultation is essential to the daily practice of school psychologists (National Association of School Psychologist, 2010). Successful consultation requires fidelity at both the consultant (implementation) and consultee (intervention) levels. We applied a multidimensional, multilevel conception of fidelity (Dunst, Trivette, & Raab, 2013) to a consultative intervention called the Collaborative Model for Promoting Competence and Success (COMPASS) for students with autism. The study provided 3 main findings. First, multidimensional, multilevel fidelity is a stable construct and increases over time with consultation support. Second, mediation analyses revealed that implementation-level fidelity components had distant, indirect effects on student Individualized Education Program (IEP) outcomes. Third, 3 fidelity components correlated with IEP outcomes: teacher coaching responsiveness at the implementation level, and teacher quality of delivery and student responsiveness at the intervention levels. Implications and future directions are discussed. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  12. A Time-dependent Heliospheric Model Driven by Empirical Boundary Conditions

    Science.gov (United States)

    Kim, T. K.; Arge, C. N.; Pogorelov, N. V.

    2017-12-01

    Consisting of charged particles originating from the Sun, the solar wind carries the Sun's energy and magnetic field outward through interplanetary space. The solar wind is the predominant source of space weather events, and modeling the solar wind propagation to Earth is a critical component of space weather research. Solar wind models are typically separated into coronal and heliospheric parts to account for the different physical processes and scales characterizing each region. Coronal models are often coupled with heliospheric models to propagate the solar wind out to Earth's orbit and beyond. The Wang-Sheeley-Arge (WSA) model is a semi-empirical coronal model consisting of a potential field source surface model and a current sheet model that takes synoptic magnetograms as input to estimate the magnetic field and solar wind speed at any distance above the coronal region. The current version of the WSA model takes the Air Force Data Assimilative Photospheric Flux Transport (ADAPT) model as input to provide improved time-varying solutions for the ambient solar wind structure. When heliospheric MHD models are coupled with the WSA model, density and temperature at the inner boundary are treated as free parameters that are tuned to optimal values. For example, the WSA-ENLIL model prescribes density and temperature assuming momentum flux and thermal pressure balance across the inner boundary of the ENLIL heliospheric MHD model. We consider an alternative approach of prescribing density and temperature using empirical correlations derived from Ulysses and OMNI data. We use our own modeling software (Multi-scale Fluid-kinetic Simulation Suite) to drive a heliospheric MHD model with ADAPT-WSA input. The modeling results using the two different approaches of density and temperature prescription suggest that the use of empirical correlations may be a more straightforward, consistent method.

  13. Practical Implications of Empirically Studying Moral Decision-Making

    Science.gov (United States)

    Heinzelmann, Nora; Ugazio, Giuseppe; Tobler, Philippe N.

    2012-01-01

    This paper considers the practical question of why people do not behave in the way they ought to behave. This question is a practical one, reaching both into the normative and descriptive domains of morality. That is, it concerns moral norms as well as empirical facts. We argue that two main problems usually keep us form acting and judging in a morally decent way: firstly, we make mistakes in moral reasoning. Secondly, even when we know how to act and judge, we still fail to meet the requirements due to personal weaknesses. This discussion naturally leads us to another question: can we narrow the gap between what people are morally required to do and what they actually do? We discuss findings from neuroscience, economics, and psychology, considering how we might bring our moral behavior better in line with moral theory. Potentially fruitful means include nudging, training, pharmacological enhancement, and brain stimulation. We conclude by raising the question of whether such methods could and should be implemented. PMID:22783157

  14. Practical implications of empirically studying moral decision-making.

    Science.gov (United States)

    Heinzelmann, Nora; Ugazio, Giuseppe; Tobler, Philippe N

    2012-01-01

    This paper considers the practical question of why people do not behave in the way they ought to behave. This question is a practical one, reaching both into the normative and descriptive domains of morality. That is, it concerns moral norms as well as empirical facts. We argue that two main problems usually keep us form acting and judging in a morally decent way: firstly, we make mistakes in moral reasoning. Secondly, even when we know how to act and judge, we still fail to meet the requirements due to personal weaknesses. This discussion naturally leads us to another question: can we narrow the gap between what people are morally required to do and what they actually do? We discuss findings from neuroscience, economics, and psychology, considering how we might bring our moral behavior better in line with moral theory. Potentially fruitful means include nudging, training, pharmacological enhancement, and brain stimulation. We conclude by raising the question of whether such methods could and should be implemented.

  15. Role of emotional intelligence in managerial effectiveness: An empirical study

    Directory of Open Access Journals (Sweden)

    Md. Sahidur Rahman

    2016-03-01

    Full Text Available Emotional intelligence is very critical to the managerial effectiveness. The present study intends to explore the relationships between emotional intelligence and the three roles such as, interpersonal, informational, and decision of managerial effectiveness. Emotional intelligence is measured by using the Emotional Quotient Index (Rahim et al., 2002 [Rahim, M., Psenicka, C., Polychroniou, P., Zhao, J., Yu, C., Chan, K., Susana, K., Alves, M., Lee, C., Rahman, M.S., Ferdausy, S., & Wyk, R. (2002. A model of emotional intelligence and conflict management strategies: a study in seven countries. International Journal of Organizational Analysis, 10(4, 302-326.] while managerial effectiveness is assessed by using Tsui’s (1984 scale [Tsui, A.S. (1984. A role set analysis of managerial reputation. Organizational Behavior and Human Performance, 34, 64-96.]. Data were collected by distributing self-administered questionnaires among the working MBA students using a convenience sampling technique. Respondents are asked to rate their emotional intelligence and managerial effectiveness scales. Finally 127 usable responses are received and, then, analyzed by using the descriptive statistics, bivariate correlation, and regression analysis. Analysis shows that emotional intelligence was positively related with interpersonal role, informational role, and decision role. The main implication is that emotional intelligence could enhance managerial effectiveness guiding the managers, academics, and professionals. The limitations are the sample size and the sampling technique which might limit the generalizability of the findings. Future directions are also discussed.

  16. STANDARDIZATION OR ADAPTATION IN COSMETICS WEBSITES MARKETING ? AN EMPIRICAL STUDY.

    Directory of Open Access Journals (Sweden)

    Anca Constantinescu-Dobra

    2011-06-01

    Full Text Available The websites marketing is becoming an important tool both for multinationals and SMEs, in their effort to internationalizing their business.This study focuses on the international opportunities that are present within the European markets. The paper aims at identifying the degree of websites marketing standardization vs. adaptation, as a marketing tool for cosmetic products. Moreover, the study examines in a comparative manner the standardization strategy of multinationals and small and medium enterprises (SMEs, leaders in European markets, for different cosmetic cathegories.The evaluation of online advertising standardization is based on the modified Model for Testing Advertising Standardization, developed by Whitelock and Chung. The web sites degree of localizations areanalyzed based upon 98 criteria, as resulted from an adapted methodology of ProfNet Institut fur Internet Marketing, Munster (Germany. The sample includes the 101 leaders from European markets.The research outcomes reflect a standardized websites marketing policy for SMEs and localized for multinationals. Also, for perfumes, dental care products and toiletry, European cosmetic leaders implementstandardized websites marketing policies and balanced for the other cosmetics categories. The hypothesis concerning a strong correlation between standardization and handling dimension was supported.

  17. Empirical Correction to the Likelihood Ratio Statistic for Structural Equation Modeling with Many Variables.

    Science.gov (United States)

    Yuan, Ke-Hai; Tian, Yubin; Yanagihara, Hirokazu

    2015-06-01

    Survey data typically contain many variables. Structural equation modeling (SEM) is commonly used in analyzing such data. The most widely used statistic for evaluating the adequacy of a SEM model is T ML, a slight modification to the likelihood ratio statistic. Under normality assumption, T ML approximately follows a chi-square distribution when the number of observations (N) is large and the number of items or variables (p) is small. However, in practice, p can be rather large while N is always limited due to not having enough participants. Even with a relatively large N, empirical results show that T ML rejects the correct model too often when p is not too small. Various corrections to T ML have been proposed, but they are mostly heuristic. Following the principle of the Bartlett correction, this paper proposes an empirical approach to correct T ML so that the mean of the resulting statistic approximately equals the degrees of freedom of the nominal chi-square distribution. Results show that empirically corrected statistics follow the nominal chi-square distribution much more closely than previously proposed corrections to T ML, and they control type I errors reasonably well whenever N ≥ max(50,2p). The formulations of the empirically corrected statistics are further used to predict type I errors of T ML as reported in the literature, and they perform well.

  18. A Longitudinal Empirical Investigation of the Pathways Model of Problem Gambling.

    Science.gov (United States)

    Allami, Youssef; Vitaro, Frank; Brendgen, Mara; Carbonneau, René; Lacourse, Éric; Tremblay, Richard E

    2017-12-01

    The pathways model of problem gambling suggests the existence of three developmental pathways to problem gambling, each differentiated by a set of predisposing biopsychosocial characteristics: behaviorally conditioned (BC), emotionally vulnerable (EV), and biologically vulnerable (BV) gamblers. This study examined the empirical validity of the Pathways Model among adolescents followed up to early adulthood. A prospective-longitudinal design was used, thus overcoming limitations of past studies that used concurrent or retrospective designs. Two samples were used: (1) a population sample of French-speaking adolescents (N = 1033) living in low socio-economic status (SES) neighborhoods from the Greater Region of Montreal (Quebec, Canada), and (2) a population sample of adolescents (N = 3017), representative of French-speaking students in Quebec. Only participants with at-risk or problem gambling by mid-adolescence or early adulthood were included in the main analysis (n = 180). Latent Profile Analyses were conducted to identify the optimal number of profiles, in accordance with participants' scores on a set of variables prescribed by the Pathways Model and measured during early adolescence: depression, anxiety, impulsivity, hyperactivity, antisocial/aggressive behavior, and drug problems. A four-profile model fit the data best. Three profiles differed from each other in ways consistent with the Pathways Model (i.e., BC, EV, and BV gamblers). A fourth profile emerged, resembling a combination of EV and BV gamblers. Four profiles of at-risk and problem gamblers were identified. Three of these profiles closely resemble those suggested by the Pathways Model.

  19. An empirical study for measuring the success index of banking industry

    Directory of Open Access Journals (Sweden)

    Mohsen Mardani

    2012-08-01

    Full Text Available Measuring organization performance plays an important role for developing better strategic plans. In today's competitive environment, organizations attempt for the product quality or offering the service, delivery, reliability capability and the customer satisfaction. These properties are not measurable only by traditional financial criteria and we need a method, which could consider non-financial factors as well. The present study of this paper proposed a hybrid of balanced score card (BSC and data envelopment analysis (DEA method for an empirical study of banking sector. The study proposes a model for assessing the Tose`eTa`avon bank performance, which is an example of governmental credit and financial services institutes. The study determines different important factors associated with each four components of BSC and uses analytical hierarchy process to rank the measures. In each part of BSC implementation, we use DEA for ranking different units of bank and efficient and inefficient units are determined.

  20. Empirical Study towards the Drivers of Sustainable Economic Growth in EU-28 Countries

    Directory of Open Access Journals (Sweden)

    Daniel Ştefan Armeanu

    2017-12-01

    Full Text Available This study aims at empirically investigating the drivers of sustainable economic growth in EU-28 countries. By means of panel data regression models, in the form of fixed and random effects models, alongside system generalized method of moments, we examine several drivers of real gross domestic product (GDP growth rate, as follows: higher education, business environment, infrastructure, technology, communications, and media, population lifestyle, and demographic changes. As regards higher education, the empirical results show that expenditure per student in higher education and traditional 18–22 year-old students are positively linked with sustainable economic growth, whereas science and technology graduates negatively influence real GDP growth. In terms of business environment, total expenditure on research and development and employment rates of recent graduates contributes to sustainable development, but corruption perceptions index revealed a negative association with economic growth. As well, the results provide support for a negative influence of infrastructure abreast technological measures on economic growth. Besides, we found a negative connection between old-age dependency ratio and sustainable economic growth.

  1. Corrosion-induced bond strength degradation in reinforced concrete-Analytical and empirical models

    International Nuclear Information System (INIS)

    Bhargava, Kapilesh; Ghosh, A.K.; Mori, Yasuhiro; Ramanujam, S.

    2007-01-01

    The present paper aims to investigate the relationship between the bond strength and the reinforcement corrosion in reinforced concrete (RC). Analytical and empirical models are proposed for the bond strength of corroded reinforcing bars. Analytical model proposed by Cairns.and Abdullah [Cairns, J., Abdullah, R.B., 1996. Bond strength of black and epoxy-coated reinforcement-a theoretical approach. ACI Mater. J. 93 (4), 362-369] for splitting bond failure and later modified by Coronelli [Coronelli, D. 2002. Corrosion cracking and bond strength modeling for corroded bars in reinforced concrete. ACI Struct. J. 99 (3), 267-276] to consider the corroded bars, has been adopted. Estimation of the various parameters in the earlier analytical model has been proposed by the present authors. These parameters include corrosion pressure due to expansive action of corrosion products, modeling of tensile behaviour of cracked concrete and adhesion and friction coefficient between the corroded bar and cracked concrete. Simple empirical models are also proposed to evaluate the reduction in bond strength as a function of reinforcement corrosion in RC specimens. These empirical models are proposed by considering a wide range of published experimental investigations related to the bond degradation in RC specimens due to reinforcement corrosion. It has been found that the proposed analytical and empirical bond models are capable of providing the estimates of predicted bond strength of corroded reinforcement that are in reasonably good agreement with the experimentally observed values and with those of the other reported published data on analytical and empirical predictions. An attempt has also been made to evaluate the flexural strength of RC beams with corroded reinforcement failing in bond. It has also been found that the analytical predictions for the flexural strength of RC beams based on the proposed bond degradation models are in agreement with those of the experimentally

  2. Evaluation of the existing triple point path models with new experimental data: proposal of an original empirical formulation

    Science.gov (United States)

    Boutillier, J.; Ehrhardt, L.; De Mezzo, S.; Deck, C.; Magnan, P.; Naz, P.; Willinger, R.

    2018-03-01

    With the increasing use of improvised explosive devices (IEDs), the need for better mitigation, either for building integrity or for personal security, increases in importance. Before focusing on the interaction of the shock wave with a target and the potential associated damage, knowledge must be acquired regarding the nature of the blast threat, i.e., the pressure-time history. This requirement motivates gaining further insight into the triple point (TP) path, in order to know precisely which regime the target will encounter (simple reflection or Mach reflection). Within this context, the purpose of this study is to evaluate three existing TP path empirical models, which in turn are used in other empirical models for the determination of the pressure profile. These three TP models are the empirical function of Kinney, the Unified Facilities Criteria (UFC) curves, and the model of the Natural Resources Defense Council (NRDC). As discrepancies are observed between these models, new experimental data were obtained to test their reliability and a new promising formulation is proposed for scaled heights of burst ranging from 24.6-172.9 cm/kg^{1/3}.

  3. Functionality of empirical model-based predictive analytics for the early detection of hemodynamic instabilty.

    Science.gov (United States)

    Summers, Richard L; Pipke, Matt; Wegerich, Stephan; Conkright, Gary; Isom, Kristen C

    2014-01-01

    Background. Monitoring cardiovascular hemodynamics in the modern clinical setting is a major challenge. Increasing amounts of physiologic data must be analyzed and interpreted in the context of the individual patient’s pathology and inherent biologic variability. Certain data-driven analytical methods are currently being explored for smart monitoring of data streams from patients as a first tier automated detection system for clinical deterioration. As a prelude to human clinical trials, an empirical multivariate machine learning method called Similarity-Based Modeling (“SBM”), was tested in an In Silico experiment using data generated with the aid of a detailed computer simulator of human physiology (Quantitative Circulatory Physiology or “QCP”) which contains complex control systems with realistic integrated feedback loops. Methods. SBM is a kernel-based, multivariate machine learning method that that uses monitored clinical information to generate an empirical model of a patient’s physiologic state. This platform allows for the use of predictive analytic techniques to identify early changes in a patient’s condition that are indicative of a state of deterioration or instability. The integrity of the technique was tested through an In Silico experiment using QCP in which the output of computer simulations of a slowly evolving cardiac tamponade resulted in progressive state of cardiovascular decompensation. Simulator outputs for the variables under consideration were generated at a 2-min data rate (0.083Hz) with the tamponade introduced at a point 420 minutes into the simulation sequence. The functionality of the SBM predictive analytics methodology to identify clinical deterioration was compared to the thresholds used by conventional monitoring methods. Results. The SBM modeling method was found to closely track the normal physiologic variation as simulated by QCP. With the slow development of the tamponade, the SBM model are seen to disagree while the

  4. An empirical investigation on the forecasting ability of mallows model averaging in a macro economic environment

    Science.gov (United States)

    Yin, Yip Chee; Hock-Eam, Lim

    2012-09-01

    This paper investigates the forecasting ability of Mallows Model Averaging (MMA) by conducting an empirical analysis of five Asia countries, Malaysia, Thailand, Philippines, Indonesia and China's GDP growth rate. Results reveal that MMA has no noticeable differences in predictive ability compared to the general autoregressive fractional integrated moving average model (ARFIMA) and its predictive ability is sensitive to the effect of financial crisis. MMA could be an alternative forecasting method for samples without recent outliers such as financial crisis.

  5. Autonomous e-coaching in the wild: Empirical validation of a model-based reasoning system

    OpenAIRE

    Kamphorst, B.A.; Klein, M.C.A.; van Wissen, A.

    2014-01-01

    Autonomous e-coaching systems have the potential to improve people's health behaviors on a large scale. The intelligent behavior change support system eMate exploits a model of the human agent to support individuals in adopting a healthy lifestyle. The system attempts to identify the causes of a person's non-adherence by reasoning over a computational model (COMBI) that is based on established psychological theories of behavior change. The present work presents an extensive, monthlong empiric...

  6. Correcting the bias of empirical frequency parameter estimators in codon models.

    Directory of Open Access Journals (Sweden)

    Sergei Kosakovsky Pond

    2010-07-01

    Full Text Available Markov models of codon substitution are powerful inferential tools for studying biological processes such as natural selection and preferences in amino acid substitution. The equilibrium character distributions of these models are almost always estimated using nucleotide frequencies observed in a sequence alignment, primarily as a matter of historical convention. In this note, we demonstrate that a popular class of such estimators are biased, and that this bias has an adverse effect on goodness of fit and estimates of substitution rates. We propose a "corrected" empirical estimator that begins with observed nucleotide counts, but accounts for the nucleotide composition of stop codons. We show via simulation that the corrected estimates outperform the de facto standard estimates not just by providing better estimates of the frequencies themselves, but also by leading to improved estimation of other parameters in the evolutionary models. On a curated collection of sequence alignments, our estimators show a significant improvement in goodness of fit compared to the approach. Maximum likelihood estimation of the frequency parameters appears to be warranted in many cases, albeit at a greater computational cost. Our results demonstrate that there is little justification, either statistical or computational, for continued use of the -style estimators.

  7. Impact of Disturbing Factors on Cooperation in Logistics Outsourcing Performance: The Empirical Model

    Directory of Open Access Journals (Sweden)

    Andreja Križman

    2010-05-01

    Full Text Available The purpose of this paper is to present the research results of a study conducted in the Slovene logistics market of conflicts and opportunism as disturbing factors while examining their impact on cooperation in logistics outsourcing performance. Relationship variables are proposed that directly or indirectly affect logistics performance and conceptualize the hypotheses based on causal linkages for the constructs. On the basis of extant literature and new argumentations that are derived from in-depth interviews of logistics experts, including providers and customers, the measurement and structural models are empirically analyzed. Existing measurement scales for the constructs are slightly modified for this analysis. Purification testing and measurement for validity and reliability are performed. Multivariate statistical methods are utilized and hypotheses are tested. The results show that conflicts have a significantly negative impact on cooperation between customers and logistics service providers (LSPs, while opportunism does not play an important role in these relationships. The observed antecedents of logistics outsourcing performance in the model account for 58.4% of the variance of the goal achievement and 36.5% of the variance of the exceeded goal. KEYWORDS: logistics outsourcing performance; logistics customer–provider relationships; conflicts and cooperation in logistics outsourcing; PLS path modelling

  8. Testing an empirically derived mental health training model featuring small groups, distributed practice and patient discussion.

    Science.gov (United States)

    Murrihy, Rachael C; Byrne, Mitchell K; Gonsalvez, Craig J

    2009-02-01

    Internationally, family doctors seeking to enhance their skills in evidence-based mental health treatment are attending brief training workshops, despite clear evidence in the literature that short-term, massed formats are not likely to improve skills in this complex area. Reviews of the educational literature suggest that an optimal model of training would incorporate distributed practice techniques; repeated practice over a lengthy time period, small-group interactive learning, mentoring relationships, skills-based training and an ongoing discussion of actual patients. This study investigates the potential role of group-based training incorporating multiple aspects of good pedagogy for training doctors in basic competencies in brief cognitive behaviour therapy (BCBT). Six groups of family doctors (n = 32) completed eight 2-hour sessions of BCBT group training over a 6-month period. A baseline control design was utilised with pre- and post-training measures of doctors' BCBT skills, knowledge and engagement in BCBT treatment. Family doctors' knowledge, skills in and actual use of BCBT with patients improved significantly over the course of training compared with the control period. This research demonstrates preliminary support for the efficacy of an empirically derived group training model for family doctors. Brief CBT group-based training could prove to be an effective and viable model for future doctor training.

  9. Longitudinal hopping in intervehicle communication: Theory and simulations on modeled and empirical trajectory data

    Science.gov (United States)

    Thiemann, Christian; Treiber, Martin; Kesting, Arne

    2008-09-01

    Intervehicle communication enables vehicles to exchange messages within a limited broadcast range and thus self-organize into dynamical and geographically embedded wireless ad hoc networks. We study the longitudinal hopping mode in which messages are transported using equipped vehicles driving in the same direction as a relay. Given a finite communication range, we investigate the conditions where messages can percolate through the network, i.e., a linked chain of relay vehicles exists between the sender and receiver. We simulate message propagation in different traffic scenarios and for different fractions of equipped vehicles. Simulations are done with both, modeled and empirical traffic data. These results are used to test the limits of applicability of an analytical model assuming a Poissonian distance distribution between the relays. We found a good agreement for homogeneous traffic scenarios and sufficiently low percentages of equipped vehicles. For higher percentages, the observed connectivity was higher than that of the model while in stop-and-go traffic situations it was lower. We explain these results in terms of correlations of the distances between the relay vehicles. Finally, we introduce variable transmission ranges and found that this additional stochastic component generally increased connectivity compared to a deterministic transmission with the same mean.

  10. Empirical Studies on the Use of Social Software in Global Software Development - a Systematic Mapping Study

    DEFF Research Database (Denmark)

    Giuffrida, Rosalba; Dittrich, Yvonne

    2013-01-01

    of empirical studies on the usage of SoSo are available in related fields, there exists no comprehensive overview of what has been investigated to date across them. Objective: The aim of this review is to map empirical studies on the usage of SoSo in Software Engineering projects and in distributed teams...... for collaborative work, fostering awareness, knowledge management and coordination among team members. Contrary to the evident high importance of the social aspects offered by SoSo, socialization is not the most important usage reported. Conclusions: This review reports how SoSo is used in GSD and how it is capable...... of supporting GSD teams. Four emerging themes in global software engineering were identified: the appropriation and development of usage structures; understanding how an ecology of communication channels and tools are used by teams; the role played by SoSo either as a subtext or as an explicit goal; and finally...

  11. A MACROPRUDENTIAL SUPERVISION MODEL. EMPIRICAL EVIDENCE FROM THE CENTRAL AND EASTERN EUROPEAN BANKING SYSTEM

    Directory of Open Access Journals (Sweden)

    Trenca Ioan

    2013-07-01

    Full Text Available One of the positive effects of the financial crises is the increasing concern of the supervisors regarding the financial system’s stability. There is a need to strengthen the links between different financial components of the financial system and the macroeconomic environment. Banking systems that have an adequate capitalization and liquidity level may face easier economic and financial shocks. The purpose of this empirical study is to identify the main determinants of the banking system’s stability and soundness in the Central and Eastern Europe countries. We asses the impact of different macroeconomic variables on the quality of capital and liquidity conditions and examine the behaviour of these financial stability indicators, by analyzing a sample of 10 banking systems during 2000-2011. The availability of banking capital signals the banking system’s resiliency to shocks. Capital adequacy ratio is the main indicator used to assess the banking fragility. One of the causes of the 2008-2009 financial crisis was the lack of liquidity in the banking system which led to the collapse of several banking institutions and macroeconomic imbalances. Given the importance of liquidity for the banking system, we propose several models in order to determine the macroeconomic variables that have a significant influence on the liquid reserves to total assets ratio. We found evidence that GDP growth, inflation, domestic credit to private sector, as well as the money and quasi money aggregate indicator have significant impact on the banking stability. The empirical regression confirms the high level of interdependence of the real sector with the financial-banking sector. Also, they prove the necessity for an effective macro prudential supervision at country level which enables the supervisory authorities to have an adequate control over the macro prudential indicators and to take appropriate decisions at the right time.

  12. Traditional Arabic & Islamic medicine: validation and empirical assessment of a conceptual model in Qatar.

    Science.gov (United States)

    AlRawi, Sara N; Khidir, Amal; Elnashar, Maha S; Abdelrahim, Huda A; Killawi, Amal K; Hammoud, Maya M; Fetters, Michael D

    2017-03-14

    Evidence indicates traditional medicine is no longer only used for the healthcare of the poor, its prevalence is also increasing in countries where allopathic medicine is predominant in the healthcare system. While these healing practices have been utilized for thousands of years in the Arabian Gulf, only recently has a theoretical model been developed illustrating the linkages and components of such practices articulated as Traditional Arabic & Islamic Medicine (TAIM). Despite previous theoretical work presenting development of the TAIM model, empirical support has been lacking. The objective of this research is to provide empirical support for the TAIM model and illustrate real world applicability. Using an ethnographic approach, we recruited 84 individuals (43 women and 41 men) who were speakers of one of four common languages in Qatar; Arabic, English, Hindi, and Urdu, Through in-depth interviews, we sought confirming and disconfirming evidence of the model components, namely, health practices, beliefs and philosophy to treat, diagnose, and prevent illnesses and/or maintain well-being, as well as patterns of communication about their TAIM practices with their allopathic providers. Based on our analysis, we find empirical support for all elements of the TAIM model. Participants in this research, visitors to major healthcare centers, mentioned using all elements of the TAIM model: herbal medicines, spiritual therapies, dietary practices, mind-body methods, and manual techniques, applied singularly or in combination. Participants had varying levels of comfort sharing information about TAIM practices with allopathic practitioners. These findings confirm an empirical basis for the elements of the TAIM model. Three elements, namely, spiritual healing, herbal medicine, and dietary practices, were most commonly found. Future research should examine the prevalence of TAIM element use, how it differs among various populations, and its impact on health.

  13. An empirical study on primary school teachers’ attitudes towards inclusive education in Jakarta, Indonesia

    NARCIS (Netherlands)

    Kurniawati, Farida; Minnaert, A.E.M.G.; Mangunsong, F.; Ahmed, W.

    2012-01-01

    Empirical research revealed that teachers’ attitudes play a crucial role in successful implementation of inclusive education. This study aimed to examine primary school teachers’ attitudes towards inclusive education in Jakarta, Indonesia. Respondents completed the attitude scale which comprised the

  14. An empirical model of the topside plasma density around 600 km based on ROCSAT-1 and Hinotori observations

    Science.gov (United States)

    Huang, He; Chen, Yiding; Liu, Libo; Le, Huijun; Wan, Weixing

    2015-05-01

    It is an urgent task to improve the ability of ionospheric empirical models to more precisely reproduce the plasma density variations in the topside ionosphere. Based on the Republic of China Satellite 1 (ROCSAT-1) observations, we developed a new empirical model of topside plasma density around 600 km under relatively quiet geomagnetic conditions. The model reproduces the ROCSAT-1 plasma density observations with a root-mean-square-error of 0.125 in units of lg(Ni(cm-3)) and reasonably describes the temporal and spatial variations of plasma density at altitudes in the range from 550 to 660 km. The model results are also in good agreement with observations from Hinotori, Coupled Ion-Neutral Dynamics Investigations/Communications/Navigation Outage Forecasting System satellites and the incoherent scatter radar at Arecibo. Further, we combined ROCSAT-1 and Hinotori data to improve the ROCSAT-1 model and built a new model (R&H model) after the consistency between the two data sets had been confirmed with the original ROCSAT-1 model. In particular, we studied the solar activity dependence of topside plasma density at a fixed altitude by R&H model and find that its feature slightly differs from the case when the orbit altitude evolution is ignored. In addition, the R&H model shows the merging of the two crests of equatorial ionization anomaly above the F2 peak, while the IRI_Nq topside option always produces two separate crests in this range of altitudes.

  15. Comparison of ITER performance predicted by semi-empirical and theory-based transport models

    International Nuclear Information System (INIS)

    Mukhovatov, V.; Shimomura, Y.; Polevoi, A.

    2003-01-01

    The values of Q=(fusion power)/(auxiliary heating power) predicted for ITER by three different methods, i.e., transport model based on empirical confinement scaling, dimensionless scaling technique, and theory-based transport models are compared. The energy confinement time given by the ITERH-98(y,2) scaling for an inductive scenario with plasma current of 15 MA and plasma density 15% below the Greenwald value is 3.6 s with one technical standard deviation of ±14%. These data are translated into a Q interval of [7-13] at the auxiliary heating power P aux = 40 MW and [7-28] at the minimum heating power satisfying a good confinement ELMy H-mode. Predictions of dimensionless scalings and theory-based transport models such as Weiland, MMM and IFS/PPPL overlap with the empirical scaling predictions within the margins of uncertainty. (author)

  16. Attachment-based family therapy for depressed and suicidal adolescents: theory, clinical model and empirical support.

    Science.gov (United States)

    Ewing, E Stephanie Krauthamer; Diamond, Guy; Levy, Suzanne

    2015-01-01

    Attachment-Based Family Therapy (ABFT) is a manualized family-based intervention designed for working with depressed adolescents, including those at risk for suicide, and their families. It is an empirically informed and supported treatment. ABFT has its theoretical underpinnings in attachment theory and clinical roots in structural family therapy and emotion focused therapies. ABFT relies on a transactional model that aims to transform the quality of adolescent-parent attachment, as a means of providing the adolescent with a more secure relationship that can support them during challenging times generally, and the crises related to suicidal thinking and behavior, specifically. This article reviews: (1) the theoretical foundations of ABFT (attachment theory, models of emotional development); (2) the ABFT clinical model, including training and supervision factors; and (3) empirical support.

  17. Cycling empirical antibiotic therapy in hospitals: meta-analysis and models.

    Directory of Open Access Journals (Sweden)

    Pia Abel zur Wiesch

    2014-06-01

    Full Text Available The rise of resistance together with the shortage of new broad-spectrum antibiotics underlines the urgency of optimizing the use of available drugs to minimize disease burden. Theoretical studies suggest that coordinating empirical usage of antibiotics in a hospital ward can contain the spread of resistance. However, theoretical and clinical studies came to different conclusions regarding the usefulness of rotating first-line therapy (cycling. Here, we performed a quantitative pathogen-specific meta-analysis of clinical studies comparing cycling to standard practice. We searched PubMed and Google Scholar and identified 46 clinical studies addressing the effect of cycling on nosocomial infections, of which 11 met our selection criteria. We employed a method for multivariate meta-analysis using incidence rates as endpoints and find that cycling reduced the incidence rate/1000 patient days of both total infections by 4.95 [9.43-0.48] and resistant infections by 7.2 [14.00-0.44]. This positive effect was observed in most pathogens despite a large variance between individual species. Our findings remain robust in uni- and multivariate metaregressions. We used theoretical models that reflect various infections and hospital settings to compare cycling to random assignment to different drugs (mixing. We make the realistic assumption that therapy is changed when first line treatment is ineffective, which we call "adjustable cycling/mixing". In concordance with earlier theoretical studies, we find that in strict regimens, cycling is detrimental. However, in adjustable regimens single resistance is suppressed and cycling is successful in most settings. Both a meta-regression and our theoretical model indicate that "adjustable cycling" is especially useful to suppress emergence of multiple resistance. While our model predicts that cycling periods of one month perform well, we expect that too long cycling periods are detrimental. Our results suggest that

  18. Choice of Foreign Market Entry Mode - Cognitions from Empirical and Theoretical Studies

    OpenAIRE

    Zhao, Xuemin; Decker, Reinhold

    2004-01-01

    This paper analyzes critically five basic theories on market entry mode decision with respect to existing strengths and weaknesses and the results of corresponding empirical studies. Starting from conflictions both in theories and empirical studies dealing with the entry mode choice problem we motivate a significant need of further research in this important area of international marketing. Furthermore we provide implications for managers in practice and outline emerging trends in market entr...

  19. Empirical Studies on English Vocabulary Learning Strategies in Mainland China over the Past Two Decades

    OpenAIRE

    Zhongxin Dai; Yao Zhou

    2015-01-01

    Wen and Wang (2004) reviewed the empirical studies over the past two decades (from 1984 to 2003) on learning strategies that Chinese EFL learners used. This article, following their methodological framework, reviews about 45 empirical studies on Chinese EFL learners’ English vocabulary learning strategies, conducted by Mainland Chinese scholars over the past two decades. The review shows that more than half of the Chinese scholars are interested in questionnaire investigation of EFL learners’...

  20. Empirical Studies on Legitimation Strategies: A Case for International Business Research Extension

    DEFF Research Database (Denmark)

    Turcan, Romeo V.; Marinova, Svetla Trifonova; Rana, Mohammad Bakhtiar

    2012-01-01

    The paper focuses on legitimation and legitimation strategies applied by companies. Following the process of systematic review, we analyze empirical studies exploring legitimation and legitimation strategies from different theoretical perspectives. Using the key findings by reconnoitering and com...... and comparing the theoretical background, approaches, methodologies, and findings of these empirical studies, we outline potential directions for research in the legitimation strategies of firms engaged in international business operations....