WorldWideScience

Sample records for modelled deltar values

  1. Cartilage quality in rheumatoid arthritis: comparison of T2* mapping, native T1 mapping, dGEMRIC, {delta}R1 and value of pre-contrast imaging

    Energy Technology Data Exchange (ETDEWEB)

    Buchbender, Christian; Scherer, Axel; Kroepil, Patric; Quentin, Michael; Reichelt, Dorothea C.; Lanzman, Rotem S.; Mathys, Christian; Blondin, Dirk; Wittsack, Hans-Joerg; Antoch, Gerald; Miese, Falk [University Duesseldorf, Department of Diagnostic and Interventional Radiology, Medical Faculty, Duesseldorf (Germany); Koerbl, Birthe [Heinrich-Heine-University, Department of Endocrinology, Diabetology and Rheumatology, Medical Faculty, Duesseldorf (Germany); Heinrich-Heine-University, Leibniz Centre for Diabetes Research, Institute of Biometrics and Epidemiology, German Diabetes Centre, Duesseldorf (Germany); Bittersohl, Bernd; Zilkens, Christoph [Heinrich-Heine-University, Department of Orthopaedics, Medical Faculty, Duesseldorf (Germany); Hofer, Matthias [Heinrich-Heine-University, Medical Education Group, Medical School, Duesseldorf (Germany); Schneider, Matthias; Ostendorf, Benedikt [Heinrich-Heine-University, Department of Endocrinology, Diabetology and Rheumatology, Medical Faculty, Duesseldorf (Germany)

    2012-06-15

    To prospectively evaluate four non-invasive markers of cartilage quality - T2* mapping, native T1 mapping, dGEMRIC and {delta}R1 - in healthy volunteers and rheumatoid arthritis (RA) patients. Cartilage of metacarpophalangeal (MCP) joints II were imaged in 28 consecutive subjects: 12 healthy volunteers [9 women, mean (SD) age 52.67 (9.75) years, range 30-66] and 16 RA patients with MCP II involvement [12 women, mean (SD) age 58.06 (12.88) years, range 35-76]. Sagittal T2* mapping was performed with a multi-echo gradient-echo on a 3 T MRI scanner. For T1 mapping the dual flip angle method was applied prior to native T1 mapping and 40 min after gadolinium application (delayed gadolinium-enhanced MRI of cartilage, dGEMRIC, T1{sub Gd}). The difference in the longitudinal relaxation rate induced by gadolinium ({delta}R1) was calculated. The area under the receiver operating characteristic curve (AROC) was used to test for differentiation of RA patients from healthy volunteers. dGEMRIC (AUC 0.81) and {delta}R1 (AUC 0.75) significantly differentiated RA patients from controls. T2* mapping (AUC 0.66) and native T1 mapping (AUC 0.66) were not significantly different in RA patients compared to controls. The data support the use of dGEMRIC for the assessment of MCP joint cartilage quality in RA. T2* and native T1 mapping are of low diagnostic value. Pre-contrast T1 mapping for the calculation of {delta}R1 does not increase the diagnostic value of dGEMRIC. (orig.)

  2. Demonstrator Flood Control Room : Inventarisatie van de wensen van de verschillende Deltares onderdelen en een hierop gebaseerd ontwerp

    NARCIS (Netherlands)

    Boertjens, G.J.; Attema-van Waas, A.R.; Guikema, M.; Schilder, C.M.C.; Veen, M.J. van der

    2009-01-01

    Op basis van het uitgevoerde onderzoek trekt TNO de volgende conclusies: • De bestaande ruimte die Deltares op het oog heeft voor de realisatie van de trainingsruimte is klein. Een eerste fase van de gewenste Flood Control Room is realiseerbaar in deze ruimte, met inachtneming dat niet alle geïdenti

  3. [Ferric iron absorption in deltar p f F xoo, a gene deletion mutant of Xanthomonas oryzae pv. oryzae, assayed using atomic absorption spectrophotometry].

    Science.gov (United States)

    Sun, Lei; Wu, Mao-Sen; He, Chen-Yang

    2010-04-01

    The ferric iron absorption is one of the most important limiting factors of bacterial growth of Xanthomonas oryzae pv. oryzae. It has been previously speculated that r p f F xoo might be involved in the ferric iron metabolism of the pathogen. In the present study, deltar p f F xoo, a gene deletion mutant, was generated from the wild-type strain PXO99A of Xoo through the homologous recombination, and Fe content was assayed using flame atomic absorption in PXO99A and deltar p f F xoo. The results indicated that the recovery was 99.7% and the relative standard deviation was 1.89 under optimized AAS operating conditions. The increase in Fe absorption in PXO99A and deltar p f F xoo was observed with the increasing time. However, the ferric content of deltar p f F xoo was significantly lower than that of PXO99A (P < 0.05). It is suggested that r p f F xoo is involved in iron metabolism in Xanthomonas oryzae pv. oryzae.

  4. Combined constraints on the SUSY parameter space from $\\Delta$r and Higgs boson search

    CERN Document Server

    Chankowski, P H

    1994-01-01

    Abstract: Combining the constraints coming from the ~M_W ~measurements and the unsuccesful search for the Higgs boson at LEP we determine in the framework of MSSM the allowed mass regions for the lighter scalar partner of the top quark. For a heavy top quark particularily strong bounds are obtained for low values of ~\\tan\\beta\\equiv v_2/v_1 ~and light bottom squark.

  5. Brand Value - Proposed Model Danrise

    Directory of Open Access Journals (Sweden)

    Daniel Nascimento Pereira da Silva

    2011-12-01

    Full Text Available Brands have taken dominance in the strategies of enterprises once they are able to generate feelings, sensations and emotions in their clients. These values, value for the enterprises and for the brands themselves, are not measurable. A strong brand configures itself as the highest representative of an enterprise and the brand is regarded as an asset of the enterprise. The evolution of a brand, as an intangible and strategic asset, becomes more vitally important for the enterprises, as a way of maximizing the results. This need, whether of the market or the enterprises, justifies the direction of the research for this vector – the value of the brand. A main objective of the research is to present a new model of brand evaluation. This model is supported by a tangible and intangible aspects and the intangible aspect, evaluates the knowledge and capacity of their managers and workers to build a brand with value through the correct ordering of the priorities of the dimensions of the proposed model. The model was tested on the brand ‗Blue Rise.‘ 

  6. Multifractal Value at Risk model

    Science.gov (United States)

    Lee, Hojin; Song, Jae Wook; Chang, Woojin

    2016-06-01

    In this paper new Value at Risk (VaR) model is proposed and investigated. We consider the multifractal property of financial time series and develop a multifractal Value at Risk (MFVaR). MFVaR introduced in this paper is analytically tractable and not based on simulation. Empirical study showed that MFVaR can provide the more stable and accurate forecasting performance in volatile financial markets where large loss can be incurred. This implies that our multifractal VaR works well for the risk measurement of extreme credit events.

  7. Achieving Value in Primary Care: The Primary Care Value Model.

    Science.gov (United States)

    Rollow, William; Cucchiara, Peter

    2016-03-01

    The patient-centered medical home (PCMH) model provides a compelling vision for primary care transformation, but studies of its impact have used insufficiently patient-centered metrics with inconsistent results. We propose a framework for defining patient-centered value and a new model for value-based primary care transformation: the primary care value model (PCVM). We advocate for use of patient-centered value when measuring the impact of primary care transformation, recognition, and performance-based payment; for financial support and research and development to better define primary care value-creating activities and their implementation; and for use of the model to support primary care organizations in transformation.

  8. What's the Value of VAM (Value-Added Modeling)?

    Science.gov (United States)

    Scherrer, Jimmy

    2012-01-01

    The use of value-added modeling (VAM) in school accountability is expanding, but deciding how to embrace VAM is difficult. Various experts say it's too unreliable, causes more harm than good, and has a big margin for error. Others assert VAM is imperfect but useful, and provides valuable feedback. A closer look at the models, and their use,…

  9. Modelling dune erosion, overwash and inundation of barrier islands

    NARCIS (Netherlands)

    Hoonhout, B.; Thiel de Vries, J.S.M.

    2012-01-01

    Physical model experiments are performed at Deltares to investigate the morphological response of barrier islands on extreme storm events. The experiments included dune erosion, overwash and inundation regimes. Extensive measurement techniques made detailed comparison with numerical models possible.

  10. Gap Model for Dual Customer Values

    Institute of Scientific and Technical Information of China (English)

    HOU Lun; TANG Xiaowo

    2008-01-01

    The customer value, the key problem in customer relationship management (CRM), was studied to construct a gap model for dual customer values. A basic description of customer values is given, and then the gaps between products and services in different periods for the customers and companies are analyzed based on the product or service life-cycle. The main factors that influence the perceived customer value were analyzed to define the "recognized value gap" and a gap model for the dual customer values was constructed to supply companies with a tool to analyze existing customer value gaps and improve customer relationship management.

  11. Value Modeling for Enterprise Resilience

    Energy Technology Data Exchange (ETDEWEB)

    Henderson, Dale L.; Lancaster, Mary J.

    2015-10-20

    Abstract. The idea that resilience is a tangible, measureable, and desirable system attribute has grown rapidly over the last decade beyond is origins in explaining ecological, physiological, psychological, and social systems. Operational enterprise resilience requires two types of measurement. First, the system must monitor various operational conditions in order to respond to disruptions. These measurements are part of one or more observation, orientation, decision, and action (OODA) loops The OODA control processes that implement a resilience strategy use these measurements to provide robustness, rapid recovery and reconstitution. In order to assess the effectiveness of the resilience strategy, a different class of measurements is necessary. This second type consists of measurements about how well the OODA processes cover critical enterprise functions and the hazards to which the enterprise is exposed. They allow assessment of how well enterprise management processes anticipate, mitigate, and adapt to a changing environment and the degree to which the system is fault tolerant. This paper nominates a theoretical framework, in the form of definitions, a model, and a syntax, that accounts for this important distinction, and in so doing provides a mechanism for bridging resilience management process models and the many proposed cyber-defense metric enumerations.

  12. MODELING A VALUE CHAIN IN PUBLIC SECTOR

    Directory of Open Access Journals (Sweden)

    Daiva Rapcevičienė

    2014-08-01

    Full Text Available Purpose – Over the past three decades comprehensive insights were made in order to design and manage the value chain. A lot of scholars discuss differences between private sector value chain – creation profit for the business and public sector value chain, the approach that public sector creates value through the services that it provides. However, there is a lack of a common understanding of what public sector value chain is in general. This paper reviews the literature on how the private value chain was transformed into public value chain and reviews a determination and architecture of a value chain in public sector which gives a structural approach to greater picture of how all structure works. It reviews an approach that the value chain for the public sector shows how the public sector organizes itself to ensure it is of value to the citizens. Design/methodology/approach – descriptive method, analysis of scientific literature. Findings – The public sector value chain is an adaptation of the private sector value chain. The difference between the two is that the customer is the focus of the public sector context, versus the profit focus in the private sector context. There are significant similarities between the two chain models. Each of the chain models are founded on a series of core components. For the public sector context, the core components are people, service and trust. Research limitations/implications – this paper based on presenting value chain for both private and public sectors and giving deeper knowledge for public sector value chain model. Practical implications – comprehension of general value chain model concept and public sector value chain model helps to see multiple connections throughout the entire process: from the beginning to the end. The paper presents the theoretical framework for further study of the value chain model for waste management creation. Originality/Value – The paper reveals the systematic

  13. Predictability of extreme values in geophysical models

    NARCIS (Netherlands)

    Sterk, A.E.; Holland, M.P.; Rabassa, P.; Broer, H.W.; Vitolo, R.

    2012-01-01

    Extreme value theory in deterministic systems is concerned with unlikely large (or small) values of an observable evaluated along evolutions of the system. In this paper we study the finite-time predictability of extreme values, such as convection, energy, and wind speeds, in three geophysical model

  14. Modeling Business Strategy: A Consumer Value Perspective

    OpenAIRE

    Svee, Eric-Oluf; Giannoulis, Constantinos; Zdravkovic, Jelena

    2011-01-01

    Part 3: Business Modeling; International audience; Business strategy lays out the plan of an enterprise to achieve its vision by providing value to its customers. Typically, business strategy focuses on economic value and its relevant exchanges with customers and does not directly address consumer values. However, consumer values drive customers’ choices and decisions to use a product or service, and therefore should have a direct impact on business strategy. This paper explores whether and h...

  15. The value of multivariate model sophistication

    DEFF Research Database (Denmark)

    Rombouts, Jeroen; Stentoft, Lars; Violante, Francesco

    2014-01-01

    We assess the predictive accuracies of a large number of multivariate volatility models in terms of pricing options on the Dow Jones Industrial Average. We measure the value of model sophistication in terms of dollar losses by considering a set of 444 multivariate models that differ in their spec....... In addition to investigating the value of model sophistication in terms of dollar losses directly, we also use the model confidence set approach to statistically infer the set of models that delivers the best pricing performances....

  16. A Binomial Integer-Valued ARCH Model.

    Science.gov (United States)

    Ristić, Miroslav M; Weiß, Christian H; Janjić, Ana D

    2016-11-01

    We present an integer-valued ARCH model which can be used for modeling time series of counts with under-, equi-, or overdispersion. The introduced model has a conditional binomial distribution, and it is shown to be strictly stationary and ergodic. The unknown parameters are estimated by three methods: conditional maximum likelihood, conditional least squares and maximum likelihood type penalty function estimation. The asymptotic distributions of the estimators are derived. A real application of the novel model to epidemic surveillance is briefly discussed. Finally, a generalization of the introduced model is considered by introducing an integer-valued GARCH model.

  17. Predictability of extreme values in geophysical models

    Directory of Open Access Journals (Sweden)

    A. E. Sterk

    2012-09-01

    Full Text Available Extreme value theory in deterministic systems is concerned with unlikely large (or small values of an observable evaluated along evolutions of the system. In this paper we study the finite-time predictability of extreme values, such as convection, energy, and wind speeds, in three geophysical models. We study whether finite-time Lyapunov exponents are larger or smaller for initial conditions leading to extremes. General statements on whether extreme values are better or less predictable are not possible: the predictability of extreme values depends on the observable, the attractor of the system, and the prediction lead time.

  18. PV panel model based on datasheet values

    DEFF Research Database (Denmark)

    Sera, Dezso; Teodorescu, Remus; Rodriguez, Pedro

    2007-01-01

    This work presents the construction of a model for a PV panel using the single-diode five-parameters model, based exclusively on data-sheet parameters. The model takes into account the series and parallel (shunt) resistance of the panel. The equivalent circuit and the basic equations of the PV cell....../panel in Standard Test Conditions (STC) are shown, as well as the parameters extraction from the data-sheet values. The temperature dependence of the cell dark saturation current is expressed with an alternative formula, which gives better correlation with the datasheet values of the power temperature dependence....... Based on these equations, a PV panel model, which is able to predict the panel behavior in different temperature and irradiance conditions, is built and tested....

  19. An analytical high value target acquisition model

    OpenAIRE

    Becker, Kevin J.

    1986-01-01

    Approved for public release; distribution is unlimited An Analytical High Value Target (HVT) acquisition model is developed for a generic anti-ship cruise missile system. the target set is represented as a single HVT within a field of escorts. The HVT's location is described by a bivariate normal probability distribution. the escorts are represented by a spatially homogeneous Poisson random field surrounding the HVT. Model output consists of the probability that at least one missile of...

  20. Value-Added Modeling in Physical Education

    Science.gov (United States)

    Hushman, Glenn; Hushman, Carolyn

    2015-01-01

    The educational reform movement in the United States has resulted in a variety of states moving toward a system of value-added modeling (VAM) to measure a teacher's contribution to student achievement. Recently, many states have begun using VAM scores as part of a larger system to evaluate teacher performance. In the past decade, only "core…

  1. A Model for Valuing Military Talents

    Institute of Scientific and Technical Information of China (English)

    LIU Hong-sheng

    2002-01-01

    The method of collocating military talents is a difficult problem. It is different from other talents, for the characteristic of military talents. This paper presents a model for valuing military talents,which can assists the military leaders to collocate military talents properly.

  2. Efficient Smoothing for Boundary Value Models

    Science.gov (United States)

    1989-12-29

    IEEE Transactions on Automatic Control , vol. 29, pp. 803-821, 1984. [2] A. Bagchi and H. Westdijk, "Smoothing...and likelihood ratio for Gaussian boundary value processes," IEEE Transactions on Automatic Control , vol. 34, pp. 954-962, 1989. [3] R. Nikoukhah et...77-96, 1988. [6] H. L. Weinert and U. B. Desai, "On complementary models and fixed- interval smoothing," IEEE Transactions on Automatic Control ,

  3. Value Concept and Economic Growth Model

    Directory of Open Access Journals (Sweden)

    Truong Hong Trinh

    2014-12-01

    Full Text Available This paper approaches the value added method for Gross Domestic Product (GDP measurement that explains the interrelationship between the expenditure approach and the income approach. The economic growth model is also proposed with three key elements of capital accumulation, technological innovation, and institutional reform. Although capital accumulation and technological innovation are two integrated elements in driving economic growth, institutional reforms play a key role in creating incentives that effect the transitional and steady state growth rate in the real world economy. The paper provides a theoretical insight on economic growth to understand incentives and driving forces in economic growth model.

  4. From Business Value Model to Coordination Process Model

    Science.gov (United States)

    Fatemi, Hassan; van Sinderen, Marten; Wieringa, Roel

    The increased complexity of business webs calls for modeling the collaboration of enterprises from different perspectives, in particular the business and process perspectives, and for mutually aligning these perspectives. Business value modeling and coordination process modeling both are necessary for a good e-business design, but these activities have different goals and use different concepts. Nevertheless, the resulting models should be consistent with each other because they refer to the same system from different perspectives. Hence, checking the consistency between these models or producing one based on the other would be of high value. In this paper we discuss the issue of achieving consistency in multi-level e-business design and give guidelines to produce consistent coordination process models from business value models in a stepwise manner.

  5. Proving the ecosystem value through hydrological modelling

    Science.gov (United States)

    Dorner, W.; Spachinger, K.; Porter, M.; Metzka, R.

    2008-11-01

    Ecosystems provide valuable functions. Also natural floodplains and river structures offer different types of ecosystem functions such as habitat function, recreational area and natural detention. From an economic stand point the loss (or rehabilitation) of these natural systems and their provided natural services can be valued as a damage (or benefit). Consequently these natural goods and services must be economically valued in project assessments e.g. cost-benefit-analysis or cost comparison. Especially in smaller catchments and river systems exists significant evidence that natural flood detention reduces flood risk and contributes to flood protection. Several research projects evaluated the mitigating effect of land use, river training and the loss of natural flood plains on development, peak and volume of floods. The presented project analysis the hypothesis that ignoring natural detention and hydrological ecosystem services could result in economically inefficient solutions for flood protection and mitigation. In test areas, subcatchments of the Danube in Germany, a combination of hydrological and hydrodynamic models with economic evaluation techniques was applied. Different forms of land use, river structure and flood protection measures were assed and compared from a hydrological and economic point of view. A hydrodynamic model was used to simulate flows to assess the extent of flood affected areas and damages to buildings and infrastructure as well as to investigate the impacts of levees and river structure on a local scale. These model results provided the basis for an economic assessment. Different economic valuation techniques, such as flood damage functions, cost comparison method and substation-approach were used to compare the outcomes of different hydrological scenarios from an economic point of view and value the ecosystem service. The results give significant evidence that natural detention must be evaluated as part of flood mitigation projects

  6. The Symmetric Solutions of Affiliated Value Model

    Institute of Scientific and Technical Information of China (English)

    Che Ka-jia; Li Zhi-chen

    2004-01-01

    In a symmetric affiliated value model, this paper analyses High-Technology industrial firms' competitive strategy in research and development (R&D). We obtain the symmetric Bayesian Nash Equilibrium functions with or without government's prize:b1(x)=v(x,x)Fn-1(x|x)-∫x0Fn-1(y|y)dv(y,y), b2(x)=∫x0[v(y,y)+v0]dFn-1(y|y), and b3(x)=∫x0v(y,y)(fn-1(y|y))/(1-Fn-1(y|y))dy. We find the firm's investment level will increase in prize, only when the constant prize v0≥v(y,y)(Fn-1(y|y))/(1-Fn-1(y|y)), does the firm invest more aggressively with constant prize than with variable prize.

  7. Issues in Value-at-Risk Modeling and Evaluation

    NARCIS (Netherlands)

    J. Danielsson; C.G. de Vries (Casper); B.N. Jorgensen (Bjørn); P.F. Christoffersen (Peter); F.X. Diebold (Francis); T. Schuermann (Til); J.A. Lopez (Jose); B. Hirtle (Beverly)

    1998-01-01

    textabstractDiscusses the issues in value-at-risk modeling and evaluation. Value of value at risk; Horizon problems and extreme events in financial risk management; Methods of evaluating value-at-risk estimates.

  8. Measuring swirl at a model scale of 1:1 for vertically submersible pumps

    Science.gov (United States)

    de Fockert, A.; Verhaart, F. I. H.; Czarnota, Z.; Rajkumar, S.

    2016-11-01

    Intakes of large pump stations are often designed with the aid of hydraulic modeling. The approach flow to pumps is tested for adverse hydraulic phenomena, such as pre-swirl, velocity variations and vortices. Most commonly, the limits for these phenomena are taken from the ANSI/HI 9.8-2012 standard - Rotodynamic Pumps for Pump Intake Design. The standard, however, does not explain how real pumps respond to swirl, uneven velocity distribution or vortices. The present joined study between Deltares and Xylem aims to bridge this gap. At the Deltares pump sump test facility, two identical pump compartments were built according to the ANSI/HI 9.8-2012 standard. In one of the compartments, a submersible, vertical column pump (Flygt PL7020) was installed, while a 1:1 scale model of that pump was installed in the other compartment. This arrangement allowed measurements of both pump performance (pump head and input power as a function of flow rate) and the model parameters (pre-rotation and vortex occurrence) for nearly identical approach flow conditions. By varying the geometry of the approach channels, the asymmetry of the flow was varied to produce various degrees of pre-swirl including values in excess of the commonly accepted limit of 5 degrees. This paper describes the measurement setup, the results of the measurements with the model pump and the measurement plan for the prototype pump.

  9. The Advancement Value Chain: An Exploratory Model

    Science.gov (United States)

    Leonard, Edward F., III

    2005-01-01

    Since the introduction of the value chain concept in 1985, several varying, yet virtually similar, value chains have been developed for the business enterprise. Shifting to higher education, can a value chain be found that links together the various activities of advancement so that an institution's leaders can actually look at the philanthropic…

  10. The Advancement Value Chain: An Exploratory Model

    Science.gov (United States)

    Leonard, Edward F., III

    2005-01-01

    Since the introduction of the value chain concept in 1985, several varying, yet virtually similar, value chains have been developed for the business enterprise. Shifting to higher education, can a value chain be found that links together the various activities of advancement so that an institution's leaders can actually look at the philanthropic…

  11. One-Step Dynamic Classifier Ensemble Model for Customer Value Segmentation with Missing Values

    OpenAIRE

    Jin Xiao; Bing Zhu; Geer Teng; Changzheng He; Dunhu Liu

    2014-01-01

    Scientific customer value segmentation (CVS) is the base of efficient customer relationship management, and customer credit scoring, fraud detection, and churn prediction all belong to CVS. In real CVS, the customer data usually include lots of missing values, which may affect the performance of CVS model greatly. This study proposes a one-step dynamic classifier ensemble model for missing values (ODCEM) model. On the one hand, ODCEM integrates the preprocess of missing values and the classif...

  12. The role of non-epistemic values in engineering models.

    Science.gov (United States)

    Diekmann, Sven; Peterson, Martin

    2013-03-01

    We argue that non-epistemic values, including moral ones, play an important role in the construction and choice of models in science and engineering. Our main claim is that non-epistemic values are not only "secondary values" that become important just in case epistemic values leave some issues open. Our point is, on the contrary, that non-epistemic values are as important as epistemic ones when engineers seek to develop the best model of a process or problem. The upshot is that models are neither value-free, nor depend exclusively on epistemic values or use non-epistemic values as tie-breakers.

  13. The New Digital Media Value Network: Proposing an Interactive Model of Digital Media Value Activities

    Directory of Open Access Journals (Sweden)

    Sylvia Chan-Olmsted

    2016-07-01

    Full Text Available This study models the dynamic nature of today’s media markets using the framework of value-adding activities in the provision and consumption of media products. The proposed user-centric approach introduces the notion that the actions of external users, social media, and interfaces affect the internal value activities of media firms via a feedback loop, and therefore should themselves be considered value activities. The model also suggests a more comprehensive list of indicators for value assessment.

  14. Mean Value Modelling of Turbocharged SI Engines

    DEFF Research Database (Denmark)

    Müller, Martin; Hendricks, Elbert; Sorenson, Spencer C.

    1998-01-01

    The development of a computer simulation to predict the performance of a turbocharged spark ignition engine during transient operation. New models have been developed for the turbocharged and the intercooling system. An adiabatic model for the intake manifold is presented....

  15. The value of multivariate model sophistication

    DEFF Research Database (Denmark)

    Rombouts, Jeroen; Stentoft, Lars; Violante, Francesco

    2014-01-01

    in their specification of the conditional variance, conditional correlation, innovation distribution, and estimation approach. All of the models belong to the dynamic conditional correlation class, which is particularly suitable because it allows consistent estimations of the risk neutral dynamics with a manageable...... of correlation models, we propose a new model that allows for correlation spillovers without too many parameters. This model performs about 60% better than the existing correlation models we consider. Relaxing a Gaussian innovation for a Laplace innovation assumption improves the pricing in a more minor way...

  16. The Added Value of Business Models

    NARCIS (Netherlands)

    Vliet, Harry van

    2014-01-01

    An overview of innovations in a particular area, for example retail developments in the fashion sector (Van Vliet, 2014), and a subsequent discussion about the probability as to whether these innovations will realise a ‘breakthrough’, has to be supplemented with the question of what the added value

  17. Modeling value creation with enterprise architecture

    NARCIS (Netherlands)

    Singh, Prince Mayurank; Jonkers, H.; Iacob, Maria Eugenia; van Sinderen, Marten J.

    2014-01-01

    Firms may not succeed in business if strategies are not properly implemented in practice. Every firm needs to know, represent and master its value creation logic, not only to stay in business but also to keep growing. This paper is about focusing on an important topic in the field of strategic

  18. Appropriate model selection methods for nonstationary generalized extreme value models

    Science.gov (United States)

    Kim, Hanbeen; Kim, Sooyoung; Shin, Hongjoon; Heo, Jun-Haeng

    2017-04-01

    Several evidences of hydrologic data series being nonstationary in nature have been found to date. This has resulted in the conduct of many studies in the area of nonstationary frequency analysis. Nonstationary probability distribution models involve parameters that vary over time. Therefore, it is not a straightforward process to apply conventional goodness-of-fit tests to the selection of an appropriate nonstationary probability distribution model. Tests that are generally recommended for such a selection include the Akaike's information criterion (AIC), corrected Akaike's information criterion (AICc), Bayesian information criterion (BIC), and likelihood ratio test (LRT). In this study, the Monte Carlo simulation was performed to compare the performances of these four tests, with regard to nonstationary as well as stationary generalized extreme value (GEV) distributions. Proper model selection ratios and sample sizes were taken into account to evaluate the performances of all the four tests. The BIC demonstrated the best performance with regard to stationary GEV models. In case of nonstationary GEV models, the AIC proved to be better than the other three methods, when relatively small sample sizes were considered. With larger sample sizes, the AIC, BIC, and LRT presented the best performances for GEV models which have nonstationary location and/or scale parameters, respectively. Simulation results were then evaluated by applying all four tests to annual maximum rainfall data of selected sites, as observed by the Korea Meteorological Administration.

  19. One-Step Dynamic Classifier Ensemble Model for Customer Value Segmentation with Missing Values

    Directory of Open Access Journals (Sweden)

    Jin Xiao

    2014-01-01

    Full Text Available Scientific customer value segmentation (CVS is the base of efficient customer relationship management, and customer credit scoring, fraud detection, and churn prediction all belong to CVS. In real CVS, the customer data usually include lots of missing values, which may affect the performance of CVS model greatly. This study proposes a one-step dynamic classifier ensemble model for missing values (ODCEM model. On the one hand, ODCEM integrates the preprocess of missing values and the classification modeling into one step; on the other hand, it utilizes multiple classifiers ensemble technology in constructing the classification models. The empirical results in credit scoring dataset “German” from UCI and the real customer churn prediction dataset “China churn” show that the ODCEM outperforms four commonly used “two-step” models and the ensemble based model LMF and can provide better decision support for market managers.

  20. Modeling churn using customer lifetime value

    OpenAIRE

    Glady, Nicolas; Baesens, Bart; Croux, Christophe

    2009-01-01

    The definition and modeling of customer loyalty have been central issues in customer relationship management since many years. Recent papers propose solutions to detect customers that are becoming less loyal, also called churners. The churner status is then defined as a function of the volume of commercial transactions. In the context of a Belgian retail financial service company, our first contribution is to redefine the notion of customer loyalty by considering it from a customer-centric vi...

  1. Modeling customer loyalty using customer lifetime value.

    OpenAIRE

    Glady, N.; Baesens, Bart; Croux, Christophe

    2006-01-01

    The definition and modeling of customer loyalty have been central issues in customer relationship management since many years. Recent papers propose solutions to detect customers that are becoming less loyal, also called churners. The churner status is then defined as a function of the volume of commercial transactions. In the context of a Belgian retail financial service company, our first contribution will be to redefine the notion of customer's loyalty by considering it from a customer-cen...

  2. Modeling the Marginal Value of Rainforest Losses

    OpenAIRE

    Strand, Jon

    2015-01-01

    A rainforest can be modeled as a dynamic asset subject to various risks, including risk of fire. Any small part of the forest can be in one of two states: either untouched by forest fire, or already damaged by fire, in which case there is both a local forest loss and increased dryness over a broader area. In this paper, two Bellman equations are constructed, one for unharmed forest and a s...

  3. The HackensackUMC Value-Based Care Model: Building Essentials for Value-Based Purchasing.

    Science.gov (United States)

    Douglas, Claudia; Aroh, Dianne; Colella, Joan; Quadri, Mohammed

    2016-01-01

    The Affordable Care Act, 2010, and the subsequent shift from a quantity-focus to a value-centric reimbursement model led our organization to create the HackensackUMC Value-Based Care Model to improve our process capability and performance to meet and sustain the triple aims of value-based purchasing: higher quality, lower cost, and consumer perception. This article describes the basics of our model and illustrates how we used it to reduce the costs of our patient sitter program.

  4. Theoretical modeling of iodine value and saponification value of biodiesel fuels from their fatty acid composition

    Energy Technology Data Exchange (ETDEWEB)

    Gopinath, A.; Puhan, Sukumar; Nagarajan, G. [Internal Combustion Engineering Division, Department of Mechanical Engineering, Anna University, Chennai 600 025, Tamil Nadu (India)

    2009-07-15

    Biodiesel is an alternative fuel consisting of alkyl esters of fatty acids from vegetable oils or animal fats. The properties of biodiesel depend on the type of vegetable oil used for the transesterification process. The objective of the present work is to theoretically predict the iodine value and the saponification value of different biodiesels from their fatty acid methyl ester composition. The fatty acid ester compositions and the above values of different biodiesels were taken from the available published data. A multiple linear regression model was developed to predict the iodine value and saponification value of different biodiesels. The predicted results showed that the prediction errors were less than 3.4% compared to the available published data. The predicted values were also verified by substituting in the available published model which was developed to predict the higher heating values of biodiesel fuels from their iodine value and the saponification value. The resulting heating values of biodiesels were then compared with the published heating values and reported. (author)

  5. Hyperbolic value addition and general models of animal choice.

    Science.gov (United States)

    Mazur, J E

    2001-01-01

    Three mathematical models of choice--the contextual-choice model (R. Grace, 1994), delay-reduction theory (N. Squires & E. Fantino, 1971), and a new model called the hyperbolic value-added model--were compared in their ability to predict the results from a wide variety of experiments with animal subjects. When supplied with 2 or 3 free parameters, all 3 models made fairly accurate predictions for a large set of experiments that used concurrent-chain procedures. One advantage of the hyperbolic value-added model is that it is derived from a simpler model that makes accurate predictions for many experiments using discrete-trial adjusting-delay procedures. Some results favor the hyperbolic value-added model and delay-reduction theory over the contextual-choice model, but more data are needed from choice situations for which the models make distinctly different predictions.

  6. The Role of Non-Epistemic Values in Engineering Models

    OpenAIRE

    Diekmann, Sven; Peterson, Martin

    2011-01-01

    We argue that non-epistemic values, including moral ones, play an important role in the construction and choice of models in science and engineering. Our main claim is that non-epistemic values are not only “secondary values” that become important just in case epistemic values leave some issues open. Our point is, on the contrary, that non-epistemic values are as important as epistemic ones when engineers seek to develop the best model of a process or problem. The upshot is that models are ne...

  7. Mean Value Modelling of an SI Engine with EGR

    DEFF Research Database (Denmark)

    Føns, Michael; Muller, Martin; Chevalier, Alain

    1999-01-01

    Mean Value Engine Models (MVEMs) are simplified, dynamic engine models which are physically based. Such models are useful for control studies, for engine control system analysis and for model based control systems. Very few published MVEMs have included the effects of Exhaust Gas Recirculation (E...

  8. A value model for evaluating homeland security decisions.

    Science.gov (United States)

    Keeney, Ralph L; von Winterfeldt, Detlof

    2011-09-01

    One of the most challenging tasks of homeland security policymakers is to allocate their limited resources to reduce terrorism risks cost effectively. To accomplish this task, it is useful to develop a comprehensive set of homeland security objectives, metrics to measure each objective, a utility function, and value tradeoffs relevant for making homeland security investments. Together, these elements form a homeland security value model. This article develops a homeland security value model based on literature reviews, a survey, and experience with building value models. The purposes of the article are to motivate the use of a value model for homeland security decision making and to illustrate its use to assess terrorism risks, assess the benefits of countermeasures, and develop a severity index for terrorism attacks. © 2011 Society for Risk Analysis.

  9. Self-Service Banking: Value Creation Models and Information Exchange

    Directory of Open Access Journals (Sweden)

    Ragnvald Sannes

    2001-01-01

    Full Text Available This paper argues that most banks have failed to exploit the potential of self-service banking because they base their service design on an incomplete business model for self-service. A framework for evaluation of self-service banking concepts is developed on the basis of Stabell and Fjeldstad's three value configurations. The value network and the value shop are consistent with self-service banking while the value chain is inappropriate. The impact of the value configurations on information exchange and self-service functionality is discussed, and a framework for design of such services proposed. Current self-service banking practices are compared to the framework, and it is concluded that current practice matches the concept of a value network and not the value shop. However, current practices are only a partial implementation of a value network-based self-service banking concept.

  10. Reiteration of Hankel singular value decomposition for modeling of complex-valued signal

    Science.gov (United States)

    Staniszewski, Michał; Skorupa, Agnieszka; Boguszewicz, Łukasz; Wicher, Magdalena; Konopka, Marek; Sokół, Maria; Polański, Andrzej

    2016-06-01

    Modeling signal which forms complex values is a common scientific problem, which is present in many applications, i.e. in medical signals, computer graphics and vision. One of the possible solution is utilization of Hankel Singular Value Decomposition. In the first step complex-valued signal is arranged in a special form called Hankel matrix, which is in the next step decomposed in operation of Singular Value Decomposition. Obtained matrices can be then reformulated in order to get parameters describing system. Basic method can be applied for fitting whole signal but it fails in modeling each particular component of signal. Modification of basic HSVD method, which relies on reiteration and is used for main components, and application of prior knowledge solves presented problem.

  11. Environment Modeling Using Runtime Values for JPF-Android

    Science.gov (United States)

    van der Merwe, Heila; Tkachuk, Oksana; Nel, Seal; van der Merwe, Brink; Visser, Willem

    2015-01-01

    Software applications are developed to be executed in a specific environment. This environment includes external native libraries to add functionality to the application and drivers to fire the application execution. For testing and verification, the environment of an application is simplified abstracted using models or stubs. Empty stubs, returning default values, are simple to generate automatically, but they do not perform well when the application expects specific return values. Symbolic execution is used to find input parameters for drivers and return values for library stubs, but it struggles to detect the values of complex objects. In this work-in-progress paper, we explore an approach to generate drivers and stubs based on values collected during runtime instead of using default values. Entry-points and methods that need to be modeled are instrumented to log their parameters and return values. The instrumented applications are then executed using a driver and instrumented libraries. The values collected during runtime are used to generate driver and stub values on- the-fly that improve coverage during verification by enabling the execution of code that previously crashed or was missed. We are implementing this approach to improve the environment model of JPF-Android, our model checking and analysis tool for Android applications.

  12. Mean Value Modelling of a Turbocharged SI Engine

    DEFF Research Database (Denmark)

    Müller, Martin; Hendricks, Elbert; Sorenson, Spencer C.

    1998-01-01

    An important paradigm for the modelling of naturallly aspirated (NA) spark ignition (SI) engines for control purposes is the Mean Value Engine Model (MVEM). Such models have a time resolution which is just sufficient to capture the main details of the dynamic performance of NA SI engines but not ...

  13. Trust Model for Social Network using Singular Value Decomposition

    OpenAIRE

    Davis Bundi Ntwiga; Patrick Weke; Michael Kiura Kirumbu

    2016-01-01

    For effective interactions to take place in a social network, trust is important. We model trust of agents using the peer to peer reputation ratings in the network that forms a real valued matrix. Singular value decomposition discounts the reputation ratings to estimate the trust levels as trust is the subjective probability of future expectations based on current reputation ratings. Reputation and trust are closely related and singular value decomposition can estimate trust using the...

  14. Stakeholder Theory and Value Creation Models in Brazilian Firms

    Directory of Open Access Journals (Sweden)

    Natalia Giugni Vidal

    2015-09-01

    Full Text Available Objective – The purpose of this study is to understand how top Brazilian firms think about and communicate value creation to their stakeholders. Design/methodology/approach – We use qualitative content analysis methodology to analyze the sustainability or annual integrated reports of the top 25 Brazilian firms by sales revenue. Findings – Based on our analysis, these firms were classified into three main types of stakeholder value creation models: narrow, broad, or transitioning from narrow to broad. We find that many of the firms in our sample are in a transition state between narrow and broad stakeholder value creation models. We also identify seven areas of concentration discussed by firms in creating value for stakeholders: better stakeholder relationships, better work environment, environmental preservation, increased customer base, local development, reputation, and stakeholder dialogue. Practical implications – This study shows a trend towards broader stakeholder value creation models in Brazilian firms. The findings of this study may inform practitioners interested in broadening their value creation models. Originality/value – This study adds to the discussion of stakeholder theory in the Brazilian context by understanding variations in value creation orientation in Brazil.

  15. Value Creation Challenges in Multichannel Retail Business Models

    Directory of Open Access Journals (Sweden)

    Mika Yrjölä

    2014-08-01

    Full Text Available Purpose: The purpose of the paper is to identify and analyze the challenges of value creation in multichannel retail business models. Design/methodology/approach: With the help of semi-structured interviews with top executives from different retailing environments, this study introduces a model of value creation challenges in the context of multichannel retailing. The challenges are analyzed in terms of three retail business model elements, i.e., format, activities, and governance. Findings: Adopting a multichannel retail business model requires critical rethinking of the basic building blocks of value creation. First of all, as customers effortlessly move between multiple channels, multichannel formats can lead to a mismatch between customer and firm value. Secondly, retailers face pressures to use their activities to form integrated total offerings to customers. Thirdly, multiple channels might lead to organizational silos with conflicting goals. A careful orchestration of value creation is needed to determine the roles and incentives of the channel parties involved. Research limitations/implications: In contrast to previous business model literature, this study did not adopt a network-centric view. By embracing the boundary-spanning nature of the business model, other challenges and elements might have been discovered (e.g., challenges in managing relationships with suppliers. Practical implications: As a practical contribution, this paper has analyzed the challenges retailers face in adopting multichannel business models. Customer tendencies for showrooming behavior highlight the need for generating efficient lock-in strategies. Customized, personal offers and information are ways to increase customer value, differentiate from competition, and achieve lock-in. Originality/value: As a theoretical contribution, this paper empirically investigates value creation challenges in a specific context, lowering the level of abstraction in the mostly

  16. Integral-Value Models for Outcomes over Continuous Time

    DEFF Research Database (Denmark)

    Harvey, Charles M.; Østerdal, Lars Peter

    Models of preferences between outcomes over continuous time are important for individual, corporate, and social decision making, e.g., medical treatment, infrastructure development, and environmental regulation. This paper presents a foundation for such models. It shows that conditions...... on preferences between real- or vector-valued outcomes over continuous time are satisfied if and only if the preferences are represented by a value function having an integral form...

  17. A mixing-length model for the prediction of convex curvature effects on turbulent boundary layers. [for turbine blade convective heat transfer prediction

    Science.gov (United States)

    Adams, E. W.; Johnston, J. P.

    1983-01-01

    A mixing-length model is developed for the prediction of turbulent boundary layers with convex streamwise curvature. For large layer thickness ratio, delta/R greater than 0.05, the model scales mixing length on the wall radius of curvature, R. For small delta/R, ordinary flat wall modeling is used for the mixing-length profile with curvature corrections, following the recommendations of Eide and Johnston (1976). Effects of streamwise change of curvature are considered; a strong lag from equilibrium is required when R increases downstream. Fifteen separate data sets were compared, including both hydrodynamic and heat transfer results. Six of these computations are presented and compared to experiment.

  18. Prediction of survival with alternative modeling techniques using pseudo values.

    Directory of Open Access Journals (Sweden)

    Tjeerd van der Ploeg

    Full Text Available BACKGROUND: The use of alternative modeling techniques for predicting patient survival is complicated by the fact that some alternative techniques cannot readily deal with censoring, which is essential for analyzing survival data. In the current study, we aimed to demonstrate that pseudo values enable statistically appropriate analyses of survival outcomes when used in seven alternative modeling techniques. METHODS: In this case study, we analyzed survival of 1282 Dutch patients with newly diagnosed Head and Neck Squamous Cell Carcinoma (HNSCC with conventional Kaplan-Meier and Cox regression analysis. We subsequently calculated pseudo values to reflect the individual survival patterns. We used these pseudo values to compare recursive partitioning (RPART, neural nets (NNET, logistic regression (LR general linear models (GLM and three variants of support vector machines (SVM with respect to dichotomous 60-month survival, and continuous pseudo values at 60 months or estimated survival time. We used the area under the ROC curve (AUC and the root of the mean squared error (RMSE to compare the performance of these models using bootstrap validation. RESULTS: Of a total of 1282 patients, 986 patients died during a median follow-up of 66 months (60-month survival: 52% [95% CI: 50%-55%]. The LR model had the highest optimism corrected AUC (0.791 to predict 60-month survival, followed by the SVM model with a linear kernel (AUC 0.787. The GLM model had the smallest optimism corrected RMSE when continuous pseudo values were considered for 60-month survival or the estimated survival time followed by SVM models with a linear kernel. The estimated importance of predictors varied substantially by the specific aspect of survival studied and modeling technique used. CONCLUSIONS: The use of pseudo values makes it readily possible to apply alternative modeling techniques to survival problems, to compare their performance and to search further for promising

  19. Possibilistic Fuzzy Net Present Value Model and Application

    Directory of Open Access Journals (Sweden)

    S. S. Appadoo

    2014-01-01

    Full Text Available The cash flow values and the interest rate in the net present value (NPV model are usually specified by either crisp numbers or random variables. In this paper, we first discuss some of the recent developments in possibility theory and find closed form expressions for fuzzy possibilistic net present value (FNPV. Then, following Carlsson and Fullér (2001, we discuss some of the possibilistic moments related to FNPV model along with an illustrative numerical example. We also give a unified approach to find higher order moments of FNPV by using the moment generating function introduced by Paseka et al. (2011.

  20. Models of consumer value cocreation in health care.

    Science.gov (United States)

    Nambisan, Priya; Nambisan, Satish

    2009-01-01

    In recent years, consumer participation in health care has gained critical importance as health care organizations (HCOs) seek varied avenues to enhance the quality and the value of their offerings. Many large HCOs have established online health communities where health care consumers (patients) can interact with one another to share knowledge and offer emotional support in disease management and care. Importantly, the focus of consumer participation in health care has moved beyond such personal health care management as the potential for consumers to participate in innovation and value creation in varied areas of the health care industry becomes increasingly evident. Realizing such potential, however, will require HCOs to develop a better understanding of the varied types of consumer value cocreation that are enabled by new information and communication technologies such as online health communities and Web 2.0 (social media) technologies. This article seeks to contribute toward such an understanding by offering a concise and coherent theoretical framework to analyze consumer value cocreation in health care. We identify four alternate models of consumer value cocreation-the partnership model, the open-source model, the support-group model, and the diffusion model-and discuss their implications for HCOs. We develop our theoretical framework by drawing on theories and concepts in knowledge creation, innovation management, and online communities. A set of propositions are developed by combining theoretical insights from these areas with real-world examples of consumer value cocreation in health care. The theoretical framework offered here informs on the potential impact of the different models of consumer value cocreation on important organizational variables such as innovation cost and time, service quality, and consumer perceptions of HCO. An understanding of the four models of consumer value cocreation can help HCOs adopt appropriate strategies and practices to

  1. A Bayesian model of context-sensitive value attribution.

    Science.gov (United States)

    Rigoli, Francesco; Friston, Karl J; Martinelli, Cristina; Selaković, Mirjana; Shergill, Sukhwinder S; Dolan, Raymond J

    2016-06-22

    Substantial evidence indicates that incentive value depends on an anticipation of rewards within a given context. However, the computations underlying this context sensitivity remain unknown. To address this question, we introduce a normative (Bayesian) account of how rewards map to incentive values. This assumes that the brain inverts a model of how rewards are generated. Key features of our account include (i) an influence of prior beliefs about the context in which rewards are delivered (weighted by their reliability in a Bayes-optimal fashion), (ii) the notion that incentive values correspond to precision-weighted prediction errors, (iii) and contextual information unfolding at different hierarchical levels. This formulation implies that incentive value is intrinsically context-dependent. We provide empirical support for this model by showing that incentive value is influenced by context variability and by hierarchically nested contexts. The perspective we introduce generates new empirical predictions that might help explaining psychopathologies, such as addiction.

  2. Pricing for Catastrophe Bonds Based on Expected-value Model

    Directory of Open Access Journals (Sweden)

    Junfei Chen

    2013-02-01

    Full Text Available As the catastrophes cannot be avoided and result in huge economic losses, therefore the compensation issue for catastrophe losses become an important research topic. Catastrophe bonds can effectively disperse the catastrophe risks which mainly undertaken by the government and the insurance companies currently and focus on capital more effectively in broad capital market, therefore to be an ideal catastrophe securities product. This study adopts Expectancy Theory to supplement and improve the pricing of catastrophe bonds based on Value Theory. A model of expected utility is established to determine the conditions of the expected revenue R of catastrophe bonds. The pricing model of the value function is used to get the psychological value of R,U (R-R‾, for catastrophe bonds. Finally, the psychological value is improved by the value according to expected utility and this can more accurately evaluate catastrophe bonds at a reasonable price. This research can provide decision-making for the pricing of catastrophe bonds.

  3. [Healthcare value chain: a model for the Brazilian healthcare system].

    Science.gov (United States)

    Pedroso, Marcelo Caldeira; Malik, Ana Maria

    2012-10-01

    This article presents a model of the healthcare value chain which consists of a schematic representation of the Brazilian healthcare system. The proposed model is adapted for the Brazilian reality and has the scope and flexibility for use in academic activities and analysis of the healthcare sector in Brazil. It places emphasis on three components: the main activities of the value chain, grouped in vertical and horizontal links; the mission of each link and the main value chain flows. The proposed model consists of six vertical and three horizontal links, amounting to nine. These are: knowledge development; supply of products and technologies; healthcare services; financial intermediation; healthcare financing; healthcare consumption; regulation; distribution of healthcare products; and complementary and support services. Four flows can be used to analyze the value chain: knowledge and innovation; products and services; financial; and information.

  4. Integer Valued Autoregressive Models for Tipping Bucket Rainfall Measurements

    DEFF Research Database (Denmark)

    Thyregod, Peter; Carstensen, Niels Jacob; Madsen, Henrik

    1999-01-01

    A new method for modelling the dynamics of rain sampled by a tipping bucket rain gauge is proposed. The considered models belong to the class of integer valued autoregressive processes. The models take the autocorelation and discrete nature of the data into account. A first order, a second order...... and a threshold model are presented together with methods to estimate the parameters of each model. The models are demonstrated to provide a good description of dt from actual rain events requiring only two to four parameters....

  5. Creating Value Through the Freemium Business Model: A Consumer Perspective

    NARCIS (Netherlands)

    J. Rietveld (Joost)

    2016-01-01

    textabstractThis paper develops a consumer-centric framework for creating value through the freemium business model. Goods that are commercialized through the freemium business model offer basic functionality for free and monetize users for extended use or complementary features. Compared to premium

  6. International Business Models Developed Through Brokerage Knowledge and Value Creation

    DEFF Research Database (Denmark)

    Petersen, Nicolaj Hannesbo; Rasmussen, Erik Stavnsager

    This paper highlights theoretically and empirically international business model decisions in networks with knowledge sharing and value creation. The paper expands the conceptual in-ternational business model framework for technology-oriented companies to include the focal firm’s network role...

  7. The Unfolding of Value Sources During Online Business Model Transformation

    Directory of Open Access Journals (Sweden)

    Nadja Hoßbach

    2016-12-01

    Full Text Available Purpose: In the magazine publishing industry, viable online business models are still rare to absent. To prepare for the ‘digital future’ and safeguard their long-term survival, many publishers are currently in the process of transforming their online business model. Against this backdrop, this study aims to develop a deeper understanding of (1 how the different building blocks of an online business model are transformed over time and (2 how sources of value creation unfold during this transformation process. Methodology: To answer our research question, we conducted a longitudinal case study with a leading German business magazine publisher (called BIZ. Data was triangulated from multiple sources including interviews, internal documents, and direct observations. Findings: Based on our case study, we nd that BIZ used the transformation process to differentiate its online business model from its traditional print business model along several dimensions, and that BIZ’s online business model changed from an efficiency- to a complementarity- to a novelty-based model during this process. Research implications: Our findings suggest that different business model transformation phases relate to different value sources, questioning the appropriateness of value source-based approaches for classifying business models. Practical implications: The results of our case study highlight the need for online-offline business model differentiation and point to the important distinction between service and product differentiation. Originality: Our study contributes to the business model literature by applying a dynamic and holistic perspective on the link between online business model changes and unfolding value sources.

  8. Genomic breeding value estimation using nonparametric additive regression models

    Directory of Open Access Journals (Sweden)

    Solberg Trygve

    2009-01-01

    Full Text Available Abstract Genomic selection refers to the use of genomewide dense markers for breeding value estimation and subsequently for selection. The main challenge of genomic breeding value estimation is the estimation of many effects from a limited number of observations. Bayesian methods have been proposed to successfully cope with these challenges. As an alternative class of models, non- and semiparametric models were recently introduced. The present study investigated the ability of nonparametric additive regression models to predict genomic breeding values. The genotypes were modelled for each marker or pair of flanking markers (i.e. the predictors separately. The nonparametric functions for the predictors were estimated simultaneously using additive model theory, applying a binomial kernel. The optimal degree of smoothing was determined by bootstrapping. A mutation-drift-balance simulation was carried out. The breeding values of the last generation (genotyped was predicted using data from the next last generation (genotyped and phenotyped. The results show moderate to high accuracies of the predicted breeding values. A determination of predictor specific degree of smoothing increased the accuracy.

  9. The Yellow Brick Road: a values based curriculum model.

    Science.gov (United States)

    McLean, Christopher

    2012-05-01

    Within the United Kingdom, the Nursing and Midwifery Council (NMC) requires that nurses and midwives are of 'good character' at the point of registration. This paper sets out how good character has been conceptualised within one U.K. higher education institution and presents a model of "values based enquiry" which aims to develop the 'character' of students. The paper presents three qualities ("the heart", "the nerve" and "the brain") which represent 'good character' and which are believed to underpin values based Nursing or Midwifery practice. The development of these qualities is argued to be reliant upon helping students to develop intrinsic professional values of care and compassion. The role of these character qualities in nursing practice and education is outlined, as are the ways in which they have led to the development of a model for values based enquiry. This model represents a vision of the nature of professional education which may be shared by staff and students, whilst offering a model for learning and teaching based upon recognised educational principles. An argument is advanced that the adoption of a values based enquiry model may develop and nurture the habits of mind which are necessary for the development of 'good character'. Copyright © 2011 Elsevier Ltd. All rights reserved.

  10. Model Checking Real-Time Value-Passing Systems

    Institute of Scientific and Technical Information of China (English)

    Jing Chen; Zio-Ning Cao

    2004-01-01

    In this paper,to model check real-time value-passing systems,a formal language Timed Symbolic Transition Graph and a logic system named Timed Predicate μ-Calculus are proposed.An algorithm is presented which is local in that it generates and investigates the reachable state space in top-down fashion and maintains the partition for time evaluations as coarse as possible while on-the-fly instantiating data variables.It can deal with not only data variables with finite value domain,but also the so called data independent variables with infinite value domain.To authors knowledge,this is the first algorithm for model checking timed systems containing value-passing features.

  11. An interval-valued reliability model with bounded failure rates

    DEFF Research Database (Denmark)

    Kozine, Igor; Krymsky, Victor

    2012-01-01

    The approach to deriving interval-valued reliability measures described in this paper is distinctive from other imprecise reliability models in that it overcomes the issue of having to impose an upper bound on time to failure. It rests on the presupposition that a constant interval-valued failure...... function if only partial failure information is available. An example is provided. © 2012 Copyright Taylor and Francis Group, LLC....

  12. Continuous Spatial Process Models for Spatial Extreme Values

    KAUST Repository

    Sang, Huiyan

    2010-01-28

    We propose a hierarchical modeling approach for explaining a collection of point-referenced extreme values. In particular, annual maxima over space and time are assumed to follow generalized extreme value (GEV) distributions, with parameters μ, σ, and ξ specified in the latent stage to reflect underlying spatio-temporal structure. The novelty here is that we relax the conditionally independence assumption in the first stage of the hierarchial model, an assumption which has been adopted in previous work. This assumption implies that realizations of the the surface of spatial maxima will be everywhere discontinuous. For many phenomena including, e. g., temperature and precipitation, this behavior is inappropriate. Instead, we offer a spatial process model for extreme values that provides mean square continuous realizations, where the behavior of the surface is driven by the spatial dependence which is unexplained under the latent spatio-temporal specification for the GEV parameters. In this sense, the first stage smoothing is viewed as fine scale or short range smoothing while the larger scale smoothing will be captured in the second stage of the modeling. In addition, as would be desired, we are able to implement spatial interpolation for extreme values based on this model. A simulation study and a study on actual annual maximum rainfall for a region in South Africa are used to illustrate the performance of the model. © 2009 International Biometric Society.

  13. Eigen values in epidemic and other bio-inspired models

    Science.gov (United States)

    Supriatna, A. K.; Anggriani, N.; Carnia, E.; Raihan, A.

    2017-08-01

    Eigen values and the largest eigen value have special roles in many applications. In this paper we will discuss its role in determining the epidemic threshold in which we can determine if an epidemic will decease or blow out eventually. Some examples and their consequences to controling the epidemic are also discusses. Beside the application in epidemic model, the paper also discusses other example of appication in bio-inspired model, such as the backcross breeding for two age classes of local and exotic goats. Here we give some elaborative examples on the use of previous backcross breeding model. Some future direction on the exploration of the relationship between these eigenvalues to different epidemic models and other bio-inspired models are also presented.

  14. [Spirographic reference values. Mathematical models and practical use (author's transl)].

    Science.gov (United States)

    Drouet, D; Kauffmann, F; Brille, D; Lellouch, J

    1980-01-01

    Various models predicting VC and FEV1 from age and height have been compared by both theoretical and practical approaches on several subgroups of a working population examined in 1960 and 1972. The models in which spirographic values are proportional to the cube of the height give a significantly worse fit of the data. All the other models give similar predicted values in practical terms, but cutoff points depend on the distributions of VC and FEV1 given age and height. Results show that these distributions are closer to a normal than to a lognormal distribution. The use of reference values and classical cutoffs is then discussed. Rather than using a single cutoff point, a more quantitative way is proposed to describe the subjects' functional status, for example by situating him in the percentile of the reference population. In screening, cutoff points cannot be choosen without specifying first the decision considered and the population concerned.

  15. Automatic modeling of the linguistic values for database fuzzy querying

    Directory of Open Access Journals (Sweden)

    Diana STEFANESCU

    2007-12-01

    Full Text Available In order to evaluate vague queries, each linguistic term is considered according to its fuzzy model. Usually, the linguistic terms are defined as fuzzy sets, during a classical knowledge acquisition off-line process. But they can also be automatically extracted from the actual content of the database, by an online process. In at least two situations, automatically modeling the linguistic values would be very useful: first, to simplify the knowledge engineer’s task by extracting the definitions from the database content; and second, where mandatory, to dynamically define the linguistic values in complex criteria queries evaluation. Procedures to automatically extract the fuzzy model of the linguistic values from the existing data are presented in this paper.

  16. Value increasing business model for e-hospital.

    Science.gov (United States)

    Null, Robert; Wei, June

    2009-01-01

    This paper developed a business value increasing model for electronic hospital (e-hospital) based on electronic value chain analysis. From this model, 58 hospital electronic business (e-business) solutions were developed. Additionally, this paper investigated the adoption patterns of these 58 e-business solutions within six US leading hospitals. The findings show that only 36 of 58 or 62% of the e-business solutions are fully or partially implemented within the six hospitals. Ultimately, the research results will be beneficial to managers and executives for accelerating e-business adoptions for e-hospital.

  17. Modeling the value of strategic actions in the superior colliculus

    Directory of Open Access Journals (Sweden)

    Dhushan Thevarajah

    2010-02-01

    Full Text Available In learning models of strategic game play, an agent constructs a valuation (action value over possible future choices as a function of past actions and rewards. Choices are then stochastic functions of these action values. Our goal is to uncover a neural signal that correlates with the action value posited by behavioral learning models. We measured activity from neurons in the superior colliculus (SC, a midbrain region involved in planning saccadic eye movements, in monkeys while they performed two saccade tasks. In the strategic task, monkeys competed against a computer in a saccade version of the mixed-strategy game “matching-pennies”. In the instructed task, stochastic saccades were elicited through explicit instruction rather than free choices. In both tasks, neuronal activity and behavior were shaped by past actions and rewards with more recent events exerting a larger influence. Further, SC activity predicted upcoming choices during the strategic task and upcoming reaction times during the instructed task. Finally, we found that neuronal activity in both tasks correlated with an established learning model, the Experience Weighted Attraction model of action valuation (Ho, Camerer, and Chong, 2007. Collectively, our results provide evidence that action values hypothesized by learning models are represented in the motor planning regions of the brain in a manner that could be used to select strategic actions.

  18. Mean Value Engine Modelling of an SI Engine with EGR

    DEFF Research Database (Denmark)

    Føns, Michael; Müller, Martin; Chevalier, Alain

    1999-01-01

    Mean Value Engine Models (MVEMs) are simplified, dynamic engine models what are physically based. Such models are useful for control studies, for engine control system analysis and for model based engine control systems. Very few published MVEMs have included the effects of Exhaust Gas...... Recirculation (EGR). The purpose of this paper is to present a modified MVEM which includes EGR in a physical way. It has been tested using newly developed, very fast manifold pressure, manifold temperature, port and EGR mass flow sensors. Reasonable agreement has been obtained on an experimental engine...

  19. Can Participatory Action Research Create Value for Business Model Innovation?

    DEFF Research Database (Denmark)

    Sparre, Mogens; Rasmussen, Ole Horn; Fast, Alf Michael

    Abstract: Participatory Action Research (PAR) has a longer academic history compared with the idea of business models (BMs). This paper indicates how industries gain by using the combined methodology. The research question "Can participatory action research create value for Business Model...... Innovation (BMI)?” – has been investigated from five different perspectives based upon The Business Model Cube and The Where to Look Model. Using both established and newly developed tools the paper presents how. Theory and data from two cases are presented and it is demonstrated how industry increase...... their monetary and/or non-monetary value creation doing BMI based upon PAR. The process is essential and using the methodology of PAR creates meaning. Behind the process, the RAR methodology and its link to BM and BMI may contribute to theory construction and creation of a common language in academia around...

  20. Energy risk management and value at risk modeling

    Energy Technology Data Exchange (ETDEWEB)

    Mehdi Sadeghi; Saeed Shavvalpour [Imam Sadiq University, Tehran (Iran). Economics Dept.

    2006-12-15

    The value of energy trades can change over time with market conditions and underlying price variables. The rise of competition and deregulation in energy markets has led to relatively free energy markets that are characterized by high price shifts. Within oil markets the volatile oil price environment after OPEC agreements in the 1970s requires a risk quantification. ''Value-at-risk'' has become an essential tool for this end when quantifying market risk. There are various methods for calculating value-at-risk. The methods we introduced in this paper are Historical Simulation ARMA Forecasting and Variance-Covariance based on GARCH modeling approaches. The results show that among various approaches the HSAF methodology presents more efficient results, so that if the level of confidence is 99%, the value-at-risk calculated through HSAF methodology is greater than actual price changes in almost 97.6 percent of the forecasting period. (author)

  1. Energy risk management and value at risk modeling

    Energy Technology Data Exchange (ETDEWEB)

    Sadeghi, Mehdi [Economics department, Imam Sadiq University, P.B. 14655-159, Tehran (Iran, Islamic Republic of)]. E-mail: sadeghi@isu.ac.ir; Shavvalpour, Saeed [Economics department, Imam Sadiq University, P.B. 14655-159, Tehran (Iran, Islamic Republic of)]. E-mail: shavalpoor@isu.ac.ir

    2006-12-15

    The value of energy trades can change over time with market conditions and underlying price variables. The rise of competition and deregulation in energy markets has led to relatively free energy markets that are characterized by high price shifts. Within oil markets the volatile oil price environment after OPEC agreements in the 1970s requires a risk quantification.' Value-at-risk' has become an essential tool for this end when quantifying market risk. There are various methods for calculating value-at-risk. The methods we introduced in this paper are Historical Simulation ARMA Forecasting and Variance-Covariance based on GARCH modeling approaches. The results show that among various approaches the HSAF methodology presents more efficient results, so that if the level of confidence is 99%, the value-at-risk calculated through HSAF methodology is greater than actual price changes in almost 97.6 percent of the forecasting period.

  2. A model for measuring value for money in professional sports

    Directory of Open Access Journals (Sweden)

    Vlad ROŞCA

    2013-07-01

    Full Text Available Few to almost none sports teams measure the entertainment value they provide to fans in exchange of the money the latter ones spend on admission fees. Scientific literature oversees the issue as well. The aim of this paper is to present a model that can be used for calculating value for money in the context of spectating sports. The research question asks how can value for money be conceptualized and measured for sports marketing purposes? Using financial and sporting variables, the method calculates how much money, on average, a fan had to spend for receiving quality entertainment – defined as won matches – from his favorite team, during the last season of the Romanian first division football championship. The results only partially confirm the research hypothesis, showing that not just price and sporting performances may influence the value delivered to fans, but other factors as well.

  3. Quantile uncertainty and value-at-risk model risk.

    Science.gov (United States)

    Alexander, Carol; Sarabia, José María

    2012-08-01

    This article develops a methodology for quantifying model risk in quantile risk estimates. The application of quantile estimates to risk assessment has become common practice in many disciplines, including hydrology, climate change, statistical process control, insurance and actuarial science, and the uncertainty surrounding these estimates has long been recognized. Our work is particularly important in finance, where quantile estimates (called Value-at-Risk) have been the cornerstone of banking risk management since the mid 1980s. A recent amendment to the Basel II Accord recommends additional market risk capital to cover all sources of "model risk" in the estimation of these quantiles. We provide a novel and elegant framework whereby quantile estimates are adjusted for model risk, relative to a benchmark which represents the state of knowledge of the authority that is responsible for model risk. A simulation experiment in which the degree of model risk is controlled illustrates how to quantify Value-at-Risk model risk and compute the required regulatory capital add-on for banks. An empirical example based on real data shows how the methodology can be put into practice, using only two time series (daily Value-at-Risk and daily profit and loss) from a large bank. We conclude with a discussion of potential applications to nonfinancial risks.

  4. Values and uncertainties in the predictions of global climate models.

    Science.gov (United States)

    Winsberg, Eric

    2012-06-01

    Over the last several years, there has been an explosion of interest and attention devoted to the problem of Uncertainty Quantification (UQ) in climate science-that is, to giving quantitative estimates of the degree of uncertainty associated with the predictions of global and regional climate models. The technical challenges associated with this project are formidable, and so the statistical community has understandably devoted itself primarily to overcoming them. But even as these technical challenges are being met, a number of persistent conceptual difficulties remain. So why is UQ so important in climate science? UQ, I would like to argue, is first and foremost a tool for communicating knowledge from experts to policy makers in a way that is meant to be free from the influence of social and ethical values. But the standard ways of using probabilities to separate ethical and social values from scientific practice cannot be applied in a great deal of climate modeling, because the roles of values in creating the models cannot be discerned after the fact-the models are too complex and the result of too much distributed epistemic labor. I argue, therefore, that typical approaches for handling ethical/social values in science do not work well here.

  5. IT Business Value Model for Information Intensive Organizations

    Directory of Open Access Journals (Sweden)

    Antonio Carlos Gastaud Maçada

    2012-01-01

    Full Text Available Many studies have highlighted the capacity Information Technology (IT has for generating value for organizations. Investments in IT made by organizations have increased each year. Therefore, the purpose of the present study is to analyze the IT Business Value for Information Intensive Organizations (IIO - e.g. banks, insurance companies and securities brokers. The research method consisted of a survey that used and combined the models from Weill and Broadbent (1998 and Gregor, Martin, Fernandez, Stern and Vitale (2006. Data was gathered using an adapted instrument containing 5 dimensions (Strategic, Informational, Transactional, Transformational and Infra-structure with 27 items. The instrument was refined by employing statistical techniques such as Exploratory and Confirmatory Factorial Analysis through Structural Equations (first and second order Model Measurement. The final model is composed of four factors related to IT Business Value: Strategic, Informational, Transactional and Transformational, arranged in 15 items. The dimension Infra-structure was excluded during the model refinement process because it was discovered during interviews that managers were unable to perceive it as a distinct dimension of IT Business Value.

  6. Value-Added Models: What the Experts Say

    Science.gov (United States)

    Amrein-Beardsley, Audrey; Pivovarova, Margarita; Geiger, Tray J.

    2016-01-01

    Being an expert involves explaining how things are supposed to work, and, perhaps more important, why things might not work as supposed. In this study, researchers surveyed scholars with expertise in value-added models (VAMs) to solicit their opinions about the uses and potential of VAMs for teacher-level accountability purposes (for example, in…

  7. Cultivating a disease management partnership: a value-chain model.

    Science.gov (United States)

    Murray, Carolyn F; Monroe, Wendy; Stalder, Sharon A

    2003-01-01

    Disease management (DM) is one of the health care industry's more innovative value-chain models, whereby multiple relationships are created to bring complex and time-sensitive services to market. The very nature of comprehensive, seamless DM provided through an outsourced arrangement necessitates a level of cooperation, trust, and synergy that may be lacking from more traditional vendor-customer relationships. This discussion highlights the experience of one health plan and its vendor partner and their approach to the development and delivery of an outsourced heart failure (HF) DM program. The program design and rollout are discussed within principles adapted from the theoretical framework of a value-chain model. Within the value-chain model, added value is created by the convergence and synergistic integration of the partners' discrete strengths. Although each partner brings unique attributes to the relationship, those attributes are significantly enhanced by the value-chain model, thus allowing each party to bring the added value of the relationship to their respective customers. This partnership increases innovation, leverages critical capabilities, and improves market responsiveness. Implementing a comprehensive, outsourced DM program is no small task. DM programs incorporate a broad array of services affecting nearly every department in a health plan's organization. When true seamless integration between multiple organizations with multiple stakeholders is the objective, implementation and ongoing operations can become even more complex. To effectively address the complexities presented by an HF DM program, the parties in this case moved beyond a typical purchaser-vendor relationship to one that is more closely akin to a strategic partnership. This discussion highlights the development of this partnership from the perspective of both organizations, as revealed through contracting and implementation activities. It is intended to provide insight into the program

  8. Empirical likelihood-based evaluations of Value at Risk models

    Institute of Scientific and Technical Information of China (English)

    2009-01-01

    Value at Risk (VaR) is a basic and very useful tool in measuring market risks. Numerous VaR models have been proposed in literature. Therefore, it is of great interest to evaluate the efficiency of these models, and to select the most appropriate one. In this paper, we shall propose to use the empirical likelihood approach to evaluate these models. Simulation results and real life examples show that the empirical likelihood method is more powerful and more robust than some of the asymptotic method available in literature.

  9. Shanghai Stock Prices as Determined by the Present Value Model

    OpenAIRE

    Gregory C. Chow

    2003-01-01

    Derived from the present-value model of stock prices, our model implies that the log stock price is a linear function of expected log dividends and the expected rate of growth of dividends where expectations are formed adaptively. The model explains very well the prices of 47 stocks traded on the Shanghai Stock Exchange observed at the beginning of 1996, 1997, and 1998. The estimated parameters are remarkably similar to those reported for stocks traded on the Hong Kong Stock Exchange and the ...

  10. The heuristic value of redundancy models of aging.

    Science.gov (United States)

    Boonekamp, Jelle J; Briga, Michael; Verhulst, Simon

    2015-11-01

    Molecular studies of aging aim to unravel the cause(s) of aging bottom-up, but linking these mechanisms to organismal level processes remains a challenge. We propose that complementary top-down data-directed modelling of organismal level empirical findings may contribute to developing these links. To this end, we explore the heuristic value of redundancy models of aging to develop a deeper insight into the mechanisms causing variation in senescence and lifespan. We start by showing (i) how different redundancy model parameters affect projected aging and mortality, and (ii) how variation in redundancy model parameters relates to variation in parameters of the Gompertz equation. Lifestyle changes or medical interventions during life can modify mortality rate, and we investigate (iii) how interventions that change specific redundancy parameters within the model affect subsequent mortality and actuarial senescence. Lastly, as an example of data-directed modelling and the insights that can be gained from this, (iv) we fit a redundancy model to mortality patterns observed by Mair et al. (2003; Science 301: 1731-1733) in Drosophila that were subjected to dietary restriction and temperature manipulations. Mair et al. found that dietary restriction instantaneously reduced mortality rate without affecting aging, while temperature manipulations had more transient effects on mortality rate and did affect aging. We show that after adjusting model parameters the redundancy model describes both effects well, and a comparison of the parameter values yields a deeper insight in the mechanisms causing these contrasting effects. We see replacement of the redundancy model parameters by more detailed sub-models of these parameters as a next step in linking demographic patterns to underlying molecular mechanisms.

  11. A holistic model for Islamic accountants and its value added

    OpenAIRE

    El-Halaby, Sherif; Hussainey, Khaled

    2015-01-01

    Purpose – The core objective for this study is introduce the holistic model for Islamic accountants through exploring the perspectives of Muslim scholars; Islamic sharia and AAOIFI ethical standards. The study also contributes to existing literature by exploring the main added value of Muslim accountant towards stakeholders through investigates the main roles of an Islamic accountants. Design/methodology/approach – The paper critically reviews historical debates about Islamic accounting and t...

  12. The Deficit Model and the Forgotten Moral Values

    Directory of Open Access Journals (Sweden)

    Marko Ahteensuu

    2011-03-01

    Full Text Available This paper was presented at the first meeting of the NSU study group “Conceptions of ethical and social values in post-secular society: Towards a new ethical imagination in a cosmopolitan world society”, held on January 28-30, 2011 at Copenhagen Business School. The deficit model explains the general public’s negative attitudes towards science and/or certain scientific applications with the public’s scientific ignorance. The deficit model is commonly criticized for oversimplifying the connection between scientific knowledge and attitudes. Other relevant factors – such as ideology, social identity, trust, culture, and worldviews – should be taken into consideration to a greater extent. We argue that explanations based on the proposed factors sometimes implicitly reintroduce the deficit model type of thinking. The strength of the factors is that they broaden the explanations to concern moral issues. We analyse two central argument types of GMO discussion, and show the central role of moral values in them. Thus, as long as arguments are seen to affect the attitudes of the general public, the role of moral values should be made explicit in the explanations concerning their attitudes.

  13. AskIT Service Desk Support Value Model

    Energy Technology Data Exchange (ETDEWEB)

    Ashcraft, Phillip Lynn [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Cummings, Susan M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Fogle, Blythe G. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Valdez, Christopher D. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-08-07

    The value model discussed herein provides an accurate and simple calculation of the funding required to adequately staff the AskIT Service Desk (SD).  The model is incremental – only technical labor cost is considered.  All other costs, such as management, equipment, buildings, HVAC, and training are considered common elements of providing any labor related IT Service. Depending on the amount of productivity loss and the number of hours the defect was unresolved, the value of resolving work from the SD is unquestionably an economic winner; the average cost of $16 per SD resolution can commonly translate to cost avoidance exceeding well over $100. Attempting to extract too much from the SD will likely create a significant downside. The analysis used to develop the value model indicates that the utilization of the SD is very high (approximately 90%).  As a benchmark, consider a comment from a manager at Vitalyst (a commercial IT service desk) that their utilization target is approximately 60%.  While high SD utilization is impressive, over the long term it is likely to cause unwanted consequences to staff such as higher turnover, illness, or burnout.  A better solution is to staff the SD so that analysts have time to improve skills through training, develop knowledge, improve processes, collaborate with peers, and improve customer relationship skills.

  14. Conceptual Model of Quantities, Units, Dimensions, and Values

    Science.gov (United States)

    Rouquette, Nicolas F.; DeKoenig, Hans-Peter; Burkhart, Roger; Espinoza, Huascar

    2011-01-01

    JPL collaborated with experts from industry and other organizations to develop a conceptual model of quantities, units, dimensions, and values based on the current work of the ISO 80000 committee revising the International System of Units & Quantities based on the International Vocabulary of Metrology (VIM). By providing support for ISO 80000 in SysML via the International Vocabulary of Metrology (VIM), this conceptual model provides, for the first time, a standard-based approach for addressing issues of unit coherence and dimensional analysis into the practice of systems engineering with SysML-based tools. This conceptual model provides support for two kinds of analyses specified in the International Vocabulary of Metrology (VIM): coherence of units as well as of systems of units, and dimension analysis of systems of quantities. To provide a solid and stable foundation, the model for defining quantities, units, dimensions, and values in SysML is explicitly based on the concepts defined in VIM. At the same time, the model library is designed in such a way that extensions to the ISQ (International System of Quantities) and SI Units (Systeme International d Unites) can be represented, as well as any alternative systems of quantities and units. The model library can be used to support SysML user models in various ways. A simple approach is to define and document libraries of reusable systems of units and quantities for reuse across multiple projects, and to link units and quantity kinds from these libraries to Unit and QuantityKind stereotypes defined in SysML user models.

  15. An infinitesimal model for quantitative trait genomic value prediction.

    Directory of Open Access Journals (Sweden)

    Zhiqiu Hu

    Full Text Available We developed a marker based infinitesimal model for quantitative trait analysis. In contrast to the classical infinitesimal model, we now have new information about the segregation of every individual locus of the entire genome. Under this new model, we propose that the genetic effect of an individual locus is a function of the genome location (a continuous quantity. The overall genetic value of an individual is the weighted integral of the genetic effect function along the genome. Numerical integration is performed to find the integral, which requires partitioning the entire genome into a finite number of bins. Each bin may contain many markers. The integral is approximated by the weighted sum of all the bin effects. We now turn the problem of marker analysis into bin analysis so that the model dimension has decreased from a virtual infinity to a finite number of bins. This new approach can efficiently handle virtually unlimited number of markers without marker selection. The marker based infinitesimal model requires high linkage disequilibrium of all markers within a bin. For populations with low or no linkage disequilibrium, we develop an adaptive infinitesimal model. Both the original and the adaptive models are tested using simulated data as well as beef cattle data. The simulated data analysis shows that there is always an optimal number of bins at which the predictability of the bin model is much greater than the original marker analysis. Result of the beef cattle data analysis indicates that the bin model can increase the predictability from 10% (multiple marker analysis to 33% (multiple bin analysis. The marker based infinitesimal model paves a way towards the solution of genetic mapping and genomic selection using the whole genome sequence data.

  16. Value stream mapping in a computational simulation model

    Directory of Open Access Journals (Sweden)

    Ricardo Becker Mendes de Oliveira

    2014-08-01

    Full Text Available The decision-making process has been extensively studied by researchers and executives. This paper aims to use the methodology of Value Stream Mapping (VSM in an integrated manner with a computer simulation model, in order to expand managers decision-making vision. The object of study is based on a production system that involves a process of automatic packaging of products, where it became necessary to implement changes in order to accommodate new products, so that the detection of bottlenecks and the visualization of impacts generated by future modifications are necessary. The simulation aims to support manager’s decision considering that the system involves several variables and their behaviors define the complexity of the process. Significant reduction in project costs by anticipating their behavior, together with the results of the Value Stream Mapping to identify activities that add value or not for the process were the main results. The validation of the simulation model will occur with the current map of the system and with the inclusion of Kaizen events so that waste in future maps are found in a practical and reliable way, which could support decision-makings.

  17. Modeling Stakeholder/Value Dependency through Mean Failure Cost

    Energy Technology Data Exchange (ETDEWEB)

    Aissa, Anis Ben [University of Tunis, Belvedere, Tunisia; Abercrombie, Robert K [ORNL; Sheldon, Frederick T [ORNL; Mili, Ali [New Jersey Insitute of Technology

    2010-01-01

    In an earlier series of works, Boehm et al. discuss the nature of information system dependability and highlight the variability of system dependability according to stakeholders. In a recent paper, the dependency patterns of this model are analyzed. In our recent works, we presented a stakeholder dependent quantitative security model, where we quantify security for a given stakeholder by the mean of the loss incurred by the stakeholder as a result of security threats. We show how this mean can be derived from the security threat configuration (represented as a vector of probabilities that reflect the likelihood of occurrence of the various security threats). We refer to our security metric as MFC, for Mean Failure Cost. In this paper, we analyze Boehm's model from the standpoint of the proposed metric, and show whether, to what extent, and how our metric addresses the issues raised by Boehm's Stakeholder/Value definition of system dependability.

  18. Value of the distant future: Model-independent results

    Science.gov (United States)

    Katz, Yuri A.

    2017-01-01

    This paper shows that the model-independent account of correlations in an interest rate process or a log-consumption growth process leads to declining long-term tails of discount curves. Under the assumption of an exponentially decaying memory in fluctuations of risk-free real interest rates, I derive the analytical expression for an apt value of the long run discount factor and provide a detailed comparison of the obtained result with the outcome of the benchmark risk-free interest rate models. Utilizing the standard consumption-based model with an isoelastic power utility of the representative economic agent, I derive the non-Markovian generalization of the Ramsey discounting formula. Obtained analytical results allowing simple calibration, may augment the rigorous cost-benefit and regulatory impact analysis of long-term environmental and infrastructure projects.

  19. Selecting the optimal method to calculate daily global reference potential evaporation from CFSR reanalysis data for application in a hydrological model study

    Directory of Open Access Journals (Sweden)

    F. C. Sperna Weiland

    2012-03-01

    Full Text Available Potential evaporation (PET is one of the main inputs of hydrological models. Yet, there is limited consensus on which PET equation is most applicable in hydrological climate impact assessments. In this study six different methods to derive global scale reference PET daily time series from Climate Forecast System Reanalysis (CFSR data are compared: Penman-Monteith, Priestley-Taylor and original and re-calibrated versions of the Hargreaves and Blaney-Criddle method. The calculated PET time series are (1 evaluated against global monthly Penman-Monteith PET time series calculated from CRU data and (2 tested on their usability for modeling of global discharge cycles.

    A major finding is that for part of the investigated basins the selection of a PET method may have only a minor influence on the resulting river flow. Within the hydrological model used in this study the bias related to the PET method tends to decrease while going from PET, AET and runoff to discharge calculations. However, the performance of individual PET methods appears to be spatially variable, which stresses the necessity to select the most accurate and spatially stable PET method. The lowest root mean squared differences and the least significant deviations (95% significance level between monthly CFSR derived PET time series and CRU derived PET were obtained for a cell-specific re-calibrated Blaney-Criddle equation. However, results show that this re-calibrated form is likely to be unstable under changing climate conditions and less reliable for the calculation of daily time series. Although often recommended, the Penman-Monteith equation applied to the CFSR data did not outperform the other methods in a evaluation against PET derived with the Penman-Monteith equation from CRU data. In arid regions (e.g. Sahara, central Australia, US deserts, the equation resulted in relatively low PET values and, consequently, led to relatively high discharge values for dry basins (e

  20. Markov Chains Used to Determine the Model of Stock Value and Compared with Other New Models of Stock Value (P/E Model and Ohlson Model

    Directory of Open Access Journals (Sweden)

    Abbasali Pouraghajan

    2012-10-01

    Full Text Available The aim of this study a comparison between the three models for the valuation of stocks in Tehran Stock Exchange. These three names PE, Olson or residual income and a Markov chain (Markov are. Researchers in their study were to calculate the valuation of shares in the first two terms and then calculate the value of the enamel Markov chain to achieve a comparative mode. Result of research shows that almost in all cases, there is no significant difference between explanatory power of these models in determining shares value and investments in Tehran exchange market can for assessment of shares uses from these 3 models, but in most cases residual income assessment model by considering less standard error of regression can say, partly is better model in determining the company's value which maybe the main reason be have high explanatory power of two dependent profit variable overall, and book value of share holder's salary by using the overall accounting relation in comparison with two other models.

  1. Classification of missing values in spatial data using spin models

    CERN Document Server

    Žukovič, Milan; 10.1103/PhysRevE.80.011116

    2013-01-01

    A problem of current interest is the estimation of spatially distributed processes at locations where measurements are missing. Linear interpolation methods rely on the Gaussian assumption, which is often unrealistic in practice, or normalizing transformations, which are successful only for mild deviations from the Gaussian behavior. We propose to address the problem of missing values estimation on two-dimensional grids by means of spatial classification methods based on spin (Ising, Potts, clock) models. The "spin" variables provide an interval discretization of the process values, and the spatial correlations are captured in terms of interactions between the spins. The spins at the unmeasured locations are classified by means of the "energy matching" principle: the correlation energy of the entire grid (including prediction sites) is estimated from the sample-based correlations. We investigate the performance of the spin classifiers in terms of computational speed, misclassification rate, class histogram an...

  2. Requirements for Logical Models for Value-Added Tax Legislation

    DEFF Research Database (Denmark)

    Nielsen, Morten Ib; Simonsen, Jakob Grue; Larsen, Ken Friis

    -specific needs. Currently, these difficulties are handled in most major ERP systems by customising and localising the native code of the ERP systems for each specific country and industry. We propose an alternative that uses logical modeling of VAT legislation. The potential benefit is to eventually transform......Enterprise resource planning (ERP) systems are ubiquitous in commercial enterprises of all sizes and invariably need to account for the notion of value-added tax (VAT). The legal and technical difficulties in handling VAT are exacerbated by spanning a broad and chaotic spectrum of intricate country...

  3. Teaching Incision and Drainage: Perceived Educational Value of Abscess Models.

    Science.gov (United States)

    Adams, Cynthia M; Nigrovic, Lise E; Hayes, Gavin; Weinstock, Peter H; Nagler, Joshua

    2017-07-17

    Incision and drainage (I&D) of skin abscesses is an important procedural skill for pediatric emergency medicine providers. Practical skills training using simulation provides an opportunity to learn and gain confidence with this invasive procedure. Our objective was to assess the perceived educational value of 2 versions of an abscess model as part of an educational workshop for teaching I&D. A combined didactic and practical skills workshop was developed for use at 2 national conferences. The didactic content was created through an iterative process. To facilitate hands-on training, 2 versions of an abscess model were created: 1 constructed from a negative mold and the other using a 3-dimensional printer. Participants were surveyed regarding prior experience with I&D, procedural confidence, and perceptions of the educational utility of the models. Seventy physicians and 75 nurse practitioners participated in the study. Procedural confidence improved after training using each version of the model, with the greatest improvements noted among novice learners. Ninety-four percent of physicians, and 99% of nurse practitioners rated the respective models as either "educational" or "very educational," and 97% and 100%, respectively, would recommend the abscess models to others. A combined didactic and practical skills educational workshop using novel abscess models was effective at improving learners' confidence. Our novel models provide an effective strategy for teaching procedural skills such as I&D and demonstrate a novel use of 3-dimensional printers in medical education. Further study is needed to determine if these educational gains translate into improvement in clinical performance or patient outcomes.

  4. Model Checking with Edge-Valued Decision Diagrams

    Science.gov (United States)

    Roux, Pierre; Siminiceanu, Radu I.

    2010-01-01

    We describe an algebra of Edge-Valued Decision Diagrams (EVMDDs) to encode arithmetic functions and its implementation in a model checking library. We provide efficient algorithms for manipulating EVMDDs and review the theoretical time complexity of these algorithms for all basic arithmetic and relational operators. We also demonstrate that the time complexity of the generic recursive algorithm for applying a binary operator on EVMDDs is no worse than that of Multi- Terminal Decision Diagrams. We have implemented a new symbolic model checker with the intention to represent in one formalism the best techniques available at the moment across a spectrum of existing tools. Compared to the CUDD package, our tool is several orders of magnitude faster

  5. Face image modeling by multilinear subspace analysis with missing values.

    Science.gov (United States)

    Geng, Xin; Smith-Miles, Kate; Zhou, Zhi-Hua; Wang, Liang

    2011-06-01

    Multilinear subspace analysis (MSA) is a promising methodology for pattern-recognition problems due to its ability in decomposing the data formed from the interaction of multiple factors. The MSA requires a large training set, which is well organized in a single tensor, which consists of data samples with all possible combinations of the contributory factors. However, such a "complete" training set is difficult (or impossible) to obtain in many real applications. The missing-value problem is therefore crucial to the practicality of the MSA but has been hardly investigated up to present. To solve the problem, this paper proposes an algorithm named M(2)SA, which is advantageous in real applications due to the following: 1) it inherits the ability of the MSA to decompose the interlaced semantic factors; 2) it does not depend on any assumptions on the data distribution; and 3) it can deal with a high percentage of missing values. M(2)SA is evaluated by face image modeling on two typical multifactorial applications, i.e., face recognition and facial age estimation. Experimental results show the effectiveness of M(2) SA even when the majority of the values in the training tensor are missing.

  6. Modeling Value Chain Analysis of Distance Education using UML

    Science.gov (United States)

    Acharya, Anal; Mukherjee, Soumen

    2010-10-01

    Distance education continues to grow as a methodology for the delivery of course content in higher education in India as well as abroad. To manage this growing demand and to provide certain flexibility, there must be certain strategic planning about the use of ICT tools. Value chain analysis is a framework for breaking down the sequence of business functions into a set of activities through which utility could be added to service. Thus it can help to determine the competitive advantage that is enjoyed by an institute. To implement these business functions certain visual representation is required. UML allows for this representation by using a set of structural and behavioral diagrams. In this paper, the first section defines a framework for value chain analysis and highlights its advantages. The second section gives a brief overview of related work in this field. The third section gives a brief discussion on distance education. The fourth section very briefly introduces UML. The fifth section models value chain of distance education using UML. Finally we discuss the limitations and the problems posed in this domain.

  7. Value-at-Risk-Estimation in the Mexican Stock Exchange Using Conditional Heteroscedasticity Models and Theory of Extreme Values

    OpenAIRE

    Alejandro Iván Aguirre Salado; Humberto Vaquera Huerta; Martha Elva Ramírez Guzmán; José René Valdez Lazalde; Carlos Arturo Aguirre Salado

    2013-01-01

    This work proposes an approach for estimating value at risk (VaR) of the Mexican stock exchange index (IPC) by using a combination of the autoregressive moving average models (ARMA); three different models of the arch family, one symmetric (GARCH) and two asymmetric (GJR-GARCH and EGARCH); and the extreme value theory (EVT). The ARMA models were initially used to obtain uncorrelated residuals, which were later used for the analysis of extreme values. The GARCH, EGARCH and GJR-GARCH models, by...

  8. On the added value of WUDAPT for Urban Climate Modelling

    Science.gov (United States)

    Brousse, Oscar; Martilli, Alberto; Mills, Gerald; Bechtel, Benjamin; Hammerberg, Kris; Demuzere, Matthias; Wouters, Hendrik; Van Lipzig, Nicole; Ren, Chao; Feddema, Johannes J.; Masson, Valéry; Ching, Jason

    2017-04-01

    Over half of the planet's population now live in cities and is expected to grow up to 65% by 2050 (United Nations, 2014), most of whom will actually occupy new emerging cities of the global South. Cities' impact on climate is known to be a key driver of environmental change (IPCC, 2014) and has been studied for decades now (Howard, 1875). Still very little is known about our cities' structure around the world, preventing urban climate simulations to be done and hence guidance to be provided for mitigation. Assessing the need to bridge the urban knowledge gap for urban climate modelling perspectives, the World Urban Database and Access Portal Tool - WUDAPT - project (Ching et al., 2015; Mills et al., 2015) developed an innovative technique to map cities globally rapidly and freely. The framework established by Bechtel and Daneke (2012) derives Local Climate Zones (Stewart and Oke, 2012) city maps out of LANDSAT 8 OLI-TIRS imagery (Bechtel et al., 2015) through a supervised classification by a Random Forest Classification algorithm (Breiman, 2001). The first attempt to implement Local Climate Zones (LCZ) out of the WUDAPT product within a major climate model was carried out by Brousse et al. (2016) over Madrid, Spain. This study proved the applicability of LCZs as an enhanced urban parameterization within the WRF model (Chen et al. 2011) employing the urban canopy model BEP-BEM (Martilli, 2002; Salamanca et al., 2010), using the averaged values of the morphological and physical parameters' ranges proposed by Stewart and Oke (2012). Other studies have now used the Local Climate Zones for urban climate modelling purposes (Alexander et al., 2016; Wouters et al. 2016; Hammerberg et al., 2017; Brousse et al., 2017) and demonstrated the added value of the WUDAPT dataset. As urban data accessibility is one of the major challenge for simulations in emerging countries, this presentation will show results of simulations using LCZs and the capacity of the WUDAPT framework to be

  9. Model-Checking with Edge-Valued Decision Diagrams

    Science.gov (United States)

    Roux, Pierre; Siminiceanu, Radu I.

    2010-01-01

    We describe an algebra of Edge-Valued Decision Diagrams (EVMDDs) to encode arithmetic functions and its implementation in a model checking library along with state-of-the-art algorithms for building the transition relation and the state space of discrete state systems. We provide efficient algorithms for manipulating EVMDDs and give upper bounds of the theoretical time complexity of these algorithms for all basic arithmetic and relational operators. We also demonstrate that the time complexity of the generic recursive algorithm for applying a binary operator on EVMDDs is no worse than that of Multi-Terminal Decision Diagrams. We have implemented a new symbolic model checker with the intention to represent in one formalism the best techniques available at the moment across a spectrum of existing tools: EVMDDs for encoding arithmetic expressions, identity-reduced MDDs for representing the transition relation, and the saturation algorithm for reachability analysis. We compare our new symbolic model checking EVMDD library with the widely used CUDD package and show that, in many cases, our tool is several orders of magnitude faster than CUDD.

  10. ASSETS MANAGEMENT - A CONCEPTUAL MODEL DECOMPOSING VALUE FOR THE CUSTOMER AND A QUANTITATIVE MODEL

    Directory of Open Access Journals (Sweden)

    Susana Nicola

    2015-03-01

    Full Text Available In this paper we describe de application of a modeling framework, the so-called Conceptual Model Decomposing Value for the Customer (CMDVC, in a Footwear Industry case study, to ascertain the usefulness of this approach. The value networks were used to identify the participants, both tangible and intangible deliverables/endogenous and exogenous assets, and the analysis of their interactions as the indication for an adequate value proposition. The quantitative model of benefits and sacrifices, using the Fuzzy AHP method, enables the discussion of how the CMDVC can be applied and used in the enterprise environment and provided new relevant relations between perceived benefits (PBs.

  11. Modeling Source Water Threshold Exceedances with Extreme Value Theory

    Science.gov (United States)

    Rajagopalan, B.; Samson, C.; Summers, R. S.

    2016-12-01

    Variability in surface water quality, influenced by seasonal and long-term climate changes, can impact drinking water quality and treatment. In particular, temperature and precipitation can impact surface water quality directly or through their influence on streamflow and dilution capacity. Furthermore, they also impact land surface factors, such as soil moisture and vegetation, which can in turn affect surface water quality, in particular, levels of organic matter in surface waters which are of concern. All of these will be exacerbated by anthropogenic climate change. While some source water quality parameters, particularly Total Organic Carbon (TOC) and bromide concentrations, are not directly regulated for drinking water, these parameters are precursors to the formation of disinfection byproducts (DBPs), which are regulated in drinking water distribution systems. These DBPs form when a disinfectant, added to the water to protect public health against microbial pathogens, most commonly chlorine, reacts with dissolved organic matter (DOM), measured as TOC or dissolved organic carbon (DOC), and inorganic precursor materials, such as bromide. Therefore, understanding and modeling the extremes of TOC and Bromide concentrations is of critical interest for drinking water utilities. In this study we develop nonstationary extreme value analysis models for threshold exceedances of source water quality parameters, specifically TOC and bromide concentrations. In this, the threshold exceedances are modeled as Generalized Pareto Distribution (GPD) whose parameters vary as a function of climate and land surface variables - thus, enabling to capture the temporal nonstationarity. We apply these to model threshold exceedance of source water TOC and bromide concentrations at two locations with different climate and find very good performance.

  12. RETHINKING VALUE: A VALUE-CENTRIC MODEL OF PRODUCT, SERVICE AND BUSINESS DEVELOPMENT

    DEFF Research Database (Denmark)

    Randmaa, Merili; Mougaard, Krestine; Howard, Thomas J.

    2011-01-01

    Globalization and information technologies have made the economical landscape more transparent and customers smarter, more demanding and networked. Companies can see these changes as a threat to their business or as an opportunity to differentiate in the market and be a Prime Mover, by re......-thinking customer value within the value system. This article shows how the term “value” is understood in different contexts and fields of economy, to see if these definitions can be merged, in order to understand the concept of value in broader way. The authors argue through literature review and example cases...

  13. RETHINKING VALUE: A VALUE-CENTRIC MODEL OF PRODUCT, SERVICE AND BUSINESS DEVELOPMENT

    DEFF Research Database (Denmark)

    Randmaa, Merili; Mougaard, Krestine; Howard, Thomas J.

    2011-01-01

    Globalization and information technologies have made the economical landscape more transparent and customers smarter, more demanding and networked. Companies can see these changes as a threat to their business or as an opportunity to differentiate in the market and be a Prime Mover, by re...... that seeing value from multi-disciplinary viewpoint opens up some unused opportunities for the companies to overcome barriers within a value system, design integrated products and services, work more effectively, co-create value with customers, make use of word-of-mouth promotion and achieve long...

  14. A Review of the Wood Pellet Value Chain, Modern Value/Supply Chain Management Approaches, and Value/Supply Chain Models

    Directory of Open Access Journals (Sweden)

    Natalie M. Hughes

    2014-01-01

    Full Text Available We reviewed 153 peer-reviewed sources to provide identification of modern supply chain management techniques and exploration of supply chain modeling, to offer decision support to managers. Ultimately, the review is intended to assist member-companies of supply chains, mainly producers, improve their current management approaches, by directing them to studies that may be suitable for direct application to their supply chains and value chains for improved efficiency and profitability. We found that information on supply chain management and modeling techniques in general is available. However, few Canadian-based published studies exist regarding a demand-driven modeling approach to value/supply chain management for wood pellet production. Only three papers were found specifically on wood pellet value chain analysis. We propose that more studies should be carried out on the value chain of wood pellet manufacturing, as well as demand-driven management and modeling approaches with improved demand forecasting methods.

  15. Replacement Value - Representation of Fair Value in Accounting. Techniques and Modeling Suitable for the Income Based Approach

    OpenAIRE

    MANEA MARINELA – DANIELA

    2011-01-01

    The term fair value is spread within the sphere of international standards without reference to any detailed guidance on how to apply. However, specialized tangible assets, which are rarely sold, the rule IAS 16 "Intangible assets " makes it possible to estimate fair value using an income approach or a replacement cost or depreciation. The following material is intended to identify potential modeling of fair value as an income-based approach, appealing to techniques used by professional evalu...

  16. PENGEMBANGAN MODEL INTERNALISASI NILAI KARAKTER DALAM PEMBELAJARAN SEJARAH MELALUI MODEL VALUE CLARIFICATION TECHNIQUE

    Directory of Open Access Journals (Sweden)

    Nunuk Suryani

    2013-07-01

    Full Text Available This research produce a product model of internalization of the character in learning history through Value Clarification Technique as a revitalization of the role of social studies in the formation of national character. In general, this research consist of three levels : (1 doing  pre-survey which identified the current condition of  the learning value of character in ​​in learning history (2 development of a model based on the findings of  pre-survey, the model used is the Dick and Carey Model, and (3 validating the models. Development models implemented with limited trials and extensive testing. The findings of this study lead to the conclusion that the VCT model is effective to internalize the character value in learning history. VCT models effective for increasing the role of learning history in the formation of student character. It can be concluded VCT models effective for improving the quality of processes and products of learning character values ​​in social studies SMP especially in Surakarta Keywords: Internalization, the value of character, Model VCT, learning history, learning social studies Penelitian ini bertujuan menghasilkan suatu produk model internalisasi nilai karakter dalam pembelajaran IPS melalui Model Value Clarification Technique sebagai revitalisasi peran pembelajaran IPS dalam pembentukan karakter bangsa. Secara garis besar tahapan penelitian meliputi (1 prasurvai untuk mengidetifikasi kondisi pembelajaran nilai karakter pada pembelajaran  IPS Sejarah SMP yang sedang berjalan, (2 pengembangan model berdasarkan hasil prasurvai, model yang digunakan adalah model Dick and Carey, dan (3 vaidasi model. Pengembangan model dilaksanakan dengan ujicoba terbatas dan uji coba luas. Temuan penelitian ini menghasilkan kesimpulan bahwa model VCT efektif  menginternalisasi nilai karakter dalam pembelajaran Sejarah. Model VCT efektif untuk meningkatkan peran pembelajaran Sejarah dalam

  17. Counseling Psychology Model Training Values Statement Addressing Diversity

    Science.gov (United States)

    Counseling Psychologist, 2009

    2009-01-01

    Respect for diversity and for values different from one's own is a central value of counseling psychology training programs. The valuing of diversity is also consistent with the profession of psychology as mandated by the American Psychological Association's (APA's) Ethical Principles and Code of Conduct and as discussed in the Guidelines and…

  18. A General Model of Organizational Values in Educational Administration

    Science.gov (United States)

    Mueller, Robin Alison

    2014-01-01

    Values theorists in educational administration agree that understanding organizational values is integral to organizational effectiveness. However, research in this area tends to be superficial, and a review of pertinent literature reveals no clear definition of organizational values or consequent implications for practical application. One of the…

  19. Counseling Psychology Model Training Values Statement Addressing Diversity

    Science.gov (United States)

    Counseling Psychologist, 2009

    2009-01-01

    Respect for diversity and for values different from one's own is a central value of counseling psychology training programs. The valuing of diversity is also consistent with the profession of psychology as mandated by the American Psychological Association's (APA's) Ethical Principles and Code of Conduct and as discussed in the Guidelines and…

  20. A General Model of Organizational Values in Educational Administration

    Science.gov (United States)

    Mueller, Robin Alison

    2014-01-01

    Values theorists in educational administration agree that understanding organizational values is integral to organizational effectiveness. However, research in this area tends to be superficial, and a review of pertinent literature reveals no clear definition of organizational values or consequent implications for practical application. One of the…

  1. The dynamics of value segments : modeling framework and empirical illustration

    NARCIS (Netherlands)

    Brangule-Vlagsma, K; Pieters, RGM; Wedel, M

    2002-01-01

    Value systems are central to understanding consumer behavior and they are an important basis for market segmentation. This study addresses changes in individual value systems across time. First, we conceptualize main ways in which value systems may change over time. Next, we extend Kamakura and Mazz

  2. Understanding Business Strategies of Networked Value Constellations Using Goal- and Value Modeling

    NARCIS (Netherlands)

    Gordijn, Jaap; Petit, Michael; Wieringa, Roelf J.

    2006-01-01

    In goal-oriented requirements engineering (GORE), one usually proceeds from a goal analysis to a requirements specification, usually of IT systems. In contrast, we consider the use of GORE for the design of IT-enabled value constellations, which are collections of enterprises that jointly satisfy a

  3. Understanding business strategies of networked value constellations using goal- and value modeling

    NARCIS (Netherlands)

    Gordijn, Jaap; Petit, Michael; Wieringa, Roel

    2006-01-01

    In goal-oriented requirements engineering (GORE), one usually proceeds from a goal analysis to a requirements specification, usually of IT systems. In contrast, we consider the use of GORE for the design of IT-enabled value constellations, which are collections of enterprises that jointly satisfy a

  4. A synergy of values. Catholic healthcare leaders must implement their organization's mission and model its values.

    Science.gov (United States)

    Clifton, R M; McEnroe, J J

    1994-06-01

    Catholic organizations need to select, develop, and retain healthcare leaders who dedicate themselves to carrying on the Church's healing ministry and the work begun by those who have preceded them. Persons entrusted to carry on Jesus' healing mission perform their duties out of a sense of commitment to the ministry and a love for the persons with whom they work and whom they serve. They recognize a synergy between their own values and the values of the healthcare organizations they lead. Dedication to leadership in Catholic healthcare can be viewed from three perspectives: the Bible and selected documents of the Catholic Church; the transfer of responsibility for Catholic healthcare from religious congregations to evolving forms of sponsorship; and the implications for the selection, development, and retention of healthcare leaders, both lay and religious. Servant-leadership is an integral part of the religious tradition that underlies Catholic healthcare. As cooperation increases between healthcare providers, third-party payers, employers, and other healthcare agents. Catholic healthcare organizations are challenged to reassert a mission and values that will enable healthcare in the United States to be delivered both compassionately and competently.

  5. Conceptual model for decomposing the value for the customer

    OpenAIRE

    Nicola, Susana; Ferreira, Eduarda Pinto; Ferreira, João José Pinto

    2012-01-01

    Value has been defined in different theoretical contexts as need, desire, interest, standard /criteria, beliefs, attitudes, and preferences. The creation of value is key to any business, and any business activity is about exchanging some tangible and/or intangible good or service and having its value accepted and rewarded by customers or clients, either inside the enterprise or collaborative network or outside. “Perhaps surprising then is that firms often do not know how to define...

  6. Mathematical Modeling Activities as a Useful Tool for Values Education

    Science.gov (United States)

    Doruk, Bekir Kursat

    2012-01-01

    Values education is crucial since it is one of the factors to reach success in education in broader sense and in mathematics education in particular sense. It is also important for educating next generations of societies. However, previous research showed that expected importance for values education was not given in Mathematics courses. In a few…

  7. Models and strategies for value clarification in population education.

    Science.gov (United States)

    Villanueva, C L

    1976-01-01

    Teachers, principals, supervisiors, superintendents, policymakers, and researchers all agree that value clarification is the most effective method for teaching value-laden population issues. It has been recognized that cultural attitudes and beliefs that are negative to family planning block acceptance of the population program. Education can become an effective vehicle in raising the level of awareness and acceptance of population practices. Teachers and students must 1st critically examine a set of alternative population values, for then they can weigh the pros and cons in order to arrive at value decisions regarding fertility behavior. Discussion and recommendations for practice and policy are presented for 9 problem areas: defining value clarification, affective versus cognitive process, teacher participation, open-ended versus closed-ended approach, use of resources and devices, target group, teacher personality, selecting appropriate issues, and dealing with controversial subjects.

  8. Improving species distribution models: the value of data on abundance

    National Research Council Canada - National Science Library

    Howard, Christine; Stephens, Philip A; Pearce‐Higgins, James W; Gregory, Richard D; Willis, Stephen G; McPherson, Jana

    2014-01-01

    Species distribution models (SDMs) are important tools for forecasting the potential impacts of future environmental changes but debate remains over the most robust modelling approaches for making projections...

  9. Value Delivery Architecture Modeling – A New Approach for Business Modeling

    Directory of Open Access Journals (Sweden)

    Joachim Metzger

    2015-08-01

    Full Text Available Complexity and uncertainty have evolved as important challenges for entrepreneurship in many industries. Value Delivery Architecture Modeling (VDAM is a proposal for a new approach for business modeling to conquer these challenges. In addition to the creation of transparency and clarity, our approach supports the operationalization of business model ideas. VDAM is based on the combination of a new business modeling language called VDML, ontology building, and the implementation of a level of cross-company abstraction. The application of our new approach in the area of electric mobility in Germany, an industry sector with high levels of uncertainty and a lack of common understanding, shows several promising results: VDAM enables the development of an unambiguous and unbiased view on value creation. Additionally it allows for several applications leading to a more informed decision towards the implementation of new business models.

  10. Linear regression model selection using p-values when the model dimension grows

    CERN Document Server

    Pokarowski, Piotr; Teisseyre, Paweł

    2012-01-01

    We consider a new criterion-based approach to model selection in linear regression. Properties of selection criteria based on p-values of a likelihood ratio statistic are studied for families of linear regression models. We prove that such procedures are consistent i.e. the minimal true model is chosen with probability tending to 1 even when the number of models under consideration slowly increases with a sample size. The simulation study indicates that introduced methods perform promisingly when compared with Akaike and Bayesian Information Criteria.

  11. Testing Benjamin Graham’s net current asset value model

    Directory of Open Access Journals (Sweden)

    Chongsoo An

    2015-02-01

    Full Text Available The objective of this paper is to empirically test one of Graham’s investment methods based on the net current asset value (NCAV. The NCAV is truly unique, and conservative, and commonly known as the net-net method.  The ratio of the net current asset value to market value (NCAV/MV was employed in this study to test a stock’s performance comparing to the performance of S&P 500 as the market index. We used all stocks in Portfolio123 whose raw data were supplied by Compustat, Standard & Poors, Capital IQ, and Reuters for the period of January 2, 1999 to August 31, 2012. The overall results show that the firms with high net current asset values outperform the market. These results are strong in the up market. It can be argued that the firms with a high NCAV/MV ratio are likely to move toward their fundamental value and generate high excess return because its stock prices are now undervalued. The implications of the study are: (a a positive NCAV/MV ratio may be a good indicator of the underpriced security; (b investing in the growth period and avoiding the downturn period leads investors to earn much higher returns from the firms with a high NCAV/MV ratio; and (c The NCAV/MV strategy requires a longer holding period of the portfolio in order to generate excess returns.

  12. Mean Value SI Engine Model for Control Studies

    DEFF Research Database (Denmark)

    Hendricks, Elbert; Sorenson, Spencer C

    1990-01-01

    This paper presents a mathematically simple nonlinear three state (three differential equation) dynamic model of an SI engine which has the same steady state accuracy as a typical dynamometer measurement of the engine over its entire speed/load operating range (± 2.0%). The model's accuracy for l....... The model can easily be run on a Personal Computer (PC) using a ordinary differential equation (ODE) integrating routine or package. This makes the model is useful for control system design and evaluation....

  13. Requirements for Logical Models for Value-Added Tax Legislation

    DEFF Research Database (Denmark)

    Nielsen, Morten Ib; Simonsen, Jakob Grue; Larsen, Ken Friis

    -specific needs. Currently, these difficulties are handled in most major ERP systems by customising and localising the native code of the ERP systems for each specific country and industry. We propose an alternative that uses logical modeling of VAT legislation. The potential benefit is to eventually transform...... such a model automatically into programs that essentially will replace customisation and localisation by con¿guration by changing parameters in the model. In particular, we: (1) identify a number of requirements for such modeling, including requirements for the underlying logic; (2) model salient parts...

  14. The value analysis team: a shared mental model.

    Science.gov (United States)

    Lang, Kathryn; Eaton, Bradley

    2009-01-01

    Value analysis teams, which standardize procurement and use of products across hospitals and health systems, have experienced great success in saving money for health care organizations. One example is the work of the Medical/Surgical Value Analysis Team at a larger New York metropolitan multihospital system, which has saved the system $1.2 million. This article examines what managers need to consider before forming such teams and how to guide the work. It will also look at the qualities and qualifications of the people who must be involved to make the process effective.

  15. Evaluation Model of the Ecology Benefit Value of Woodland in China

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    Main influencing factors affecting the ecology benefit value of woodland are analyzed,mainly including the water conservation value,environment cleaning value,water and soil conservation value,and climate regulation value.Evaluation model of the ecology benefit value of woodland is put forward which can deal with the uncertain information.Method for determining index weights is discussed,as well as the processing method for uncertain information during the evaluation of ecology benefit value of woodland.Finally,the feasibility and convenience of the evaluation model of the woodland ecology benefit value are illustrated with examples.

  16. Teaching Time Value of Money Using an Excel Retirement Model

    Science.gov (United States)

    Arellano, Fernando; Mulig, Liz; Rhame, Susan

    2012-01-01

    The time value of money (TVM) is required knowledge for all business students. It is traditionally taught in finance and accounting classes for use in various applications in the business curriculum. These concepts are also very useful in real life situations such as calculating the amount to save for retirement. This paper details a retirement…

  17. Estimating net present value variability for deterministic models

    NARCIS (Netherlands)

    van Groenendaal, W.J.H.

    1995-01-01

    For decision makers the variability in the net present value (NPV) of an investment project is an indication of the project's risk. So-called risk analysis is one way to estimate this variability. However, risk analysis requires knowledge about the stochastic character of the inputs. For large, long

  18. Value, search, persistence and model updating in anterior cingulate cortex

    NARCIS (Netherlands)

    Kolling, N.; Wittmann, M.K.; Behrens, T.E.J.; Boorman, E.D.; Mars, R.B.; Rushworth, M.F.S.

    2016-01-01

    Dorsal anterior cingulate cortex (dACC) carries a wealth of value-related information necessary for regulating behavioral flexibility and persistence. It signals error and reward events informing decisions about switching or staying with current behavior. During decision-making, it encodes the avera

  19. Value-driven risk analysis of coordination models

    NARCIS (Netherlands)

    Ionita, Dan; Gordijn, Jaap; Yesuf, Ahmed Seid; Wieringa, Roel

    2016-01-01

    Coordination processes are business processes that involve independent profit-and-loss responsible business actors who collectively provide something of value to a customer. Coordination processes are meant to be profitable for the business actors that execute them. However, because business actors

  20. On the Reciprocity of State Vectors in Boundary Value Models

    Science.gov (United States)

    1989-09-22

    34 IEEE Transactions on Automatic Control , vol. 29, pp. 803-821, 1984. [31 A. Bagchi and H. Westdijk, "Smoothing and likelihood ratio for Gaussian...boundary value processes," IEEE Transactions on Automatic Control , vol. 34, pp. 954-962, 1989. [41 J.-P. Carmichael, J.-C. Massd, and R. Theodorescu

  1. Modeling data revisions : Measurement error and dynamics of "true" values

    NARCIS (Netherlands)

    Jacobs, Jan P. A. M.; van Norden, Simon

    2011-01-01

    Policy makers must base their decisions on preliminary and partially revised data of varying reliability. Realistic modeling of data revisions is required to guide decision makers in their assessment of current and future conditions. This paper provides a new framework with which to model data revis

  2. Modeling data revisions : Measurement error and dynamics of "true" values

    NARCIS (Netherlands)

    Jacobs, Jan P. A. M.; van Norden, Simon

    2011-01-01

    Policy makers must base their decisions on preliminary and partially revised data of varying reliability. Realistic modeling of data revisions is required to guide decision makers in their assessment of current and future conditions. This paper provides a new framework with which to model data revis

  3. Integral-Value Models for Outcomes over Continuous Time

    DEFF Research Database (Denmark)

    Harvey, Charles M.; Østerdal, Lars Peter

    Models of preferences between outcomes over continuous time are important for individual, corporate, and social decision making, e.g., medical treatment, infrastructure development, and environmental regulation. This paper presents a foundation for such models. It shows that conditions on prefere...

  4. Business modelling revisited: The configuration of control and value

    NARCIS (Netherlands)

    Ballon, P.J.P.

    2007-01-01

    Purpose - This paper aims to provide a theoretically grounded framework for designing and analysing business models for (mobile) information communication technology (ICT) services and systems. Design/methodology/approach - The paper reviews the most topical literature on business modelling, as well

  5. The value of structural information in the VAR model

    NARCIS (Netherlands)

    R.W. Strachan (Rodney); H.K. van Dijk (Herman)

    2003-01-01

    textabstractEconomic policy decisions are often informed by empirical economic analysis. While the decision-maker is usually only interested in good estimates of outcomes, the analyst is interested in estimating the model. Accurate inference on the structural features of a model, such as cointegrati

  6. Deciding the economic value added: an alternative model

    Directory of Open Access Journals (Sweden)

    Cecilia Gallegos Muñoz

    2011-06-01

    Full Text Available The economic value creation is something that affects all companies, that is why, it is crucial to establish , for which purpose, there are a variety of procedures. Among which, especially the method of economic value added (EVA, which can compare the yield of investment to its financial cost. This methodology is very useful, but its definition does not distinguish EVA from the business effects with EVA generated results, so in this article we propose a new way to determine the EVA that allows differentiation and the EVA distinction created by many results achievable by the company. Finally, an explanatory case it is presented to demonstrate if the new proposal it is applicable or not.

  7. Accounting for choice of measurement scale in extreme value modeling

    OpenAIRE

    Wadsworth, J. L.; Tawn, J. A.; Jonathan, P.

    2010-01-01

    We investigate the effect that the choice of measurement scale has upon inference and extrapolation in extreme value analysis. Separate analyses of variables from a single process on scales which are linked by a nonlinear transformation may lead to discrepant conclusions concerning the tail behavior of the process. We propose the use of a Box--Cox power transformation incorporated as part of the inference procedure to account parametrically for the uncertainty surrounding the scale of extrapo...

  8. The value of incomplete mouse models of Alzheimer's disease.

    Science.gov (United States)

    Radde, Rebecca; Duma, Cecilia; Goedert, Michel; Jucker, Mathias

    2008-03-01

    To study Alzheimer's disease (AD), a variety of mouse models has been generated through the overexpression of the amyloid precursor protein and/or the presenilins harboring one or several mutations found in familial AD. With aging, these mice develop several lesions similar to those of AD, including diffuse and neuritic amyloid deposits, cerebral amyloid angiopathy, dystrophic neurites and synapses, and amyloid-associated neuroinflammation. Other characteristics of AD, such as neurofibrillary tangles and nerve cell loss, are not satisfactorily reproduced in these models. Mouse models that recapitulate only specific aspects of AD pathogenesis are of great advantage when deciphering the complexity of the disease and can contribute substantially to diagnostic and therapeutic innovations. Incomplete mouse models have been key to the development of Abeta42-targeted therapies, as well as to the current understanding of the interrelationship between cerebral beta-amyloidosis and tau neurofibrillary lesions, and are currently being used to develop novel diagnostic agents for in vivo imaging.

  9. Large dimension forecasting models and random singular value spectra

    CERN Document Server

    Bouchaud, J P; Miceli, M A; Potters, M; Bouchaud, Jean-Philippe; Laloux, Laurent; Potters, Marc

    2005-01-01

    We present a general method to detect and extract from a finite time sample statistically meaningful correlations between input and output variables of large dimensionality. Our central result is derived from the theory of free random matrices, and gives an explicit expression for the interval where singular values are expected in the absence of any true correlations between the variables under study. Our result can be seen as the natural generalization of the Marcenko-Pastur distribution for the case of rectangular correlation matrices. We illustrate the interest of our method on a set of macroeconomic time series.

  10. The relationship of values to adjustment in illness: a model for nursing practice.

    Science.gov (United States)

    Harvey, R M

    1992-04-01

    This paper proposes a model of the relationship between values, in particular health value, and adjustment to illness. The importance of values as well as the need for value change are described in the literature related to adjustment to physical disability and chronic illness. An empirical model, however, that explains the relationship of values to adjustment or adaptation has not been found by this researcher. Balance theory and its application to the abstract and perceived cognitions of health value and health perception are described here to explain the relationship of values like health value to outcomes associated with adjustment or adaptation to illness. The proposed model is based on the balance theories of Heider, Festinger and Feather. Hypotheses based on the model were tested and supported in a study of 100 adults with visible and invisible chronic illness. Nursing interventions based on the model are described and suggestions for further research discussed.

  11. Value Analysis: A Model of Personal and Professional Ethics in Marriage and Family Counseling.

    Science.gov (United States)

    Thomas, Volker

    1994-01-01

    Presents Ethics Model of Marriage and Family Counseling and its underlying assumptions. Analyzes six basic counseling values in relation to microsystems of counselor and client, mesosystem of counseling process, and societal value context as the macrosystem. Utilizes discussion of suicide to apply these values to the model. Includes 17 references.…

  12. Has the Deliberative Model of Democracy an epistemic value?

    Directory of Open Access Journals (Sweden)

    Roberto García

    2013-07-01

    Full Text Available Deliberative democracy is a normative ideal of democracy. This model is a proposal for the regeneration of the legitimacy of our institutions, but also a mechanism for decision making. It is based on two different dimensions: a procedural dimension where the model demands the inclusion and an equal capacity to influence the final decision of all those affected (Cohen, 1989; Bohman, 1996; Habermas, 1992 ... and a substantive dimension where the political decisions are made through a collective procedure of argumentation and public discussion. If these conditions are recognized, the decisions will be more rational and better decisions. This paper has two aims. First I will present the key elements of this epistemic conception of political legitimacy. Second I will show the challenges it faces. On a one hand, the counterfactual of many of its postulates and on the other, the obvious problems of bias consensualist of this model.

  13. MODEL PENENTUAN HARGA SAHAM: PENGUJIAN CAPITAL ASSET PRICING MODEL MELALUI PENGUJIAN ECONOMIC VALUE ADDED

    Directory of Open Access Journals (Sweden)

    Suripto Suripto

    2017-03-01

    Full Text Available This research tested the influence of characteristics of the firms and of EVA (Eco-nomic Value Added to stock of returns. This Research sample was company Self-100 ValueCreator of year 2001 until 2006. Result of research indicated that company size measure,profitability, capital structure (characteristics of the firms and EVA by stimulant had aneffect on significant to stock of returns, but by partial only characteristics company. Condi-tion of company fundamentals had an effect on significance to stock of returns. This indica-tion that investor still considered factors of fundamentals was having investment. EVA didnot have an effect on significant to stock of returns. This finding indicated that Model deter-mination of stock of returns (CAPM Irrelevant determined the level of EVA and also indicatedthat CAPM (Capital Assets Pricing Model was not relevant in determining stock of returns inIndonesian Stock Exchange.

  14. Agency, Values, and Well-Being: A Human Development Model

    Science.gov (United States)

    Welzel, Christian; Inglehart, Ronald

    2010-01-01

    This paper argues that feelings of agency are linked to human well-being through a sequence of adaptive mechanisms that promote human development, once existential conditions become permissive. In the first part, we elaborate on the evolutionary logic of this model and outline why an evolutionary perspective is helpful to understand changes in…

  15. Community Reintegration: The Value of Educational-Action-Training Models.

    Science.gov (United States)

    Versluys, Hilda P.

    1984-01-01

    The article describes the principles that guide program development, the use of therapeutic and educational activities, and the components of transitional program models for disabled individuals re-entering the community following discharge from rehabilitation facilities. The role of the occupatonal therapist in successful reintegration is…

  16. Implicit Value Updating Explains Transitive Inference Performance: The Betasort Model.

    Directory of Open Access Journals (Sweden)

    Greg Jensen

    Full Text Available Transitive inference (the ability to infer that B > D given that B > C and C > D is a widespread characteristic of serial learning, observed in dozens of species. Despite these robust behavioral effects, reinforcement learning models reliant on reward prediction error or associative strength routinely fail to perform these inferences. We propose an algorithm called betasort, inspired by cognitive processes, which performs transitive inference at low computational cost. This is accomplished by (1 representing stimulus positions along a unit span using beta distributions, (2 treating positive and negative feedback asymmetrically, and (3 updating the position of every stimulus during every trial, whether that stimulus was visible or not. Performance was compared for rhesus macaques, humans, and the betasort algorithm, as well as Q-learning, an established reward-prediction error (RPE model. Of these, only Q-learning failed to respond above chance during critical test trials. Betasort's success (when compared to RPE models and its computational efficiency (when compared to full Markov decision process implementations suggests that the study of reinforcement learning in organisms will be best served by a feature-driven approach to comparing formal models.

  17. The heuristic value of redundancy models of aging

    NARCIS (Netherlands)

    Boonekamp, Jelle J.; Briga, Michael; Verhulst, Simon

    2015-01-01

    Molecular studies of aging aim to unravel the cause(s) of aging bottom-up, but linking these mechanisms to organismal level processes remains a challenge. We propose that complementary top-down data-directed modelling of organismal level empirical findings may contribute to developing these links. T

  18. Gettysburg: An Analysis of the Training Value of Commercial Models.

    Science.gov (United States)

    1992-03-01

    for training such as limited trainnag timeoi a lack of cora - puter support. The models were evaluated solely for their strengths 0:nd we.-kdesses as...feet was taller , steeper, and the more heavily wooded of the two hills. Little Round Top, at 670 feet, was about one-half mile northeast of Big Round

  19. Prediction of survival with alternative modeling techniques using pseudo values

    NARCIS (Netherlands)

    T. van der Ploeg (Tjeerd); F.R. Datema (Frank); R.J. Baatenburg de Jong (Robert Jan); E.W. Steyerberg (Ewout)

    2014-01-01

    textabstractBackground: The use of alternative modeling techniques for predicting patient survival is complicated by the fact that some alternative techniques cannot readily deal with censoring, which is essential for analyzing survival data. In the current study, we aimed to demonstrate that pseudo

  20. The heuristic value of redundancy models of aging

    NARCIS (Netherlands)

    Boonekamp, Jelle J.; Briga, Michael; Verhulst, Simon

    2015-01-01

    Molecular studies of aging aim to unravel the cause(s) of aging bottom-up, but linking these mechanisms to organismal level processes remains a challenge. We propose that complementary top-down data-directed modelling of organismal level empirical findings may contribute to developing these links.

  1. Effects of model schematisation, geometry and parameter values on urban flood modelling.

    Science.gov (United States)

    Vojinovic, Z; Seyoum, S D; Mwalwaka, J M; Price, R K

    2011-01-01

    One-dimensional (1D) hydrodynamic models have been used as a standard industry practice for urban flood modelling work for many years. More recently, however, model formulations have included a 1D representation of the main channels and a 2D representation of the floodplains. Since the physical process of describing exchanges of flows with the floodplains can be represented in different ways, the predictive capability of different modelling approaches can also vary. The present paper explores effects of some of the issues that concern urban flood modelling work. Impacts from applying different model schematisation, geometry and parameter values were investigated. The study has mainly focussed on exploring how different Digital Terrain Model (DTM) resolution, presence of different features on DTM such as roads and building structures and different friction coefficients affect the simulation results. Practical implications of these issues are analysed and illustrated in a case study from St Maarten, N.A. The results from this study aim to provide users of numerical models with information that can be used in the analyses of flooding processes in urban areas.

  2. Adding Value to Ecological Risk Assessment with Population Modeling

    DEFF Research Database (Denmark)

    Forbes, Valery E.; Calow, Peter; Grimm, Volker

    2011-01-01

    Current measures used to estimate the risks of toxic chemicals are not relevant to the goals of the environmental protection process, and thus ecological risk assessment (ERA) is not used as extensively as it should be as a basis for cost-effective management of environmental resources. Appropriate...... population models can provide a powerful basis for expressing ecological risks that better inform the environmental management process and thus that are more likely to be used by managers. Here we provide at least five reasons why population modeling should play an important role in bridging the gap between...... what we measure and what we want to protect. We then describe six actions needed for its implementation into management-relevant ERA....

  3. An Algorithm for Solution of an Interval Valued EOQ Model

    Directory of Open Access Journals (Sweden)

    Susovan CHAKRABORTTY

    2013-01-01

    Full Text Available This paper deals with the problem of determining the economic order quantity (EOQin the interval sense. A purchasing inventory model with shortages and lead time, whose carryingcost, shortage cost, setup cost, demand quantity and lead time are considered as interval numbers,instead of real numbers. First, a brief survey of the existing works on comparing and ranking anytwo interval numbers on the real line is presented. A common algorithm for the optimum productionquantity (Economic lot-size per cycle of a single product (so as to minimize the total average cost isdeveloped which works well on interval number optimization under consideration. A numerical exampleis presented for better understanding the solution procedure. Finally a sensitive analysis of the optimalsolution with respect to the parameters of the model is examined.

  4. Process-based modelling of the nutritive value of forages: a review

    Science.gov (United States)

    Modelling sward nutritional value (NV) is of particular importance to understand the interactions between grasslands, livestock production, environment and climate-related impacts. Variables describing nutritive value vary significantly between ruminant production systems, but two types are commonly...

  5. Model Averaging and Dimension Selection for the Singular Value Decomposition

    Science.gov (United States)

    2006-01-10

    the analysis of relational data (Harshman et al., 1982), biplots (Gabriel 1971, Gower and Hand 1996) and in reduced-rank interaction models for...numbers of random matrices,” SIAM J. Matrix Anal. Appl., 9, 543–560. Gabriel, K. R. (1971), “The biplot graphic display of matrices with application to...and Hand, D. J. (1996), Biplots , vol. 54 of Monographs on Statistics and Applied Probability, Chapman and Hall Ltd., London. Green, P. J. (2003

  6. Unlocking Value Creation Using an Agritourism Business Model

    Directory of Open Access Journals (Sweden)

    Laura Broccardo

    2017-09-01

    Full Text Available Agritourism has achieved a greater importance in the last decade, but despite this relevance, the definition is not aligned everywhere, depending on the contingency variables of the context in which agritourism is located. This paper aims at analyzing the business model’s key success factors of Italian agritourism by studying their structural, social and economic features, integrated with a sustainability approach. The empirical analysis is based on a sample of agritourism, located in an Italian region. The empirical results show relevant and useful elements to support the sustainable development of agritourism business models in Italy, linking theory, policy and practices. Indeed, these results, together with others related to the economic dimension of the farms, their specialization, and the characteristics of the farmers make it possible to argue that there are common elements, which offer potential for agritourism. In addition, it was possible to identify two different models of agritourism. Agritourism can open new horizons in rural sustainable development, with possible beneficial effects on the environment, society, agricultural heritage and economic growth. In particular, regional policy developers should take into consideration these elements in order to direct correctly efforts. The research shows also some interesting theoretical implications as it contributes to enrich the literature on this particular kind of business model. At the same time, it helps family owners to increase the overall understanding of their agritourism, in order to finalize adequate planning and communication.

  7. Valuing water resources in Switzerland using a hedonic price model

    Science.gov (United States)

    van Dijk, Diana; Siber, Rosi; Brouwer, Roy; Logar, Ivana; Sanadgol, Dorsa

    2016-05-01

    In this paper, linear and spatial hedonic price models are applied to the housing market in Switzerland, covering all 26 cantons in the country over the period 2005-2010. Besides structural house, neighborhood and socioeconomic characteristics, we include a wide variety of new environmental characteristics related to water to examine their role in explaining variation in sales prices. These include water abundance, different types of water bodies, the recreational function of water, and water disamenity. Significant spatial autocorrelation is found in the estimated models, as well as nonlinear effects for distances to the nearest lake and large river. Significant effects are furthermore found for water abundance and the distance to large rivers, but not to small rivers. Although in both linear and spatial models water related variables explain less than 1% of the price variation, the distance to the nearest bathing site has a larger marginal contribution than many neighborhood-related distance variables. The housing market shows to differentiate between different water related resources in terms of relative contribution to house prices, which could help the housing development industry make more geographically targeted planning activities.

  8. The Analysis of Several Models of Investment Value of Logistics Project Evaluation

    Directory of Open Access Journals (Sweden)

    Ke Qiu Cheng Zhou

    2013-01-01

    Full Text Available The study of the logistics project evaluation model features reviews the traditional value evaluation model. On the basis of this, using the fuzzy theory, we establish several logistics project evaluation models under fuzzy environment. The analysis of the respective characteristics and the comparison of the calculated results of the three models show that these models are important methods of investment value of logistics evaluation.

  9. Measuring the value of nonwage employee benefits: building a model of the relation between benefit satisfaction and value.

    Science.gov (United States)

    Weathington, Bart L; Jones, Allan P

    2006-11-01

    Researchers have commonly assumed benefits that employees view as more valuable have a greater influence on their attitudes and behaviors. Researchers have used 2 common methods to measure benefit value: attaching a monetary value to benefits and using self-reports of benefit importance. The present authors propose that the 2 approaches are conceptually distinct and have different implications. They use a social exchange perspective to justify this distinction and integrate both approaches and benefit satisfaction into a more comprehensive model of benefit perception. Results suggest that both measures have practical applications depending on the nature of the exchange relationship between the organization and employees. However, this relationship depends on the specific benefit and on employee satisfaction with that benefit. Some benefits lend themselves to a monetary estimate, whereas others lend themselves more to a nonmonetary valuation.

  10. Do dynamic regional models add value to the global model projections of Indian monsoon?

    Science.gov (United States)

    Singh, Swati; Ghosh, Subimal; Sahana, A. S.; Vittal, H.; Karmakar, Subhankar

    2017-02-01

    Dynamic Regional Climate Models (RCMs) work at fine resolution for a limited region and hence they are presumed to simulate regional climate better than General Circulation Models (GCMs). Simulations by RCMs are used for impacts assessment, often without any evaluation. There is a growing debate on the added value made by the regional models to the projections of GCMs specifically for the regions like, United States and Europe. Evaluation of RCMs for Indian Summer Monsoon Rainfall (ISMR) has been overlooked in literature, though there are few disjoint studies on Indian monsoon extremes and biases. Here we present a comprehensive study on the evaluations of RCMs for the ISMR with all its important characteristics such as northward and eastward propagation, onset, seasonal rainfall patterns, intra-seasonal oscillations, spatial variability and patterns of extremes. We evaluate nine regional simulations from Coordinated Regional Climate Downscaling Experiment and compare them with their host Coupled Model Intercomparison Project-5 GCM projections. We do not find any consistent improvement in the RCM simulations with respect to their host GCMs for any of the characteristics of Indian monsoon except the spatial variation. We also find that the simulations of the ISMR characteristics by a good number of RCMs, are worse than those of their host GCMs. No consistent added value is observed in the RCM simulations of changes in ISMR characteristics over recent periods, compared to past; though there are few exceptions. These results highlight the need for proper evaluation before utilizing regional models for impacts assessment and subsequent policy making for sustainable climate change adaptation.

  11. A Delay Model of Multiple-Valued Logic Circuits Consisting of Min, Max, and Literal Operations

    Science.gov (United States)

    Takagi, Noboru

    Delay models for binary logic circuits have been proposed and clarified their mathematical properties. Kleene's ternary logic is one of the simplest delay models to express transient behavior of binary logic circuits. Goto first applied Kleene's ternary logic to hazard detection of binary logic circuits in 1948. Besides Kleene's ternary logic, there are many delay models of binary logic circuits, Lewis's 5-valued logic etc. On the other hand, multiple-valued logic circuits recently play an important role for realizing digital circuits. This is because, for example, they can reduce the size of a chip dramatically. Though multiple-valued logic circuits become more important, there are few discussions on delay models of multiple-valued logic circuits. Then, in this paper, we introduce a delay model of multiple-valued logic circuits, which are constructed by Min, Max, and Literal operations. We then show some of the mathematical properties of our delay model.

  12. Probing for the Multiplicative Term in Modern Expectancy-Value Theory: A Latent Interaction Modeling Study

    Science.gov (United States)

    Trautwein, Ulrich; Marsh, Herbert W.; Nagengast, Benjamin; Ludtke, Oliver; Nagy, Gabriel; Jonkmann, Kathrin

    2012-01-01

    In modern expectancy-value theory (EVT) in educational psychology, expectancy and value beliefs additively predict performance, persistence, and task choice. In contrast to earlier formulations of EVT, the multiplicative term Expectancy x Value in regression-type models typically plays no major role in educational psychology. The present study…

  13. Application of the Counseling Psychology Model Training Values Statement Addressing Diversity to the Admission Process

    Science.gov (United States)

    Loewy, Michael I.; Juntunen, Cindy L.; Duan, Changming

    2009-01-01

    This article addresses the responsibility of counseling psychology programs to communicate and implement the professional training values regarding diversity as articulated in the "Counseling Psychology Model Training Values Statement Addressing Diversity" (henceforth the "Values Statement") clearly and directly in the advertising and admission…

  14. The Need for a Counseling Psychology Model Training Values Statement Addressing Diversity

    Science.gov (United States)

    Mintz, Laurie B.; Jackson, Aaron P.; Neville, Helen A.; Illfelder-Kaye, Joyce; Winterowd, Carrie L.; Loewy, Michael I.

    2009-01-01

    The authors articulate the need for a "Counseling Psychology Model Training Values Statement Addressing Diversity" (henceforth "Values Statement"). They discuss the historic unwillingness of the field to address values in a sophisticated or complex way and highlight the increasingly common training scenario in which trainees state that certain…

  15. Probing for the Multiplicative Term in Modern Expectancy-Value Theory: A Latent Interaction Modeling Study

    Science.gov (United States)

    Trautwein, Ulrich; Marsh, Herbert W.; Nagengast, Benjamin; Ludtke, Oliver; Nagy, Gabriel; Jonkmann, Kathrin

    2012-01-01

    In modern expectancy-value theory (EVT) in educational psychology, expectancy and value beliefs additively predict performance, persistence, and task choice. In contrast to earlier formulations of EVT, the multiplicative term Expectancy x Value in regression-type models typically plays no major role in educational psychology. The present study…

  16. Application of the Counseling Psychology Model Training Values Statement Addressing Diversity to the Admission Process

    Science.gov (United States)

    Loewy, Michael I.; Juntunen, Cindy L.; Duan, Changming

    2009-01-01

    This article addresses the responsibility of counseling psychology programs to communicate and implement the professional training values regarding diversity as articulated in the "Counseling Psychology Model Training Values Statement Addressing Diversity" (henceforth the "Values Statement") clearly and directly in the advertising and admission…

  17. Interval-Valued Model Level Fuzzy Aggregation-Based Background Subtraction.

    Science.gov (United States)

    Chiranjeevi, Pojala; Sengupta, Somnath

    2016-07-29

    In a recent work, the effectiveness of neighborhood supported model level fuzzy aggregation was shown under dynamic background conditions. The multi-feature fuzzy aggregation used in that approach uses real fuzzy similarity values, and is robust for low and medium-scale dynamic background conditions such as swaying vegetation, sprinkling water, etc. The technique, however, exhibited some limitations under heavily dynamic background conditions, as features have high uncertainty under such noisy conditions and these uncertainties were not captured by real fuzzy similarity values. Our proposed algorithm is particularly focused toward improving the detection under heavy dynamic background conditions by modeling uncertainties in the data by interval-valued fuzzy set. In this paper, real-valued fuzzy aggregation has been extended to interval-valued fuzzy aggregation by considering uncertainties over real similarity values. We build up a procedure to calculate the uncertainty that varies for each feature, at each pixel, and at each time instant. We adaptively determine membership values at each pixel by the Gaussian of uncertainty value instead of fixed membership values used in recent fuzzy approaches, thereby, giving importance to a feature based on its uncertainty. Interval-valued Choquet integral is evaluated using interval similarity values and the membership values in order to calculate interval-valued fuzzy similarity between model and current. Adequate qualitative and quantitative studies are carried out to illustrate the effectiveness of the proposed method in mitigating heavily dynamic background situations as compared to state-of-the-art.

  18. Value-Added Models of Assessment: Implications for Motivation and Accountability

    Science.gov (United States)

    Anderman, Eric M.; Anderman, Lynley H.; Yough, Michael S.; Gimbert, Belinda G.

    2010-01-01

    In this article, we examine the relations of value-added models of measuring academic achievement to student motivation. Using an achievement goal orientation theory perspective, we argue that value-added models, which focus on the progress of individual students over time, are more closely aligned with research on student motivation than are more…

  19. Integrating Work and Basic Values into the Spherical Model of Interests

    Science.gov (United States)

    Sodano, Sandro M.

    2011-01-01

    Two prominent models of values, one in work and the other in life, were examined as they each related to the dimensions underlying the Spherical Model of Interests (Tracey & Rounds, 1996) as measured by the Personal Globe Inventory (PGI; Tracey, 2002). The technique of external property vector fitting was utilized to plot the value constructs onto…

  20. Modeling and prediction of monetary and non-monetary business values

    NARCIS (Netherlands)

    Välja, Margus; Österlind, Magnus; Iacob, Maria-Eugenia; Sinderen, van Marten; Johnson, Pontus; Gasevic, D; Hatala, M.; Motahari Nezhad, H.R.; Reichert, M.U.

    2013-01-01

    In existing business model frameworks little attention is paid to a thorough understanding of the perceived customer value of a business’ offering as compared to competing offers. In this paper, we propose to use utility theory in combination with e3value models to address this issue. An actor's joi

  1. Dual Value Creation and Business Model Design: An Ethnographic Study of an Internationalizing NGO

    DEFF Research Database (Denmark)

    Turcan, Romeo V.

    This ethnographic research explores the process of business model design in the context of an NGO internationalizing to an emerging market. It contributes to the business model literature by investigating how this NGO - targeting multiple key stakeholders - was experimenting (1) with value propos...... proposition; (2) creating the value/promise; and (3) delivering the value. Theoretically the paper is grounded in the dynamic capability view of the firm providing venues for future research and implications for policy and practice....

  2. Constructing set-valued fundamental diagrams from jamiton solutions in second order traffic models

    KAUST Repository

    Seibold, Benjamin

    2013-09-01

    Fundamental diagrams of vehicular traiic ow are generally multivalued in the congested ow regime. We show that such set-valued fundamental diagrams can be constructed systematically from simple second order macroscopic traiic models, such as the classical Payne-Whitham model or the inhomogeneous Aw-Rascle-Zhang model. These second order models possess nonlinear traveling wave solutions, called jamitons, and the multi-valued parts in the fundamental diagram correspond precisely to jamiton-dominated solutions. This study shows that transitions from function-valued to set-valued parts in a fundamental diagram arise naturally in well-known second order models. As a particular consequence, these models intrinsically reproduce traiic phases. © American Institute of Mathematical Sciences.

  3. Study of behavior and determination of customer lifetime value(CLV) using Markov chain model

    Science.gov (United States)

    Permana, Dony; Indratno, Sapto Wahyu; Pasaribu, Udjianna S.

    2014-03-01

    Customer Lifetime Value or CLV is a restriction on interactive marketing to help a company in arranging financial for the marketing of new customer acquisition and customer retention. Additionally CLV can be able to segment customers for financial arrangements. Stochastic models for the fairly new CLV used a Markov chain. In this model customer retention probability and new customer acquisition probability play an important role. This model is originally introduced by Pfeifer and Carraway in 2000 [1]. They introduced several CLV models, one of them only involves customer and former customer. In this paper we expand the model by adding the assumption of the transition from former customer to customer. In the proposed model, the CLV value is higher than the CLV value obtained by Pfeifer and Caraway model. But our model still requires a longer convergence time.

  4. Mean-value second-order uncertainty analysis method: application to water quality modelling

    Science.gov (United States)

    Mailhot, Alain; Villeneuve, Jean-Pierre

    Uncertainty analysis in hydrology and water quality modelling is an important issue. Various methods have been proposed to estimate uncertainties on model results based on given uncertainties on model parameters. Among these methods, the mean-value first-order second-moment (MFOSM) method and the advanced mean-value first-order second-moment (AFOSM) method are the most common ones. This paper presents a method based on a second-order approximation of a model output function. The application of this method requires the estimation of first- and second-order derivatives at a mean-value point in the parameter space. Application to a Streeter-Phelps prototype model is presented. Uncertainties on two and six parameters are considered. Exceedance probabilities (EP) of dissolved oxygen concentrations are obtained and compared with EP computed using Monte Carlo, AFOSM and MFOSM methods. These results show that the mean-value second-order method leads to better estimates of EP.

  5. EVA based model to estimate the Chinese listed company's intrinsic value

    Institute of Scientific and Technical Information of China (English)

    RAN Mao-sheng; ZHONG Tao; HU Guo-peng

    2005-01-01

    This article is based on traditionally intrinsic value assessment model. We employed the assumption on the differences in future increase rates of companies, taking into account of the expected Economic Value Added (EVA) discount and the capital investment, to establish a high increase model, a two-stage EVA discount model and a three-stage EVA discount model for the intrinsic value assessment. Those models eliminate the great fluctuation of free cash flow in calculating the capital expenditure by setting aside the cash flow of the company's investment in the year and considering only the capital cost. This method needs only to assess the EVA flow in different year in probing the intrinsic value of a company, thus give more consistent conclusion than conventional methods.

  6. The economic value of CAD systems in structural design and construction: A modelling approach

    NARCIS (Netherlands)

    Chandansingh, R.A.

    1995-01-01

    A modelling approach is provided for the analysis of cost-effects of CAD systems. It aims to support strategic management of CAD systems in structural design and construction. The approach is based on the production digraph model of production processes, and the value-added model of information comm

  7. An Analysis of the Educational Value of Low-Fidelity Anatomy Models as External Representations

    Science.gov (United States)

    Chan, Lap Ki; Cheng, Maurice M. W.

    2011-01-01

    Although high-fidelity digital models of human anatomy based on actual cross-sectional images of the human body have been developed, reports on the use of physical models in anatomy teaching continue to appear. This article aims to examine the common features shared by these physical models and analyze their educational value based on the…

  8. Determination of Complex-Valued Parametric Model Coefficients Using Artificial Neural Network Technique

    Directory of Open Access Journals (Sweden)

    A. M. Aibinu

    2010-01-01

    Full Text Available A new approach for determining the coefficients of a complex-valued autoregressive (CAR and complex-valued autoregressive moving average (CARMA model coefficients using complex-valued neural network (CVNN technique is discussed in this paper. The CAR and complex-valued moving average (CMA coefficients which constitute a CARMA model are computed simultaneously from the adaptive weights and coefficients of the linear activation functions in a two-layered CVNN. The performance of the proposed technique has been evaluated using simulated complex-valued data (CVD with three different types of activation functions. The results show that the proposed method can accurately determine the model coefficients provided that the network is properly trained. Furthermore, application of the developed CVNN-based technique for MRI K-space reconstruction results in images with improve resolution.

  9. Identifying E-Business Model:A Value Chain-Based Analysis

    Institute of Scientific and Technical Information of China (English)

    ZENG Qingfeng; HUANG Lihua

    2004-01-01

    E-business will change the ways that all companies do business, and most traditional businesses will evolve from their current business model to a combination of place and space via e-business model To choose the proper e-business model becomes the important strategic concern for company to succeed The main objective of this paper is to investigate the analysis framework for identifying e-business model Based on the e-business process, from the value chain to the value net perspective. This paper provides a theoretical framework for identifying e-business models, and results in 11 e-business models. The strategic intend of every e-business model is discussed in the end of this paper. An enterprise e-business model design and implementation can be specified by the combination of one or more among 11 e-business models.

  10. A NOVEL MULTI-VALUED BAM MODEL WITH IMPROVED ERROR-CORRECTING CAPABILITY

    Institute of Scientific and Technical Information of China (English)

    Zhang Daoqiang; Chen Songcan

    2003-01-01

    A Hyperbolic Tangent multi-valued Bi-directional Associative Memory (HTBAM)model is proposed in this letter. Two general energy functions are defined to prove the stabilityof one class of multi-valued Bi-directional Associative Memorys(BAMs), with HTBAM being thespecial case. Simulation results show that HTBAM has a competitive storage capacity and muchmore error-correcting capability than other multi-valued BAMs.

  11. Structural and functional model of the value of a healthy lifestyle of students.

    Directory of Open Access Journals (Sweden)

    Mameri Farid.

    2011-08-01

    Full Text Available It is analyze the main elements of the structural-functional model of the formation of values of a healthy lifestyle of students. It is revealed some elements of educational technology, teaching requirements for the organization of educational work on the formation of values of a healthy lifestyle of students. It is noted that the improvement of the formation of values of a healthy lifestyle of students increased, subject to the enrichment of its assessment criteria.

  12. Business Model Innovation to Create and Capture Resource Value in Future Circular Material Chains

    Directory of Open Access Journals (Sweden)

    Göran Roos

    2014-03-01

    Full Text Available This article briefly discusses the origins and development of the business model concept resulting in a high level definition. Against this backdrop, frameworks from the literature around green business models with examples of green business models and the business model innovation process are presented. The article then discusses the origins and meaning of different "green" concepts relevant for the circular value chain concluding with a high level definition. The article finally outline the process by which a business model for a circular value chain can be developed taking into account the social dilemma that exist in these type of situations. The article concludes with the specific questions that need to be answered in order to create an appropriate business model for a circular value chain.

  13. Comparison of Optimal Portfolios Selected by Multicriterial Model Using Absolute and Relative Criteria Values

    Directory of Open Access Journals (Sweden)

    Branka Marasović

    2009-03-01

    Full Text Available In this paper we select an optimal portfolio on the Croatian capital market by using the multicriterial programming. In accordance with the modern portfolio theory maximisation of returns at minimal risk should be the investment goal of any successful investor. However, contrary to the expectations of the modern portfolio theory, the tests carried out on a number of financial markets reveal the existence of other indicators important in portfolio selection. Considering the importance of variables other than return and risk, selection of the optimal portfolio becomes a multicriterial problem which should be solved by using the appropriate techniques.In order to select an optimal portfolio, absolute values of criteria, like return, risk, price to earning value ratio (P/E, price to book value ratio (P/B and price to sale value ratio (P/S are included in our multicriterial model. However the problem might occur as the mean values of some criteria are significantly different for different sectors and because financial managers emphasize that comparison of the same criteria for different sectors could lead us to wrong conclusions. In the second part of the paper, relative values of previously stated criteria (in relation to mean value of sector are included in model for selecting optimal portfolio. Furthermore, the paper shows that if relative values of criteria are included in multicriterial model for selecting optimal portfolio, return in subsequent period is considerably higher than if absolute values of the same criteria were used.

  14. COMPARISON OF OPTIMAL PORTFOLIOS SELECTED BY MULTICRITERIAL MODEL USING ABSOLUTE AND RELATIVE CRITERIA VALUES

    Directory of Open Access Journals (Sweden)

    Branka Marasović

    2009-01-01

    Full Text Available In this paper we select an optimal portfolio on the Croatian capital market by using the multicriterial programming. In accordance with the modern portfolio theory maximisation of returns at minimal risk should be the investment goal of any successful investor. However, contrary to the expectations of the modern portfolio theory, the tests carried out on a number of financial markets reveal the existence of other indicators important in portfolio selection. Considering the importance of variables other than return and risk, selection of the optimal portfolio becomes a multicriterial problem which should be solved by using the appropriate techniques.In order to select an optimal portfolio, absolute values of criteria, like return, risk, price to earning value ratio (P/E, price to book value ratio (P/B and price to sale value ratio (P/S are included in our multicriterial model. However the problem might occur as the mean values of some criteria are significantly different for different sectors and because financial managers emphasize that comparison of the same criteria for different sectors could lead us to wrong conclusions. In the second part of the paper, relative values of previously stated criteria (in relation to mean value of sector are included in model for selecting optimal portfolio. Furthermore, the paper shows that if relative values of criteria are included in multicriterial model for selecting optimal portfolio, the return in the subsequent period is considerably higher than if absolute values of the same criteria were used.

  15. The Development of a Qualitative Dynamic Attribute Value Model for Healthcare Institutes

    Directory of Open Access Journals (Sweden)

    Wan- I Lee

    2010-12-01

    Full Text Available  Background: Understanding customers has become an urgent topic for increasing competitiveness. The purpopse of  the study was to develop a qualitative dynamic attribute value model which provides insight into the customers' value for healthcare institute managers by conducting the initial open-ended questionnaire survey to select participants purposefully."nMethods: A total number of 427 questionnaires was conducted in two hospitals in Taiwan (one district hospital with 635 beds and one academic hospital with 2495 beds and 419 questionnaires were received in nine weeks. Then, apply qualitative in-depth interviews to explore customers' perspective of values for building a model of partial differential equations."nResults: This study concludes nine categories of value, including cost, equipment, physician background, physicain care, environment, timing arrangement, relationship, brand image and additional value, to construct objective network for customer value and qualitative dynamic attribute value model where the network shows the value process of loyalty development via its effect on customer satisfaction, customer relationship, customer loyalty and healthcare service."nConclusion: One set predicts the customer relationship based on comminent, including service quality, communication and em­pahty. As the same time, customer loyalty based on trust, involves buzz marketing, brand and image.  Customer value of the current instance is useful for traversing original customer attributes and identifing customers on different service share.

  16. The development of a qualitative dynamic attribute value model for healthcare institutes.

    Science.gov (United States)

    Lee, Wan-I

    2010-01-01

    Understanding customers has become an urgent topic for increasing competitiveness. The purpopse of the study was to develop a qualitative dynamic attribute value model which provides insight into the customers' value for healthcare institute managers by conducting the initial open-ended questionnaire survey to select participants purposefully. A total number of 427 questionnaires was conducted in two hospitals in Taiwan (one district hospital with 635 beds and one academic hospital with 2495 beds) and 419 questionnaires were received in nine weeks. Then, apply qualitative in-depth interviews to explore customers' perspective of values for building a model of partial differential equations. This study concludes nine categories of value, including cost, equipment, physician background, physicain care, environment, timing arrangement, relationship, brand image and additional value, to construct objective network for customer value and qualitative dynamic attribute value model where the network shows the value process of loyalty development via its effect on customer satisfaction, customer relationship, customer loyalty and healthcare service. One set predicts the customer relationship based on comminent, including service quality, communication and empahty. As the same time, customer loyalty based on trust, involves buzz marketing, brand and image. Customer value of the current instance is useful for traversing original customer attributes and identifing customers on different service share.

  17. Rethinking Teacher Evaluation: A Conversation about Statistical Inferences and Value-Added Models

    Science.gov (United States)

    Callister Everson, Kimberlee; Feinauer, Erika; Sudweeks, Richard R.

    2013-01-01

    In this article, the authors provide a methodological critique of the current standard of value-added modeling forwarded in educational policy contexts as a means of measuring teacher effectiveness. Conventional value-added estimates of teacher quality are attempts to determine to what degree a teacher would theoretically contribute, on average,…

  18. How to Gain Value from a Project Management Model: Case Study

    DEFF Research Database (Denmark)

    Riis, Eva; Eskerod, Pernille

    2010-01-01

      Recent research has shown that PM models can be a powerful creator of value for companies. This paper extends this research aiming at gaining a deeper understanding of the preconditions that must exist for harvesting the values of a common frame of reference for project management. It reports f...

  19. 40 CFR 600.207-93 - Calculation of fuel economy values for a model type.

    Science.gov (United States)

    2010-07-01

    ... base level. (7) For alcohol dual fuel automobiles and natural gas dual fuel automobiles the procedures... combined fuel economy values from the tests performed using gasoline or diesel test fuel. (ii) Calculate... economy values for the model type. (5) For alcohol dual fuel automobiles and natural gas dual fuel...

  20. Balanced Realization and Model Order Reduction for Nonlinear Systems based on Singular Value Analysis

    NARCIS (Netherlands)

    Fujimoto, Kenji; Scherpen, Jacquelien M. A.

    2010-01-01

    This paper discusses balanced realization and model order reduction for both continuous-time and discrete-time general nonlinear systems based on singular value analysis of the corresponding Hankel operators. Singular value analysis clarifies the gain structure of a given nonlinear operator. Here it

  1. Rethinking Teacher Evaluation: A Conversation about Statistical Inferences and Value-Added Models

    Science.gov (United States)

    Callister Everson, Kimberlee; Feinauer, Erika; Sudweeks, Richard R.

    2013-01-01

    In this article, the authors provide a methodological critique of the current standard of value-added modeling forwarded in educational policy contexts as a means of measuring teacher effectiveness. Conventional value-added estimates of teacher quality are attempts to determine to what degree a teacher would theoretically contribute, on average,…

  2. Model of Values-Based Management Process in Schools: A Mixed Design Study

    Science.gov (United States)

    Dogan, Soner

    2016-01-01

    The aim of this paper is to evaluate the school administrators' values-based management behaviours according to the teachers' perceptions and opinions and, accordingly, to build a model of values-based management process in schools. The study was conducted using explanatory design which is inclusive of both quantitative and qualitative methods.…

  3. Seismic quiescence and b-value decrease before large events in forest-fire model

    CERN Document Server

    Mitsudo, Tetsuya; Kato, Naoyuki

    2015-01-01

    Forest fire models may be interpreted as a simple model for earthquake occurrence by translating trees and fire into stressed segments of a fault and their rupture, respectively. Here we adopt a twodimensional forest-fire model in continuous time, and focus on the temporal changes of seismicity and the b-value. We find the b-value change and seismic quiescence prior to large earthquakes by stacking many sequences towards large earthquakes. As the magnitude-frequency relation in this model is directly related to the cluster-size distribution, decrease of the b-value can be explained in terms of the change in the cluster-size distribution. Decrease of the b-value means that small clusters of stressed sites aggregate into a larger cluster. Seismic quiescence may be attributed to the decrease of stressed sites that do not belong to percolated clusters.

  4. A conceptual model of channel choice : measuring online and offline shopping value perceptions

    NARCIS (Netherlands)

    Broekhuizen, Thijs L.J.; Jager, Wander

    2004-01-01

    This study tries to understand how consumers evaluate channels for their purchasing. Specifically, it develops a conceptual model that addresses consumer value perceptions of using the Internet versus the traditional (physical) channel. Previous research showed that perceptions of price, product

  5. Teachers’ individual action theories about competence-based education: the value of the cognitive apprenticeship model

    OpenAIRE

    2009-01-01

    Seezink, A., Poell, R. F., & Kirschner, P. A. (2009). Teachers' individual action theories about competence-based education: The value of the cognitive apprenticeship model. Journal of Vocational Education & Training, 61, 203-215.

  6. Building a Values-Informed Mental Model for New Orleans Climate Risk Management.

    Science.gov (United States)

    Bessette, Douglas L; Mayer, Lauren A; Cwik, Bryan; Vezér, Martin; Keller, Klaus; Lempert, Robert J; Tuana, Nancy

    2017-01-13

    Individuals use values to frame their beliefs and simplify their understanding when confronted with complex and uncertain situations. The high complexity and deep uncertainty involved in climate risk management (CRM) lead to individuals' values likely being coupled to and contributing to their understanding of specific climate risk factors and management strategies. Most mental model approaches, however, which are commonly used to inform our understanding of people's beliefs, ignore values. In response, we developed a "Values-informed Mental Model" research approach, or ViMM, to elicit individuals' values alongside their beliefs and determine which values people use to understand and assess specific climate risk factors and CRM strategies. Our results show that participants consistently used one of three values to frame their understanding of risk factors and CRM strategies in New Orleans: (1) fostering a healthy economy, wealth, and job creation, (2) protecting and promoting healthy ecosystems and biodiversity, and (3) preserving New Orleans' unique culture, traditions, and historically significant neighborhoods. While the first value frame is common in analyses of CRM strategies, the latter two are often ignored, despite their mirroring commonly accepted pillars of sustainability. Other values like distributive justice and fairness were prioritized differently depending on the risk factor or strategy being discussed. These results suggest that the ViMM method could be a critical first step in CRM decision-support processes and may encourage adoption of CRM strategies more in line with stakeholders' values.

  7. Coordination buyer-supplier in supply chain models from net present value perspective

    OpenAIRE

    Hamontree, Chaowalit

    2014-01-01

    This thesis examines four parts of production and inventory models for buyer-supplier in the supply chain under deterministic conditions. The main objective is to find optimal lot-sizing decisions and inventory policies which derive from the classical inventory and Net Present Value (NPV) framework. Firstly, we study the production and inventory models from the classical framework to identify how to value the holding cost for buyer and supplier in the average profit or cost functions. Secondl...

  8. Using many pilot points and singular value decomposition in groundwater model calibration

    DEFF Research Database (Denmark)

    Christensen, Steen; Doherty, John

    2008-01-01

    over the model area. Singular value decomposition (SVD) of the normal matrix is used to reduce the large number of pilot point parameters to a smaller number of so-called super parameters that can be estimated by nonlinear regression from the available observations. A number of eigenvectors...... corresponding to significant Eigen values (resulting from the decomposition) is used to transform the model from having many pilot point parameters to having a few super parameters. A synthetic case model is used to analyze and demonstrate the application of the presented method of model parameterization...

  9. A reformulation of the Cost Plus Net Value Change (C+NVC) model of wildfire economics

    Science.gov (United States)

    Geoffrey H. Donovan; Douglas B. Rideout

    2003-01-01

    The Cost plus Net Value Change (C+NVC) model provides the theoretical foundation for wildland fire economics and provides the basis for the National Fire Management Analysis System (NFMAS). The C+NVC model is based on the earlier least Cost plus Loss model (LC+L) expressed by Sparhawk (1925). Mathematical and graphical analysis of the LC+L model illustrates two errors...

  10. SPSS macros to compare any two fitted values from a regression model.

    Science.gov (United States)

    Weaver, Bruce; Dubois, Sacha

    2012-12-01

    In regression models with first-order terms only, the coefficient for a given variable is typically interpreted as the change in the fitted value of Y for a one-unit increase in that variable, with all other variables held constant. Therefore, each regression coefficient represents the difference between two fitted values of Y. But the coefficients represent only a fraction of the possible fitted value comparisons that might be of interest to researchers. For many fitted value comparisons that are not captured by any of the regression coefficients, common statistical software packages do not provide the standard errors needed to compute confidence intervals or carry out statistical tests-particularly in more complex models that include interactions, polynomial terms, or regression splines. We describe two SPSS macros that implement a matrix algebra method for comparing any two fitted values from a regression model. The !OLScomp and !MLEcomp macros are for use with models fitted via ordinary least squares and maximum likelihood estimation, respectively. The output from the macros includes the standard error of the difference between the two fitted values, a 95% confidence interval for the difference, and a corresponding statistical test with its p-value.

  11. Towards an Integrated Value Adding Management Model for FM and CREM

    DEFF Research Database (Denmark)

    Jensen, Per Anker; van der Voordt, Theo

    2016-01-01

    frameworks are too complex and lack of common terminology and clear operationalisations of intervention-impact relationships. Approach (Theory/Methodology) :A generalised Value Adding Management process model is developed based on a common cause-effect model identified in existing conceptual frameworks......Purpose : To present an integrated process model of adding value by Facilities Management (FM) and Corporate Real Estate Management (CREM) that is a generalisation of existing conceptual frameworks and aims to be a basis for management of added value in practice. Background : The growing research...... on the added value of FM and CREM over the last decade has resulted in the development of several conceptual frameworks and the collection of much empirical data in practice. However, the practical application of current knowledge has shown to be limited and difficult. The reasons seem to be that the different...

  12. A two process model of burnout and work engagement: distinct implications of demands and values.

    Science.gov (United States)

    Leiter, M P

    2008-01-01

    A model of job burnout proposes two distinct processes. The first process concerns balance of demands to resources. A poor balance leads to chronic exhaustion, an integral aspect of the burnout syndrome. The second process concerns the congruence of individual and organizational values. The model proposes that value conflicts have implications for all three aspects of burnout. It also proposes that the impact of value conflicts has only minor implications for the exhaustion aspect of burnout; they are more relevant for the cynicism and inefficacy aspects of the syndrome. The model considers distinct processes at work that concern employees' perception of organizational justice and their trust in leadership. With a sample of 725 nurses, the analysis tested one component of the theory: the extent to which value congruence enhances the prediction of burnout beyond the prediction provided by demands and resources. Future directions are discussed.

  13. Representation of Solar Capacity Value in the ReEDS Capacity Expansion Model

    Energy Technology Data Exchange (ETDEWEB)

    Sigrin, B.; Sullivan, P.; Ibanez, E.; Margolis, R.

    2014-03-01

    An important issue for electricity system operators is the estimation of renewables' capacity contributions to reliably meeting system demand, or their capacity value. While the capacity value of thermal generation can be estimated easily, assessment of wind and solar requires a more nuanced approach due to the resource variability. Reliability-based methods, particularly assessment of the Effective Load-Carrying Capacity, are considered to be the most robust and widely-accepted techniques for addressing this resource variability. This report compares estimates of solar PV capacity value by the Regional Energy Deployment System (ReEDS) capacity expansion model against two sources. The first comparison is against values published by utilities or other entities for known electrical systems at existing solar penetration levels. The second comparison is against a time-series ELCC simulation tool for high renewable penetration scenarios in the Western Interconnection. Results from the ReEDS model are found to compare well with both comparisons, despite being resolved at a super-hourly temporal resolution. Two results are relevant for other capacity-based models that use a super-hourly resolution to model solar capacity value. First, solar capacity value should not be parameterized as a static value, but must decay with increasing penetration. This is because -- for an afternoon-peaking system -- as solar penetration increases, the system's peak net load shifts to later in the day -- when solar output is lower. Second, long-term planning models should determine system adequacy requirements in each time period in order to approximate LOLP calculations. Within the ReEDS model we resolve these issues by using a capacity value estimate that varies by time-slice. Within each time period the net load and shadow price on ReEDS's planning reserve constraint signals the relative importance of additional firm capacity.

  14. Expectancy-Value models of health behaviour: the role of salience and anticipated affect

    NARCIS (Netherlands)

    van der Pligt, J.; de Vries, N.K.

    1998-01-01

    Expectancy-value models of health behaviour are based upon the assumption that this behaviour is determined by a subjective cost-benefit analysis. Generally, these models emphasize cognitive appraisal processes focusing on the likelihood and evaluation of the consequences of health-related

  15. Oxygen and hydrogen isotope ratios in tree rings: how well do models predict observed values?

    CSIR Research Space (South Africa)

    Waterhouse, JS

    2002-07-30

    Full Text Available the trunk, it is proficient to model the observed annual values of oxygen isotope ratios of alpha-cellulose to a significant level (r = 0.77, P < 0.01). When the same model is applied to hydrogen isotope ratios, results are found, and predictions can be made...

  16. Value-Added Models for Teacher Preparation Programs: Validity and Reliability Threats, and a Manageable Alternative

    Science.gov (United States)

    Brady, Michael P.; Heiser, Lawrence A.; McCormick, Jazarae K.; Forgan, James

    2016-01-01

    High-stakes standardized student assessments are increasingly used in value-added evaluation models to connect teacher performance to P-12 student learning. These assessments are also being used to evaluate teacher preparation programs, despite validity and reliability threats. A more rational model linking student performance to candidates who…

  17. Expectancy-Value models of health behaviour: the role of salience and anticipated affect

    NARCIS (Netherlands)

    van der Pligt, J.; de Vries, N.K.

    1998-01-01

    Expectancy-value models of health behaviour are based upon the assumption that this behaviour is determined by a subjective cost-benefit analysis. Generally, these models emphasize cognitive appraisal processes focusing on the likelihood and evaluation of the consequences of health-related behaviour

  18. Operationalizing the Concept of Value--An Action Research-Based Model

    Science.gov (United States)

    Naslund, Dag; Olsson, Annika; Karlsson, Sture

    2006-01-01

    Purpose: While the importance of measuring customer satisfaction levels is well established, less research exists on how organizations operationalize such knowledge. The purpose of this paper is to describe an action research (AR) case study resulting in a workshop model to operationalize the concept of value. The model facilitates organizational…

  19. Predictor Relationships between Values Held by Married Individuals, Resilience and Conflict Resolution Styles: A Model Suggestion

    Science.gov (United States)

    Tosun, Fatma; Dilmac, Bulent

    2015-01-01

    The aim of the present research is to reveal the predictor relationships between the values held by married individuals, resilience and conflict resolution styles. The research adopts a relational screening model that is a sub-type of the general screening model. The sample of the research consists of 375 married individuals, of which 173 are…

  20. Analysing stratified medicine business models and value systems: innovation-regulation interactions.

    Science.gov (United States)

    Mittra, James; Tait, Joyce

    2012-09-15

    Stratified medicine offers both opportunities and challenges to the conventional business models that drive pharmaceutical R&D. Given the increasingly unsustainable blockbuster model of drug development, due in part to maturing product pipelines, alongside increasing demands from regulators, healthcare providers and patients for higher standards of safety, efficacy and cost-effectiveness of new therapies, stratified medicine promises a range of benefits to pharmaceutical and diagnostic firms as well as healthcare providers and patients. However, the transition from 'blockbusters' to what might now be termed 'niche-busters' will require the adoption of new, innovative business models, the identification of different and perhaps novel types of value along the R&D pathway, and a smarter approach to regulation to facilitate innovation in this area. In this paper we apply the Innogen Centre's interdisciplinary ALSIS methodology, which we have developed for the analysis of life science innovation systems in contexts where the value creation process is lengthy, expensive and highly uncertain, to this emerging field of stratified medicine. In doing so, we consider the complex collaboration, timing, coordination and regulatory interactions that shape business models, value chains and value systems relevant to stratified medicine. More specifically, we explore in some depth two convergence models for co-development of a therapy and diagnostic before market authorisation, highlighting the regulatory requirements and policy initiatives within the broader value system environment that have a key role in determining the probable success and sustainability of these models.

  1. Exploring the Relationship Between Business Model Innovation, Corporate Sustainability, and Organisational Values within the Fashion Industry

    DEFF Research Database (Denmark)

    Pedersen, Esben Rahbek Gjerdrum; Gwozdz, Wencke; Hvass, Kerli Kant

    2016-01-01

    The objective of this paper is to examine the relationship between business model innovation, corporate sustainability, and the underlying organisational values. Moreover, the paper examines how the three dimensions correlate with corporate financial performance. It is concluded that companies...... with innovative business models are more likely to address corporate sustainability and that business model innovation and corporate sustainability alike are typically found in organisations rooted in values of flexibility and discretion. Business model innovation and corporate sustainability thus seem to have...... their origin in the fundamental principles guiding the organisation. In addition, the study also finds a positive relationship between the core organisational values and financial performance. The analysis of the paper is based on survey responses from 492 managers within the Swedish fashion industry....

  2. GIS SUPPORTED HEDONIC MODEL FOR ASSESSING PROPERTY VALUE IN WEST OAKLAND, CALIFORNIA

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    A hedonic linear regression model is constructed in this paper to estimate property value. In our model, the property value (sales price) is a function of several selected variables such as the property characteristics, social neighborhoods, level of neighborhood environmental contaminations, level of neighborhood crimes, and locational accessibility to jobs or services. Definitions and calculation of these variables are approached by using Geographic Information System tools. For improving estimation, gravity model is employed to measure both levels of neighborhood toxic sites and crimes; and a time-based method is used to measure the locational accessibility rather than simple straight-line distance measurement. This study discovers that the relationship between house value and its nearby highway is nonlinear. The methodology could help policy makers assess the external effects ora property. Our model also could be used potentially to identify the current and historic trends of development caused by neighborhood or environments change in the study area.

  3. Climate change impact on groundwater levels: ensemble modelling of extreme values

    Directory of Open Access Journals (Sweden)

    J. Kidmose

    2012-06-01

    Full Text Available This paper presents a first attempt to estimate future groundwater levels by applying extreme value statistics on predictions from a hydrological model. Climate for the future period, 2081–2100, are represented by projections from nine combinations of three global climate models and six regional climate models, and downscaled with two different methods. An integrated surface water/groundwater model is forced with precipitation, temperature, and evapotranspiration from the 18 model – and downscaling combinations. Extreme value analyses are performed on the hydraulic head changes from a control period (1991–2010 to the future period for the 18 combinations. Hydraulic heads for return periods of 21, 50 and 100 yr (T21–100 are estimated. Three uncertainty sources are evaluated; climate models, downscaling and extreme value statistics. Of these sources, downscaling dominates for the higher return periods of 50 and 100 yr, whereas uncertainty from climate models and downscaling are similar for lower return periods. Uncertainty from the extreme value statistics only contribute up to around 10% of the uncertainty from the three sources.

  4. The Boolean—Valued Model of the Conglomerate Axiom System ACG

    Institute of Scientific and Technical Information of China (English)

    李娜

    1993-01-01

    The paper[1] constructs the conglomerate axiom system ACG in order to research the base of Category Theory.This paper constructs the model Ω(B)(where B is a complete Boolean algebra) on the basis of the models △(B) (see[2])and ∧(B)(see[5]),and proves:(1)Ω(B) is Boolean-Valued model of the conglomerate axiom system ACG;(2)The maximum and minimum principles are true in Ω(B).

  5. On the initial value problem for a class of discrete velocity models.

    Science.gov (United States)

    Bellandi, Davide

    2017-02-01

    In this paper we investigate the initial value problem for a class of hyperbolic systems relating the mathematical modeling of a class of complex phenomena, with emphasis on vehicular traffic flow. Existence and uniqueness for large times of solutions, a basic requisite both for models building and for their numerical implementation, are obtained under weak hypotheses on the terms modeling the interaction among agents. The results are then compared with the existing literature on the subject.

  6. Modeling Dynamics of Leaf Color Based on RGB Value in Rice

    Institute of Scientific and Technical Information of China (English)

    ZHANG Yong-hui; TANG Liang; LIU Xiao-jun; LIU Lei-lei; CAO Wei-xing; ZHU Yan

    2014-01-01

    This paper was to develop a model for simulating the leaf color changes in rice (Oryza sativa L.) based on RGB (red, green, and blue) values. Based on rice experiment data with different cultivars and nitrogen (N) rates, the time-course RGB values of each leaf on main stem were collected during the growth period in rice, and a model for simulating the dynamics of leaf color in rice was then developed using quantitative modeling technology. The results showed that the RGB values of leaf color gradually decreased from the initial values (light green) to the steady values (green) during the ifrst stage, remained the steady values (green) during the second stage, then gradually increased to the ifnal values (from green to yellow) during the third stage. The decreasing linear functions, constant functions and increasing linear functions were used to simulate the changes in RGB values of leaf color at the ifrst, second and third stages with growing degree days (GDD), respectively;two cultivar parameters, MatRGB (leaf color matrix) and AR (a vector composed of the ratio of the cumulative GDD of each stage during color change process of leaf n to that during leaf n drawn under adequate N status), were introduced to quantify the genetic characters in RGB values of leaf color and in durations of different stages during leaf color change, respectively;FN (N impact factor) was used to quantify the effects of N levels on RGB values of leaf color and on durations of different stages during leaf color change;linear functions were applied to simulate the changes in leaf color along the leaf midvein direction during leaf development process. Validation of the models with the independent experiment dataset exhibited that the root mean square errors (RMSE) between the observed and simulated RGB values were among 8 to 13, the relative RMSE (RRMSE) were among 8 to 10%, the mean absolute differences (da) were among 3.85 to 6.90, and the ratio of da to the mean observation values (dap

  7. A model of relations of value of property sites on the case of Ljubljana

    Directory of Open Access Journals (Sweden)

    Franc J. Zakrajšek

    2004-01-01

    Full Text Available The proposed model is a computer aided system for evaluating the value of spatial sites of properties, both present as well as simulations of future conditions following hypothetical directions of development, strategic or concrete spatial decisions. The model is built on classical methods of mass appraisal of property, which are supported by a geographical information model. It introduces significant novelties in the field: the use of methods of analytical hierarchical processing for determining technical coefficients of site advantages. The model was developed in the research Development and implementation of a regional simulation model for the Ljubljana urban region, financed by the City municipality of Ljubljana.

  8. A financial market model with endogenous fundamental values through imitative behavior.

    Science.gov (United States)

    Naimzada, Ahmad; Pireddu, Marina

    2015-07-01

    In this paper, we propose a financial market model with heterogeneous speculators, i.e., optimistic and pessimistic fundamentalists that, respectively, overestimate and underestimate the true fundamental value due to ambiguity in the stock market, which prevents them from relying on the true fundamental value in their speculations. Indeed, we assume that agents use in its place fundamental values determined by an imitative process. Namely, in forming their beliefs, speculators consider the relative profits realized by optimists and pessimists and update their fundamental values proportionally to those relative profits. Moreover, differently from the majority of the literature on the topic, the stock price is determined by a nonlinear mechanism that prevents divergence issues. For our model, we study, via analytical and numerical tools, the stability of the unique steady state, its bifurcations, as well as the emergence of complex behaviors. We also investigate multistability phenomena, characterized by the presence of coexisting attractors.

  9. A financial market model with endogenous fundamental values through imitative behavior

    Science.gov (United States)

    Naimzada, Ahmad; Pireddu, Marina

    2015-07-01

    In this paper, we propose a financial market model with heterogeneous speculators, i.e., optimistic and pessimistic fundamentalists that, respectively, overestimate and underestimate the true fundamental value due to ambiguity in the stock market, which prevents them from relying on the true fundamental value in their speculations. Indeed, we assume that agents use in its place fundamental values determined by an imitative process. Namely, in forming their beliefs, speculators consider the relative profits realized by optimists and pessimists and update their fundamental values proportionally to those relative profits. Moreover, differently from the majority of the literature on the topic, the stock price is determined by a nonlinear mechanism that prevents divergence issues. For our model, we study, via analytical and numerical tools, the stability of the unique steady state, its bifurcations, as well as the emergence of complex behaviors. We also investigate multistability phenomena, characterized by the presence of coexisting attractors.

  10. Regional Modelling for Optimal Allocation of Agricultural Crops Considering Environmental Impacts, Housing Value and Leisure Preferences.

    OpenAIRE

    Haruvy, Nava; Shalhevet, Sarit

    2006-01-01

    Regional planning should consider the impact of agricultural crops on housing value and leisure, as well as on the local environment. We designed an optimization model for allocating agricultural crops based on farmers profits as well as the impact on these three factors. Each crop creates a different landscape, as well as a different effect on shading and noise reduction. These in turn influence the value of nearby housing and the regional leisure opportunities. Each crop also has a positive...

  11. On decidability and model checking for a first order modal logic for value-passing processes

    Institute of Scientific and Technical Information of China (English)

    薛锐; 林惠民

    2003-01-01

    A semantic interpretation of a first order extension of Hennessy-Milner logic for value-passing processes, named HML(FO), is presented. The semantics is based on symbolic transitiongraphs with assignment. It is shown that the satisfiability of the two-variable sub-logic HML(FO2) ofHML(FO) is decidable, and the complexity discussed. Finally, a decision procedure for model checkingthe value-passing processes with respect to HML(FO2) is obtained.

  12. Using the Outcome-Driven Innovation Approach to Develop a Customer Value Model for Lighting

    OpenAIRE

    Dalton, Nonie

    2012-01-01

    LED technology is on the cusp of disrupting the entire lighting industry if the industry can develop new lighting products that customers will want to use and purchase. Understanding what customers value from their lighting products is an open question for the industry. To answer this question and to provide a deeper understanding of customer’s needs, I have applied the outcome driven innovation approach. Through this approach the customer value model was developed, which identifies customers...

  13. Analytic approximations to nonlinear boundary value problems modeling beam-type nano-electromechanical systems

    Energy Technology Data Exchange (ETDEWEB)

    Zou, Li [Dalian Univ. of Technology, Dalian City (China). State Key Lab. of Structural Analysis for Industrial Equipment; Liang, Songxin; Li, Yawei [Dalian Univ. of Technology, Dalian City (China). School of Mathematical Sciences; Jeffrey, David J. [Univ. of Western Ontario, London (Canada). Dept. of Applied Mathematics

    2017-06-01

    Nonlinear boundary value problems arise frequently in physical and mechanical sciences. An effective analytic approach with two parameters is first proposed for solving nonlinear boundary value problems. It is demonstrated that solutions given by the two-parameter method are more accurate than solutions given by the Adomian decomposition method (ADM). It is further demonstrated that solutions given by the ADM can also be recovered from the solutions given by the two-parameter method. The effectiveness of this method is demonstrated by solving some nonlinear boundary value problems modeling beam-type nano-electromechanical systems.

  14. Analytic Approximations to Nonlinear Boundary Value Problems Modeling Beam-Type Nano-Electromechanical Systems

    Science.gov (United States)

    Zou, Li; Liang, Songxin; Li, Yawei; Jeffrey, David J.

    2017-03-01

    Nonlinear boundary value problems arise frequently in physical and mechanical sciences. An effective analytic approach with two parameters is first proposed for solving nonlinear boundary value problems. It is demonstrated that solutions given by the two-parameter method are more accurate than solutions given by the Adomian decomposition method (ADM). It is further demonstrated that solutions given by the ADM can also be recovered from the solutions given by the two-parameter method. The effectiveness of this method is demonstrated by solving some nonlinear boundary value problems modeling beam-type nano-electromechanical systems.

  15. Structural Models that Manage IT Portfolio Affecting Business Value of Enterprise Architecture

    Science.gov (United States)

    Kamogawa, Takaaki

    This paper examines the structural relationships between Information Technology (IT) governance and Enterprise Architecture (EA), with the objective of enhancing business value in the enterprise society. Structural models consisting of four related hypotheses reveal the relationship between IT governance and EA in the improvement of business values. We statistically examined the hypotheses by analyzing validated questionnaire items from respondents within firms listed on the Japanese stock exchange who were qualified to answer them. We concluded that firms which have organizational ability controlled by IT governance are more likely to deliver business value based on IT portfolio management.

  16. Value Chain Model Development of Tuna and Tuna Alike In Indonesia

    Directory of Open Access Journals (Sweden)

    Ateng Supriatna

    2014-11-01

    Full Text Available Sustainability of tuna and tuna alike fisheries depend on chain value system formed by stakeholders ranging from product fishing, processing/diversification, distribution, and marketing. The objective of this research was to create chain value system model of tuna and tuna alike fisheries, to predict the interaction pattern of stakeholders and to formulate the precise strategy to minimize the synergy resistance of chain value system development strategy of tuna and tuna alike fisheries. Structural Equation Modelling (SEM was applied to analyze the chain value model. The stakeholders/the players of tuna and tuna alike fisheries basically have positive interaction (CE>0.The negative interaction occured on retailers and consumers. Interaction of retailers with consumers is significant (p<0.05 0.01. Interaction pattern of fishermen, retailers, collectors, and exporters affected significantlly by pricing and the level of role played by the stakeholders. The strategy to minimize the sinergy resistance of chain value system development are respectivelly the strategy of involving the group of fishermen on products pricing (CE= 1,176 and P= 0,000, the strategy of involving the seller groups on products pricing (CE-PE=1,08, CE-PB= 0,766, CE EKS= 2,028 AND P= 0,000, and the strategy on guaranteeing flexible interaction between retailer and consumers (CE= 0,179 and p = 0,01.Keywords: tuna and tuna alike, interaction, tuna and tuna alike stakeholders, and value chain

  17. Evaluating the Value of High Spatial Resolution in National Capacity Expansion Models using ReEDS

    Energy Technology Data Exchange (ETDEWEB)

    Krishnan, Venkat; Cole, Wesley

    2016-11-14

    Power sector capacity expansion models (CEMs) have a broad range of spatial resolutions. This paper uses the Regional Energy Deployment System (ReEDS) model, a long-term national scale electric sector CEM, to evaluate the value of high spatial resolution for CEMs. ReEDS models the United States with 134 load balancing areas (BAs) and captures the variability in existing generation parameters, future technology costs, performance, and resource availability using very high spatial resolution data, especially for wind and solar modeled at 356 resource regions. In this paper we perform planning studies at three different spatial resolutions--native resolution (134 BAs), state-level, and NERC region level--and evaluate how results change under different levels of spatial aggregation in terms of renewable capacity deployment and location, associated transmission builds, and system costs. The results are used to ascertain the value of high geographically resolved models in terms of their impact on relative competitiveness among renewable energy resources.

  18. A comparison of total precipitation values estimated from measurements and a 1D cloud model

    Directory of Open Access Journals (Sweden)

    Z. Aslan

    Full Text Available The purpose of this study is to establish a relation between observed total precipitation values and estimations from a one-dimensional diagnostic cloud model. Total precipitation values estimated from maximum liquid water content, maximum vertical velocity, cloud top height, and temperature excess are also used to provide an equation for the total precipitation prediction. Data for this study were collected in Istanbul during the autumns of 1987 and 1988. The statistical models are developed with multiple regression technique and then comparatively verified with independent data for 1990. The multiple regression coefficients are in the range of 75% to 80% in the statistical models. Results of the test showed that total precipitation values estimated from the above techniques are in good agreement, with correlation coefficient between 40% and 46% based on test data for 1990.

  19. A P-value model for theoretical power analysis and its applications in multiple testing procedures

    Directory of Open Access Journals (Sweden)

    Fengqing Zhang

    2016-10-01

    Full Text Available Abstract Background Power analysis is a critical aspect of the design of experiments to detect an effect of a given size. When multiple hypotheses are tested simultaneously, multiplicity adjustments to p-values should be taken into account in power analysis. There are a limited number of studies on power analysis in multiple testing procedures. For some methods, the theoretical analysis is difficult and extensive numerical simulations are often needed, while other methods oversimplify the information under the alternative hypothesis. To this end, this paper aims to develop a new statistical model for power analysis in multiple testing procedures. Methods We propose a step-function-based p-value model under the alternative hypothesis, which is simple enough to perform power analysis without simulations, but not too simple to lose the information from the alternative hypothesis. The first step is to transform distributions of different test statistics (e.g., t, chi-square or F to distributions of corresponding p-values. We then use a step function to approximate each of the p-value’s distributions by matching the mean and variance. Lastly, the step-function-based p-value model can be used for theoretical power analysis. Results The proposed model is applied to problems in multiple testing procedures. We first show how the most powerful critical constants can be chosen using the step-function-based p-value model. Our model is then applied to the field of multiple testing procedures to explain the assumption of monotonicity of the critical constants. Lastly, we apply our model to a behavioral weight loss and maintenance study to select the optimal critical constants. Conclusions The proposed model is easy to implement and preserves the information from the alternative hypothesis.

  20. Business Value of Information Technology Service Quality Based on Probabilistic Business-Driven Model

    Directory of Open Access Journals (Sweden)

    Jaka Sembiring

    2015-08-01

    Full Text Available The business value of information technology (IT services is often difficult to assess, especially from the point of view of a non-IT manager. This condition could severely impact organizational IT strategic decisions. Various approaches have been proposed to quantify the business value, but some are trapped in technical complexity while others misguide managers into directly and subjectively judging some technical entities outside their domain of expertise. This paper describes a method on how to properly capture both perspectives based on a  probabilistic business-driven model. The proposed model presents a procedure to calculate the business value of IT services. The model also covers IT security services and their business value as an important aspect of IT services that is not covered in previously published researches. The impact of changes in the quality of IT services on business value will also be discussed. A simulation and a case illustration are provided to show the possible application of the proposed model for a simple business process in an enterprise.

  1. The Impact of Different Absolute Solar Irradiance Values on Current Climate Model Simulations

    Science.gov (United States)

    Rind, David H.; Lean, Judith L.; Jonas, Jeffrey

    2014-01-01

    Simulations of the preindustrial and doubled CO2 climates are made with the GISS Global Climate Middle Atmosphere Model 3 using two different estimates of the absolute solar irradiance value: a higher value measured by solar radiometers in the 1990s and a lower value measured recently by the Solar Radiation and Climate Experiment. Each of the model simulations is adjusted to achieve global energy balance; without this adjustment the difference in irradiance produces a global temperature change of 0.48C, comparable to the cooling estimated for the Maunder Minimum. The results indicate that by altering cloud cover the model properly compensates for the different absolute solar irradiance values on a global level when simulating both preindustrial and doubled CO2 climates. On a regional level, the preindustrial climate simulations and the patterns of change with doubled CO2 concentrations are again remarkably similar, but there are some differences. Using a higher absolute solar irradiance value and the requisite cloud cover affects the model's depictions of high-latitude surface air temperature, sea level pressure, and stratospheric ozone, as well as tropical precipitation. In the climate change experiments it leads to an underestimation of North Atlantic warming, reduced precipitation in the tropical western Pacific, and smaller total ozone growth at high northern latitudes. Although significant, these differences are typically modest compared with the magnitude of the regional changes expected for doubled greenhouse gas concentrations. Nevertheless, the model simulations demonstrate that achieving the highest possible fidelity when simulating regional climate change requires that climate models use as input the most accurate (lower) solar irradiance value.

  2. Science Models as Value-Added Services for Scholarly Information Systems

    CERN Document Server

    Mutschke, Peter; Schaer, Philipp; Sure, York

    2011-01-01

    The paper introduces scholarly Information Retrieval (IR) as a further dimension that should be considered in the science modeling debate. The IR use case is seen as a validation model of the adequacy of science models in representing and predicting structure and dynamics in science. Particular conceptualizations of scholarly activity and structures in science are used as value-added search services to improve retrieval quality: a co-word model depicting the cognitive structure of a field (used for query expansion), the Bradford law of information concentration, and a model of co-authorship networks (both used for re-ranking search results). An evaluation of the retrieval quality when science model driven services are used turned out that the models proposed actually provide beneficial effects to retrieval quality. From an IR perspective, the models studied are therefore verified as expressive conceptualizations of central phenomena in science. Thus, it could be shown that the IR perspective can significantly...

  3. Modeling extreme PM10 concentration in Malaysia using generalized extreme value distribution

    Science.gov (United States)

    Hasan, Husna; Mansor, Nadiah; Salleh, Nur Hanim Mohd

    2015-05-01

    Extreme PM10 concentration from the Air Pollutant Index (API) at thirteen monitoring stations in Malaysia is modeled using the Generalized Extreme Value (GEV) distribution. The data is blocked into monthly selection period. The Mann-Kendall (MK) test suggests a non-stationary model so two models are considered for the stations with trend. The likelihood ratio test is used to determine the best fitted model and the result shows that only two stations favor the non-stationary model (Model 2) while the other eleven stations favor stationary model (Model 1). The return level of PM10 concentration that is expected to exceed the maximum once within a selected period is obtained.

  4. Simulating the Value of Concentrating Solar Power with Thermal Energy Storage in a Production Cost Model

    Energy Technology Data Exchange (ETDEWEB)

    Denholm, P.; Hummon, M.

    2012-11-01

    Concentrating solar power (CSP) deployed with thermal energy storage (TES) provides a dispatchable source of renewable energy. The value of CSP with TES, as with other potential generation resources, needs to be established using traditional utility planning tools. Production cost models, which simulate the operation of grid, are often used to estimate the operational value of different generation mixes. CSP with TES has historically had limited analysis in commercial production simulations. This document describes the implementation of CSP with TES in a commercial production cost model. It also describes the simulation of grid operations with CSP in a test system consisting of two balancing areas located primarily in Colorado.

  5. Proximate analysis based multiple regression models for higher heating value estimation of low rank coals

    Energy Technology Data Exchange (ETDEWEB)

    Akkaya, Ali Volkan [Department of Mechanical Engineering, Yildiz Technical University, 34349 Besiktas, Istanbul (Turkey)

    2009-02-15

    In this paper, multiple nonlinear regression models for estimation of higher heating value of coals are developed using proximate analysis data obtained generally from the low rank coal samples as-received basis. In this modeling study, three main model structures depended on the number of proximate analysis parameters, which are named the independent variables, such as moisture, ash, volatile matter and fixed carbon, are firstly categorized. Secondly, sub-model structures with different arrangements of the independent variables are considered. Each sub-model structure is analyzed with a number of model equations in order to find the best fitting model using multiple nonlinear regression method. Based on the results of nonlinear regression analysis, the best model for each sub-structure is determined. Among them, the models giving highest correlation for three main structures are selected. Although the selected all three models predicts HHV rather accurately, the model involving four independent variables provides the most accurate estimation of HHV. Additionally, when the chosen model with four independent variables and a literature model are tested with extra proximate analysis data, it is seen that that the developed model in this study can give more accurate prediction of HHV of coals. It can be concluded that the developed model is effective tool for HHV estimation of low rank coals. (author)

  6. The Value of Multivariate Model Sophistication: An Application to pricing Dow Jones Industrial Average options

    DEFF Research Database (Denmark)

    Rombouts, Jeroen V.K.; Stentoft, Lars; Violante, Francesco

    in their specification of the conditional variance, conditional correlation, and innovation distribution. All models belong to the dynamic conditional correlation class which is particularly suited because it allows to consistently estimate the risk neutral dynamics with a manageable computational effort in relatively...... innovation for a Laplace innovation assumption improves the pricing in a smaller way. Apart from investigating directly the value of model sophistication in terms of dollar losses, we also use the model condence set approach to statistically infer the set of models that delivers the best pricing performance....

  7. Four competing interactions for models with an uncountable set of spin values on a Cayley tree

    Science.gov (United States)

    Rozikov, U. A.; Haydarov, F. H.

    2017-06-01

    We consider models with four competing interactions ( external field, nearest neighbor, second neighbor, and three neighbors) and an uncountable set [0, 1] of spin values on the Cayley tree of order two. We reduce the problem of describing the splitting Gibbs measures of the model to the problem of analyzing solutions of a nonlinear integral equation and study some particular cases for Ising and Potts models. We also show that periodic Gibbs measures for the given models either are translation invariant or have the period two. We present examples where periodic Gibbs measures with the period two are not unique.

  8. Interactive breast mass segmentation using a convex active contour model with optimal threshold values.

    Science.gov (United States)

    Acho, Sussan Nkwenti; Rae, William Ian Duncombe

    2016-10-01

    A convex active contour model requires a predefined threshold value to determine the global solution for the best contour to use when doing mass segmentation. Fixed thresholds or manual tuning of threshold values for optimum mass boundary delineation are impracticable. A proposed method is presented to determine an optimized mass-specific threshold value for the convex active contour derived from the probability matrix of the mass with the particle swarm optimization method. We compared our results with the Chan-Vese segmentation and a published global segmentation model on masses detected on direct digital mammograms. The regional term of the convex active contour model maximizes the posterior partitioning probability for binary segmentation. Suppose the probability matrix is binary thresholded using the particle swarm optimization to obtain a value T1, we define the optimal threshold value for the global minimizer of the convex active contour as the mean intensity of all pixels whose probabilities are greater than T1. The mean Jaccard similarity indices were 0.89±0.07 for the proposed/Chan-Vese method and 0.88±0.06 for the proposed/published segmentation model. The mean Euclidean distance between Fourier descriptors of the segmented areas was 0.05±0.03 for the proposed/Chan-Vese method and 0.06±0.04 for the proposed/published segmentation model. This efficient method avoids problems of initial level set contour placement and contour re-initialization. Moreover, optimum segmentation results are realized for all masses improving on the fixed threshold value of 0.5 proposed elsewhere. Copyright © 2016 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  9. An ARIMA model for forecasting Wi-Fi data network traffic values

    Directory of Open Access Journals (Sweden)

    Cesar Augusto Hernández Suarez

    2010-07-01

    Full Text Available This present scientific and technological research was aimed at showing that time series represent an excellent tool for data traffic modelling within Wi-Fi networks. Box-Jenkins methodology (described herein was used for this purpose. Wi-Fi traffic modelling through correlated models, such as time series, allowed a great part of the data’s behaviourl dynamics to be adjusted into a sin- gle equation and future traffic values to be estimated based on this. All this is advantageous when it comes to planning integrated coverage, reserving resources and performing more efficient and timely control at different levels of the Wi-Fi data network func- tional hierarchy. A six order ARIMA traffic model was obtained as a research outcome which predicted traffic with relatively small mean square error values for an 18-day term.

  10. Predictive error dependencies when using pilot points and singular value decomposition in groundwater model calibration

    DEFF Research Database (Denmark)

    Christensen, Steen; Doherty, John

    2008-01-01

    over the model area. Singular value decomposition (SVD) of the (possibly weighted) sensitivity matrix of the pilot point based model produces eigenvectors of which we pick a small number corresponding to significant eigenvalues. Super parameters are defined as factors through which parameter...... conditions near an inflow boundary where data is lacking and which exhibit apparent significant nonlinear behavior. It is shown that inclusion of Tikhonov regularization can stabilize and speed up the parameter estimation process. A method of linearized model analysis of predictive uncertainty...... nonlinear functions. Recommendations concerning the use of pilot points and singular value decomposition in real-world groundwater model calibration are finally given. (c) 2008 Elsevier Ltd. All rights reserved....

  11. Models for measuring and predicting shareholder value: A study of third party software service providers

    Indian Academy of Sciences (India)

    N Viswanadham; Poornima Luthra

    2005-04-01

    In this study, we use the strategic profit model (SPM) and the economic value-added (EVA to measure shareholder value). SPM measures the return on net worth (RONW) which is defined as the return on assets (ROA) multiplied by the financial leverage. EVA is defined as the firm’s net operating profit after taxes (NOPAT) minus the capital charge. Both, RONW and EVA provide an indication of how much shareholder value a firm creates for its shareholders, year on year. With the increasing focus on creation of shareholder value and core competencies, many companies are outsourcing their information technology (IT) related activities to third party software companies. Indian software companies have become leaders in providing these services. Companies from several other countries are also competing for the top slot. We use the SPM and EVA models to analyse the four listed players of the software industry using the publicly available published data. We compare the financial data obtained from the models, and use peer average data to provide customized recommendations for each company to improve their shareholder value. Assuming that the companies follow these rules, we also predict future RONW and EVA for the companies for the financial year 2005. Finally, we make several recommendations to software providers for effectively competing in the global arena.

  12. Bayesian estimation in IRT models with missing values in background variables

    Directory of Open Access Journals (Sweden)

    Christian Aßmann

    2015-12-01

    Full Text Available Large scale assessment studies typically aim at investigating the relationship between persons competencies and explaining variables. Individual competencies are often estimated by explicitly including explaining background variables into corresponding Item Response Theory models. Since missing values in background variables inevitably occur, strategies to handle the uncertainty related to missing values in parameter estimation are required. We propose to adapt a Bayesian estimation strategy based on Markov Chain Monte Carlo techniques. Sampling from the posterior distribution of parameters is thereby enriched by sampling from the full conditional distribution of the missing values. We consider non-parametric as well as parametric approximations for the full conditional distributions of missing values, thus allowing for a flexible incorporation of metric as well as categorical background variables. We evaluate the validity of our approach with respect to statistical accuracy by a simulation study controlling the missing values generating mechanism. We show that the proposed Bayesian strategy allows for effective comparison of nested model specifications via gauging highest posterior density intervals of all involved model parameters. An illustration of the suggested approach uses data from the National Educational Panel Study on mathematical competencies of fifth grade students.

  13. Presenting the SCL model: adding value to business strategy through UCD principles

    OpenAIRE

    Moreno, M. A.; Lilley, D; Lofthouse, V.

    2014-01-01

    This paper presents the Sustainable Consumption Leveraging (SCL) Model and its toolkit, which was developed to help businesses examine their potential for enabling sustainable consumption whilst identifying areas of opportunity to improve their business model and value proposition. The paper begins by establishing the contribution of business towards sustainable consumption and sets out user-centred design (UCD) principles as a valuable approach to leverage sustainable consumption. The relati...

  14. Assessing the Performance of Value-at-Risk Models in Chinese Stock Market

    OpenAIRE

    Lin, Lin

    2008-01-01

    In this paper, parametric, nonparametric, and semi-parametric models are applied to a hypothetical portfolio - Shanghai Stock Exchange Composite Index to estimate Value-at-Risk in Chinese market. In order to assess the performance of different approaches, the statistic features such as kurtosis, skewness and autocorrelation of daily return have been studied. In addition, this article analyzes the advantages and disadvantages of each model and implements back-tests to check the validation of t...

  15. An Evaluation of Value at Risk Models in Chinese Stock Market

    OpenAIRE

    Xiao, Ying

    2013-01-01

    The aim of this article is to examine the predictive performance of VaR model in Chinese stock market and try to find the rational choice of models for China. In order to achieve this goal, Historical simulation approach, Bootstrapped HS, Hull White method, parametric approach with volatility adjustment, Generalized extreme value theory and Peaks-over-threshold approach are applied to the Shanghai Stock Exchange Composite Index (SSECI) and Shenzhen Stock Exchange Composite Index (SZSECI) to e...

  16. New Values of Cross-Talk Parameters for Twisted Pair Model

    Directory of Open Access Journals (Sweden)

    Milos Kozak

    2010-01-01

    Full Text Available Near-end Crosstalk (NEXT and Far-end Crosstalk (FEXT of unshielded twisted pair (UTP cable are the main factors limiting the information capacity in data transmission. Crosstalk depends mostly on the frequency. Frequency dependent transfer functions and crosstalk attenuation may be obtained by measurement, but for the analytical description of the transmission channel's parameters is useful to define functions modelling the crosstalk. The paper describes the measuring facility, presents the measured waveforms and the values of model parameters.

  17. Reference values and physiological characterization of a specific isolated pig kidney perfusion model

    OpenAIRE

    Meissler Michael; Fischer Axel; Fehrenberg Claudia; Grosse-Siestrup Christian; Unger Volker; Groneberg David A

    2007-01-01

    Abstract Background Models of isolated and perfused kidneys are used to study the effects of drugs, hazardous or toxic substances on renal functions. Since physiological and morphological parameters of small laboratory animal kidneys are difficult to compare to human renal parameters, porcine kidney perfusion models have been developed to simulate closer conditions to the human situation, but exact values of renal parameters for different collection and perfusion conditions have not been repo...

  18. Indeterminate values of target variable in development of credit scoring models

    Directory of Open Access Journals (Sweden)

    Martin Řezáč

    2013-01-01

    Full Text Available In the beginning of every modelling procedure, the first question to ask is what we are trying to predict by the model. In credit scoring the most frequent case is modelling of probability of default; however other situations, such as fraud, revolving of the credit or success of collections could be predicted as well. Nevertheless, the first step is always to define the target variable.The target variable is generally an ’output’ of the model. It contains the information on the available data that we want to predict in future data. In credit scoring it is commonly called good/bad definition. In this paper we study the effect of use of indeterminate value of target variable in development of credit scoring models. We explain the basic principles of logistic regression modelling and selection of target variable. Next, the focus is given to introduction of some of the widely used statistics for model assessment. The main part of the paper is devoted to development and assessment of 27 credit scoring models on real credit data, which are built up and assessed according various definitions of target variable. We show that there is a valid reason for some target definitions to include the indeterminate value into the modelling process, as it provided us with convincing results.

  19. The effects of numerical-model complexity and observation type on estimated porosity values

    Science.gov (United States)

    Starn, J. Jeffrey; Bagtzoglou, Amvrossios C.; Green, Christopher T.

    2015-09-01

    The relative merits of model complexity and types of observations employed in model calibration are compared. An existing groundwater flow model coupled with an advective transport simulation of the Salt Lake Valley, Utah (USA), is adapted for advective transport, and effective porosity is adjusted until simulated tritium concentrations match concentrations in samples from wells. Two calibration approaches are used: a "complex" highly parameterized porosity field and a "simple" parsimonious model of porosity distribution. The use of an atmospheric tracer (tritium in this case) and apparent ages (from tritium/helium) in model calibration also are discussed. Of the models tested, the complex model (with tritium concentrations and tritium/helium apparent ages) performs best. Although tritium breakthrough curves simulated by complex and simple models are very generally similar, and there is value in the simple model, the complex model is supported by a more realistic porosity distribution and a greater number of estimable parameters. Culling the best quality data did not lead to better calibration, possibly because of processes and aquifer characteristics that are not simulated. Despite many factors that contribute to shortcomings of both the models and the data, useful information is obtained from all the models evaluated. Although any particular prediction of tritium breakthrough may have large errors, overall, the models mimic observed trends.

  20. Use of the macroeconomic models in the analysis of the balance value

    Directory of Open Access Journals (Sweden)

    Manole Tatiana

    2013-02-01

    Full Text Available This paper investigates the problem of using the macroeconomic models to analyze the balance value. Actually, the analyses are performed under the IS-LM model. Since the balance value depends on the balance of goods and services market and the money market, the authors have studied the possibilities of balance on these two markets in terms of conditions from the Republic of Moldova. There is investigated the ratio of monetary mass and GDP under the law of equality of the amount of money in circulation. The case of the Republic of Moldova indicates a great increase of the monetary mass related to the GDP growth, fact that produces an inflationary effect. The interest rate is a key indicator in analyzing the balance value.

  1. USE OF THE MACROECONOMIC MODELS IN THE ANALYSIS OF THE BALANCE VALUE

    Directory of Open Access Journals (Sweden)

    Tatiana MANOLE

    2013-01-01

    Full Text Available This paper investigates the problem of using the macroeconomic models to analyze the balance value. Actually, the analyses are performed under the IS-LM model. Since the balance value depends on the balance of goods and services market and the money market, the authors have studied the possibilities of balance on these two markets in terms of conditions from the Republic of Moldova. There is investigated the ratio of monetary mass and GDP under the law of equality of the amount of money in circulation. The case of the Republic of Moldova indicates a great increase of the monetary mass related to the GDP growth, fact that produces an inflationary effect. The interest rate is a key indicator in analyzing the balance value.

  2. Characterization, thermochemical conversion studies, and heating value modeling of municipal solid waste.

    Science.gov (United States)

    Shi, Honghong; Mahinpey, Nader; Aqsha, Aqsha; Silbermann, Rico

    2016-02-01

    A study was carried out to examine the characteristics of municipal solid waste (MSW) from the City of Red Deer, Alberta, Canada. Experiments were performed for determining the moisture content, proximate and ultimate compositions, heating value of fourteen wastes in different categories. Their thermal weight loss behaviors under pyrolysis/torrefaction conditions were also investigated in a thermogravimetric analyzer (TGA). An empirical model was developed for the high heating value (HHV) estimation of MSW. A total of 193 experimental data were collected from this study and those in the literature, of which 161 data were used for model derivation; and, 32 additional data were used for model validation. The model was developed using multiple regression analysis and a stepwise regression method: HHV (MJ/kg)=0.350C+1.01H-0.0826O, which is expressed in terms of weight percentages on a dry basis of carbon (C), hydrogen (H) and oxygen (O). The validation results suggest that this model was effective in producing accurate outputs that were close to the experimental values. In addition, it had the lowest error level in comparison with seven other models from the literature.

  3. Value-at-Risk forecasts by a spatiotemporal model in Chinese stock market

    Science.gov (United States)

    Gong, Pu; Weng, Yingliang

    2016-01-01

    This paper generalizes a recently proposed spatial autoregressive model and introduces a spatiotemporal model for forecasting stock returns. We support the view that stock returns are affected not only by the absolute values of factors such as firm size, book-to-market ratio and momentum but also by the relative values of factors like trading volume ranking and market capitalization ranking in each period. This article studies a new method for constructing stocks' reference groups; the method is called quartile method. Applying the method empirically to the Shanghai Stock Exchange 50 Index, we compare the daily volatility forecasting performance and the out-of-sample forecasting performance of Value-at-Risk (VaR) estimated by different models. The empirical results show that the spatiotemporal model performs surprisingly well in terms of capturing spatial dependences among individual stocks, and it produces more accurate VaR forecasts than the other three models introduced in the previous literature. Moreover, the findings indicate that both allowing for serial correlation in the disturbances and using time-varying spatial weight matrices can greatly improve the predictive accuracy of a spatial autoregressive model.

  4. The role of initial values in nonstationary fractional time series models

    DEFF Research Database (Denmark)

    Johansen, Søren; Nielsen, Morten Ørregaard

    We consider the nonstationary fractional model $\\Delta^{d}X_{t}=\\varepsilon _{t}$ with $\\varepsilon_{t}$ i.i.d.$(0,\\sigma^{2})$ and $d>1/2$. We derive an analytical expression for the main term of the asymptotic bias of the maximum likelihood estimator of $d$ conditional on initial values, and we...

  5. Smart City Research : Contextual Conditions, Governance Models, and Public Value Assessment

    NARCIS (Netherlands)

    Meijer, Albert J.|info:eu-repo/dai/nl/172436729; Gil-Garcia, J. Ramon; Bolívar, Manuel Pedro Rodríguez

    2016-01-01

    There are three issues that are crucial to advancing our academic understanding of smart cities: (1) contextual conditions, (2) governance models, and (3) the assessment of public value. A brief review of recent literature and the analysis of the included papers provide support for the assumption

  6. Predictive Models of Alcohol Use Based on Attitudes and Individual Values

    Science.gov (United States)

    Del Castillo Rodríguez, José A. García; López-Sánchez, Carmen; Soler, M. Carmen Quiles; Del Castillo-López, Álvaro García; Pertusa, Mónica Gázquez; Campos, Juan Carlos Marzo; Inglés, Cándido J.

    2013-01-01

    Two predictive models are developed in this article: the first is designed to predict people' attitudes to alcoholic drinks, while the second sets out to predict the use of alcohol in relation to selected individual values. University students (N = 1,500) were recruited through stratified sampling based on sex and academic discipline. The…

  7. The Relationship between Civic Behavior and Civic Values: A Conceptual Model

    Science.gov (United States)

    Bryant, Alyssa N.; Gayles, Joy Gaston; Davis, Heather A.

    2012-01-01

    This study examined the relationships among college students' civic values and behaviors, college culture, and college involvement, accounting for their pre-college inclinations toward civic responsibility. Using a longitudinal, national dataset comprised of 3,680 college students, the study employed structural equation modeling to identify a…

  8. Value-Added Model (VAM) Research for Educational Policy: Framing the Issue

    Science.gov (United States)

    Amrein-Beardsley, Audrey; Collins, Clarin; Polasky, Sarah A.; Sloat, Edward F.

    2013-01-01

    In this manuscript, the guest editors of the EPAA Special Issue on "Value-Added Model (VAM) Research for Educational Policy" (1) introduce the background and policy context surrounding the increased use of VAMs for teacher evaluation and accountability purposes across the United States; (2) summarize the five research papers and one…

  9. A conceptual model of channel choice : measuring online and offline shopping value perceptions

    NARCIS (Netherlands)

    Broekhuizen, Thijs L.J.; Jager, Wander

    2004-01-01

    This study tries to understand how consumers evaluate channels for their purchasing. Specifically, it develops a conceptual model that addresses consumer value perceptions of using the Internet versus the traditional (physical) channel. Previous research showed that perceptions of price, product qua

  10. Assessing the "Rothstein Falsification Test": Does It Really Show Teacher Value-Added Models Are Biased?

    Science.gov (United States)

    Goldhaber, Dan; Chaplin, Duncan Dunbar

    2015-01-01

    In an influential paper, Jesse Rothstein (2010) shows that standard value-added models (VAMs) suggest implausible and large future teacher effects on past student achievement. This is the basis of a falsification test that "appears" to indicate bias in typical VAM estimates of teacher contributions to student learning on standardized…

  11. A conceptual model of channel choice : measuring online and offline shopping value perceptions

    NARCIS (Netherlands)

    Broekhuizen, Thijs L.J.; Jager, Wander

    2004-01-01

    This study tries to understand how consumers evaluate channels for their purchasing. Specifically, it develops a conceptual model that addresses consumer value perceptions of using the Internet versus the traditional (physical) channel. Previous research showed that perceptions of price, product qua

  12. Improved Weighted Shapley Value Model for the Fourth Party Logistics Supply Chain Coalition

    Directory of Open Access Journals (Sweden)

    Na Xu

    2013-01-01

    Full Text Available How to make the individual get the reasonable and practical profit among the fourth party logistics supply chain coalition system is still a question for further study. Considering the characteristics of the fourth party logistics supply chain coalition, this paper combines Shapley Value with Distribution according to Contribution, two methods in the application, and then adjusts the profit allocated to each member reasonably based on the actual coalition situation named improved weighted Shapley Value model. In this paper, we first analyze the fourth party logistics supply chain coalition profit allocation models, the classical Shapley value method. Then, we analyze the weight of individual enterprise in the coalition by the analytic hierarchy process. To each enterprise, the weight is determined by the investment risks, information divulging risks, and failure risks. Finally, the numerical study shows that the profit allocation method improved weighted Shapley value model is relatively rational and practical. Thus, the proposed combined model is a useful profit allocation mechanism for the fourth party logistics supply chain coalition that the contribution and risks are fully considered.

  13. Value Added Models and the Implementation of the National Standards of K-12 Physical Education

    Science.gov (United States)

    Seymour, Clancy M.; Garrison, Mark J.

    2017-01-01

    The implementation of value-added models of teacher evaluation continue to expand in public education, but the effects of using student test scores to evaluate K-12 physical educators necessitates further discussion. Using the five National Standards for K-12 Physical Education from the Society of Health and Physical Educators America (SHAPE),…

  14. The PKRC's Value as a Professional Development Model Validated

    Science.gov (United States)

    Larson, Dale

    2013-01-01

    After a brief review of the 4-H professional development standards, a new model for determining the value of continuing professional development is introduced and applied to the 4-H standards. The validity of the 4-H standards is affirmed. 4-H Extension professionals are encouraged to celebrate the strength of their standards and to engage the…

  15. A time series model: First-order integer-valued autoregressive (INAR(1))

    Science.gov (United States)

    Simarmata, D. M.; Novkaniza, F.; Widyaningsih, Y.

    2017-07-01

    Nonnegative integer-valued time series arises in many applications. A time series model: first-order Integer-valued AutoRegressive (INAR(1)) is constructed by binomial thinning operator to model nonnegative integer-valued time series. INAR (1) depends on one period from the process before. The parameter of the model can be estimated by Conditional Least Squares (CLS). Specification of INAR(1) is following the specification of (AR(1)). Forecasting in INAR(1) uses median or Bayesian forecasting methodology. Median forecasting methodology obtains integer s, which is cumulative density function (CDF) until s, is more than or equal to 0.5. Bayesian forecasting methodology forecasts h-step-ahead of generating the parameter of the model and parameter of innovation term using Adaptive Rejection Metropolis Sampling within Gibbs sampling (ARMS), then finding the least integer s, where CDF until s is more than or equal to u . u is a value taken from the Uniform(0,1) distribution. INAR(1) is applied on pneumonia case in Penjaringan, Jakarta Utara, January 2008 until April 2016 monthly.

  16. Predictive Models of Alcohol Use Based on Attitudes and Individual Values

    Science.gov (United States)

    Del Castillo Rodríguez, José A. García; López-Sánchez, Carmen; Soler, M. Carmen Quiles; Del Castillo-López, Álvaro García; Pertusa, Mónica Gázquez; Campos, Juan Carlos Marzo; Inglés, Cándido J.

    2013-01-01

    Two predictive models are developed in this article: the first is designed to predict people' attitudes to alcoholic drinks, while the second sets out to predict the use of alcohol in relation to selected individual values. University students (N = 1,500) were recruited through stratified sampling based on sex and academic discipline. The…

  17. Valuing structure, model uncertainty and model averaging in vector autoregressive processes

    NARCIS (Netherlands)

    R.W. Strachan (Rodney); H.K. van Dijk (Herman)

    2004-01-01

    textabstractEconomic policy decisions are often informed by empirical analysis based on accurate econometric modeling. However, a decision-maker is usually only interested in good estimates of outcomes, while an analyst must also be interested in estimating the model. Accurate inference on structura

  18. An analysis of the educational value of low-fidelity anatomy models as external representations.

    Science.gov (United States)

    Chan, Lap Ki; Cheng, Maurice M W

    2011-01-01

    Although high-fidelity digital models of human anatomy based on actual cross-sectional images of the human body have been developed, reports on the use of physical models in anatomy teaching continue to appear. This article aims to examine the common features shared by these physical models and analyze their educational value based on the literature on cognition, learning, and external representations. A literature search on these physical models in three popular anatomy journals published over a 10-year period from 2001 to 2010 found that all of them have low fidelity: they oftentimes do not closely resemble the regions of the human body they are representing. They include only a small number of the structures that exist in these regions of the human body and do not accurately represent the shape and surface details of these structures. However, these models strongly correspond to the human body in the spatial relationship of the represented structures, which is crucial to achieving their educational purpose of teaching three-dimensional comprehension and anatomical reasoning. The educational value of these models includes acting as memory aids, reducing cognitive overload, facilitating problem solving, and arousing students' enthusiasm and participation. Because these models often lack a close resemblance to the human body, their use in anatomy teaching should always be accompanied by adequate explanations to the students to establish the correspondence between the models and the parts of the human body they are representing. Copyright © 2011 American Association of Anatomists.

  19. Measuring and Managing Value Co-Creation Process: Overview of Existing Theoretical Models

    Directory of Open Access Journals (Sweden)

    Monika Skaržauskaitė

    2013-08-01

    Full Text Available Purpose — the article is to provide a holistic view on concept of value co-creation and existing models for measuring and managing it by conducting theoretical analysis of scientific literature sources targeting the integration of various approaches. Most important and relevant results of the literature study are presented with a focus on changed roles of organizations and consumers. This article aims at contributing theoretically to the research stream of measuring co-creation of value in order to gain knowledge for improvement of organizational performance and enabling new and innovative means of value creation. Design/methodology/approach. The nature of this research is exploratory – theoretical analysis and synthesis of scientific literature sources targeting the integration of various approaches was performed. This approach was chosen due to the absence of established theory on models of co-creation, possible uses in organizations and systematic overview of tools measuring/suggesting how to measure co-creation. Findings. While the principles of managing and measuring co-creation in regards of consumer motivation and involvement are widely researched, little attempt has been made to identify critical factors and create models dealing with organizational capabilities and managerial implications of value co-creation. Systematic analysis of literature revealed a gap not only in empirical research concerning organization’s role in co-creation process, but in theoretical and conceptual levels, too. Research limitations/implications. The limitations of this work as a literature review lies in its nature – the complete reliance on previously published research papers and the availability of these studies. For a deeper understanding of co-creation management and for developing models that can be used in real-life organizations, a broader theoretical, as well as empirical, research is necessary. Practical implications. Analysis of the

  20. Value-added strategy models to provide quality services in senior health business.

    Science.gov (United States)

    Yang, Ya-Ting; Lin, Neng-Pai; Su, Shyi; Chen, Ya-Mei; Chang, Yao-Mao; Handa, Yujiro; Khan, Hafsah Arshed Ali; Elsa Hsu, Yi-Hsin

    2017-06-20

    The rapid population aging is now a global issue. The increase in the elderly population will impact the health care industry and health enterprises; various senior needs will promote the growth of the senior health industry. Most senior health studies are focused on the demand side and scarcely on supply. Our study selected quality enterprises focused on aging health and analyzed different strategies to provide excellent quality services to senior health enterprises. We selected 33 quality senior health enterprises in Taiwan and investigated their excellent quality services strategies by face-to-face semi-structured in-depth interviews with CEO and managers of each enterprise in 2013. A total of 33 senior health enterprises in Taiwan. Overall, 65 CEOs and managers of 33 enterprises were interviewed individually. None. Core values and vision, organization structure, quality services provided, strategies for quality services. This study's results indicated four type of value-added strategy models adopted by senior enterprises to offer quality services: (i) residential care and co-residence model, (ii) home care and living in place model, (iii) community e-business experience model and (iv) virtual and physical portable device model. The common part in these four strategy models is that the services provided are elderly centered. These models offer virtual and physical integrations, and also offer total solutions for the elderly and their caregivers. Through investigation of successful strategy models for providing quality services to seniors, we identified opportunities to develop innovative service models and successful characteristics, also policy implications were summarized. The observations from this study will serve as a primary evidenced base for enterprises developing their senior market and, also for promoting the value co-creation possibility through dialogue between customers and those that deliver service.

  1. Comparison of stock valuation models with their intrinsic value in Tehran Stock Exchange

    Directory of Open Access Journals (Sweden)

    Ali Amiri

    2016-06-01

    Full Text Available Stock evaluation is one of the most important and most complex operational processes in the stock exchange. In financial markets, the pricing of tradable assets plays a basic role in resource allocation. After initial stock valuation of listed companies in Tehran Stock Exchange, some changes were observed in prices with the value set by the Stock Exchange. The aim of this study was to determine the model applied in the formation of stock prices in the stock market to find an appropriate market value model among value-based valuation models. To test the models of stock valuation, ordinary least square regression was used. Also, E-Views software was used for further data analysis. The sample included all the companies listed in Tehran Stock Exchange from 2008 till 2013. Based on the stratified random sampling, each industry was selected as a category and using Cochran formula, sample size of 40 participants was determined from each category. The data analysis indicated that the price-to-book ratio (P/B ratio had the highest adjustment factor and had been set as the best stock valuation model.

  2. Valuing snorkeling visits to the Florida Keys with stated and revealed preference models.

    Science.gov (United States)

    Park, Timothy; Bowker, J M; Leeworthy, Vernon R

    2002-07-01

    Coastal coral reefs, especially in the Florida Keys, are declining at a disturbing rate. Marine ecologists and reef scientists have emphasized the importance of establishing nonmarket values of coral reefs to assess the cost effectiveness of coral reef management and remediation programs. The purpose of this paper is to develop a travel cost-contingent valuation model of demand for trips to the Florida Keys focusing on willingness to pay (WTP) to preserve the current water quality and health of the coral reefs. The stated and revealed preference models allow the marginal valuation of recreationists to adjust depending on current and planned trip commitments in valuing nonmarginal policy changes in recreational opportunities. The integrated model incorporates key factors for establishing baseline amenity values for tourist dive sites, including perceptions of reef quality and dive conditions, the role of substitute sites, and the quality and availability of tourist facilities and recreation opportunities. The travel cost and WTP model differ in identifying critical variables and provide insight into the adjustment of trip decisions across alternative destination sites and the valuation of trips. In contrast to the travel cost model, a measure of the availability of substitute sites and total recreation activities does not have a significant impact on WTP valuations reported by snorkelers. Snorkelers engage in a relatively focused set of activities, suggesting that these recreationists may not shift expenditures to other sites or other recreation activities in the Florida Keys when confronted with increased access costs for the snorkeling experience.

  3. Study on the Cooperative E-commerce Model between Enterprises based on the Value Chain

    Institute of Scientific and Technical Information of China (English)

    XU Jun[1,2; LIU Xiaoxing[1

    2015-01-01

    The real e-commerce between enterprises is based on the internal departments of enterprises and the cooperative interaction between enterprise and its partners. In this paper, on the basis of the theory of value chain, 11 cooperative e-commerce models between enterprises have been classified according to the activities of the cooperation between enterprises, and then every cooperative e-commerce model between enterprises is discussed. In practice, cooperative e-commerce between enterprises can be a combination of one or more e-commerce models between enterprises.

  4. The performance of composite forecast models of value-at-risk in the energy market

    Energy Technology Data Exchange (ETDEWEB)

    Chiu, Yen-Chen [Department of Finance, National Taichung Institute of Technology (China); Chuang, I-Yuan; Lai, Jing-Yi [Department of Finance, National Chung Cheng University (China)

    2010-03-15

    This paper examines a comparative evaluation of the predictive performance of various Value-at-Risk (VaR) models in the energy market. This study extends the conventional research in literature, by proposing composite forecast models for applying to Brent and WTI crude oil prices. Forecasting techniques considered here include the EWMA, stable density, Kernel density, Hull and White, GARCH-GPD, plus composite forecasts from linearly combining two or more of the competing models above. Findings show Hull and White to be the most powerful approach for capturing downside risk in the energy market. Reasonable results are also available from carefully combining VaR forecasts. (author)

  5. A note on Black-Scholes pricing model for theoretical values of stock options

    Science.gov (United States)

    Edeki, S. O.; Ugbebor, O. O.; Owoloko, E. A.

    2016-02-01

    In this paper, we consider some conditions that transform the classical Black-Scholes Model for stock options valuation from its partial differential equation (PDE) form to an equivalent ordinary differential equation (ODE) form. In addition, we propose a relatively new semi-analytical method for the solution of the transformed Black-Scholes model. The obtained solutions via this method can be used to find the theoretical values of the stock options in relation to their fair prices. In considering the reliability and efficiency of the models, we test some cases and the results are in good agreement with the exact solution.

  6. Mathematical Models For Calculating The Value Of Dynamic Viscosity Of A Liquid

    Directory of Open Access Journals (Sweden)

    Ślęzak M.

    2015-06-01

    Full Text Available The objective of this article is to review models for calculating the value of liquid dynamic viscosity. Issues of viscosity and rheological properties of liquid ferrous solutions are important from the perspective of modelling, along with the control of actual production processes related to the manufacturing of metals, including iron and steel. Conducted analysis within literature indicates that there are many theoretical considerations concerning the effect of viscosity of liquid metals solutions. The vast majority of models constitute a group of theoretical or semi-empirical equations, where thermodynamic parameters of solutions, or some parameters determined by experimental methods, are used for calculations of the dynamic viscosity coefficient.

  7. Exploring the use of fuzzy logic models to describe the relation between SBP and RR values.

    Science.gov (United States)

    Gouveia, Sónia; Brás, Susana

    2012-01-01

    In this work, fuzzy logic based models are used to describe the relation between systolic blood pressure (SBP) and tachogram (RR) values as a function of the SBP level. The applicability of these methods is tested using real data in Lying (L) and Standing (S) conditions and generated surrogate data. The results indicate that fuzzy models exhibit a similar performance in both conditions, and their performance is significantly higher with real data than with surrogate data. These results point out the potential of a fuzzy logic approach to model properly the relation between SBP and RR values. As a future work, it remains to assess the clinical impact of these findings and inherent repercussion on the estimation of time domain baroreflex sensitivity indices.

  8. Simulation modeling to derive the value-of-information for risky animal disease-import decisions.

    Science.gov (United States)

    Disney, W Terry; Peters, Mark A

    2003-11-12

    Simulation modeling can be used in aiding decision-makers in deciding when to invest in additional research and when the risky animal disease-import decision should go forward. Simulation modeling to evaluate value-of-information (VOI) techniques provides a robust, objective and transparent framework for assisting decision-makers in making risky animal and animal product decisions. In this analysis, the hypothetical risk from poultry disease in chicken-meat imports was modeled. Economic criteria were used to quantify alternative confidence-increasing decisions regarding potential import testing and additional research requirements. In our hypothetical example, additional information about poultry disease in the exporting country (either by requiring additional export-flock surveillance that results in no sign of disease, or by conducting additional research into lack of disease transmittal through chicken-meat ingestion) captured >75% of the value-of-information attainable regarding the chicken-meat-import decision.

  9. The value of a statistical life: a meta-analysis with a mixed effects regression model.

    Science.gov (United States)

    Bellavance, François; Dionne, Georges; Lebeau, Martin

    2009-03-01

    The value of a statistical life (VSL) is a very controversial topic, but one which is essential to the optimization of governmental decisions. We see a great variability in the values obtained from different studies. The source of this variability needs to be understood, in order to offer public decision-makers better guidance in choosing a value and to set clearer guidelines for future research on the topic. This article presents a meta-analysis based on 39 observations obtained from 37 studies (from nine different countries) which all use a hedonic wage method to calculate the VSL. Our meta-analysis is innovative in that it is the first to use the mixed effects regression model [Raudenbush, S.W., 1994. Random effects models. In: Cooper, H., Hedges, L.V. (Eds.), The Handbook of Research Synthesis. Russel Sage Foundation, New York] to analyze studies on the value of a statistical life. We conclude that the variability found in the values studied stems in large part from differences in methodologies.

  10. An improved multi-value cellular automata model for heterogeneous bicycle traffic flow

    Energy Technology Data Exchange (ETDEWEB)

    Jin, Sheng [College of Civil Engineering and Architecture, Zhejiang University, Hangzhou, 310058 China (China); Qu, Xiaobo [Griffith School of Engineering, Griffith University, Gold Coast, 4222 Australia (Australia); Xu, Cheng [Department of Transportation Management Engineering, Zhejiang Police College, Hangzhou, 310053 China (China); College of Transportation, Jilin University, Changchun, 130022 China (China); Ma, Dongfang, E-mail: mdf2004@zju.edu.cn [Ocean College, Zhejiang University, Hangzhou, 310058 China (China); Wang, Dianhai [College of Civil Engineering and Architecture, Zhejiang University, Hangzhou, 310058 China (China)

    2015-10-16

    This letter develops an improved multi-value cellular automata model for heterogeneous bicycle traffic flow taking the higher maximum speed of electric bicycles into consideration. The update rules of both regular and electric bicycles are improved, with maximum speeds of two and three cells per second respectively. Numerical simulation results for deterministic and stochastic cases are obtained. The fundamental diagrams and multiple states effects under different model parameters are analyzed and discussed. Field observations were made to calibrate the slowdown probabilities. The results imply that the improved extended Burgers cellular automata (IEBCA) model is more consistent with the field observations than previous models and greatly enhances the realism of the bicycle traffic model. - Highlights: • We proposed an improved multi-value CA model with higher maximum speed. • Update rules are introduced for heterogeneous bicycle traffic with maximum speed 2 and 3 cells/s. • Simulation results of the proposed model are consistent with field bicycle data. • Slowdown probabilities of both regular and electric bicycles are calibrated.

  11. Modeling Polio Data Using the First Order Non-Negative Integer-Valued Autoregressive, INAR(1), Model

    Science.gov (United States)

    Vazifedan, Turaj; Shitan, Mahendran

    Time series data may consists of counts, such as the number of road accidents, the number of patients in a certain hospital, the number of customers waiting for service at a certain time and etc. When the value of the observations are large it is usual to use Gaussian Autoregressive Moving Average (ARMA) process to model the time series. However if the observed counts are small, it is not appropriate to use ARMA process to model the observed phenomenon. In such cases we need to model the time series data by using Non-Negative Integer valued Autoregressive (INAR) process. The modeling of counts data is based on the binomial thinning operator. In this paper we illustrate the modeling of counts data using the monthly number of Poliomyelitis data in United States between January 1970 until December 1983. We applied the AR(1), Poisson regression model and INAR(1) model and the suitability of these models were assessed by using the Index of Agreement(I.A.). We found that INAR(1) model is more appropriate in the sense it had a better I.A. and it is natural since the data are counts.

  12. Dynamic Average-Value Modeling of Doubly-Fed Induction Generator Wind Energy Conversion Systems

    Science.gov (United States)

    Shahab, Azin

    In a Doubly-fed Induction Generator (DFIG) wind energy conversion system, the rotor of a wound rotor induction generator is connected to the grid via a partial scale ac/ac power electronic converter which controls the rotor frequency and speed. In this research, detailed models of the DFIG wind energy conversion system with Sinusoidal Pulse-Width Modulation (SPWM) scheme and Optimal Pulse-Width Modulation (OPWM) scheme for the power electronic converter are developed in detail in PSCAD/EMTDC. As the computer simulation using the detailed models tends to be computationally extensive, time consuming and even sometimes not practical in terms of speed, two modified approaches (switching-function modeling and average-value modeling) are proposed to reduce the simulation execution time. The results demonstrate that the two proposed approaches reduce the simulation execution time while the simulation results remain close to those obtained using the detailed model simulation.

  13. MODEL NON LINIER GARCH (NGARCH UNTUK MENGESTIMASI NILAI VALUE at RISK (VaR PADA IHSG

    Directory of Open Access Journals (Sweden)

    I KOMANG TRY BAYU MAHENDRA

    2015-06-01

    Full Text Available In investment, risk measurement is important. One of risk measure is Value at Risk (VaR. There are many methods that can be used to estimate risk based on VaR framework. One of them Non Linier GARCH (NGARCH model. In this research, determination of VaR used NGARCH model. NGARCH model allowed for asymetric behaviour in the volatility such that “good news” or positive return and “bad news” or negative return. Based on calculations of VaR, the higher of the confidence level and the longer the investment period, the risk was greater. Determination of VaR using NGARCH model was less than GARCH model.

  14. Values of Land and Renewable Resources in a Three-Sector Economic Growth Model

    Directory of Open Access Journals (Sweden)

    Zhang Wei-Bin

    2015-04-01

    Full Text Available This paper studies dynamic interdependence of capital, land and resource values in a three sector growth model with endogenous wealth and renewable resources. The model is based on the neoclassical growth theory, Ricardian theory and growth theory with renewable resources. The household’s decision is modeled with an alternative approach proposed by Zhang two decades ago. The economic system consists of the households, industrial, agricultural, and resource sectors. The model describes a dynamic interdependence between wealth accumulation, resource change, and division of labor under perfect competition. We simulate the model to demonstrate the existence of a unique stable equilibrium point and plot the motion of the dynamic system. The study conducts comparative dynamic analysis with regard to changes in the propensity to consume resources, the propensity to consume housing, the propensity to consume agricultural goods, the propensity to consume industrial goods, the propensity to save, the population, and the output elasticity of capital of the resource sector.

  15. Comparison of spatial extreme value models for snow depth extremes in Austria

    Science.gov (United States)

    Schellander, Harald; Hell, Tobias

    2017-04-01

    In Alpine regions like Austria a spatial representation of extreme snow depth is of crucial importance for numerous purposes such as the designing of construction projects. Extreme value theory builds the well-established foundation of modeling extremes. Two different approaches for the spatial modeling of snow depth extremes have been extensively investigated lately: Smooth Spatial Modeling (Blanchet and Lehning, 2010) and different classes of max-stable processes (Blanchet and Davison, 2011; Nicolet et al., 2015), both outperforming classical interpolation techniques. While max-stable models are generally considered as improvement over smooth modeling, the methods have not been compared in the context of extreme snow depth. In the present study a great variety of different GEV models is fitted to seasonal snow depth maxima measured at more than 200 Austrian weather stations. Return levels of smooth spatial models and several max-stable representations (Schlather, Brown-Resnick, Geometric Gaussian, Extremal-t) and covariance models (Powered Exponential, Brown, Whittle-Matern), also allowing for anisotropic extremal dependence are compared by a modified Anderson-Darling score and a normalized RMSE. Preliminary results show, that for snow depth extremes in Austria smooth spatial modeling and a version with extremal coefficients as covariates deliver slightly better scores than (an)-isotropic max-stable models.

  16. Optimal models with maximizing probability of first achieving target value in the preceding stages

    Institute of Scientific and Technical Information of China (English)

    林元烈; 伍从斌; 康波大

    2003-01-01

    Decision makers often face the need of performance guarantee with some sufficiently high proba-bility. Such problems can be modelled using a discrete time Markov decision process (MDP) with a probabilitycriterion for the first achieving target value. The objective is to find a policy that maximizes the probabilityof the total discounted reward exceeding a target value in the preceding stages. We show that our formula-tion cannot be described by former models with standard criteria. We provide the properties of the objectivefunctions, optimal value functions and optimal policies. An algorithm for computing the optimal policies forthe finite horizon case is given. In this stochastic stopping model, we prove that there exists an optimal deter-ministic and stationary policy and the optimality equation has a unique solution. Using perturbation analysis,we approximate general models and prove the existence of ε-optimal policy for finite state space. We give anexample for the reliability of the satellite systems using the above theory. Finally, we extend these results tomore general cases.

  17. Modeling the Isentropic Head Value of Centrifugal Gas Compressor using Genetic Programming

    Directory of Open Access Journals (Sweden)

    Safiyullah Ferozkhan

    2016-01-01

    Full Text Available Gas compressor performance is vital in oil and gas industry because of the equipment criticality which requires continuous operations. Plant operators often face difficulties in predicting appropriate time for maintenance and would usually rely on time based predictive maintenance intervals as recommended by original equipment manufacturer (OEM. The objective of this work is to develop the computational model to find the isentropic head value using genetic programming. The isentropic head value is calculated from the OEM performance chart. Inlet mass flow rate and speed of the compressor are taken as the input value. The obtained results from the GP computational models show good agreement with experimental and target data with the average prediction error of 1.318%. The genetic programming computational model will assist machinery engineers to quantify performance deterioration of gas compressor and the results from this study will be then utilized to estimate future maintenance requirements based on the historical data. In general, this genetic programming modelling provides a powerful solution for gas compressor operators to realize predictive maintenance approach in their operations.

  18. Restaurant Service Consumers’ Value Perceptions: A Study with Structural Equation Modeling

    Directory of Open Access Journals (Sweden)

    George Bedinelli Rossi Bedinelli Rossi

    2012-12-01

    Full Text Available This research aims to create a model that could explain consumers‘ value perception of restaurants attended on Sundays in the city of São Paulo. The research was carried out in two phases: first was an exploratory research project—a focus group–type with two groups of eight individuals each, which had the objective of discovering the main variables that impact the value perception of consumers. Thus, a balanced Likert-type scale was generated, with seven levels of concurrence. The scale was submitted to five experts for theoretical validation and was applied to a non probabilistic sample pursuant to the judgment of 360 consumers. Then, in a second phase, validation of the scale by the Confirmatory Factor Analysis method was provided as well as the building and analysis of five causal models by the method of Structural Equation Modeling. The final model with a better adjustment was composed of PRICE as an endogenous variable and ENVIRONMENT, SERVICE, FOOD, and HYGIENE as exogenous variables. Such conclusions allow the prediction of the decision process in relation to restaurant selection in two phases: (1 when a group of restaurants is chosen, and (2 the moment when the PRICE variable takes over the role of defining the value offered by each restaurant, which will motivate the selection.

  19. On a Boolean-valued Model of the Strict Implication System(Continuous)

    Institute of Scientific and Technical Information of China (English)

    LI Na; LIU Hua-ke

    2004-01-01

    The reference [4] proved the consistency of S 1 and S 2 among Lewis'five strict implicationsystems in the modal logic by using the method of the Boolean-valued model. But, in this method, the consistency of S 3 , S 4 and S 5 in Lewis'five strictimplication systems is not decided. This paper makes use of the properties : (1) the equivalence of the modal systems S 3 andP 3 , S 4 and P 4 ; (2) the modal systems P 3 and P 4 all contained the modal axiom T(□p) ; (3) the modal axiom T is correspondence to the reflexiveproperty in VB . Hence, the paper proves: (a) |A S 31|=1 ; (b) |A S 41|=1 ;(c) |A S 51|=1 in the model (where B is a complete Boolean algebra, R is reflexive property in VB ). Therefore, the paper finallyproves that the Boolean-valued model VB of the ZFC axiomsystem in set theory is also a Boolean-valued model of Lewis'the strict implication system S 3 , S 4 and S 5 .

  20. Testing the value and life-style model (VALS of psychographic market segmentation

    Directory of Open Access Journals (Sweden)

    G. G. Rousseau

    1990-06-01

    Full Text Available This article describes the development of a psychographic inventory suitable for testing the VALS model of market segmentation within the South African context. Hypotheses relating to value and life-style traits, suggested by prior research, are tested, utilising a sample of white and black respondents in the Johannesburg/Soweto metropolitan area. Results imply that the instrument developed has moderate reliability and can be administered bilingually. The VALS model tested by the instrument suggests that most respondents hold need-driven and outer-directed values. Impllications for value and lifestyle segmentation within the South African context are discussed. Opsomming Hierdie artikel beskryfdie ontwikkeling van 'n psigografiese inventaris geskik vir die toetsing van die VALS-model van marksegmentasie binne Suid-Afrikaanse verband. Hipoteses met betrekking tot waarde- en lewenstyltrekke, voortspruitend uit vorige navorsing, word getoets op 'n steekproef van wit en swart respondente in die Johannesburg/Soweto metropolitaanse gebied. Resultate toon dat die instrument gemiddelde betroubaarheid openbaar en toepasbaar is op 'n tweetalige grondslag. Die VALS-model soos getoets deur die instrument toon dat die meeste respondente behoefte-gedrewe en na buite gerigte waardes openbaar. Implikasies vir waarde- en lewenstylsegmentasie binne Suid-Afrikaanse verband word ook bespreek.

  1. Modeling shortest path selection of the ant Linepithema humile using psychophysical theory and realistic parameter values.

    Science.gov (United States)

    von Thienen, Wolfhard; Metzler, Dirk; Witte, Volker

    2015-05-07

    The emergence of self-organizing behavior in ants has been modeled in various theoretical approaches in the past decades. One model explains experimental observations in which Argentine ants (Linepithema humile) selected the shorter of two alternative paths from their nest to a food source (shortest path experiments). This model serves as an important example for the emergence of collective behavior and self-organization in biological systems. In addition, it inspired the development of computer algorithms for optimization problems called ant colony optimization (ACO). In the model, a choice function describing how ants react to different pheromone concentrations is fundamental. However, the parameters of the choice function were not deduced experimentally but freely adapted so that the model fitted the observations of the shortest path experiments. Thus, important knowledge was lacking about crucial model assumptions. A recent study on the Argentine ant provided this information by measuring the response of the ants to varying pheromone concentrations. In said study, the above mentioned choice function was fitted to the experimental data and its parameters were deduced. In addition, a psychometric function was fitted to the data and its parameters deduced. Based on these findings, it is possible to test the shortest path model by applying realistic parameter values. Here we present the results of such tests using Monte Carlo simulations of shortest path experiments with Argentine ants. We compare the choice function and the psychometric function, both with parameter values deduced from the above-mentioned experiments. Our results show that by applying the psychometric function, the shortest path experiments can be explained satisfactorily by the model. The study represents the first example of how psychophysical theory can be used to understand and model collective foraging behavior of ants based on trail pheromones. These findings may be important for other

  2. The IT Advantage Assessment Model: Applying an Expanded Value Chain Model to Academia

    Science.gov (United States)

    Turner, Walter L.; Stylianou, Antonis C.

    2004-01-01

    Academia faces an uncertain future as the 21st century unfolds. New demands, discerning students, increased competition from non-traditional competitors are just a few of the forces demanding a response. The use of information technology (IT) in academia has not kept pace with its use in industry. What has been lacking is a model for the strategic…

  3. The IT Advantage Assessment Model: Applying an Expanded Value Chain Model to Academia

    Science.gov (United States)

    Turner, Walter L.; Stylianou, Antonis C.

    2004-01-01

    Academia faces an uncertain future as the 21st century unfolds. New demands, discerning students, increased competition from non-traditional competitors are just a few of the forces demanding a response. The use of information technology (IT) in academia has not kept pace with its use in industry. What has been lacking is a model for the strategic…

  4. Value assessment of a global hydrological forecasting system

    Science.gov (United States)

    Candogan Yossef, N.; Winsemius, H.; van Beek, L. P. H.; van Beek, E.; Bierkens, M. F. P.

    2012-04-01

    The inter-annual variability in streamflow presents risks and opportunities in the management of water resources systems. Reliable hydrological forecasts, effective communication and proper response allow several sectors to make more informed management decisions. In many developing regions of the world, there are no efficient hydrological forecasting systems. A global forecasting system which indicates increased probabilities of streamflow excesses or shortages over long lead-times can be of great value for these regions. FEWS-World system is developed for this purpose. It is based on the Delft-FEWS (flood early warning system) developed by Deltares and incorporates the global hydrological model PCR-GLOBWB. This study investigates the skill and value of FEWS-World. Skill is defined as the ability of the system to forecast discharge extremes; and value as its usefulness for possible users and ultimately for affected populations. Skill is assessed in historical simulation mode as well as retroactive forecasting mode. For the assessment in historical simulation mode a meteorological forcing based on observations from the Climate Research Unit of the University of East Anglia and the ERA-40 reanalysis of the European Center for Medium-Range Weather Forecasts (ECMWF) was used. For the assessment in retroactive forecasting mode the model was forced with ensemble forecasts from the seasonal forecast archives of ECMWF. The eventual goal is to transfer FEWS-World to operational forecasting mode, where the system will use operational seasonal forecasts from ECMWF. The results will be disseminated on the internet, and hopefully provide information that is valuable for users in data and model-poor regions of the world. The results of the preliminary assessment show that although forecasting skill decreases with increasing lead time, the value of forecasts does not necessarily decrease. The forecast requirements and response options of several water related sectors was

  5. Holistic oil field value management: using system dynamics for 'intermediate level' and 'value-based' modelling in the oil industry

    Energy Technology Data Exchange (ETDEWEB)

    Corben, D.; Stevenson, R.; Wolstenholme, E.F. [Cognitus Ltd., Harrogate (United Kingdom)

    1999-07-01

    System dynamics has been seen primarily as a strategic tool, most effectively used at the highest level of strategy to identify robust policy interventions under a wide range of scenarios. However, an alternative, complementary and powerful role is emerging. This is at an 'intermediate level' in organisations to coordinate and integrate policies across the value chain. It is at this level where business value, as defined by the discounted value of future free cash flow, is both created and destroyed. This paper introduces the need for 'intermediate-level' and 'value-based' modelling and emphasises the natural role of system dynamics in supporting a methodology to fulfil the need. It describes the development of an approach and its application in the oil industry to coordinate the response of people and tools within operational, financial and commercial functions across the value chain to address a variety of problems and issues. (author)

  6. A model of determining a fair market value for teaching residents: who profits?

    Science.gov (United States)

    Cullen, Edward J; Lawless, Stephen T; Hertzog, James H; Penfil, Scott; Bradford, Kathleen K; Nadkarni, Vinay M; Corddry, David H; Costarino, Andrew T

    2003-07-01

    Centers for Medicare & Medicaid Services (CMS) Health Resources and Services Administration Children's Hospitals Graduate Medical Education (GME) Payment Program now supports freestanding children's teaching hospitals. To analyze the fair market value impact of GME payment on resident teaching efforts in our pediatric intensive care unit (PICU). Cost-accounting model, developed from a 1-year retrospective, descriptive, single-institution, longitudinal study, applied to physician teachers, residents, and CMS. Sixteen-bed PICU in a freestanding, university-affiliated children's teaching hospital. Pediatric critical care physicians, second-year residents. Cost of physician opportunity time; CMS investment return; the teaching physicians' investment return; residents' investment return; service balance between CMS and teaching service investment margins; economic balance points; fair market value. GME payments to our hospital increased 4.8-fold from 577 886 dollars to 2 772 606 dollars during a 1-year period. Critical care physicians' teaching opportunity cost rose from 250 097 dollars to 262 215 dollars to provide 1523 educational hours (6853 relative value units). Residents' net financial value for service provided to the PICU rose from 245 964 dollars to 317 299 dollars. There is an uneven return on investment in resident education for CMS, critical care physicians, and residents. Economic balance points are achievable for the present educational efforts of the CMS, critical care physicians, and residents if the present direct medical education payment increases from 29.38% to 36%. The current CMS Health Resources and Services Administration Children's Hospitals GME Payment Program produces uneven investment returns for CMS, critical care physicians, and residents. We propose a cost-accounting model, based on perceived production capability measured in relative value units and available GME funds, that would allow a clinical service to balance and obtain a fair

  7. Using pharmacoeconomic modelling to determine value-based pricing for new pharmaceuticals in malaysia.

    Science.gov (United States)

    Dranitsaris, George; Truter, Ilse; Lubbe, Martie S; Sriramanakoppa, Nitin N; Mendonca, Vivian M; Mahagaonkar, Sangameshwar B

    2011-10-01

    Decision analysis (DA) is commonly used to perform economic evaluations of new pharmaceuticals. Using multiples of Malaysia's per capita 2010 gross domestic product (GDP) as the threshold for economic value as suggested by the World Health Organization (WHO), DA was used to estimate a price per dose for bevacizumab, a drug that provides a 1.4-month survival benefit in patients with metastatic colorectal cancer (mCRC). A decision model was developed to simulate progression-free and overall survival in mCRC patients receiving chemotherapy with and without bevacizumab. Costs for chemotherapy and management of side effects were obtained from public and private hospitals in Malaysia. Utility estimates, measured as quality-adjusted life years (QALYs), were determined by interviewing 24 oncology nurses using the time trade-off technique. The price per dose was then estimated using a target threshold of US$44 400 per QALY gained, which is 3 times the Malaysian per capita GDP. A cost-effective price for bevacizumab could not be determined because the survival benefit provided was insufficient According to the WHO criteria, if the drug was able to improve survival from 1.4 to 3 or 6 months, the price per dose would be $567 and $1258, respectively. The use of decision modelling for estimating drug pricing is a powerful technique to ensure value for money. Such information is of value to drug manufacturers and formulary committees because it facilitates negotiations for value-based pricing in a given jurisdiction.

  8. Estimating health state utility values from discrete choice experiments--a QALY space model approach.

    Science.gov (United States)

    Gu, Yuanyuan; Norman, Richard; Viney, Rosalie

    2014-09-01

    Using discrete choice experiments (DCEs) to estimate health state utility values has become an important alternative to the conventional methods of Time Trade-Off and Standard Gamble. Studies using DCEs have typically used the conditional logit to estimate the underlying utility function. The conditional logit is known for several limitations. In this paper, we propose two types of models based on the mixed logit: one using preference space and the other using quality-adjusted life year (QALY) space, a concept adapted from the willingness-to-pay literature. These methods are applied to a dataset collected using the EQ-5D. The results showcase the advantages of using QALY space and demonstrate that the preferred QALY space model provides lower estimates of the utility values than the conditional logit, with the divergence increasing with worsening health states. Copyright © 2014 John Wiley & Sons, Ltd.

  9. Small-sample likelihood inference in extreme-value regression models

    CERN Document Server

    Ferrari, Silvia L P

    2012-01-01

    We deal with a general class of extreme-value regression models introduced by Barreto- Souza and Vasconcellos (2011). Our goal is to derive an adjusted likelihood ratio statistic that is approximately distributed as \\c{hi}2 with a high degree of accuracy. Although the adjusted statistic requires more computational effort than its unadjusted counterpart, it is shown that the adjustment term has a simple compact form that can be easily implemented in standard statistical software. Further, we compare the finite sample performance of the three classical tests (likelihood ratio, Wald, and score), the gradient test that has been recently proposed by Terrell (2002), and the adjusted likelihood ratio test obtained in this paper. Our simulations favor the latter. Applications of our results are presented. Key words: Extreme-value regression; Gradient test; Gumbel distribution; Likelihood ratio test; Nonlinear models; Score test; Small-sample adjustments; Wald test.

  10. ECONOMETRIC MODEL OF FIRM’S VALUE IN LIQUID MARKET: CASE OF INDONESIA

    Directory of Open Access Journals (Sweden)

    Putu Agus Ardiana

    2012-11-01

    Full Text Available The research aims to investigate variables affecting Tobin’s Q which represents the value of public  companies listed on LQ45 Index on the Indonesia Stock Exchange by developing a  BLUE(Best Linear UnbiasedEstimators econometric model for cross-sectional data of 2007, 2008, and 2009 as well as panel data. The models vary across different data but there are important findings to note. Public companies listed  on LQ45 Index have experienced overliquidity problem during the period of observation leading to a decline in firm’s value. In addition, those public companies have low financial risk so they have chance to  increase their debts especially long-term debts.

  11. Study of Value Assessment Model of Forest Biodiversity Based on the Habitat Area in China

    Directory of Open Access Journals (Sweden)

    Ying Zhang

    2014-03-01

    Full Text Available Forest biodiversity is an important part of biodiversity. There is an essential significance of studying forest biodiversity assessment for promoting the conservation of biodiversity and enhancing biodiversity management in China. This study collected forest biodiversity habitat area, output value of forestry and so on forest biodiversity assessment-related data from 2001 to 2010 in China and using optimal control methods in cybernetics to establish value assessment model of forest biodiversity based on the data of habitat area, as well as calculated the optimal price for forest biodiversity assessment. The result showed that forest biodiversity habitat assessment of the optimal price is 9,970 RMB Yuan/ha and there is a dynamic model for forest biodiversity assessment. Finally, the study suggested that studies of forest biodiversity assessment in China, in particular, studying of valuation of forest biodiversity should consider using shadow price and the social, economic and other factors should be taken into account

  12. Generation model of positional values as cell operation during the development of multicellular organisms.

    Science.gov (United States)

    Ogawa, Ken-ichiro; Miyake, Yoshihiro

    2011-03-01

    Many conventional models have used the positional information hypothesis to explain each elementary process of morphogenesis during the development of multicellular organisms. Their models assume that the steady concentration patterns of morphogens formed in an extracellular environment have an important property of positional information, so-called "robustness". However, recent experiments reported that a steady morphogen pattern, the concentration gradient of the Bicoid protein, during early Drosophila embryonic development is not robust for embryo-to-embryo variability. These reports encourage a reconsideration of a long-standing problem in systematic cell differentiation: what is the entity of positional information for cells? And, what is the origin of the robust boundary of gene expression? To address these problems at a cellular level, in this article we pay attention to the re-generative phenomena that show another important property of positional information, "size invariance". In view of regenerative phenomena, we propose a new mathematical model to describe the generation mechanism of a spatial pattern of positional values. In this model, the positional values are defined as the values into which differentiable cells transform a spatial pattern providing positional information. The model is mathematically described as an associative algebra composed of various terms, each of which is the multiplication of some fundamental operators under the assumption that the operators are derived from the remarkable properties of cell differentiation on an amputation surface in regenerative phenomena. We apply this model to the concentration pattern of the Bicoid protein during the anterior-posterior axis formation in Drosophila, and consider the conditions needed to establish the robust boundary of the expression of the hunchback gene.

  13. netherland hydrological modeling instrument

    Science.gov (United States)

    Hoogewoud, J. C.; de Lange, W. J.; Veldhuizen, A.; Prinsen, G.

    2012-04-01

    Netherlands Hydrological Modeling Instrument A decision support system for water basin management. J.C. Hoogewoud , W.J. de Lange ,A. Veldhuizen , G. Prinsen , The Netherlands Hydrological modeling Instrument (NHI) is the center point of a framework of models, to coherently model the hydrological system and the multitude of functions it supports. Dutch hydrological institutes Deltares, Alterra, Netherlands Environmental Assessment Agency, RWS Waterdienst, STOWA and Vewin are cooperating in enhancing the NHI for adequate decision support. The instrument is used by three different ministries involved in national water policy matters, for instance the WFD, drought management, manure policy and climate change issues. The basis of the modeling instrument is a state-of-the-art on-line coupling of the groundwater system (MODFLOW), the unsaturated zone (metaSWAP) and the surface water system (MOZART-DM). It brings together hydro(geo)logical processes from the column to the basin scale, ranging from 250x250m plots to the river Rhine and includes salt water flow. The NHI is validated with an eight year run (1998-2006) with dry and wet periods. For this run different parts of the hydrology have been compared with measurements. For instance, water demands in dry periods (e.g. for irrigation), discharges at outlets, groundwater levels and evaporation. A validation alone is not enough to get support from stakeholders. Involvement from stakeholders in the modeling process is needed. There fore to gain sufficient support and trust in the instrument on different (policy) levels a couple of actions have been taken: 1. a transparent evaluation of modeling-results has been set up 2. an extensive program is running to cooperate with regional waterboards and suppliers of drinking water in improving the NHI 3. sharing (hydrological) data via newly setup Modeling Database for local and national models 4. Enhancing the NHI with "local" information. The NHI is and has been used for many

  14. Weak-strong uniqueness for measure-valued solutions of some compressible fluid models

    Science.gov (United States)

    Gwiazda, Piotr; Świerczewska-Gwiazda, Agnieszka; Wiedemann, Emil

    2015-10-01

    We prove weak-strong uniqueness in the class of admissible measure-valued solutions for the isentropic Euler equations in any space dimension and for the Savage-Hutter model of granular flows in one and two space dimensions. For the latter system, we also show the complete dissipation of momentum in finite time, thus rigorously justifying an assumption that has been made in the engineering and numerical literature.

  15. Weak-strong uniqueness for measure-valued solutions of some compressible fluid models

    CERN Document Server

    Gwiazda, Piotr; Wiedemann, Emil

    2015-01-01

    We prove weak-strong uniqueness in the class of admissible measure-valued solutions for the isentropic Euler equations in any space dimension and for the Savage-Hutter model of granular flows in one and two space dimensions. For the latter system, we also show the complete dissipation of momentum in finite time, thus rigorously justifying an assumption that has been made in the engineering and numerical literature.

  16. Recommended Parameter Values for INEEL Subsurface Disposal Area Source Release Modeling

    Energy Technology Data Exchange (ETDEWEB)

    Riley, Robert G.; Lopresti, Charles A.

    2004-06-23

    The purpose of this report is to summarize 1) associated information and values for key release model parameters (i.e., best estimate, minimum and maximum) obtained where possible from published experimental data, 2) a structure for selection of sensitivity tests cases that can be used to identify test cases, and 3) recommended test cases for selected contaminants of potential concern to assess remedy effectiveness against a no-treatment base case.

  17. Impact of calibration fitting models on the clinical value of chromogranin A

    OpenAIRE

    Ferraro, Simona; Marano, Giuseppe; Ciardi, Laura; Vendramin, Chiara; Bongo, Angelo S.; Bellomo, Giorgio; Boracchi, Patrizia; Biganzoli, Elia M.

    2009-01-01

    Background: The clinical relevance of chromogranin A (CgA) concentrations depends on the analytical performance of the assay. The goal of the present study was to define the clinical involvements in CgA calibration models by evaluating the confidence intervals (CIs) for values from patients who were undergoing monitoring for disease. Methods: Thirty calibration curves for the CgA assay [immunoradiometric assay (IRMA), (CIS-BIO)] were built using linear regression (LR), and four-parameter log...

  18. Assessing the value of increased model resolution in forecasting fire danger

    Science.gov (United States)

    Jeanne Hoadley; Miriam Rorig; Ken Westrick; Larry Bradshaw; Sue Ferguson; Scott Goodrick; Paul Werth

    2003-01-01

    The fire season of 2000 was used as a case study to assess the value of increasing mesoscale model resolution for fire weather and fire danger forecasting. With a domain centered on Western Montana and Northern Idaho, MM5 simulations were run at 36, 12, and 4-km resolutions for a 30 day period at the height of the fire season. Verification analyses for meteorological...

  19. Case Study: Sensitivity Analysis of the Barataria Basin Barrier Shoreline Wetland Value Assessment Model

    Science.gov (United States)

    2014-07-01

    Barrier Shoreline Wetland Value Assessment Model1 by S. Kyle McKay2 and J. Craig Fischenich3 OVERVIEW: Sensitivity analysis is a technique for...relevance of questions posed during an Independent External Peer Review (IEPR). BARATARIA BASIN BARRIER SHORELINE (BBBS) STUDY: On average...scale restoration projects to reduce marsh loss and maintain these wetlands as healthy functioning ecosystems. The Barataria Basin Barrier Shoreline

  20. The Predictive Value of Subjective Labour Supply Data: A Dynamic Panel Data Model with Measurement Error

    OpenAIRE

    Euwals, Rob

    2002-01-01

    This paper tests the predictive value of subjective labour supply data for adjustments in working hours over time. The idea is that if subjective labour supply data help to predict next year?s working hours, such data must contain at least some information on individual labour supply preferences. This informational content can be crucial to identify models of labour supply. Furthermore, it can be crucial to investigate the need for, or, alternatively, the support for laws and collective agree...

  1. Deriving Genomic Breeding Values for Residual Feed Intake from Covariance Functions of Random Regression Models

    OpenAIRE

    Strathe, Anders B; Mark, Thomas; Nielsen, Bjarne; Do, Duy Ngoc; KADARMIDEEN, Haja N.; Jensen, Just

    2014-01-01

    Random regression models were used to estimate covariance functions between cumulated feed intake (CFI) and body weight (BW) in 8424 Danish Duroc pigs. Random regressions on second order Legendre polynomials of age were used to describe genetic and permanent environmental curves in BW and CFI. Based on covariance functions, residual feed intake (RFI) was defined and derived as the conditional genetic variance in feed intake given mid-test breeding value for BW and rate of gain. The heritabili...

  2. Quantum correction to tiny vacuum expectation value in two Higgs doublet model for Dirac neutrino mass

    CERN Document Server

    Morozumi, Takuya; Tamai, Kotaro

    2011-01-01

    We study a Dirac neutrino mass model of Davidson and Logan. In the model, the smallness of the neutrino mass is originated from the small vacuum expectation value of the second Higgs of two Higgs doublets. We study the one loop effective potential of the Higgs sector and examine how the small vacuum expectation is stable under the radiative correction. By deriving formulae of the radiative correction, we numerically study how large the one loop correction is and show how it depends on the quadratic mass terms and quartic couplings of the Higgs potential. The correction changes depending on the various scenarios for extra Higgs mass spectrum.

  3. New Link in Bioinformatics Services Value Chain: Position, Organization and Business Model

    Directory of Open Access Journals (Sweden)

    Mladen Čudanov

    2012-11-01

    Full Text Available This paper presents development in the bioinformatics services industry value chain, based on cloud computing paradigm. As genome sequencing costs per Megabase exponentially drop, industry needs to adopt. Paper has two parts: theoretical analysis and practical example of Seven Bridges Genomics Company. We are focused on explaining organizational, business and financial aspects of new business model in bioinformatics services, rather than technical side of the problem. In the light of that we present twofold business model fit for core bioinformatics research and Information and Communication Technologie (ICT support in the new environment, with higher level of capital utilization and better resistance to business risks.

  4. The Analysis of B2B Business Model from the Viewpoint of Value Chain

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    Electronic business impacts differently the busines s process in or between the corporations in variant degree. In view of the new fea tures of the internet, critical questions to be answered inclued: what are t he emerging business models and related to this, which strategic marketing appro aches are applied,or emerging. In the paper, the business models in B2B are clas sified into four types as e-procurement, e-marketplace, e-trade exchange, and collaborative trading community from the value viewpoint...

  5. On the usefulness of singular value decomposition-ARMA models in Doppler ultrasound.

    Science.gov (United States)

    Forsberg, F

    1991-01-01

    The singular value decomposition (SVD) autoregressive moving average, (ARMA) procedure is applied to computer-generated synthetic Doppler signals as well as in-vivo Doppler data recorded in the carotid artery. Two essential algorithmic parameters (the initially proposed model order and the number of overdetermined equations used) prove difficult to choose. The resulting spectra are very dependent on these two parameters. For the simulated data models orders of (3, 3) provide good results. However, when applying the SVD-ARMA algorithm to in-vivo Doppler signals no single set of model orders was capable of producing consistent spectral estimates throughout the cardiac cycle. Altering the model orders also necessitates the selection of new algorithmic parameters. Hence, the SVD-ARMA approach cannot be considered suitable as a spectral estimation technique, for real-time Doppler ultrasound systems.

  6. Towards a balanced value business model for personalized medicine: an outlook.

    Science.gov (United States)

    Koelsch, Christof; Przewrocka, Joanna; Keeling, Peter

    2013-01-01

    Novel targeted drugs, mainly in oncology, have commanded substantial price premiums in the recent past. Consequently, the attention of pharmaceutical companies has shifted away from the traditional low-price and high-volume blockbuster business model to drugs that command high, and sometimes extremely high, prices in limited markets defined by targeted patient populations. This model may have already passed its zenith, as the impact of more and more high-priced drugs coming to market substantially increases their combined burden on payors and public health finances. This article introduces a new 'balanced value' business model for personalized medicine, leveraging the emerging opportunities to reduce drug development cost and time for targeted therapies. This model allows pharmaceutical companies to charge prices for targeted therapy below the likely future thresholds for payors' willingness to pay, at the same time preserving attractive margins for the drug developers.

  7. [The application value of water flea Daphnia pulex for hypoxia model].

    Science.gov (United States)

    Li, Jiajia; Sheng, Bo; Yang, Lei; Zuo, Yunxia; Lin, Jin; Li, Guohua

    2011-08-01

    Hypoxia-inducible factor (HIF) is an important transcription factor under hypoxic condition in many organisms, and plays a key role in the induction of hypoxia tolerance. It is necessary to establish a hypoxia model for HIF and to perform further hypoxia tolerance research. To investigate the value of Daphnia as a model organism in hypoxia precondition, we developed a preconditioning protocol with a model organism, Daphnia pulex. We found that two episodes of exposure to hypoxic solution resulted in enhanced hypoxia tolerance which is dependent on HIF. Comparative genomic analysis was also made to highlight the homology of HIF-related genes among Daphnia, fruitfly and human. We found that Daphnia is suitable for the study of human hypoxic injury as a model organism.

  8. Network Traffic Based on GARCH-M Model and Extreme Value Theory

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    GARCH-M (generalized autoregressive conditional heteroskedasticity in the mean) model is used to analyse the volatility clustering phenomenon in mobile communication network traffic.Normal distribution, t distribution and generalized Pareto distribution assumptions are adopted respectively to simulate the random component in the model. The demonstration of the quantile of network traffic series indicates that common GARCH-M model can partially deal with the "fat tail"problem. However, the "fat tail" characteristic of the random component directly affects the accuracy of the calculation. Even t distribution is based on the assumption for all the data. On the other hand, extreme value theory, which only concentrates on the tail distribution, can provide more accurate result for high quantiles. The best result is obtained based on the generalized Pareto distribution assumption for the random component in the GARCH-M model.

  9. How are organisational climate models and patient satisfaction related? A competing value framework approach.

    Science.gov (United States)

    Ancarani, Alessandro; Di Mauro, Carmela; Giammanco, Maria Daniela

    2009-12-01

    Patient satisfaction has become an important indicator of process quality inside hospitals. Even so, the improvement of patient satisfaction cannot simply follow from the implementation of new incentives schemes and organisational arrangements; it also depends on hospitals' cultures and climates. This paper studies the impact of alternative models of organisational climate in hospital wards on patient satisfaction. Data gathered from seven public hospitals in Italy are used to explore this relationship. The theoretical approach adopted is the Competing Value Framework which classifies organisations according to their inward or outward focus and according to the importance assigned to control vs. flexibility. Results show that both a model stressing openness, change and innovation and a model emphasising cohesion and workers' morale are positively related to patient satisfaction, while a model based on managerial control is negatively associated with patient satisfaction.

  10. Evaluating the Value of High Spatial Resolution in National Capacity Expansion Models using ReEDS

    Energy Technology Data Exchange (ETDEWEB)

    Krishnan, Venkat; Cole, Wesley

    2016-07-18

    This poster is based on the paper of the same name, presented at the IEEE Power & Energy Society General Meeting, July18, 2016. Power sector capacity expansion models (CEMs) have a broad range of spatial resolutions. This paper uses the Regional Energy Deployment System (ReEDS) model, a long-term national scale electric sector CEM, to evaluate the value of high spatial resolution for CEMs. ReEDS models the United States with 134 load balancing areas (BAs) and captures the variability in existing generation parameters, future technology costs, performance, and resource availability using very high spatial resolution data, especially for wind and solar modeled at 356 resource regions. In this paper we perform planning studies at three different spatial resolutions - native resolution (134 BAs), state-level, and NERC region level - and evaluate how results change under different levels of spatial aggregation in terms of renewable capacity deployment and location, associated transmission builds, and system costs. The results are used to ascertain the value of high geographically resolved models in terms of their impact on relative competitiveness among renewable energy resources.

  11. An expectancy-value model of emotion regulation: implications for motivation, emotional experience, and decision making.

    Science.gov (United States)

    Tamir, Maya; Bigman, Yochanan E; Rhodes, Emily; Salerno, James; Schreier, Jenna

    2015-02-01

    According to expectancy-value models of self-regulation, people are motivated to act in ways they expect to be useful to them. For instance, people are motivated to run when they believe running is useful, even when they have nothing to run away from. Similarly, we propose an expectancy-value model of emotion regulation, according to which people are motivated to emote in ways they expect to be useful to them, regardless of immediate contextual demands. For instance, people may be motivated to get angry when they believe anger is useful, even when there is nothing to be angry about. In 5 studies, we demonstrate that leading people to expect an emotion to be useful increased their motivation to experience that emotion (Studies 1-5), led them to up-regulate the experience of that emotion (Studies 3-4), and led to emotion-consistent behavior (Study 4). Our hypotheses were supported when we manipulated the expected value of anxiety (Study 1) and anger (Studies 2-5), both consciously (Studies 1-4) and unconsciously (Study 5). We discuss the theoretical and pragmatic implications of the proposed model.

  12. Application of the Beck model to stock markets: Value-at-Risk and portfolio risk assessment

    Science.gov (United States)

    Kozaki, M.; Sato, A.-H.

    2008-02-01

    We apply the Beck model, developed for turbulent systems that exhibit scaling properties, to stock markets. Our study reveals that the Beck model elucidates the properties of stock market returns and is applicable to practical use such as the Value-at-Risk estimation and the portfolio analysis. We perform empirical analysis with daily/intraday data of the S&P500 index return and find that the volatility fluctuation of real markets is well-consistent with the assumptions of the Beck model: The volatility fluctuates at a much larger time scale than the return itself and the inverse of variance, or “inverse temperature”, β obeys Γ-distribution. As predicted by the Beck model, the distribution of returns is well-fitted by q-Gaussian distribution of Tsallis statistics. The evaluation method of Value-at-Risk (VaR), one of the most significant indicators in risk management, is studied for q-Gaussian distribution. Our proposed method enables the VaR evaluation in consideration of tail risk, which is underestimated by the variance-covariance method. A framework of portfolio risk assessment under the existence of tail risk is considered. We propose a multi-asset model with a single volatility fluctuation shared by all assets, named the single β model, and empirically examine the agreement between the model and an imaginary portfolio with Dow Jones indices. It turns out that the single β model gives good approximation to portfolios composed of the assets with non-Gaussian and correlated returns.

  13. From 'solution shop' model to 'focused factory' in hospital surgery: increasing care value and predictability.

    Science.gov (United States)

    Cook, David; Thompson, Jeffrey E; Habermann, Elizabeth B; Visscher, Sue L; Dearani, Joseph A; Roger, Veronique L; Borah, Bijan J

    2014-05-01

    The full-service US hospital has been described organizationally as a "solution shop," in which medical problems are assumed to be unstructured and to require expert physicians to determine each course of care. If universally applied, this model contributes to unwarranted variation in care, which leads to lower quality and higher costs. We purposely disrupted the adult cardiac surgical practice that we led at Mayo Clinic, in Rochester, Minnesota, by creating a "focused factory" model (characterized by a uniform approach to delivering a limited set of high-quality products) within the practice's solution shop. Key elements of implementing the new model were mapping the care process, segmenting the patient population, using information technology to communicate clearly defined expectations, and empowering nonphysician providers at the bedside. Using a set of criteria, we determined that the focused-factory model was appropriate for 67 percent of cardiac surgical patients. We found that implementation of the model reduced resource use, length-of-stay, and cost. Variation was markedly reduced, and outcomes were improved. Assigning patients to different care models increases care value and the predictability of care process, outcomes, and costs while preserving (in a lesser clinical footprint) the strengths of the solution shop. We conclude that creating a focused-factory model within a solution shop, by applying industrial engineering principles and health information technology tools and changing the model of work, is very effective in both improving quality and reducing costs.

  14. Review of family relational stress and pediatric asthma: the value of biopsychosocial systemic models.

    Science.gov (United States)

    Wood, Beatrice L; Miller, Bruce D; Lehman, Heather K

    2015-06-01

    Asthma is the most common chronic disease in children. Despite dramatic advances in pharmacological treatments, asthma remains a leading public health problem, especially in socially disadvantaged minority populations. Some experts believe that this health gap is due to the failure to address the impact of stress on the disease. Asthma is a complex disease that is influenced by multilevel factors, but the nature of these factors and their interrelations are not well understood. This paper aims to integrate social, psychological, and biological literatures on relations between family/parental stress and pediatric asthma, and to illustrate the utility of multilevel systemic models for guiding treatment and stimulating future research. We used electronic database searches and conducted an integrated analysis of selected epidemiological, longitudinal, and empirical studies. Evidence is substantial for the effects of family/parental stress on asthma mediated by both disease management and psychobiological stress pathways. However, integrative models containing specific pathways are scarce. We present two multilevel models, with supporting data, as potential prototypes for other such models. We conclude that these multilevel systems models may be of substantial heuristic value in organizing investigations of, and clinical approaches to, the complex social-biological aspects of family stress in pediatric asthma. However, additional systemic models are needed, and the models presented herein could serve as prototypes for model development.

  15. Assessing the antecedents of customer loyalty on healthcare insurance products: Service quality; perceived value embedded model

    Directory of Open Access Journals (Sweden)

    Fadi Abdelmuniem Abdelfattah

    2015-11-01

    Full Text Available Purpose: This research aim to investigate the influence of service quality attributes towards customers’ loyalty on health insurance products. In addition, this research also tested the mediation role of perceived value in between service quality and customers’ loyalty on health insurance products. Design/methodology/approach: Based on the literature review, this research developed a conceptual model of customers loyalty embedded with service quality and perceived value. The study surveyed 342 healthcare insurance customers. Apart from assessing the reliability and validity of the constructs through confirmatory factor analysis, this research also used structural equation modelling (SEM approach to test the proposed hypothesis. Findings: The results from the inferential statistics revealed that the healthcare insurance customers are highly influenced by service quality followed by the perceived value in reaching their loyalty towards a particular health insurance service provider. Research limitations/implications: The sample for this study is based on health insurance customers only and it is suggested that future studies enlarge the scope to include others type of customers of different insurance products. Practical implications: In order to encourage the customers to more loyal towards their service providers, this research will add value for the mangers to understand the items of service quality and considering the perceived value of the target customers in order to optimize their loyalty. As whole, the outcome of this research will assist managers for better understanding of the customers’ loyalty antecedents under the perspective of healthcare insurance products. Originality/value: This paper has tried to provide a comprehensive understanding about customers’ loyalty under the perspective of service quality and perceived values context in the Malaysian health care insurance industry. Since there was a lack of such research in

  16. Introduction to the role of model of value-based medicine in the development of private hospitals

    Directory of Open Access Journals (Sweden)

    Shou-ping CHEN

    2014-01-01

    Full Text Available Value-based medicine is the tendency of modern medicine.This paper elaborated the properties of value-based medicine,analyzed the role of model of value-based medicine in modern medicine,and put forward ideas and related measures with the model of value-based medicine to promote the development of private hospitals, which provided some reference for development of private hospitals. 

  17. Creating Innovators through setting up organizational Vision, Mission, and Core Values : a Strategic Model in Higher Education

    OpenAIRE

    Aithal, Sreeramana

    2016-01-01

    Vision, mission, objectives and core values play major role in setting up sustainable organizations. Vision and mission statements describe the organization’s goals. Core values and core principles represent the organization’s culture. In this paper, we have discussed a model on how a higher education institution can prosper to reach its goal of ‘creating innovators’ through its vision, mission, objectives and core values. A model for the core values required for a prospective ...

  18. The impact of missing data in a generalized integer-valued autoregression model for count data.

    Science.gov (United States)

    Alosh, Mohamed

    2009-11-01

    The impact of the missing data mechanism on estimates of model parameters for continuous data has been extensively investigated in the literature. In comparison, minimal research has been carried out for the impact of missing count data. The focus of this article is to investigate the impact of missing data on a transition model, termed the generalized autoregressive model of order 1 for longitudinal count data. The model has several features, including modeling dependence and accounting for overdispersion in the data, that make it appealing for the clinical trial setting. Furthermore, the model can be viewed as a natural extension of the commonly used log-linear model. Following introduction of the model and discussion of its estimation we investigate the impact of different missing data mechanisms on estimates of the model parameters through a simulation experiment. The findings of the simulation experiment show that, as in the case of normally distributed data, estimates under the missing completely at random (MCAR) and missing at random (MAR) mechanisms are close to their analogue for the full dataset and that the missing not at random (MNAR) mechanism has the greatest bias. Furthermore, estimates based on imputing the last observed value carried forward (LOCF) for missing data under the MAR assumption are similar to those of the MAR. This latter finding might be attributed to the Markov property underlying the model and to the high level of dependence among successive observations used in the simulation experiment. Finally, we consider an application of the generalized autoregressive model to a longitudinal epilepsy dataset analyzed in the literature.

  19. Spatial modelling of periglacial phenomena in Deception Island (Maritime Antarctic): logistic regression and informative value method.

    Science.gov (United States)

    Melo, Raquel; Vieira, Gonçalo; Caselli, Alberto; Ramos, Miguel

    2010-05-01

    Field surveying during the austral summer of 2007/08 and the analysis of a QuickBird satellite image, resulted on the production of a detailed geomorphological map of the Irizar and Crater Lake area in Deception Island (South Shetlands, Maritime Antarctic - 1:10 000) and allowed its analysis and spatial modelling of the geomorphological phenomena. The present study focus on the analysis of the spatial distribution and characteristics of hummocky terrains, lag surfaces and nivation hollows, complemented by GIS spatial modelling intending to identify relevant controlling geographical factors. Models of the susceptibility of occurrence of these phenomena were created using two statistical methods: logistical regression, as a multivariate method; and the informative value as a bivariate method. Success and prediction rate curves were used for model validation. The Area Under the Curve (AUC) was used to quantify the level of performance and prediction of the models and to allow the comparison between the two methods. Regarding the logistic regression method, the AUC showed a success rate of 71% for the lag surfaces, 81% for the hummocky terrains and 78% for the nivation hollows. The prediction rate was 72%, 68% and 71%, respectively. Concerning the informative value method, the success rate was 69% for the lag surfaces, 84% for the hummocky terrains and 78% for the nivation hollows, and with a correspondingly prediction of 71%, 66% and 69%. The results were of very good quality and demonstrate the potential of the models to predict the influence of independent variables in the occurrence of the geomorphological phenomena and also the reliability of the data. Key-words: present-day geomorphological dynamics, detailed geomorphological mapping, GIS, spatial modelling, Deception Island, Antarctic.

  20. Using Discrete Event Simulation to Model the Economic Value of Shorter Procedure Times on EP Lab Efficiency in the VALUE PVI Study.

    Science.gov (United States)

    Kowalski, Marcin; DeVille, J Brian; Svinarich, J Thomas; Dan, Dan; Wickliffe, Andrew; Kantipudi, Charan; Foell, Jason D; Filardo, Giovanni; Holbrook, Reece; Baker, James; Baydoun, Hassan; Jenkins, Mark; Chang-Sing, Peter

    2016-05-01

    The VALUE PVI study demonstrated that atrial fibrillation (AF) ablation procedures and electrophysiology laboratory (EP lab) occupancy times were reduced for the cryoballoon compared with focal radiofrequency (RF) ablation. However, the economic impact associated with the cryoballoon procedure for hospitals has not been determined. Assess the economic value associated with shorter AF ablation procedure times based on VALUE PVI data. A model was formulated from data from the VALUE PVI study. This model used a discrete event simulation to translate procedural efficiencies into metrics utilized by hospital administrators. A 1000-day period was simulated to determine the accrued impact of procedure time on an institution's EP lab when considering staff and hospital resources. The simulation demonstrated that procedures performed with the cryoballoon catheter resulted in several efficiencies, including: (1) a reduction of 36.2% in days with overtime (422 days RF vs 60 days cryoballoon); (2) 92.7% less cumulative overtime hours (370 hours RF vs 27 hours cryoballoon); and (3) an increase of 46.7% in days with time for an additional EP lab usage (186 days RF vs 653 days cryoballoon). Importantly, the added EP lab utilization could not support the time required for an additional AF ablation procedure. The discrete event simulation of the VALUE PVI data demonstrates the potential positive economic value of AF ablation procedures using the cryoballoon. These benefits include more days where overtime is avoided, fewer cumulative overtime hours, and more days with time left for additional usage of EP lab resources.

  1. Model-based clustering for assessing the prognostic value of imaging biomarkers and mixed type tests.

    Science.gov (United States)

    Wang, Zheyu; Sebestyen, Krisztian; Monsell, Sarah E

    2017-09-01

    A model-based clustering method is proposed to address two research aims in Alzheimer's disease (AD): to evaluate the accuracy of imaging biomarkers in AD prognosis, and to integrate biomarker information and standard clinical test results into the diagnoses. One challenge in such biomarker studies is that it is often desired or necessary to conduct the evaluation without relying on clinical diagnoses or some other standard references. This is because (1) biomarkers may provide prognostic information long before any standard reference can be acquired; (2) these references are often based on or provide unfair advantage to standard tests. Therefore, they can mask the prognostic value of a useful biomarker, especially when the biomarker is much more accurate than the standard tests. In addition, the biomarkers and existing tests may be of mixed type and vastly different distributions. A model-based clustering method based on finite mixture modeling framework is introduced. The model allows for the inclusion of mixed typed manifest variables with possible differential covariates to evaluate the prognostic value of biomarkers in addition to standard tests without relying on potentially inaccurate reference diagnoses. Maximum likelihood parameter estimation is carried out via the EM algorithm. Accuracy measures and the ROC curves of the biomarkers are derived subsequently. Finally, the method is illustrated with a real example in AD.

  2. Regression analysis in modeling of air surface temperature and factors affecting its value in Peninsular Malaysia

    Science.gov (United States)

    Rajab, Jasim Mohammed; Jafri, Mohd. Zubir Mat; Lim, Hwee San; Abdullah, Khiruddin

    2012-10-01

    This study encompasses air surface temperature (AST) modeling in the lower atmosphere. Data of four atmosphere pollutant gases (CO, O3, CH4, and H2O) dataset, retrieved from the National Aeronautics and Space Administration Atmospheric Infrared Sounder (AIRS), from 2003 to 2008 was employed to develop a model to predict AST value in the Malaysian peninsula using the multiple regression method. For the entire period, the pollutants were highly correlated (R=0.821) with predicted AST. Comparisons among five stations in 2009 showed close agreement between the predicted AST and the observed AST from AIRS, especially in the southwest monsoon (SWM) season, within 1.3 K, and for in situ data, within 1 to 2 K. The validation results of AST with AST from AIRS showed high correlation coefficient (R=0.845 to 0.918), indicating the model's efficiency and accuracy. Statistical analysis in terms of β showed that H2O (0.565 to 1.746) tended to contribute significantly to high AST values during the northeast monsoon season. Generally, these results clearly indicate the advantage of using the satellite AIRS data and a correlation analysis study to investigate the impact of atmospheric greenhouse gases on AST over the Malaysian peninsula. A model was developed that is capable of retrieving the Malaysian peninsulan AST in all weather conditions, with total uncertainties ranging between 1 and 2 K.

  3. Time-driven activity-based costing: A dynamic value assessment model in pediatric appendicitis.

    Science.gov (United States)

    Yu, Yangyang R; Abbas, Paulette I; Smith, Carolyn M; Carberry, Kathleen E; Ren, Hui; Patel, Binita; Nuchtern, Jed G; Lopez, Monica E

    2017-06-01

    Healthcare reform policies are emphasizing value-based healthcare delivery. We hypothesize that time-driven activity-based costing (TDABC) can be used to appraise healthcare interventions in pediatric appendicitis. Triage-based standing delegation orders, surgical advanced practice providers, and a same-day discharge protocol were implemented to target deficiencies identified in our initial TDABC model. Post-intervention process maps for a hospital episode were created using electronic time stamp data for simple appendicitis cases during February to March 2016. Total personnel and consumable costs were determined using TDABC methodology. The post-intervention TDABC model featured 6 phases of care, 33 processes, and 19 personnel types. Our interventions reduced duration and costs in the emergency department (-41min, -$23) and pre-operative floor (-57min, -$18). While post-anesthesia care unit duration and costs increased (+224min, +$41), the same-day discharge protocol eliminated post-operative floor costs (-$306). Our model incorporating all three interventions reduced total direct costs by 11% ($2753.39 to $2447.68) and duration of hospitalization by 51% (1984min to 966min). Time-driven activity-based costing can dynamically model changes in our healthcare delivery as a result of process improvement interventions. It is an effective tool to continuously assess the impact of these interventions on the value of appendicitis care. II, Type of study: Economic Analysis. Copyright © 2017 Elsevier Inc. All rights reserved.

  4. General duality for Abelian-group-valued statistical-mechanics models

    Energy Technology Data Exchange (ETDEWEB)

    Caracciolo, Sergio [Dip. di Fisica and INFN, Universita degli Studi di Milano, via Celoria 16, I-20133 Milan (Italy); Sportiello, Andrea [Dip. di Fisica and INFN, Universita degli Studi di Milano, via Celoria 16, I-20133 Milan (Italy)

    2004-07-30

    We introduce a general class of statistical-mechanics models, taking values in an Abelian group, which includes examples of both spin and gauge models, both ordered and disordered. The model is described by a set of 'variables' and a set of 'interactions'. Each interaction is associated with a linear combination of variables; these are summarized in a matrix J. A Gibbs factor is associated with each variable (one-body term) and with each interaction. Then we introduce a duality transformation for systems in this class. The duality exchanges the Abelian group with its dual, the Gibbs factors with their Fourier transforms and the interactions with the variables. High (low) couplings in the interaction terms are mapped into low (high) couplings in the one-body terms. If the matrix J is interpreted as a vector representation of a matroid, duality exchanges the matroid with its dual. We discuss some physical examples. The main idea is to generalize the known models up to eventually include randomness into the pattern of interaction. We introduce and study a random Gaussian model, a random Potts-like model and a random variant of discrete scalar QED. Although the classical procedure as given by Kramers and Wannier does not extend in a natural way to such a wider class of systems, our weaker procedure applies to these models, too. We shortly describe the consequence of duality for each example.

  5. Assessing a novel approach to proxy system modeling of speleothem δ18O values

    Science.gov (United States)

    Wong, C. I.; Harman, C. J.; Banner, J.

    2016-12-01

    Assessment of past climate patterns and constraints on the climate processes driving such patterns is enabled by both an expanding network of paleoclimate reconstructions and more and improved simulations of past climate. Paleoclimate-data-model comparisons, however, require the translation of variables simulated by climate models (e.g., precipitation amount and δ18O values) into variables preserved in climate archives (e.g., speleothem δ18O values) using proxy system models. This study presents a new approach for evaluating vadose hydrology and modeling modern cave dripwater δ18O variability, thereby contributing to a component of the proxy system model of the archival of precipitation δ18O by speleothems. We assess the ability of an existing framework, rank StorAge Selection (rSAS) function, to account for variability in dripwater δ18O values in our expanded, decade-long data set from two caves in a semi-arid climate (Texas) and an existing 6.5 year dataset from tropical cave (Borneo). The rSAS approach quantifies the transit time distribution (probability density function of the age of a population of water parcels exiting a system) using functions that represent the way storage (ranked by age) is selected and released to discharge. Importantly, this framework accounts for the effect of antecedent moisture conditions on shifts in flow pathways and connectivity that determine the age distribution of water exiting the system. Initial results suggest that the rSAS approach can better account for dripwater δ18O variability relative to models based on amount weighted precipitation averages that are commonly applied in the interpretation of cave dripwater studies. The use of the rSAS approach is especially advantageous when the monitoring interval is brief relative to water residence time in the system and when drip sites are not dominantly supplied by a single reservoir. With respect to Texas, our expanded monitoring results more robustly demonstrate that cave

  6. Exploring spatial change and gravity center movement for ecosystem services value using a spatially explicit ecosystem services value index and gravity model.

    Science.gov (United States)

    He, Yingbin; Chen, Youqi; Tang, Huajun; Yao, Yanmin; Yang, Peng; Chen, Zhongxin

    2011-04-01

    Spatially explicit ecosystem services valuation and change is a newly developing area of research in the field of ecology. Using the Beijing region as a study area, the authors have developed a spatially explicit ecosystem services value index and implemented this to quantify and spatially differentiate ecosystem services value at 1-km grid resolution. A gravity model was developed to trace spatial change in the total ecosystem services value of the Beijing study area from a holistic point of view. Study results show that the total value of ecosystem services for the study area decreased by 19.75% during the period 1996-2006 (3,226.2739 US$×10(6) in 1996, 2,589.0321 US$×10(6) in 2006). However, 27.63% of the total area of the Beijing study area increased in ecosystem services value. Spatial differences in ecosystem services values for both 1996 and 2006 are very clear. The center of gravity of total ecosystem services value for the study area moved 32.28 km northwestward over the 10 years due to intensive human intervention taking place in southeast Beijing. The authors suggest that policy-makers should pay greater attention to ecological protection under conditions of rapid socio-economic development and increase the area of green belt in the southeastern part of Beijing.

  7. The reference model of supply chain operational controlling in value management

    Directory of Open Access Journals (Sweden)

    2010-03-01

    Full Text Available The systemic approach of the controlling function to supporting the operations management results from its complex analysis of the supply chain business and operating results and from influencing the operations management factors - products, processes and resources that determine the achieved result (revenues, costs, profitability and assets turnover as well as the return on invested capital. All product features which stand for customer value and its competitiveness are the basis for designing, planning and controlling the interconnected processes responsible for manufacturing and delivery of products. The effectiveness of methods applied in developing products, processes and resources depends on the precise analysis and appraisal of the operating conditions that justify their application. Supporting the operations management, focused on the product value and improving the company's financial result, apart from financial, technical and economic analyses requires transferring the product value to activities control methods and to developing the resources in the product supply chain already at the stage of planning. As a result of an analysis of requirements supporting the development of processes and resources in the supply chain, a reference model of operational controlling in product value management was developed. The multicriterion selection and appropriate application of material flow management methods in the supply chain is each time preceded by an operating and financial analysis as well as by an appraisal of operating conditions that influence the choice of control methods.

  8. Designing Organizational Effectiveness Model of Selected Iraq’s Sporting Federations Based on Competing Values Framework

    Directory of Open Access Journals (Sweden)

    Hossein Eydi

    2013-01-01

    Full Text Available The aim of the present study was designing effectiveness model of selected Iraq sport federations based on competing values framework. Statistical society of present study included 221 subjects ranging from chairmen, expert staffs, national adolescent athletes, and national referees. 180 subjects (81.4 percent answered standard questionnaire of Eydi et al (2011 with five Likert values scale. Content and face validity of this tool was confirmed by 12 academic professors and its reliability was validated by Cronbach's alpha (r = 0.97. Results of Structural Equation Model (SEM based on path analysis method showed that factors of expert human resources(0.88, organizational interaction (0.88, productivity (0.87, employees' cohesion (0.84, planning (0.84, organizational stability (0.81, flexibility (0.78, and organizational resources (0.74 had the most effects on organizational effectiveness.Also, findings of factor analysis showed that patterns of internal procedures and rational goals were main patterns of competing values framework and determinants of organizational effectiveness in Iraq's selected sport federations. Moreover, federations of football, track and field, weightlifting, and basketball had the highest mean of organizational effectiveness, respectively. Hence, Iraq sport federations mainly focused on organizational control, and internal attention as index of OE.

  9. Some effects of quiet geomagnetic field changes upon values used for main field modeling

    Science.gov (United States)

    Campbell, W.H.

    1987-01-01

    The effects of three methods of data selection upon the assumed main field levels for geomagnetic observatory records used in main field modeling were investigated for a year of very low solar-terrestrial activity. The first method concerned the differences between the year's average of quiet day field values and the average of all values during the year. For H these differences were 2-3 gammas, for D they were -0.04 to -0.12???, for Z the differences were negligible. The second method of selection concerned the effects of the daytime internal Sq variations upon the daily mean values of field. The midnight field levels when the Sq currents were a minimum deviated from the daily mean levels by as much as 4-7 gammas in H and Z but were negligible for D. The third method of selection was designed to avoid the annual and semi-annual quiet level changes of field caused by the seasonal changes in the magnetosphere. Contributions from these changes were found to be as much as 4-7 gammas in quiet years and expected to be greater than 10 gammas in active years. Suggestions for improved methods of improved data selection in main field modeling are given. ?? 1987.

  10. An Investigation of Factors Affecting Elementary School Students’ BMI Values Based on the System Dynamics Modeling

    Directory of Open Access Journals (Sweden)

    Tian-Syung Lan

    2014-01-01

    Full Text Available This study used system dynamics method to investigate the factors affecting elementary school students’ BMI values. The construction of the dynamic model is divided into the qualitative causal loop and the quantitative system dynamics modeling. According to the system dynamics modeling, this study consisted of research on the four dimensions: student’s personal life style, diet-relevant parenting behaviors, advocacy and implementation of school nutrition education, and students’ peer interaction. The results of this study showed that students with more adequate health concepts usually have better eating behaviors and consequently have less chance of becoming obese. In addition, this study also verified that educational attainment and socioeconomic status of parents have a positive correlation with students’ amounts of physical activity, and nutrition education has a prominent influence on changing students’ high-calorie diets.

  11. Dynamic Value at Risk: A Comparative Study Between Heteroscedastic Models and Monte Carlo Simulation

    Directory of Open Access Journals (Sweden)

    José Lamartine Távora Junior

    2006-12-01

    Full Text Available The objective of this paper was to analyze the risk management of a portfolio composed by Petrobras PN, Telemar PN and Vale do Rio Doce PNA stocks. It was verified if the modeling of Value-at-Risk (VaR through the place Monte Carlo simulation with volatility of GARCH family is supported by hypothesis of efficient market. The results have shown that the statistic evaluation in inferior to dynamics, evidencing that the dynamic analysis supplies support to the hypothesis of efficient market of the Brazilian share holding market, in opposition of some empirical evidences. Also, it was verified that the GARCH models of volatility is enough to accommodate the variations of the shareholding Brazilian market, since the model is capable to accommodate the great dynamic of the Brazilian market.

  12. An extreme value model for maximum wave heights based on weather types

    Science.gov (United States)

    Rueda, Ana; Camus, Paula; Méndez, Fernando J.; Tomás, Antonio; Luceño, Alberto

    2016-02-01

    Extreme wave heights are climate-related events. Therefore, special attention should be given to the large-scale weather patterns responsible for wave generation in order to properly understand wave climate variability. We propose a classification of weather patterns to statistically downscale daily significant wave height maxima to a local area of interest. The time-dependent statistical model obtained here is based on the convolution of the stationary extreme value model associated to each weather type. The interdaily dependence is treated by a climate-related extremal index. The model's ability to reproduce different time scales (daily, seasonal, and interannual) is presented by means of its application to three locations in the North Atlantic: Mayo (Ireland), La Palma Island, and Coruña (Spain).

  13. The value of stream level observations to constrain low-parameter hydrologic models

    Science.gov (United States)

    Seibert, J.; Vis, M.; Pool, S.

    2014-12-01

    While conceptual runoff models with a low number of model parameters are useful tools to capture the hydrological catchment functioning, these models usually rely on model calibration, which makes their use in ungauged basins challenging. One approach might be to take at least a few measurements. Recent studies demonstrated that few streamflow measurements, representing data that could be measured with limited efforts in an ungauged basin, might be helpful to constrain runoff models for simulations in ungauged basins. While in these previous studies we assumed that few streamflow measurements were taken, obviously it would also be reasonable to measure stream levels. Several approaches could be used in practice for such stream level observations: water level loggers have become less expensive and easier to install; stream levels will in the near future be increasingly available from satellite remote sensing resulting in evenly space time series; community-based approaches (e.g., crowdhydrology.org), finally, can offer level observations at irregular time intervals. Here we present a study where a runoff model (the HBV model) was calibrated for 600+ gauged basins in the US assuming that only a subset of the data was available. We pretended that only stream level observations at different time intervals, representing the temporal resolution of the different observation approaches mentioned before, were available. The model, which was calibrated based on these data subsets, was then evaluated on the full observed streamflow record. Our results indicate that streamlevel data alone already can provide surprisingly good model simulation results in humid catchments, whereas in arid catchments some form of quantitative information (streamflow observation or regional average value) is needed to obtain good results. These results are encouraging for hydrological observations in data scarce regions as level observations are much easier to obtain than streamflow observations

  14. Recommended Parameter Values for GENII Modeling of Radionuclides in Routine Air and Water Releases

    Energy Technology Data Exchange (ETDEWEB)

    Snyder, Sandra F.; Arimescu, Carmen; Napier, Bruce A.; Hay, Tristan R.

    2012-11-01

    The GENII v2 code is used to estimate dose to individuals or populations from the release of radioactive materials into air or water. Numerous parameter values are required for input into this code. User-defined parameters cover the spectrum from chemical data, meteorological data, agricultural data, and behavioral data. This document is a summary of parameter values that reflect conditions in the United States. Reasonable regional and age-dependent data is summarized. Data availability and quality varies. The set of parameters described address scenarios for chronic air emissions or chronic releases to public waterways. Considerations for the special tritium and carbon-14 models are briefly addressed. GENIIv2.10.0 is the current software version that this document supports.

  15. The Relevance of Value Net Integrator and Shared Infrastructure Business Models in Managing Chronic Conditions

    Directory of Open Access Journals (Sweden)

    Susan Lambert

    2005-11-01

    Full Text Available There is widespread support for chronic condition management (CCM programs that require a multi-disciplinary, care-team approach. Implementation of such programs represents a paradigm shift in primary care service delivery and has significant resource implications for the general practice. Integral to the widespread uptake of care-team based CCM is information collection, storage and dissemination amongst the care-team members. This paper looks to ebusiness models for assistance in understanding the requirements of general practitioners (GPs in providing multi-disciplinary team care to patients with chronic conditions. The role required of GPs in chronic condition management is compared to that of a value net integrator. The essential characteristics of value net integrators are identified and compared to those of GPs providing multi-disciplinary team care to patients with chronic conditions. It is further suggested that a shared infrastructure is required.

  16. Global unitary fixing and matrix-valued correlations in matrix models

    CERN Document Server

    Adler, S L; Horwitz, Lawrence P.

    2003-01-01

    We consider the partition function for a matrix model with a global unitary invariant energy function. We show that the averages over the partition function of global unitary invariant trace polynomials of the matrix variables are the same when calculated with any choice of a global unitary fixing, while averages of such polynomials without a trace define matrix-valued correlation functions, that depend on the choice of unitary fixing. The unitary fixing is formulated within the standard Faddeev-Popov framework, in which the squared Vandermonde determinant emerges as a factor of the complete Faddeev-Popov determinant. We give the ghost representation for the FP determinant, and the corresponding BRST invariance of the unitary-fixed partition function. The formalism is relevant for deriving Ward identities obeyed by matrix-valued correlation functions.

  17. Deriving Genomic Breeding Values for Residual Feed Intake from Covariance Functions of Random Regression Models

    DEFF Research Database (Denmark)

    Strathe, Anders B; Mark, Thomas; Nielsen, Bjarne

    . Based on covariance functions, residual feed intake (RFI) was defined and derived as the conditional genetic variance in feed intake given mid-test breeding value for BW and rate of gain. The heritability of RFI over the entire period was 0.36, but more interestingly, the genetic variance of RFI was 6......Random regression models were used to estimate covariance functions between cumulated feed intake (CFI) and body weight (BW) in 8424 Danish Duroc pigs. Random regressions on second order Legendre polynomials of age were used to describe genetic and permanent environmental curves in BW and CFI......% of the genetic variance in feed intake, revealing that a minor component of feed intake was genetically independent of maintenance and growth. In conclusion, the approach derived herein led to a consistent definition of RFI, where genomic breeding values were easily obtained...

  18. Customer Focused Product Design Using Integrated Model of Target Costing, Quality Function Deployment and Value Engineering

    Directory of Open Access Journals (Sweden)

    Hossein Rezaei Dolatabadi

    2013-01-01

    Full Text Available Target costing by integrating customer requirements, technical attributes and cost information into the product design phase and eliminating the non-value added functions, plays a vital role in different phases of the product life cycle. Quality Function Deployment (QFD and Value Engineering (VE are two techniques which can be used for applying target costing, successfully. The purpose of this paper is to propose an integrated model of target costing, QFD and VE to explore the role of target costing in managing product costs while promoting quality specifications meeting customers’ needs. F indings indicate that the integration of target costing, QFD and VE is an essential technique in managing the costs of production process. Findings also imply that integration of the three techniques provides a competitive cost advantage to companies.

  19. The added value of remote sensing products in constraining hydrological models

    Science.gov (United States)

    Hrachowitz, M.; Nijzink, R.; Savenije, H. H. G.

    2016-12-01

    A typical calibration of a hydrological model relies on the availability of discharge data, which is, however, not always present and not the largest outgoing flux in many parts of the world. At the same time, more remote sensing products are becoming available that can aid in deriving model parameters and model structures, but also more traditional analytical approaches (e.g. the Budyko framework) can still be of high value. In this research, models are constrained in a step-wise approach with different combinations of remote sensing data and/or analytical frameworks. For example, the temporal resolution can be a driving principle leading to the formulation of a set of constraints. More specific, in a first step the Budyko framework can be used as a means to filter out solutions that cannot reproduce the long-term dynamics of the system. In the following steps, remote sensing data of respectively GRACE (monthly resolution), NDII (16-day resolution) and LSA-SAF evaporation (daily) can lead to final parameterizations of a model. Nevertheless, the choice of these driving principles, the applied order of constraints and the strictness of the applied boundaries of the constraints, will lead to varying solutions. Therefore, variations in these factors, and thus different combinations with different remote sensing products, should lead to an enhanced understanding of the strengths and weaknesses the approaches have with regard to finding optimal parameter sets for hydrological models.

  20. The application of pharmacoeconomic modelling to estimate a value-based price for new cancer drugs.

    Science.gov (United States)

    Dranitsaris, George; Truter, Ilse; Lubbe, Martie S; Cottrell, Wayne; Spirovski, Biljana; Edwards, Jonathan

    2012-04-01

    Value-based pricing has recently been discussed by international bodies as a means to estimate a drug price that is linked to the benefits it offers patients and society. The World Health Organization (WHO) has recommended using three times a country's per capita gross domestic product (GDP) as the threshold for economic value. Using the WHO criteria, pharmacoeconomic modelling was used to illustrate the application of value-based price towards bevacizumab, a relatively new drug that provides a 1.4-month survival benefit to patients with metastatic colorectal cancer (mCRC). A decision model was developed to simulate outcomes in mCRC patients receiving chemotherapy ± bevacizumab. Clinical data were obtained from randomized trials and costs from Canadian cancer centres. Utility estimates were determined by interviewing 24 oncology nurses and pharmacists. A price per dose of bevacizumab was then estimated using a target threshold of $CAD117,000 per quality adjusted life year gained, which is three times the Canadian per capita GDP. For a 1.4-month survival benefit, a price of $CAD830 per dose would be considered cost-effective from the Canadian public health care perspective. If the drug were able to improve patient quality of life or survival from 1.4 to 3 months, the drug price could increase to $CAD1560 and $CAD2180 and still be considered cost-effective. The use of the WHO criteria for estimating a value-based price is feasible, but a balance between what patients/governments can afford to pay and the commercial viability of the product in the reference country would be required. © 2010 Blackwell Publishing Ltd.

  1. Fault diagnostics in power transformer model winding for different alpha values

    Directory of Open Access Journals (Sweden)

    G.H. Kusumadevi

    2015-09-01

    Full Text Available Transient overvoltages appearing at line terminal of power transformer HV windings can cause failure of winding insulation. The failure can be from winding to ground or between turns or sections of winding. In most of the cases, failure from winding to ground can be detected by changes in the wave shape of surge voltage appearing at line terminal. However, detection of insulation failure between turns may be difficult due to intricacies involved in identifications of faults. In this paper, simulation investigations carried out on a power transformer model winding for identifying faults between turns of winding has been reported. The power transformer HV winding has been represented by 8 sections, 16 sections and 24 sections. Neutral current waveform has been analyzed for same model winding represented by different number of sections. The values of α (‘α’ value is the square root of total ground capacitance to total series capacitance of winding considered for windings are 5, 10 and 20. Standard lightning impulse voltage (1.2/50 μs wave shape have been considered for analysis. Computer simulations have been carried out using software PSPICE version 10.0. Neutral current and frequency response analysis methods have been used for identification of faults within sections of transformer model winding.

  2. A CBR-Based and MAHP-Based Customer Value Prediction Model for New Product Development

    Directory of Open Access Journals (Sweden)

    Yu-Jie Zhao

    2014-01-01

    Full Text Available In the fierce market environment, the enterprise which wants to meet customer needs and boost its market profit and share must focus on the new product development. To overcome the limitations of previous research, Chan et al. proposed a dynamic decision support system to predict the customer lifetime value (CLV for new product development. However, to better meet the customer needs, there are still some deficiencies in their model, so this study proposes a CBR-based and MAHP-based customer value prediction model for a new product (C&M-CVPM. CBR (case based reasoning can reduce experts’ workload and evaluation time, while MAHP (multiplicative analytic hierarchy process can use actual but average influencing factor’s effectiveness in stimulation, and at same time C&M-CVPM uses dynamic customers’ transition probability which is more close to reality. This study not only introduces the realization of CBR and MAHP, but also elaborates C&M-CVPM’s three main modules. The application of the proposed model is illustrated and confirmed to be sensible and convincing through a stimulation experiment.

  3. Hybrid Models Based on Singular Values and Autoregressive Methods for Multistep Ahead Forecasting of Traffic Accidents

    Directory of Open Access Journals (Sweden)

    Lida Barba

    2016-01-01

    Full Text Available The traffic accidents occurrence urges the intervention of researchers and society; the human losses and material damage could be abated with scientific studies focused on supporting prevention plans. In this paper prediction strategies based on singular values and autoregressive models are evaluated for multistep ahead traffic accidents forecasting. Three time series of injured people in traffic accidents collected in Santiago de Chile from 2000:1 to 2014:12 were used, which were previously classified by causes related to the behavior of drivers, passengers, or pedestrians and causes not related to the behavior as road deficiencies, mechanical failures, and undetermined causes. A simplified form of Singular Spectrum Analysis (SSA, combined with the autoregressive linear (AR method, and a conventional Artificial Neural Network (ANN are proposed. Additionally, equivalent models that combine Hankel Singular Value Decomposition (HSVD, AR, and ANN are evaluated. The comparative analysis shows that the hybrid models SSA-AR and SSA-ANN reach the highest accuracy with an average MAPE of 1.5% and 1.9%, respectively, from 1- to 14-step ahead prediction. However, it was discovered that HSVD-AR shows a higher accuracy in the farthest horizons, from 12- to 14-step ahead prediction, which reaches an average MAPE of 2.2%.

  4. A CBR-Based and MAHP-Based Customer Value Prediction Model for New Product Development

    Science.gov (United States)

    Zhao, Yu-Jie; Luo, Xin-xing; Deng, Li

    2014-01-01

    In the fierce market environment, the enterprise which wants to meet customer needs and boost its market profit and share must focus on the new product development. To overcome the limitations of previous research, Chan et al. proposed a dynamic decision support system to predict the customer lifetime value (CLV) for new product development. However, to better meet the customer needs, there are still some deficiencies in their model, so this study proposes a CBR-based and MAHP-based customer value prediction model for a new product (C&M-CVPM). CBR (case based reasoning) can reduce experts' workload and evaluation time, while MAHP (multiplicative analytic hierarchy process) can use actual but average influencing factor's effectiveness in stimulation, and at same time C&M-CVPM uses dynamic customers' transition probability which is more close to reality. This study not only introduces the realization of CBR and MAHP, but also elaborates C&M-CVPM's three main modules. The application of the proposed model is illustrated and confirmed to be sensible and convincing through a stimulation experiment. PMID:25162050

  5. Estimation of Value-at-Risk for Energy Commodities via CAViaR Model

    Science.gov (United States)

    Xiliang, Zhao; Xi, Zhu

    This paper uses the Conditional Autoregressive Value at Risk model (CAViaR) proposed by Engle and Manganelli (2004) to evaluate the value-at-risk for daily spot prices of Brent crude oil and West Texas Intermediate crude oil covering the period May 21th, 1987 to Novermber 18th, 2008. Then the accuracy of the estimates of CAViaR model, Normal-GARCH, and GED-GARCH was compared. The results show that all the methods do good job for the low confidence level (95%), and GED-GARCH is the best for spot WTI price, Normal-GARCH and Adaptive-CAViaR are the best for spot Brent price. However, for the high confidence level (99%), Normal-GARCH do a good job for spot WTI, GED-GARCH and four kind of CAViaR specifications do well for spot Brent price. Normal-GARCH does badly for spot Brent price. The result seems suggest that CAViaR do well as well as GED-GARCH since CAViaR directly model the quantile autoregression, but it does not outperform GED-GARCH although it does outperform Normal-GARCH.

  6. A CBR-based and MAHP-based customer value prediction model for new product development.

    Science.gov (United States)

    Zhao, Yu-Jie; Luo, Xin-xing; Deng, Li

    2014-01-01

    In the fierce market environment, the enterprise which wants to meet customer needs and boost its market profit and share must focus on the new product development. To overcome the limitations of previous research, Chan et al. proposed a dynamic decision support system to predict the customer lifetime value (CLV) for new product development. However, to better meet the customer needs, there are still some deficiencies in their model, so this study proposes a CBR-based and MAHP-based customer value prediction model for a new product (C&M-CVPM). CBR (case based reasoning) can reduce experts' workload and evaluation time, while MAHP (multiplicative analytic hierarchy process) can use actual but average influencing factor's effectiveness in stimulation, and at same time C&M-CVPM uses dynamic customers' transition probability which is more close to reality. This study not only introduces the realization of CBR and MAHP, but also elaborates C&M-CVPM's three main modules. The application of the proposed model is illustrated and confirmed to be sensible and convincing through a stimulation experiment.

  7. Value of river discharge data for global-scale hydrological modeling

    Directory of Open Access Journals (Sweden)

    M. Hunger

    2007-11-01

    Full Text Available This paper investigates the value of observed river discharge data for global-scale hydrological modeling of a number of flow characteristics that are required for assessing water resources, flood risk and habitat alteration of aqueous ecosystems. An improved version of WGHM (WaterGAP Global Hydrology Model was tuned in a way that simulated and observed long-term average river discharges at each station become equal, using either the 724-station dataset (V1 against which former model versions were tuned or a new dataset (V2 of 1235 stations and often longer time series. WGHM is tuned by adjusting one model parameter (γ that affects runoff generation from land areas, and, where necessary, by applying one or two correction factors, which correct the total runoff in a sub-basin (areal correction factor or the discharge at the station (station correction factor. The study results are as follows. (1 Comparing V2 to V1, the global land area covered by tuning basins increases by 5%, while the area where the model can be tuned by only adjusting γ increases by 8% (546 vs. 384 stations. However, the area where a station correction factor (and not only an areal correction factor has to be applied more than doubles (389 vs. 93 basins, which is a strong drawback as use of a station correction factor makes discharge discontinuous at the gauge and inconsistent with runoff in the basin. (2 The value of additional discharge information for representing the spatial distribution of long-term average discharge (and thus renewable water resources with WGHM is high, particularly for river basins outside of the V1 tuning area and for basins where the average sub-basin area has decreased by at least 50% in V2 as compared to V1. For these basins, simulated long-term average discharge would differ from the observed one by a factor of, on average, 1.8 and 1.3, respectively, if the additional discharge information were not used for tuning. The value tends to be higher in

  8. The value of soil respiration measurements for interpreting and modeling terrestrial carbon cycling

    Energy Technology Data Exchange (ETDEWEB)

    Phillips, Claire L.; Bond-Lamberty, Ben; Desai, Ankur R.; Lavoie, Martin; Risk, Dave; Tang, Jianwu; Todd-Brown, Katherine; Vargas, Rodrigo

    2016-11-16

    A recent acceleration of model-data synthesis activities has leveraged many terrestrial carbon (C) datasets, but utilization of soil respiration (RS) data has not kept pace with other types such as eddy covariance (EC) fluxes and soil C stocks. Here we argue that RS data, including non-continuous measurements from survey sampling campaigns, have unrealized value and should be utilized more extensively and creatively in data synthesis and modeling activities. We identify three major challenges in interpreting RS data, and discuss opportunities to address them. The first challenge is that when RS is compared to ecosystem respiration (RECO) measured from EC towers, it is not uncommon to find substantial mismatch, indicating one or both flux methodologies are unreliable. We argue the most likely cause of mismatch is unreliable EC data, and there is an unrecognized opportunity to utilize RS for EC quality control. The second challenge is that RS integrates belowground heterotrophic (RH) and autotrophic (RA) activity, whereas modelers generally prefer partitioned fluxes, and few models include an explicit RS output. Opportunities exist to use the total RS flux for data assimilation and model benchmarking methods rather than less-certain partitioned fluxes. Pushing for more experiments that not only partition RS but also monitor the age of RA and RH, as well as for the development of belowground RA components in models, would allow for more direct comparison between measured and modeled values. The third challenge is that soil respiration is generally measured at a very different resolution than that needed for comparison to EC or ecosystem- to global-scale models. Measuring soil fluxes with finer spatial resolution and more extensive coverage, and downscaling EC fluxes to match the scale of RS, will improve chamber and tower comparisons. Opportunities also exist to estimate RH at regional scales by implementing decomposition functional types, akin to plant functional

  9. 8760-Based Method for Representing Variable Generation Capacity Value in Capacity Expansion Models: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Frew, Bethany A [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Cole, Wesley J [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Sun, Yinong [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Mai, Trieu T [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Richards, James [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2017-08-01

    Capacity expansion models (CEMs) are widely used to evaluate the least-cost portfolio of electricity generators, transmission, and storage needed to reliably serve demand over the evolution of many years or decades. Various CEM formulations are used to evaluate systems ranging in scale from states or utility service territories to national or multi-national systems. CEMs can be computationally complex, and to achieve acceptable solve times, key parameters are often estimated using simplified methods. In this paper, we focus on two of these key parameters associated with the integration of variable generation (VG) resources: capacity value and curtailment. We first discuss common modeling simplifications used in CEMs to estimate capacity value and curtailment, many of which are based on a representative subset of hours that can miss important tail events or which require assumptions about the load and resource distributions that may not match actual distributions. We then present an alternate approach that captures key elements of chronological operation over all hours of the year without the computationally intensive economic dispatch optimization typically employed within more detailed operational models. The updated methodology characterizes the (1) contribution of VG to system capacity during high load and net load hours, (2) the curtailment level of VG, and (3) the potential reductions in curtailments enabled through deployment of storage and more flexible operation of select thermal generators. We apply this alternate methodology to an existing CEM, the Regional Energy Deployment System (ReEDS). Results demonstrate that this alternate approach provides more accurate estimates of capacity value and curtailments by explicitly capturing system interactions across all hours of the year. This approach could be applied more broadly to CEMs at many different scales where hourly resource and load data is available, greatly improving the representation of challenges

  10. Linking GIS-based models to value ecosystem services in an Alpine region.

    Science.gov (United States)

    Grêt-Regamey, Adrienne; Bebi, Peter; Bishop, Ian D; Schmid, Willy A

    2008-11-01

    Planning frequently fails to include the valuation of public goods and services. This can have long-term negative economic consequences for a region. This is especially the case in mountainous regions such as the Alps, which depend on tourism and where land-use changes can negatively impact key ecosystem services and hence the economy. In this study, we develop a semi-automatic procedure to value ecosystem goods and services. Several existing process-based models linked to economic valuation methods are integrated into a geographic information system (GIS) platform. The model requires the input of a digital elevation model, a land-cover map, and a spatially explicit temperature dataset. These datasets are available for most regions in Europe. We illustrate the approach by valuing four ecosystem services: avalanche protection, timber production, scenic beauty, and habitat, which are supplied by the "Landschaft Davos", an administrative district in the Swiss Alps. We compare the impacts of a human development scenario and a climate scenario on the value of these ecosystem services. Urban expansion and tourist infrastructure developments have a negative impact on scenic beauty and habitats. These impacts outweigh the benefits of the developments in the long-term. Forest expansion, predictable under a climate change scenario, favours natural avalanche protection and habitats. In general, such non-marketed benefits provided by the case-study region more than compensate for the costs of forest maintenance. Finally, we discuss the advantages and disadvantages of the approach. Despite its limitations, we show how this approach could well help decision-makers balance the impacts of different planning options on the economic accounting of a region, and guide them in selecting sustainable and economically feasible development strategies.

  11. Application of stochastic models to determine customers lifetime value for a Brazilian supermarkets network

    Directory of Open Access Journals (Sweden)

    Annibal Parracho Sant'Anna

    2008-12-01

    Full Text Available This paper studies strategies to access customer lifetime value (CLV. Traditionally, heuristics based on recency, frequency and monetary value variables (RFM are used to determine the best customers. Here, some forms of directly exploring these parameters to predict CLV are compared to an approach based on fitting a stochastic model. The model employed is a composition of a model for the number of transactions along the residual lifetime and a model for the value spent. New evidence is raised on the effect of aggregating transactions monthly. The data analyzed refer to two years of purchases of a group of customers of the same entrance cohort of a fidelity program cadastre of a supermarkets network in Rio de Janeiro. Using the first year to calibrate and the second year to validate the models, good fit of both models to the series of individual data and coherent CLV predictions are obtained.Este artigo estuda estratégias para avaliar o valor do tempo de vida do cliente (CLV. Tradicionalmente, heurísticas baseadas em variáveis medindo recência, freqüência e valor monetário (RFM são utilizadas para determinar os melhores clientes. Aqui, algumas formas de explorar diretamente estes parâmetros para predizer o CLV são comparadas com uma abordagem baseada no ajustamento de um modelo estocástico. O modelo utilizado é uma composição de um modelo para o número de transações ao longo da vida útil residual e um modelo para o valor gasto. Nova evidência é levantada sobre o efeito de agregação das transações mensalmente. Os dados analisados referem-se a dois anos da compras de um grupo de clientes da mesma coorte de ingresso no cadastro de um programa de fidelidade de uma rede de supermercados do Rio de Janeiro. Usando o primeiro ano para calibrar e o segundo ano para validar os modelos, bom ajuste dos dois modelos para as séries de dados individuais e previsões coerentes para o CLV são obtidas.

  12. Muninn: A versioning flash key-value store using an object-based storage model

    OpenAIRE

    Kang, Y.; Pitchumani, R; Marlette, T; Miller, El

    2014-01-01

    While non-volatile memory (NVRAM) devices have the po-tential to alleviate the trade-off between performance, scal-ability, and energy in storage and memory subsystems, a block interface and storage subsystems designed for slow I/O devices make it difficult to efficiently exploit NVRAMs in a portable and extensible way. We propose an object-based storage model as a way of addressing the shortfalls of the current interfaces. Through the design of Muninn, an object-based versioning key-value st...

  13. Newton method for determining the optimal replenishment policy for EPQ model with present value

    Directory of Open Access Journals (Sweden)

    Wu Kuo-Jung Jeff

    2008-01-01

    Full Text Available This paper is a response for the paper of Dohi, Kaio and Osaki, that was published in RAIRO: Operations Research, 26, 1-14 (1992 for an EPQ model with present value. The purpose of this paper is threefold. First, the convex and increasing properties for the first derivative of the objective function are proved. Second, we apply the Newton method to find the optimal cycle time. Third, we provide some numerical examples to demonstrate that the Newton method is more efficient than the bisection method. .

  14. A Multiple—Valued Algebra for Modeling MOS VLSI Circuits at Switch—Level

    Institute of Scientific and Technical Information of China (English)

    胡谋

    1992-01-01

    A multiple-valued algebra for modeling MOS VLSI circuits at switch-level is proposed in this paper,Its structure and properties are studied.This algebra can be used to transform a MOS digital circuit to a swith-level algebraic expression so as to generate the truth table for the circuit and to derive a Boolean expression for it.In the paper,methods to construct a switch-level algebraic expression for a circuit and methods to simplify expressions are given.This algebra provides a new tool for MOS VLSI circuit design and analysis.

  15. A Correlated Random Effects Model for Nonignorable Missing Data in Value-Added Assessment of Teacher Effects

    Science.gov (United States)

    Karl, Andrew T.; Yang, Yan; Lohr, Sharon L.

    2013-01-01

    Value-added models have been widely used to assess the contributions of individual teachers and schools to students' academic growth based on longitudinal student achievement outcomes. There is concern, however, that ignoring the presence of missing values, which are common in longitudinal studies, can bias teachers' value-added scores.…

  16. 8760-Based Method for Representing Variable Generation Capacity Value in Capacity Expansion Models

    Energy Technology Data Exchange (ETDEWEB)

    Frew, Bethany A [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2017-08-03

    Capacity expansion models (CEMs) are widely used to evaluate the least-cost portfolio of electricity generators, transmission, and storage needed to reliably serve load over many years or decades. CEMs can be computationally complex and are often forced to estimate key parameters using simplified methods to achieve acceptable solve times or for other reasons. In this paper, we discuss one of these parameters -- capacity value (CV). We first provide a high-level motivation for and overview of CV. We next describe existing modeling simplifications and an alternate approach for estimating CV that utilizes hourly '8760' data of load and VG resources. We then apply this 8760 method to an established CEM, the National Renewable Energy Laboratory's (NREL's) Regional Energy Deployment System (ReEDS) model (Eurek et al. 2016). While this alternative approach for CV is not itself novel, it contributes to the broader CEM community by (1) demonstrating how a simplified 8760 hourly method, which can be easily implemented in other power sector models when data is available, more accurately captures CV trends than a statistical method within the ReEDS CEM, and (2) providing a flexible modeling framework from which other 8760-based system elements (e.g., demand response, storage, and transmission) can be added to further capture important dynamic interactions, such as curtailment.

  17. Solving inverse problem for Markov chain model of customer lifetime value using flower pollination algorithm

    Science.gov (United States)

    Al-Ma'shumah, Fathimah; Permana, Dony; Sidarto, Kuntjoro Adji

    2015-12-01

    Customer Lifetime Value is an important and useful concept in marketing. One of its benefits is to help a company for budgeting marketing expenditure for customer acquisition and customer retention. Many mathematical models have been introduced to calculate CLV considering the customer retention/migration classification scheme. A fairly new class of these models which will be described in this paper uses Markov Chain Models (MCM). This class of models has the major advantage for its flexibility to be modified to several different cases/classification schemes. In this model, the probabilities of customer retention and acquisition play an important role. From Pfeifer and Carraway, 2000, the final formula of CLV obtained from MCM usually contains nonlinear form of the transition probability matrix. This nonlinearity makes the inverse problem of CLV difficult to solve. This paper aims to solve this inverse problem, yielding the approximate transition probabilities for the customers, by applying metaheuristic optimization algorithm developed by Yang, 2013, Flower Pollination Algorithm. The major interpretation of obtaining the transition probabilities are to set goals for marketing teams in keeping the relative frequencies of customer acquisition and customer retention.

  18. [The Quality of Life Questionnaire EQ-5D: modelling and norm values for the general population].

    Science.gov (United States)

    Hinz, Andreas; Klaiberg, Antje; Brähler, Elmar; König, Hans-Helmut

    2006-02-01

    The EQ-5D is a short questionnaire for measuring health related quality of life. The aim of the present study is to test psychometric properties of this instrument when applied to the general population. 2022 subjects between 16 and 93 years of age were randomly selected and tested with the EQ-5D. 60 % of the sample reported no health problems in all items, that is, there was a great ceiling effect. The comparison between two models with preference based index measurement and one simple sum score showed that the sum score is at least as appropriate as the two other more sophisticated models. The visual analogue scale of the EQ-5D proved to be practicable and useful for the application in the general population. Norm values for the visual analogue scale and the EQ-5D are presented.

  19. On the Expected Present Value of Total Dividends in a Risk Model with Potentially Delayed Claims

    Institute of Scientific and Technical Information of China (English)

    Xie Jie-hua; Zou Wei

    2013-01-01

    In this paper, we consider a risk model in which two types of individual claims, main claims and by-claims, are defined. Every by-claim is induced by the main claim randomly and may be delayed for one time period with a certain probability. The dividend policy that certain amount of dividends will be paid as long as the surplus is greater than a constant dividend barrier is also introduced into this delayed claims risk model. By means of the probability generating functions, formulae for the expected present value of total dividend payments prior to ruin are obtained for discrete-type individual claims. Explicit expressions for the corresponding results are derived for Kn claim amount distributions. Numerical illustrations are also given.

  20. Semiparametric models of time-dependent predictive values of prognostic biomarkers.

    Science.gov (United States)

    Zheng, Yingye; Cai, Tianxi; Stanford, Janet L; Feng, Ziding

    2010-03-01

    Rigorous statistical evaluation of the predictive values of novel biomarkers is critical prior to applying novel biomarkers into routine standard care. It is important to identify factors that influence the performance of a biomarker in order to determine the optimal conditions for test performance. We propose a covariate-specific time-dependent positive predictive values curve to quantify the predictive accuracy of a prognostic marker measured on a continuous scale and with censored failure time outcome. The covariate effect is accommodated with a semiparametric regression model framework. In particular, we adopt a smoothed survival time regression technique (Dabrowska, 1997, The Annals of Statistics 25, 1510-1540) to account for the situation where risk for the disease occurrence and progression is likely to change over time. In addition, we provide asymptotic distribution theory and resampling-based procedures for making statistical inference on the covariate-specific positive predictive values. We illustrate our approach with numerical studies and a dataset from a prostate cancer study.

  1. Assessment of Forest Conservation Value Using a Species Distribution Model and Object-based Image Analysis

    Science.gov (United States)

    Jin, Y.; Lee, D. K.; Jeong, S. G.

    2015-12-01

    The ecological and social values of forests have recently been highlighted. Assessments of the biodiversity of forests, as well as their other ecological values, play an important role in regional and national conservation planning. The preservation of habitats is linked to the protection of biodiversity. For mapping habitats, species distribution model (SDM) is used for predicting suitable habitat of significant species, and such distribution modeling is increasingly being used in conservation science. However, the pixel-based analysis does not contain contextual or topological information. In order to provide more accurate habitats predictions, a continuous field view that assumes the real world is required. Here we analyze and compare at different scales, habitats of the Yellow Marten's(Martes Flavigula), which is a top predator and also an umbrella species in South Korea. The object-scale, which is a group of pixels that have similar spatial and spectral characteristics, and pixel-scale were used for SDM. Our analysis using the SDM at different scales suggests that object-scale analysis provides a superior representation of continuous habitat, and thus will be useful in forest conservation planning as well as for species habitat monitoring.

  2. Estimation of economic values for traits of dairy sheep: I. Model development.

    Science.gov (United States)

    Wolfová, M; Wolf, J; Krupová, Z; Kica, J

    2009-05-01

    A bioeconomic model was developed to estimate effects of change in production and functional traits on profit of dairy or dual-purpose milked sheep under alternative management systems. The flock structure was described in terms of animal categories and probabilities of transitions among them, and a Markov chain approach was used to calculate the stationary state of the resultant ewe flock. The model included both deterministic and stochastic components. Performance for most traits was simulated as the population average, but variation in several traits was taken into account. Management options included lambing intervals, mating system, and culling strategy for ewes, weaning and marketing strategy for progeny, and feeding system. The present value of profit computed as the difference between total revenues and total costs per ewe per year, both discounted to the birth date of the animals, was used as the criterion for economic efficiency of the production system in the stationary state. Economic values (change in system profit per unit change in the trait) of up to 35 milk production, growth, carcass, wool, and functional traits may be estimated.

  3. Modelling and Optimising the Value of a Hybrid Solar-Wind System

    Science.gov (United States)

    Nair, Arjun; Murali, Kartik; Anbuudayasankar, S. P.; Arjunan, C. V.

    2017-05-01

    In this paper, a net present value (NPV) approach for a solar hybrid system has been presented. The system, in question aims at supporting an investor by assessing an investment in solar-wind hybrid system in a given area. The approach follow a combined process of modelling the system, with optimization of major investment-related variables to maximize the financial yield of the investment. The consideration of solar wind hybrid supply presents significant potential for cost reduction. The investment variables concern the location of solar wind plant, and its sizing. The system demand driven, meaning that its primary aim is to fully satisfy the energy demand of the customers. Therefore, the model is a practical tool in the hands of investor to assess and optimize in financial terms an investment aiming at covering real energy demand. Optimization is performed by taking various technical, logical constraints. The relation between the maximum power obtained between individual system and the hybrid system as a whole in par with the net present value of the system has been highlighted.

  4. Modelling EuroQol health-related utility values for diabetic complications from CODE-2 data.

    Science.gov (United States)

    Bagust, Adrian; Beale, Sophie

    2005-03-01

    Recent research has employed different analytical techniques to estimate the impact of the various long-term complications of type 2 diabetes on health-related utility and health status. However, limited patient numbers or lack of variety of patient experience has limited their power to discriminate between separate complications and grades of severity. In this study alternative statistical model forms were compared to investigate the influence of various factors on self-assessed health status and calculated utility scores, including the presence and severity of complications, and type of diabetes therapy. Responses to the EuroQol EQ-5D questionnaire from 4641 patients with type 2 diabetes in 5 European countries were analysed. Simple multiple regression analysis was used to model both visual analogue scale (VAS) scores and time trade-off index scores (TTO). Also, two complex models were developed for TTO analysis using a structure suggested by the EuroQol calculation algorithm. Both VAS and TTO models achieved greater explanatory power than in earlier studies. Relative weightings for individual complications differed between VAS and TTO scales, reflecting the strong influence of loss of mobility and severe pain in the EuroQol algorithm. Insulin-based therapy was uniformly associated with a detrimental effect equivalent to an additional moderate complication. Evidence was found that TTO values are not responsive in cases where 3 or more multiple complications are present, and therefore may underestimate utility loss for patients most adversely affected by complex chronic diseases like diabetes.

  5. Analysis of vibroacoustic properties of dimpled beams using a boundary value model

    Science.gov (United States)

    Myers, Kyle R.

    Attention has been given recently to the use of dimples as a means of passively altering the vibroacoustic properties of structures. Because of their geometric complexity, previous studies have modeled dimpled structures using the finite element method. However, the dynamics of dimpled structures are not completely understood. The goal of this study is to provide a better understanding of these structures through the development of a boundary value model (BVM) using Hamilton's Variational Principle. The focus of this study is on dimpled beams, which represent the simplest form of a dimpled structure. A general model of a beam with N dimples in free vibration is developed. Since dimples formed via a stamping process do not change the mass of the beam, the dimple thickness is less than that of the straight segments. Differential equations of motion that describe the normal and axial motion of the dimpled beams are derived. Their numerical solution yields the natural frequencies and analytical mode shapes of a dimpled beam. The accuracy of this model is checked against those obtained using the finite element method, as well as the analytical studies on the vibrations of arches, and shown to be accurate. The effect of dimple placement, dimple angle, its chord length, its thickness, as well as beam boundary conditions on beam natural frequencies and mode shapes are investigated. For beams with axially restrictive boundary conditions, the results.

  6. Interval-valued intuitionistic fuzzy multi-criteria model for design concept selection

    Directory of Open Access Journals (Sweden)

    Daniel Osezua Aikhuele

    2017-09-01

    Full Text Available This paper presents a new approach for design concept selection by using an integrated Fuzzy Analytical Hierarchy Process (FAHP and an Interval-valued intuitionistic fuzzy modified TOP-SIS (IVIF-modified TOPSIS model. The integrated model which uses the improved score func-tion and a weighted normalized Euclidean distance method for the calculation of the separation measures of alternatives from the positive and negative intuitionistic ideal solutions provides a new approach for the computation of intuitionistic fuzzy ideal solutions. The results of the two approaches are integrated using a reflection defuzzification integration formula. To ensure the feasibility and the rationality of the integrated model, the method is successfully applied for eval-uating and selecting some design related problems including a real-life case study for the selec-tion of the best concept design for a new printed-circuit-board (PCB and for a hypothetical ex-ample. The model which provides a novel alternative, has been compared with similar computa-tional methods in the literature.

  7. A modelling approach to find stable and reliable soil organic carbon values for further regionalization.

    Science.gov (United States)

    Bönecke, Eric; Franko, Uwe

    2015-04-01

    Soil organic matter (SOM) and carbon (SOC) might be the most important components to describe soil fertility of agricultural used soils. It is sensitive to temporal and spatial changes due to varying weather conditions, uneven crops and soil management practices and still struggles with providing reliable delineation of spatial variability. Soil organic carbon, furthermore, is an essential initial parameter for dynamic modelling, understanding e.g. carbon and nitrogen processes. Alas it requires cost and time intensive field and laboratory work to attain and using this information. The objective of this study is to assess an approach that reduces efforts of laboratory and field analyses by using method to find stable initial soil organic carbon values for further soil process modelling and regionalization on field scale. The demand of strategies, technics and tools to improve reliable soil organic carbon high resolution maps and additionally reducing cost constraints is hence still facing an increasing attention of scientific research. Although, it is nowadays a widely used practice, combining effective sampling schemes with geophysical sensing techniques, to describe within-field variability of soil organic carbon, it is still challenging large uncertainties, even at field scale in both, science and agriculture. Therefore, an analytical and modelling approach might facilitate and improve this strategy on small and large field scale. This study will show a method, how to find reliable steady state values of soil organic carbon at particular points, using the approved soil process model CANDY (Franko et al. 1995). It is focusing on an iterative algorithm of adjusting the key driving components: soil physical properties, meteorological data and management information, for which we quantified the input and the losses of soil carbon (manure, crop residues, other organic inputs, decomposition, leaching). Furthermore, this approach can be combined with geophysical

  8. Reference values and physiological characterization of a specific isolated pig kidney perfusion model

    Directory of Open Access Journals (Sweden)

    Meissler Michael

    2007-01-01

    Full Text Available Abstract Background Models of isolated and perfused kidneys are used to study the effects of drugs, hazardous or toxic substances on renal functions. Since physiological and morphological parameters of small laboratory animal kidneys are difficult to compare to human renal parameters, porcine kidney perfusion models have been developed to simulate closer conditions to the human situation, but exact values of renal parameters for different collection and perfusion conditions have not been reported so far. If the organs could be used out of regular slaughtering processes animal experiments may be avoided. Methods To assess renal perfusion quality, we analyzed different perfusion settings in a standardized model of porcine kidney hemoperfusion with organs collected in the operating theatre (OP: groups A-D or in a public abattoir (SLA: group E and compared the data to in vivo measurements in living animals (CON. Experimental groups had defined preservation periods (0, 2 and 24 hrs, one with additional albumin in the perfusate (C for edema reduction. Results Varying perfusion settings resulted in different functional values (mean ± SD: blood flow (RBF [ml/min*100 g]: (A 339.9 ± 61.1; (C 244.5 ± 53.5; (D 92.8 ± 25.8; (E 153.8 ± 41.5; glomerular fitration (GFR [ml/min*100 g]: (CON 76.1 ± 6.2; (A 59.2 ± 13.9; (C 25.0 ± 10.6; (D 1.6 ± 1.3; (E 16.3 ± 8.2; fractional sodium reabsorption (RFNa [%] (CON 99.8 ± 0.1; (A 82.3 ± 8.1; (C 86.8 ± 10.3; (D 38.4 ± 24.5; (E 88.7 ± 5.8. Additionally the tubular coupling-ratio of Na-reabsorption/O2-consumption was determined (TNa/O2-cons [mmol-Na/mmol- O2] (CON 30.1; (A 42.0, (C 80.6; (D 17.4; (E 23.8, exhibiting OP and SLA organs with comparable results. Conclusion In the present study functional values for isolated kidneys with different perfusion settings were determined to assess organ perfusion quality. It can be summarized that the hemoperfused porcine kidney can serve as a biological model with

  9. Valuing spatially dispersed environmental goods: A joint revealed and stated preference model to consistently separate use and non-use values

    OpenAIRE

    FERRINI, Silvia; Fezzi, Carlo; Day, Brett H.; BATEMAN, Ian J.

    2008-01-01

    We argue that the literature concerning the valuation of non-market, spatially defined goods (such as those provided by the natural environment) is crucially deficient in two respects. First, it fails to employ a theoretically consistent structural model of utility to the separate and hence correct definition of use and non-use values. Second, applications (particularly those using stated preference methods) typically fail to capture the spatially complex distribution of resources and their s...

  10. Sensitivity analysis of fine sediment models using heterogeneous data

    Science.gov (United States)

    Kamel, A. M. Yousif; Bhattacharya, B.; El Serafy, G. Y.; van Kessel, T.; Solomatine, D. P.

    2012-04-01

    Sediments play an important role in many aquatic systems. Their transportation and deposition has significant implication on morphology, navigability and water quality. Understanding the dynamics of sediment transportation in time and space is therefore important in drawing interventions and making management decisions. This research is related to the fine sediment dynamics in the Dutch coastal zone, which is subject to human interference through constructions, fishing, navigation, sand mining, etc. These activities do affect the natural flow of sediments and sometimes lead to environmental concerns or affect the siltation rates in harbours and fairways. Numerical models are widely used in studying fine sediment processes. Accuracy of numerical models depends upon the estimation of model parameters through calibration. Studying the model uncertainty related to these parameters is important in improving the spatio-temporal prediction of suspended particulate matter (SPM) concentrations, and determining the limits of their accuracy. This research deals with the analysis of a 3D numerical model of North Sea covering the Dutch coast using the Delft3D modelling tool (developed at Deltares, The Netherlands). The methodology in this research was divided into three main phases. The first phase focused on analysing the performance of the numerical model in simulating SPM concentrations near the Dutch coast by comparing the model predictions with SPM concentrations estimated from NASA's MODIS sensors at different time scales. The second phase focused on carrying out a sensitivity analysis of model parameters. Four model parameters were identified for the uncertainty and sensitivity analysis: the sedimentation velocity, the critical shear stress above which re-suspension occurs, the shields shear stress for re-suspension pick-up, and the re-suspension pick-up factor. By adopting different values of these parameters the numerical model was run and a comparison between the

  11. Old but sexy: Value creation of old technology-based businesses models

    Directory of Open Access Journals (Sweden)

    Oke Christian Beckmann

    2016-12-01

    Full Text Available The purpose of this study is to contribute to a better understanding of the strategic and organisational con gurations that companies can use to generate value with product-market systems and their busi- ness models that have been dominant in the past but forced back into niche positions by innovation. The former dominant music format vinyl was rapidly substituted after the introduction of digital music. However, still nowadays some customers use and buy old technology-based products – vinyl sales boom again since 2007. Due to the two-sided nature of the market, customers have to get access to complementary goods. We are thus interested in technologies which have been outdated by the emergence of new technologies. The originality lies in the combination of the two areas: business models and old technologies. Furthermore, vinyl is an example not analysed in depth by scholars so far. We approached this by undertaking an in-depth literature review to generate hypotheses regarding the value-adding activities of old-technology based businesses as a basis for further research in this area. In addition the paper gives insights into the constellations to be expected over time for old technology-based businesses models in platform markets. We here focus on a neglected topic in the strategy literature which, however, bears relevance for many businesses locked into product-market systems which make it hard for them to (completely switch to a new technology emerging in the market. It is especially valuable to describe the consequences in a systematic fashion.

  12. ARM Cloud Radar Simulator Package for Global Climate Models Value-Added Product

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Yuying [North Carolina State Univ., Raleigh, NC (United States); Xie, Shaocheng [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-05-01

    It has been challenging to directly compare U.S. Department of Energy (DOE) Atmospheric Radiation Measurement (ARM) Climate Research Facility ground-based cloud radar measurements with climate model output because of limitations or features of the observing processes and the spatial gap between model and the single-point measurements. To facilitate the use of ARM radar data in numerical models, an ARM cloud radar simulator was developed to converts model data into pseudo-ARM cloud radar observations that mimic the instrument view of a narrow atmospheric column (as compared to a large global climate model [GCM] grid-cell), thus allowing meaningful comparison between model output and ARM cloud observations. The ARM cloud radar simulator value-added product (VAP) was developed based on the CloudSat simulator contained in the community satellite simulator package, the Cloud Feedback Model Intercomparison Project (CFMIP) Observation Simulator Package (COSP) (Bodas-Salcedo et al., 2011), which has been widely used in climate model evaluation with satellite data (Klein et al., 2013, Zhang et al., 2010). The essential part of the CloudSat simulator is the QuickBeam radar simulator that is used to produce CloudSat-like radar reflectivity, but is capable of simulating reflectivity for other radars (Marchand et al., 2009; Haynes et al., 2007). Adapting QuickBeam to the ARM cloud radar simulator within COSP required two primary changes: one was to set the frequency to 35 GHz for the ARM Ka-band cloud radar, as opposed to 94 GHz used for the CloudSat W-band radar, and the second was to invert the view from the ground to space so as to attenuate the beam correctly. In addition, the ARM cloud radar simulator uses a finer vertical resolution (100 m compared to 500 m for CloudSat) to resolve the more detailed structure of clouds captured by the ARM radars. The ARM simulator has been developed following the COSP workflow (Figure 1) and using the capabilities available in COSP

  13. Support System Model for Value based Group Decision on Roof System Selection

    Directory of Open Access Journals (Sweden)

    Christiono Utomo

    2011-02-01

    Full Text Available A group decision support system is required on a value-based decision because there are different concern caused by differing preferences, experiences, and background. It is to enable each decision-maker to evaluate and rank the solution alternatives before engaging into negotiation with other decision-makers. Stakeholder of multi-criteria decision making problems usually evaluates the alternative solution from different perspective, making it possible to have a dominant solution among the alternatives. Each stakeholder needs to identify the goals that can be optimized and those that can be compromised in order to reach an agreement with other stakeholders. This paper presents group decision model involving three decision-makers on the selection of suitable system for a building’s roof. The objective of the research is to find an agreement options model and coalition algorithms for multi person decision with two main preferences of value which are function and cost. The methodology combines value analysis method using Function Analysis System Technique (FAST; Life Cycle Cost analysis, group decision analysis method based on Analytical Hierarchy Process (AHP in a satisfying options, and Game theory-based agent system to develop agreement option and coalition formation for the support system. The support system bridges theoretical gap between automated design in construction domain and automated negotiation in information technology domain by providing a structured methodology which can lead to systematic support system and automated negotiation. It will contribute to value management body of knowledge as an advanced method for creativity and analysis phase, since the practice of this knowledge is teamwork based. In the case of roof system selection, it reveals the start of the first negotiation round. Some of the solutions are not an option because no individual stakeholder or coalition of stakeholders desires to select it. The result indicates

  14. Modeling the Value of Integrated Canadian and U.S. Power Sector Expansion

    Energy Technology Data Exchange (ETDEWEB)

    Cole, Wesley, Beiter, Philipp; Steinberg, Daniel

    2016-09-08

    The United States and Canada power systems are not isolated. Cross-border transmission and coordination of system operation create an interconnected power system, which results in combined imports and exports of electricity of greater than 70 TWh per year [1]. Currently, over 5 GW of new international transmission lines are in various stages of permitting and development. These lines may enable greater integration and coordination of the U.S. and Canada systems, which can in turn reduce challenges associated with integration of high penetrations of variable renewables. Furthermore, low-cost Canadian resources, such as wind and hydro, could contribute to compliance with the EPA's recently released Clean Power Plan. Improving integration and coordination internationally will reduce the costs of accessing these resources. This analysis work build on previous work by Ibanez and Zinaman [2]. In this work we seek to better understand the value of additional interconnection between the U.S. and Canadian power systems. Specifically, we quantify the value of additional interconnection and coordination within the Canadian-US integrated power system under scenarios in which large reductions (>80%) in power sector CO2 emissions are achieved. We explore how the ability to add additional cross-border transmission impacts capacity investment, the generation mix, system costs, and the ability of the system to integrate variable renewable energy into the power system. This analysis uses the Regional Energy Deployment System (ReEDS) capacity expansion model [3], [4] to quantify the value of the integrated power system expansion of the United States and Canada. ReEDS is an optimization model that assesses the deployment and operation (including transmission) of the electricity sector of the contiguous United States and Canadian provinces from 2016 through 2050. It has the ability to model the integration of renewable energy technologies into the grid. ReEDS captures renewable

  15. The value of "black-box" neural network modeling in subsurface flow prediction

    Science.gov (United States)

    Paleologos, E.; Skitzi, I.; Katsifarakis, K.

    2012-04-01

    In several hydrologic cases the complexity of the processes involved tied in with the uncertainty in the subsurface geologic environment, geometries, and boundary conditions cannot be addressed by constitutive relationships, either in a deterministic or a stochastic framework. "Black-box" models are used routinely in surface hydrologic predictions, but in subsurface hydrology there is still a tendency to rely on physical descriptions, even in problems where the geometry, the medium, the processes, the boundary conditions are largely unknown. Subsurface flow in karstic environments exemplifies all the above complexities and uncertainties rendering the use of physical models impractical. The current study uses neural networks to exemplify that "black-box" models can provide useful predictions even in the absence of physical process descriptions. Daily discharges of two springs lying in a karstic environment were simulated for a period of two and a half years with the use of a multi-layer perceptron back-propagation neural network. Missing discharge values were supplemented by assuming linear relationships during base flow conditions, thus extending the length of the data record during the network's training phase and improving its performance. The time lag between precipitation and spring discharge differed significantly for the two springs indicating that in karstic environments hydraulic behavior is dominated, even within a few hundred meters, by local conditions. Optimum training results were attained with a Levenberg-Marquardt algorithm resulting in a network architecture consisting of two input layer neurons, four hidden layer neurons, and one output layer neuron, the spring's discharge. The neural network's predictions captured the behavior for both springs and followed very closely the discontinuities in the discharge time series. Under/over-estimation of observed discharges for the two springs remained below 3%, with the exception of a few local maxima where

  16. Development of a theoretical model for measuring the perceived value of social responsibility of IPEN

    Energy Technology Data Exchange (ETDEWEB)

    Mutarelli, Rita de Cassia; Lima, Ana Cecilia de Souza; Sabundjian, Gaiane, E-mail: rmutarelli@gmail.com, E-mail: aclima@ipen.br, E-mail: gdjian@ipen.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)

    2015-07-01

    Social responsibility has been one of the great discussions in institutional management, and that is an important variable in the strategy and performance of the institutions. The Instituto de Pesquisas Energeticas e Nucleares (IPEN) has worked for the development of environmental and social issues, converging mainly to the benefit of the population. The theory that guides the social responsibility practices is always difficult to measure for several reasons. One reason for this difficulty is that social responsibility involves a variety of issues that are converted in rights, obligations and expectations of different audiences that could be internal and external to the organization. In addition, the different understanding of the institutions about social and environmental issues is another source of complexity. Based on the study context including: the topic being researched, the chosen institute and the questions resulting from the research, the aim of this paper is to propose a theoretical model to describe and analyze the social responsibility of IPEN. The main contribution of this study is to develop a model that integrates the dimensions of social responsibility. These dimensions - also called constructs - are composed of indexes and indicators that were previously used in various contexts of empirical research, combined with the theoretical and conceptual review of social responsibility. The construction of the proposed theoretical model was based on the research of various methodologies and various indicators for measuring social responsibility. This model was statistically tested, analyzed, adjusted, and the end result is a consistent model to measure the perceived value of social responsibility of IPEN. This work could also be applied to other institutions. Moreover, it may be improved and become a tool that will serve as a thermometer to measure social and environmental issues, and will support decision making in various management processes. (author)

  17. Physical Dating Violence: the potential understating value of a bi-factorial model

    Directory of Open Access Journals (Sweden)

    Carmen Viejo

    2014-01-01

    Full Text Available National and international studies have pointed out Conflict Tactics Scale (CTS, Straus, 1979, 1996 as one of the most widely used measures for assessing the strategies used in situations of conflict within young couples. Nevertheless, there is not any conclusive result about its structure. Especially the physical dating violence scale has undergone several structural analyses providing monofactorial and bifactorial structures. The aim of this study was focusing on the validation of structural models using confirmatory factorial analysis (CFA of CTS within adolescent couples, considering the differences between boys and girls and between aggressors and victims. 3258 adolescents, aged 15-21, were selected using a stratified random sample and interviewed using an adaptation of the CTS questionnaire. The results pointed out that it is not possible to identify a single model fit, but boys and girls, aggressor and victims, have the same pattern: a bifactorial model which establishes different but correlated moderate aggressive behaviors and severe aggressive behaviors. These results are discussed in terms of the potential value of this two factor structure for understanding the phenomenon.

  18. Evaluation of dynamically downscaled extreme temperature using a spatially-aggregated generalized extreme value (GEV) model

    Science.gov (United States)

    Wang, Jiali; Han, Yuefeng; Stein, Michael L.; Kotamarthi, Veerabhadra R.; Huang, Whitney K.

    2016-11-01

    The weather research and forecast (WRF) model downscaling skill in extreme maximum daily temperature is evaluated by using the generalized extreme value (GEV) distribution. While the GEV distribution has been used extensively in climatology and meteorology for estimating probabilities of extreme events, accurately estimating GEV parameters based on data from a single pixel can be difficult, even with fairly long data records. This work proposes a simple method assuming that the shape parameter, the most difficult of the three parameters to estimate, does not vary over a relatively large region. This approach is applied to evaluate 31-year WRF-downscaled extreme maximum temperature through comparison with North American regional reanalysis (NARR) data. Uncertainty in GEV parameter estimates and the statistical significance in the differences of estimates between WRF and NARR are accounted for by conducting a novel bootstrap procedure that makes no assumption of temporal or spatial independence within a year, which is especially important for climate data. Despite certain biases over parts of the United States, overall, WRF shows good agreement with NARR in the spatial pattern and magnitudes of GEV parameter estimates. Both WRF and NARR show a significant increase in extreme maximum temperature over the southern Great Plains and southeastern United States in January and over the western United States in July. The GEV model shows clear benefits from the regionally constant shape parameter assumption, for example, leading to estimates of the location and scale parameters of the model that show coherent spatial patterns.

  19. METHOD OF GREEN FUNCTIONS IN MATHEMATICAL MODELLING FOR TWO-POINT BOUNDARY-VALUE PROBLEMS

    Directory of Open Access Journals (Sweden)

    E. V. Dikareva

    2015-01-01

    Full Text Available Summary. In many applied problems of control, optimization, system theory, theoretical and construction mechanics, for problems with strings and nods structures, oscillation theory, theory of elasticity and plasticity, mechanical problems connected with fracture dynamics and shock waves, the main instrument for study these problems is a theory of high order ordinary differential equations. This methodology is also applied for studying mathematical models in graph theory with different partitioning based on differential equations. Such equations are used for theoretical foundation of mathematical models but also for constructing numerical methods and computer algorithms. These models are studied with use of Green function method. In the paper first necessary theoretical information is included on Green function method for multi point boundary-value problems. The main equation is discussed, notions of multi-point boundary conditions, boundary functionals, degenerate and non-degenerate problems, fundamental matrix of solutions are introduced. In the main part the problem to study is formulated in terms of shocks and deformations in boundary conditions. After that the main results are formulated. In theorem 1 conditions for existence and uniqueness of solutions are proved. In theorem 2 conditions are proved for strict positivity and equal measureness for a pair of solutions. In theorem 3 existence and estimates are proved for the least eigenvalue, spectral properties and positivity of eigenfunctions. In theorem 4 the weighted positivity is proved for the Green function. Some possible applications are considered for a signal theory and transmutation operators.

  20. A New Homotopy Perturbation Scheme for Solving Singular Boundary Value Problems Arising in Various Physical Models

    Science.gov (United States)

    Roul, Pradip; Warbhe, Ujwal

    2017-08-01

    The classical homotopy perturbation method proposed by J. H. He, Comput. Methods Appl. Mech. Eng. 178, 257 (1999) is useful for obtaining the approximate solutions for a wide class of nonlinear problems in terms of series with easily calculable components. However, in some cases, it has been found that this method results in slowly convergent series. To overcome the shortcoming, we present a new reliable algorithm called the domain decomposition homotopy perturbation method (DDHPM) to solve a class of singular two-point boundary value problems with Neumann and Robin-type boundary conditions arising in various physical models. Five numerical examples are presented to demonstrate the accuracy and applicability of our method, including thermal explosion, oxygen-diffusion in a spherical cell and heat conduction through a solid with heat generation. A comparison is made between the proposed technique and other existing seminumerical or numerical techniques. Numerical results reveal that only two or three iterations lead to high accuracy of the solution and this newly improved technique introduces a powerful improvement for solving nonlinear singular boundary value problems (SBVPs).

  1. Convergence of Extreme Value Statistics in a Two-Layer Quasi-Geostrophic Atmospheric Model

    Directory of Open Access Journals (Sweden)

    Vera Melinda Gálfi

    2017-01-01

    Full Text Available We search for the signature of universal properties of extreme events, theoretically predicted for Axiom A flows, in a chaotic and high-dimensional dynamical system. We study the convergence of GEV (Generalized Extreme Value and GP (Generalized Pareto shape parameter estimates to the theoretical value, which is expressed in terms of the partial information dimensions of the attractor. We consider a two-layer quasi-geostrophic atmospheric model of the mid-latitudes, adopt two levels of forcing, and analyse the extremes of different types of physical observables (local energy, zonally averaged energy, and globally averaged energy. We find good agreement in the shape parameter estimates with the theory only in the case of more intense forcing, corresponding to a strong chaotic behaviour, for some observables (the local energy at every latitude. Due to the limited (though very large data size and to the presence of serial correlations, it is difficult to obtain robust statistics of extremes in the case of the other observables. In the case of weak forcing, which leads to weaker chaotic conditions with regime behaviour, we find, unsurprisingly, worse agreement with the theory developed for Axiom A flows.

  2. Emerging Media Crisis Value Model: A Comparison of Relevant, Timely Message Strategies for Emergency Events

    Directory of Open Access Journals (Sweden)

    Sabrina Page

    2013-01-01

    Full Text Available Communication during an emergency or crisis event is essential for emergency responders, the community involved, and those watching on television as well as receiving information via social media from family members, friends or other community members. The evolution of communication during an emergency/crisis event now includes utilizing social media. To better understand this evolution the Emerging Media Crisis Value Model (EMCVM is used in comparing two emergency events; Hurricane Irene (2011, a natural disaster, and the theater shooting in Aurora, Colorado (2012, a man-made crisis. The EMCVM provides a foundation for future studies focusing on the use of social media, emergency responders at the local, state and national levels are better prepared to educate a community thus, counteracting public uncertainty, fear, while providing timely, accurate information.

  3. A Multinomial Ordinal Probit Model with Singular Value Decomposition Method for a Multinomial Trait

    Directory of Open Access Journals (Sweden)

    Soonil Kwon

    2012-01-01

    Full Text Available We developed a multinomial ordinal probit model with singular value decomposition for testing a large number of single nucleotide polymorphisms (SNPs simultaneously for association with multidisease status when sample size is much smaller than the number of SNPs. The validity and performance of the method was evaluated via simulation. We applied the method to our real study sample recruited through the Mexican-American Coronary Artery Disease study. We found 3 genes (SORCS1, AMPD1, and PPARα to be associated with the development of both IGT and IFG, while 5 genes (AMPD2, PRKAA2, C5, TCF7L2, and ITR with the IGT mechanism only and 6 genes (CAPN10, IL4, NOS3, CD14, GCG, and SORT1 with the IFG mechanism only. These data suggest that IGT and IFG may indicate different physiological mechanism to prediabetes, via different genetic determinants.

  4. Continuous review inventory models under time value of money and crashable lead time consideration

    Directory of Open Access Journals (Sweden)

    Hung Kuo-Chen

    2011-01-01

    Full Text Available A stock is an asset if it can react to economic and seasonal influences in the management of the current assets. The financial manager must calculate the input of funds to the stock intelligently and the amount of money cycled through stocks, taking into account the time factors in the future. The purpose of this paper is to propose an inventory model considering issues of crash cost and current value. The sensitivity analysis of each parameter, in this research, differs from the traditional approach. We utilize a course of deduction with sound mathematics to develop several lemmas and one theorem to estimate optimal solutions. This study first tries to find the optimal order quantity at all lengths of lead time with components crashed at their minimum duration. Second, a simple method to locate the optimal solution unlike traditional sensitivity analysis is developed. Finally, some numerical examples are given to illustrate all lemmas and the theorem in the solution algorithm.

  5. Partner Selection in a Virtual Enterprise: A Group Multiattribute Decision Model with Weighted Possibilistic Mean Values

    Directory of Open Access Journals (Sweden)

    Fei Ye

    2013-01-01

    Full Text Available This paper proposes an extended technique for order preference by similarity to ideal solution (TOPSIS for partner selection in a virtual enterprise (VE. The imprecise and fuzzy information of the partner candidate and the risk preferences of decision makers are both considered in the group multiattribute decision-making model. The weighted possibilistic mean values are used to handle triangular fuzzy numbers in the fuzzy environment. A ranking procedure for partner candidates is developed to help decision makers with varying risk preferences select the most suitable partners. Numerical examples are presented to reflect the feasibility and efficiency of the proposed TOPSIS. Results show that the varying risk preferences of decision makers play a significant role in the partner selection process in VE under a fuzzy environment.

  6. Greenhouse crop residues: Energy potential and models for the prediction of their higher heating value

    Energy Technology Data Exchange (ETDEWEB)

    Callejon-Ferre, A.J.; Lopez-Martinez, J.A.; Manzano-Agugliaro, F. [Departamento de Ingenieria Rural, Universidad de Almeria, Ctra. Sacramento s/n, La Canada de San Urbano, 04120 Almeria (Spain); Velazquez-Marti, B. [Departamento de Ingenieria Rural y Agroalimentaria, Universidad Politecnica de Valencia, Camino de Vera s/n, 46022 Valencia (Spain)

    2011-02-15

    Almeria, in southeastern Spain, generates some 1,086,261 t year{sup -1} (fresh weight) of greenhouse crop (Cucurbita pepo L., Cucumis sativus L., Solanum melongena L., Solanum lycopersicum L., Phaseoulus vulgaris L., Capsicum annuum L., Citrillus vulgaris Schrad. and Cucumis melo L.) residues. The energy potential of this biomass is unclear. The aim of the present work was to accurately quantify this variable, differentiating between crop species while taking into consideration the area they each occupy. This, however, required the direct analysis of the higher heating value (HHV) of these residues, involving very expensive and therefore not commonly available equipment. Thus, a further aim was to develop models for predicting the HHV of these residues, taking into account variables measured by elemental and/or proximate analysis, thus providing an economically attractive alternative to direct analysis. All the analyses in this work involved the use of worldwide-recognised standards and methods. The total energy potential for these plant residues, as determined by direct analysis, was 1,003,497.49 MW h year{sup -1}. Twenty univariate and multivariate equations were developed to predict the HHV. The R{sup 2} and adjusted R{sup 2} values obtained for the univariate and multivariate models were 0.909 and 0.946 or above respectively. In all cases, the mean absolute percentage error varied between 0.344 and 2.533. These results show that any of these 20 equations could be used to accurately predict the HHV of crop residues. The residues produced by the Almeria greenhouse industry would appear to be an interesting source of renewable energy. (author)

  7. What Is the Predictive Value of Animal Models for Vaccine Efficacy in Humans? Consideration of Strategies to Improve the Value of Animal Models.

    Science.gov (United States)

    Herati, Ramin Sedaghat; Wherry, E John

    2017-03-27

    Animal models are an essential feature of the vaccine design toolkit. Although animal models have been invaluable in delineating the mechanisms of immune function, their precision in predicting how well specific vaccines work in humans is often suboptimal. There are, of course, many obvious species differences that may limit animal models from predicting all details of how a vaccine works in humans. However, careful consideration of which animal models may have limitations should also allow more accurate interpretations of animal model data and more accurate predictions of what is to be expected in clinical trials. In this article, we examine some of the considerations that might be relevant to cross-species extrapolation of vaccine-related immune responses for the prediction of how vaccines will perform in humans.

  8. The value of using feasibility models in systematic conservation planning to predict landholder management uptake.

    Science.gov (United States)

    Tulloch, Ayesha I T; Tulloch, Vivitskaia J D; Evans, Megan C; Mills, Morena

    2014-12-01

    Understanding the social dimensions of conservation opportunity is crucial for conservation planning in multiple-use landscapes. However, factors that influence the feasibility of implementing conservation actions, such as the history of landscape management, and landholders' willingness to engage are often difficult or time consuming to quantify and rarely incorporated into planning. We examined how conservation agencies could reduce costs of acquiring such data by developing predictive models of management feasibility parameterized with social and biophysical factors likely to influence landholders' decisions to engage in management. To test the utility of our best-supported model, we developed 4 alternative investment scenarios based on different input data for conservation planning: social data only; biological data only; potential conservation opportunity derived from modeled feasibility that incurs no social data collection costs; and existing conservation opportunity derived from feasibility data that incurred collection costs. Using spatially explicit information on biodiversity values, feasibility, and management costs, we prioritized locations in southwest Australia to control an invasive predator that is detrimental to both agriculture and natural ecosystems: the red fox (Vulpes vulpes). When social data collection costs were moderate to high, the most cost-effective investment scenario resulted from a predictive model of feasibility. Combining empirical feasibility data with biological data was more cost-effective for prioritizing management when social data collection costs were low (<4% of the total budget). Calls for more data to inform conservation planning should take into account the costs and benefits of collecting and using social data to ensure that limited funding for conservation is spent in the most cost-efficient and effective manner. © 2014 Society for Conservation Biology.

  9. Demonstrating the value of community-based ('citizen science') observations for catchment modelling and characterisation

    Science.gov (United States)

    Starkey, Eleanor; Parkin, Geoff; Birkinshaw, Stephen; Large, Andy; Quinn, Paul; Gibson, Ceri

    2017-05-01

    Despite there being well-established meteorological and hydrometric monitoring networks in the UK, many smaller catchments remain ungauged. This leaves a challenge for characterisation, modelling, forecasting and management activities. Here we demonstrate the value of community-based ('citizen science') observations for modelling and understanding catchment response as a contribution to catchment science. The scheme implemented within the 42 km2 Haltwhistle Burn catchment, a tributary of the River Tyne in northeast England, has harvested and used quantitative and qualitative observations from the public in a novel way to effectively capture spatial and temporal river response. Community-based rainfall, river level and flood observations have been successfully collected and quality-checked, and used to build and run a physically-based, spatially-distributed catchment model, SHETRAN. Model performance using different combinations of observations is tested against traditionally-derived hydrographs. Our results show how the local network of community-based observations alongside traditional sources of hydro-information supports characterisation of catchment response more accurately than using traditional observations alone over both spatial and temporal scales. We demonstrate that these community-derived datasets are most valuable during local flash flood events, particularly towards peak discharge. This information is often missed or poorly represented by ground-based gauges, or significantly underestimated by rainfall radar, as this study clearly demonstrates. While community-based observations are less valuable during prolonged and widespread floods, or over longer hydrological periods of interest, they can still ground-truth existing traditional sources of catchment data to increase confidence during characterisation and management activities. Involvement of the public in data collection activities also encourages wider community engagement, and provides important

  10. Model of Religious Study and Moral Values in TK Putra Harapan Nalumsari Jepara

    Directory of Open Access Journals (Sweden)

    Mubasyaroh Mubasyaroh

    2016-12-01

    Full Text Available Religious and moral education from an early age so needs to be invested for the child, so that in the future they will have a strong and deep understanding of the norms and teachings of Islam. Age children early childhood and kindergarten (TK is a time to play, so education is implemented particularly religious education should be designed properly by the teacher so that the education process into an active and fun activities. Kindergarten (TK Putra Harapan Nalumsari Jepara is one of the educational institutions that provide education for early childhood, with one lesson material is a moral and religious education. Moral education is one of the materials is very important because in load values, morals and religion, so this will be a guide for students in later life. The research method used is descriptive qualitative research to describe in detail the subjects and issues to be studied. The findings in this study is the cultivation of religious values and morals for children Kindergarten revolves around the activities of daily life. In particular cultivation of religious values with laying the foundations of the faith, personality or character that is commendable and devotional practices, in accordance with the child's ability is implemented. As one of the early childhood education institutions, in conducting the study, have several models of delivery Edutainment, habituation and uswah hasanah as a reference implementation of teaching and learning. Pendidikan agama dan moral sejak usia dini perlu ditanamkan bagi anak, sehingga di masa depan mereka akan memiliki pemahaman yang kuat dan mendalam dari norma-norma dan ajaran Islam. Usia anak-anak usia dini dan taman kanak-kanak (TK adalah waktu untuk bermain, sehingga pendidikan diimplementasikan khususnya pendidikan agama harus dirancang dengan baik oleh guru sehingga proses pendidikan menjadi kegiatan yang aktif dan menyenangkan. TK (TK Putra Harapan Nalumsari Jepara merupakan salah satu lembaga

  11. Non-equal-interval direct optimizing Verhulst model that x(n) be taken as initial value and its application

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    To overcome the deficiencies of the existing Verhulst GM(1,1) model, based on the existing grey theory, a non-equal-interval direct optimum Verhulst GM(1,1) model is built which chooses a modified n-th component x(n) of X(0) as the starting condition of the grey differential model. It optimizes a modified β value and the background value, and takes two times fitting optimization. The new model extends equal intervals to non-equal-intervals and is suitable for general data modelling and estimating parameters...

  12. Market Value Estimation Models for Marine Surface Vessels with the Use of Multiple Regression Analysis.

    Science.gov (United States)

    1982-12-01

    which should be used when quantifying "value." Some common alternative measures may include terms such as book value, net realizable value , cur- rent...brokerage fee) [Ref. 1: p. 9-6]. Depending on whether or not there 15 are any preparation or brokerage costs, the net realizable value may be equivalent...of petroleum-carrying vessels to the realizable value of their scrap steel. For example, the Motor Vessel EXXON FLORENCE was recently sold in Taiwan

  13. Quakes Regime In A Heterogeneous Media With Strength Values Obeying The Cauhy Law: Model and Comparison With Seismic Data

    Science.gov (United States)

    Rodkin, M. V.

    Earthquake shear zones are known to consist of rigid asperities and weak zones [Das and Kostrov, 1983; et al], i.e. rupture zones can be described as random systems con- sisting of subzones with very different strength values. It seems probably, that the distribution of strength values of elements of geophysical media can be described by the Cauchy law [Kagan, 1994; Rodkin, 2001; et al.]. Below, geophysical media was modeled as a system consisting of elements with strength values obeying the Cauchy law. A simple algorithm was used to model a process of generation of rupture zones in a such media. Catalog of artificial quakes was obtained in result of numerical model- ing. Individual rupture zone is described here by its length and initial strength. Set of rupture zones obtained is characterized by the used stress value and the model b-value. The distributions of length values of rupture zones obtained this way obey the law sim- ilar to power-law taking place in case of real earthquakes. A number of correlations between model stress value, number of quakes and b-value characteristic of this model was exposed. Some of the model regularities agree with well known properties of seis- mic regime. Some others were not revealed before, but were shown to agree with the world Harvard catalog data. Only one result (a positive correlation between the stress value and a number of quakes) seems to contradict with real seismic regime. Actually, a negative correlation was obtained between density of earthquakes and characteristic apparent stress values. This result testifies for an importance of effect of weakening of geophysical media in the process of earthquake generation.

  14. Model grid and infiltration values for the transient ground-water flow model, Death Valley regional ground-water flow system, Nevada and California

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — This digital data set defines the model grid and infiltration values simulated in the transient ground-water flow model of the Death Valley regional ground-water...

  15. Model grid and infiltration values for the transient ground-water flow model, Death Valley regional ground-water flow system, Nevada and California

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — This digital data set defines the model grid and infiltration values simulated in the transient ground-water flow model of the Death Valley regional ground-water...

  16. Goals and Values in School: A Model Developed for Describing, Evaluating and Changing the Social Climate of Learning Environments

    Science.gov (United States)

    Allodi, Mara Westling

    2010-01-01

    This paper defines a broad model of the psychosocial climate in educational settings. The model was developed from a general theory of learning environments, on a theory of human values and on empirical studies of children's evaluations of their schools. The contents of the model are creativity, stimulation, achievement, self-efficacy, creativity,…

  17. Cardiopulmonary values in dogs with artificial model of caval syndrome in heartworm disease.

    Science.gov (United States)

    Kuwahara, Y; Kitagawa, H; Sasaki, Y; Ishihara, K

    1991-02-01

    Cardiopulmonary values were determined in dogs with an artificial model of heartworm caval syndrome, which was produced by insertion of heartworm-like silicone tubes into the tricuspid valve orifice and right atrium. Fifteen to 25 tubes with some knots were inserted in 6 dogs (knot group), and 7 to 11 tubes (small-number group) or 29 to 37 tubes (large-number group) without a knot in 3 dogs, respectively. After tube insertion, angiographic contrast medium infused into the right ventricle regurgitated to the right atrium in all cases, and the regurgitation was the most severe in the large-number group. On electrocardiographic findings, the atrial and/or ventricular premature beat developed. The height of a- and v-wave of right atrial pressure curves elevated in all groups. The elevation in v-wave was obvious in the large-number group. The pulmonary arterial pressure tended to fall or to elevate slightly, and total pulmonary resistance increased in all groups. The right cardiac output decreased significantly in all cases. The right heart hemodynamics of the model might resemble those in spontaneous cases without disturbed pulmonary circulation.

  18. Modeling sediment transport in the lower Yellow River and dynamic equilibrium threshold value

    Institute of Scientific and Technical Information of China (English)

    HU; Chunhong; GUO; Qingchao

    2004-01-01

    A major problem in the lower Yellow River is the insufficient incoming water and excessive sediment supply, which results in serious deposition, continuous rise of the river bed, and austere flood control situation. To understand the sediment transport regularity of the lower Yellow River and determine the relationship between sedimentation,incoming water and sediment, and zone water diversion, a mathematical model of the sediment suitable for the characteristics of the lower Yellow River has been developed.This model is first rated and verified by large quantity of observed data, and it is then used to analyze silting reduction for the lower Yellow River by Xiaolangdi Reservoir's operation,the relationship between zone water diversion and channel sedimentation, and critical equilibrium of sedimentation in the lower Yellow River. The threshold values of equilibrium of sedimentation in the lower Yellow River are estimated and they suggest that deposition in the lower Yellow River can be effectively reduced by the operation of regulating flow and sediment from Xiaolangdi Reservoir, water-soil conservation, and controlling water diversion along the lower Yellow River.

  19. National scale multivariate extreme value modelling of waves, winds and sea levels

    Directory of Open Access Journals (Sweden)

    Gouldby Ben

    2016-01-01

    Full Text Available It has long been recognised that extreme coastal flooding can arise from the joint occurrence of extreme waves, winds and sea levels. The standard simplified joint probability approach used in England and Wales can result in an underestimation of flood risk unless correction factors are applied. This paper describes the application of a state-of-the-art multivariate extreme value model to offshore winds, waves and sea levels around the coast of England. The methodology overcomes the limitations of the traditional method. The output of the new statistical analysis is a Monte-Carlo (MC simulation comprising many thousands of offshore extreme events and it is necessary to translate all of these events into overtopping rates for use as input to flood risk assessments. It is computationally impractical to transform all of these MC events from the offshore to the nearshore. Computationally efficient statistical emulators of the SWAN wave transformation model have therefore been constructed. The emulators translate the thousands of MC events offshore. Whilst the methodology has been applied for national flood risk assessment, it has the potential to be implemented for wider use, including climate change impact assessment, nearshore wave climates for detailed local assessments and coastal flood forecasting.

  20. Contact of boundary-value problems and nonlocal problems in mathematical models of heat transfer

    Science.gov (United States)

    Lyashenko, V.; Kobilskaya, O.

    2015-10-01

    In this paper the mathematical models in the form of nonlocal problems for the two-dimensional heat equation are considered. Relation of a nonlocal problem and a boundary value problem, which describe the same physical heating process, is investigated. These problems arise in the study of the temperature distribution during annealing of the movable wire and the strip by permanent or periodically operating internal and external heat sources. The first and the second nonlocal problems in the mobile area are considered. Stability and convergence of numerical algorithms for the solution of a nonlocal problem with piecewise monotone functions in the equations and boundary conditions are investigated. Piecewise monotone functions characterize the heat sources and heat transfer conditions at the boundaries of the area that is studied. Numerous experiments are conducted and temperature distributions are plotted under conditions of internal and external heat sources operation. These experiments confirm the effectiveness of attracting non-local terms to describe the thermal processes. Expediency of applying nonlocal problems containing nonlocal conditions - thermal balance conditions - to such models is shown. This allows you to define heat and mass transfer as the parameters of the process control, in particular heat source and concentration of the substance.

  1. Value of mink vomit model in study of anti-emetic drugs

    Institute of Scientific and Technical Information of China (English)

    Fang Zhang; Lei Wang; Zhi-Hong Yang; Zhan-Tao Liu; Wang Yue

    2006-01-01

    AIM: To establish a new, reliable vomit model of minks.METHODS: Adult male minks were randomly divided into 8 groups (n=6): cisplatin (7.5 mg/kg)intraperitoneal injection (ip) group, copper sulfate (40mg/kg) intragastric injection (ig) group, apomorphine (1.6 mg/kg) subcutaneous injection (sc) group, and 18Gy whole-body X-irradiation group, ondansetron injection group (2 mg/kg ip) 30 min later followed by cisplatin (7.5 mg/kg) ip, normal saline (NS) ip injection control group, metoclopramide injection group (4 mg/kg ip) 30min later followed by apomorphine (1.6 mg/kg) sc, NS ig control group. The frequency of retching and vomiting was calculated. After behavioral experiment, distribution of 5-HT in the ileum was detected by immunohistologic method.RESULTS: Cisplatin, apomorphine, copper sulfate and X-irradiation administered to minks evoked a profound emetic response in the animals. However, retching and vomiting were significantly inhibited by pretreatment with ondansetron and metoclopramide in cisplatin and copper sulfate groups (P=0.018). Immunohistologic result showed that 5-HT released from enterochromaffin cells (EC cells) was involved in vomiting mechanism.CONCLUSION: Mink vomit model has a great value in studying the vomiting mechanism and screening new antiemetic drugs.

  2. Estimating the ratios of the stationary distribution values for Markov chains modeling evolutionary algorithms.

    Science.gov (United States)

    Mitavskiy, Boris; Cannings, Chris

    2009-01-01

    The evolutionary algorithm stochastic process is well-known to be Markovian. These have been under investigation in much of the theoretical evolutionary computing research. When the mutation rate is positive, the Markov chain modeling of an evolutionary algorithm is irreducible and, therefore, has a unique stationary distribution. Rather little is known about the stationary distribution. In fact, the only quantitative facts established so far tell us that the stationary distributions of Markov chains modeling evolutionary algorithms concentrate on uniform populations (i.e., those populations consisting of a repeated copy of the same individual). At the same time, knowing the stationary distribution may provide some information about the expected time it takes for the algorithm to reach a certain solution, assessment of the biases due to recombination and selection, and is of importance in population genetics to assess what is called a "genetic load" (see the introduction for more details). In the recent joint works of the first author, some bounds have been established on the rates at which the stationary distribution concentrates on the uniform populations. The primary tool used in these papers is the "quotient construction" method. It turns out that the quotient construction method can be exploited to derive much more informative bounds on ratios of the stationary distribution values of various subsets of the state space. In fact, some of the bounds obtained in the current work are expressed in terms of the parameters involved in all the three main stages of an evolutionary algorithm: namely, selection, recombination, and mutation.

  3. Devising a model brand loyalty in tires industry: the adjustment role of customer perceived value

    Directory of Open Access Journals (Sweden)

    Davoud Feiz

    2015-06-01

    Full Text Available Today, brand discussion is highly considered by companies and market agents. Different factors such as customers’ loyalty to brand impact on brand and the increase in sale and profit. Present paper aims at studying the impact of brand experience, trust and satisfaction on brand loyalty to Barez Tire Company in the city of Kerman as well as providing a model for this case. Research population consists of all Barez Tire consumers in Kerman. The volume of the sample was 171 for which simple random sampling was used. Data collection tool was a standard questionnaire and for measuring its reliability, Chronbach’s alpha was used. Present research is an applied one in terms of purpose and it is a descriptive and correlative one in terms of acquiring needed data. To analyze data, confirmatory factor analysis (CFA and structural equation model (SEM in SPSS and LISREL software were used. The findings indicate that brand experience, brand trust, and brand satisfaction impact on brand loyalty to Barez Tire Brand in the city of Kerman significantly. Noteworthy, the impact of these factors is higher when considering the role of the perceived value moderator.

  4. Towards a Conceptual Model of HRSS: Leveraging Intellectual Capital Configuration to Create Value for End-Users

    OpenAIRE

    Meijerink, Jeroen; Bondarouk, Tanya; Looise, Jan Kees

    2009-01-01

    Human Resource Shared Services (HRSS) are established to reap the benefits of both centralization and decentralization through bundling intellectual capital and offering HR services that are adapted to the needs of clients and end-users. As a result, HRSS are believed to create value for end-users: employees, managers and HR professionals. However, our understanding of HRSS value creation is limited and therefore, this paper presents a conceptual model that explains value creation of HRSS. Th...

  5. Parameter values for epidemiological models of foot-and-mouth disease in swine

    Directory of Open Access Journals (Sweden)

    Amy C Kinsley

    2016-06-01

    Full Text Available In the event of a foot-and-mouth disease (FMD incursion, response strategies are required to control, contain and eradicate the pathogen as efficiently as possible. Infectious disease simulation models are widely used tools that mimic disease dispersion in a population and that can be useful in the design and support of prevention and mitigation activities. However, there are often gaps in evidence-based research to supply models with quantities that are necessary to accurately reflect the system of interest. The objective of this study was to quantify values associated with the duration of the stages of FMD infection (latent period, subclinical period, incubation period, and duration of infection, probability of transmission (within-herd and between-herd via spatial spread, and diagnosis of a vesicular disease within a herd using a meta-analysis of the peer-reviewed literature and expert opinion. The latent period ranged from 1 to 7 days and incubation period ranged from 1 to 9 day; both were influenced by strain. In contrast, the subclinical period ranged from 0 to 6 days and was influenced by sampling method only. The duration of infection ranged from 1 to 10 days. The probability of spatial spread between an infected and fully susceptible swine farm was estimated as greatest within 5 km of the infected farm, highlighting the importance of possible long-range transmission through the movement of infected animals. Lastly, while most swine practitioners are confident in their ability to detect a vesicular disease in an average sized swine herd, a small proportion expect that up to half of the herd would need to show clinical signs before detection via passive surveillance would occur. The results of this study will be useful in within- and between-herd simulation models to develop efficient response strategies in the event an FMD in swine populations of disease-free countries or regions.

  6. Understanding Students' Motivation in Sport and Physical Education: From the Expectancy-Value Model and Self-Efficacy Theory Perspectives

    Science.gov (United States)

    Gao, Zan; Lee, Amelia M.; Harrison, Louis, Jr.

    2008-01-01

    In this article, the roles of individuals' expectancy beliefs and incentives (i.e., task value, outcome expectancy) in sport and physical education are examined from expectancy-value model and self-efficacy theory perspectives. Overviews of the two theoretical frameworks and the conceptual and measurement issues are provided, followed by a review…

  7. A Personal Value-Based Model of College Students' Aptitudes and Expected Choice Behavior Regarding Retailing Careers.

    Science.gov (United States)

    Shim, Soyeon; Warrington, Patti; Goldsberry, Ellen

    1999-01-01

    A study of 754 retail management students developed a value-based model of career attitude and expected choice behavior. Findings indicate that personal values had an influence on all aspects of retail career attitudes, which then had a direct effect on expected choice behavior. (Contains 55 references.) (Author/JOW)

  8. Understanding Students' Motivation in Sport and Physical Education: From the Expectancy-Value Model and Self-Efficacy Theory Perspectives

    Science.gov (United States)

    Gao, Zan; Lee, Amelia M.; Harrison, Louis, Jr.

    2008-01-01

    In this article, the roles of individuals' expectancy beliefs and incentives (i.e., task value, outcome expectancy) in sport and physical education are examined from expectancy-value model and self-efficacy theory perspectives. Overviews of the two theoretical frameworks and the conceptual and measurement issues are provided, followed by a review…

  9. Does Student Sorting Invalidate Value-Added Models of Teacher Effectiveness? An Extended Analysis of the Rothstein Critique

    Science.gov (United States)

    Koedel, Cory; Betts, Julian R.

    2011-01-01

    Value-added modeling continues to gain traction as a tool for measuring teacher performance. However, recent research questions the validity of the value-added approach by showing that it does not mitigate student-teacher sorting bias (its presumed primary benefit). Our study explores this critique in more detail. Although we find that estimated…

  10. Assessing State Models of Value-Added Teacher Evaluations: Alignment of Policy, Instruments, and Literature-Based Concepts

    Science.gov (United States)

    Hadfield, Timothy E.; Hutchison-Lupardus, Tammy R.; Snyder, Jennifer E.

    2012-01-01

    This problem-based learning project addressed the need to improve the construction and implementation of value-added teacher evaluation policies and instruments. State officials are constructing value-added teacher evaluation models due to accountability initiatives, while ignoring the holes and problems in its implementation. The team's…

  11. Teaching about Values and Goals: Applications of the Circumplex Model to Motivation, Well-Being, and Prosocial Behavior

    Science.gov (United States)

    Kasser, Tim

    2014-01-01

    In this article, I review how people organize values and goals in their minds and I suggest teaching demonstrations, exercises, and assignments to help students learn this material. I pay special attention to circumplex models, which represent the extent of conflict or compatibility between values and goals, and to the well-researched distinction…

  12. A Clustering-Based Model-Building EA for Optimization Problems with Binary and Real-Valued Variables

    NARCIS (Netherlands)

    Sadowski, Krzysztof L.; Bosman, Peter A. N.; Thierens, Dirk

    2015-01-01

    We propose a novel clustering-based model-building evolutionary algorithm to tackle optimization problems that have both binary and real-valued variables. The search space is clustered every generation using a distance metric that considers binary and real-valued variables jointly in order to captur

  13. Financial modeling in medicine: cash flow, basic metrics, the time value of money, discount rates, and internal rate of return.

    Science.gov (United States)

    Lexa, Frank James; Berlin, Jonathan W

    2005-03-01

    In this article, the authors cover tools for financial modeling. Commonly used time lines and cash flow diagrams are discussed. Commonly used but limited terms such as payback and breakeven are introduced. The important topics of the time value of money and discount rates are introduced to lay the foundation for their use in modeling and in more advanced metrics such as the internal rate of return. Finally, the authors broach the more sophisticated topic of net present value.

  14. Value at risk estimation using independent component analysis-generalized autoregressive conditional heteroscedasticity (ICA-GARCH) models.

    Science.gov (United States)

    Wu, Edmond H C; Yu, Philip L H; Li, W K

    2006-10-01

    We suggest using independent component analysis (ICA) to decompose multivariate time series into statistically independent time series. Then, we propose to use ICA-GARCH models which are computationally efficient to estimate the multivariate volatilities. The experimental results show that the ICA-GARCH models are more effective than existing methods, including DCC, PCA-GARCH, and EWMA. We also apply the proposed models to compute value at risk (VaR) for risk management applications. The backtesting and the out-of-sample tests validate the performance of ICA-GARCH models for value at risk estimation.

  15. A Value Model for Depressive Symptoms and Hopelessness among University Students in Turkey

    Science.gov (United States)

    Bilican, F. Isil; Yapici, Asim; Kutlu, M. Oguz

    2016-01-01

    This study aimed to examine which values predicted depressive symptoms and hopelessness in Turkey. While it was hypothesized that values emphasizing universalism, benevolence, conformity, security, tradition, spirituality, self-direction, and achievement would predict lower levels of depressive symptoms and hopelessness, those values emphasizing…

  16. Education in Crisis: A Value-Based Model of Education Provides Some Guidance

    Science.gov (United States)

    Sankar, Yassin

    2004-01-01

    Modern education is in a state of global crisis partially because of the absence of a value-based design of its strategic functions. Education affects the whole spectrum of human values, namely, creative, experiential, aesthetic, material, instrumental, ethical, social, and spiritual values. A student whose educational experience involves this…

  17. How One School Implements and Experiences Ohio's Value-Added Model: A Case Study

    Science.gov (United States)

    Quattrochi, David

    2009-01-01

    Ohio made value-added law in 2003 and incorporated value-added assessment to its operating standards for teachers and administrators in 2006. Value-added data is used to determine if students are making a year's growth at the end of each school year. Schools and districts receive a rating of "Below Growth, Met Growth, or Above Growth" on…

  18. Residents values in a rational decision-making model: an interest in academics in emergency medicine.

    Science.gov (United States)

    Burkhardt, John Christian; Smith-Coggins, Rebecca; Santen, Sally

    2016-10-01

    Academic physicians train the next generation of doctors. It is important to understand the factors that lead residents to choose an academic career to continue to effectively recruit residents who will join the national medical faculty. A decision-making theory-driven, large scale assessment of this process has not been previously undertaken. To examine the factors that predict an Emergency resident's interest in pursuing an academic career at the conclusion of training. This study employs the ABEM Longitudinal Survey (n = 365). A logistic regression model was estimated using an interest in an academic career in residency as the dependent variable. Independent variables include gender, under-represented minority status, survey cohort, number of dependent children, possession of an advanced degree, ongoing research, publications, and the appeal of science, independence, and clinical work in choosing EM. Logistic regression resulted in a statistically significant model (p < 0.001). Residents who chose EM due to the appeal of science, had peer-reviewed publications and ongoing research were more likely to be interested in an academic career at the end of residency (p < 0.05). An increased number of children (p < 0.05) was negatively associated with an interest in academics. Individual resident career interests, research productivity, and lifestyle can help predict an interest in pursuing an academic career. Recruitment and enrichment of residents who have similar values and behaviors should be considered in programs interested in generating more graduates who enter an academic career.

  19. Technological Innovation and Beyond: Exploring Public Value of University Inventions Based on Contingent Effectiveness Model

    DEFF Research Database (Denmark)

    Milana, Evita; Li-Ying, Jason; Faria, Lourenco

    2017-01-01

    University inventions are traditionally seen as significant input into development of new technologies and innovations in the market as they generate growth and regional development. (REF) Yet, these inventions developed into new technologies can simultaneously create public values such as those...... of university inventions. We define four main values: technological, economic, social and environmental, and place the latter two under the concept of public value. The aim of this paper is to expand the understanding of public value and incorporate it into technology transfer literature. We assign...... to the concept of public value a measurement tool, thus, making public value a measurable concept. Therefore, this study not only extends conceptual and theoretical considerations of public value (Jørgensen and Bozeman 2007), but it also provides evidence based on collected data. A unique data set from survey...

  20. No-Impact Threshold Values for NRAP's Reduced Order Models

    Energy Technology Data Exchange (ETDEWEB)

    Last, George V. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Murray, Christopher J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Brown, Christopher F. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Jordan, Preston D. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Sharma, Maneesh [West Virginia Univ., and National Energy Technlogy Lab., Morgantown, WV (United States)

    2013-02-01

    The purpose of this study was to develop methodologies for establishing baseline datasets and statistical protocols for determining statistically significant changes between background concentrations and predicted concentrations that would be used to represent a contamination plume in the Gen II models being developed by NRAP’s Groundwater Protection team. The initial effort examined selected portions of two aquifer systems; the urban shallow-unconfined aquifer system of the Edwards-Trinity Aquifer System (being used to develop the ROM for carbon-rock aquifers, and the a portion of the High Plains Aquifer (an unconsolidated and semi-consolidated sand and gravel aquifer, being used to development the ROM for sandstone aquifers). Threshold values were determined for Cd, Pb, As, pH, and TDS that could be used to identify contamination due to predicted impacts from carbon sequestration storage reservoirs, based on recommendations found in the EPA’s ''Unified Guidance for Statistical Analysis of Groundwater Monitoring Data at RCRA Facilities'' (US Environmental Protection Agency 2009). Results from this effort can be used to inform a ''no change'' scenario with respect to groundwater impacts, rather than the use of an MCL that could be significantly higher than existing concentrations in the aquifer.

  1. Efficient Execution Methods of Pivoting for Bulk Extraction of Entity-Attribute-Value-Modeled Data.

    Science.gov (United States)

    Luo, Gang; Frey, Lewis J

    2016-03-01

    Entity-attribute-value (EAV) tables are widely used to store data in electronic medical records and clinical study data management systems. Before they can be used by various analytical (e.g., data mining and machine learning) programs, EAV-modeled data usually must be transformed into conventional relational table format through pivot operations. This time-consuming and resource-intensive process is often performed repeatedly on a regular basis, e.g., to provide a daily refresh of the content in a clinical data warehouse. Thus, it would be beneficial to make pivot operations as efficient as possible. In this paper, we present three techniques for improving the efficiency of pivot operations: 1) filtering out EAV tuples related to unneeded clinical parameters early on; 2) supporting pivoting across multiple EAV tables; and 3) conducting multi-query optimization. We demonstrate the effectiveness of our techniques through implementation. We show that our optimized execution method of pivoting using these techniques significantly outperforms the current basic execution method of pivoting. Our techniques can be used to build a data extraction tool to simplify the specification of and improve the efficiency of extracting data from the EAV tables in electronic medical records and clinical study data management systems.

  2. The effect of rosemary preparations on the microbial quality and TBARS value of model pork batters

    Directory of Open Access Journals (Sweden)

    Elżbieta Hać-Szymańczuk

    2011-06-01

    Full Text Available Background. Rosemary (Rosmarinus officinalis L. extracts have a potent antioxidant and antibacterial activity and are widely used in the food industry. The effect of added rosemary preparations on the microbiological quality and process of lipid oxidation of model pork batters, immediately after preparation (“0” and 1, 3 and 7 days of chill-storage (4-6°C was analysed in the study. Material and methods. Experiments were conducted with three types of rosemary preparations, i.e.: dried spice, essential oil and a commercial preparation (TasteGuard P. The experimental material consisted of meat batter produced from porcine musculus longissimus dorsi and water. Microbiological examinations covered determinations of the total count of mesophilic aerobic microorganisms, psychrophilic bacteria, coliforms and enterococci. In turn, chemical analyses involved determination of the TBARS value. Results. The rosemary preparations did not exhibit either antibacterial properties against aerobic mesophilic or psychrophilic bacteria. The essential rosemary oil was observed to inhibit the growth of coliform bacteria and enterococci, whereas the dried spice examined was found to increase the counts of aerobic mesophilic bacteria, coliforms and enterococci. None of the rosemary preparations terminated the lipid oxidation process. Conclusions. The results obtained in this study point to the necessity of continuing investigations to determine the dose of rosemary preparations that would inhibit the growth of microflora being the most frequent cause of raw materials and products spoilage and, simultaneously, restrict oxidation of their lipids.

  3. Linear and regressive stochastic models for prediction of daily maximum ozone values at Mexico City atmosphere

    Energy Technology Data Exchange (ETDEWEB)

    Bravo, J. L [Instituto de Geofisica, UNAM, Mexico, D.F. (Mexico); Nava, M. M [Instituto Mexicano del Petroleo, Mexico, D.F. (Mexico); Gay, C [Centro de Ciencias de la Atmosfera, UNAM, Mexico, D.F. (Mexico)

    2001-07-01

    We developed a procedure to forecast, with 2 or 3 hours, the daily maximum of surface ozone concentrations. It involves the adjustment of Autoregressive Integrated and Moving Average (ARIMA) models to daily ozone maximum concentrations at 10 monitoring atmospheric stations in Mexico City during one-year period. A one-day forecast is made and it is adjusted with the meteorological and solar radiation information acquired during the first 3 hours before the occurrence of the maximum value. The relative importance for forecasting of the history of the process and of meteorological conditions is evaluated. Finally an estimate of the daily probability of exceeding a given ozone level is made. [Spanish] Se aplica un procedimiento basado en la metodologia conocida como ARIMA, para predecir, con 2 o 3 horas de anticipacion, el valor maximo de la concentracion diaria de ozono. Esta basado en el calculo de autorregresiones y promedios moviles aplicados a los valores maximos de ozono superficial provenientes de 10 estaciones de monitoreo atmosferico en la Ciudad de Mexico y obtenidos durante un ano de muestreo. El pronostico para un dia se ajusta con la informacion meteorologica y de radiacion solar correspondiente a un periodo que antecede con al menos tres horas la ocurrencia esperada del valor maximo. Se compara la importancia relativa de la historia del proceso y de las condiciones meteorologicas previas para el pronostico. Finalmente se estima la probabilidad diaria de que un nivel normativo o preestablecido para contingencias de ozono sea rebasado.

  4. SIMILARITIES BETWEEN THE KNOWLEDGE CREATION AND CONVERSION MODEL AND THE COMPETING VALUES FRAMEWORK: AN INTEGRATIVE APPROACH

    Directory of Open Access Journals (Sweden)

    PAULO COSTA

    2016-12-01

    Full Text Available ABSTRACT Contemporaneously, and with the successive paradigmatic revolutions inherent to management since the XVII century, we are witnessing a new era marked by the structural rupture in the way organizations are perceived. Market globalization, cemented by quick technological evolutions, associated with economic, cultural, political and social transformations characterize a reality where uncertainty is the only certainty for organizations and managers. Knowledge management has been interpreted by managers and academics as a viable alternative in a logic of creation and conversation of sustainable competitive advantages. However, there are several barriers to the implementation and development of knowledge management programs in organizations, with organizational culture being one of the most preponderant. In this sense, and in this article, we will analyze and compare The Knowledge Creation and Conversion Model proposed by Nonaka and Takeuchi (1995 and Quinn and Rohrbaugh's Competing Values Framework (1983, since both have convergent conceptual lines that can assist managers in different sectors to guide their organization in a perspective of productivity, quality and market competitiveness.

  5. A step function model to evaluate the real monetary value of man-sievert with real GDP

    Energy Technology Data Exchange (ETDEWEB)

    Na, Seong H. [Korea Institute of Nuclear Safety, 19 Guseong-dong, Yuseong-gu, Daejeon 305-338 (Korea, Republic of)], E-mail: shna@kins.re.kr; Kim, Sun G. [School of Business, Daejeon University, Yong Woon-dong, Dong-gu, Daejeon 300-716 (Korea, Republic of)], E-mail: sunkim@dju.ac.kr

    2009-07-15

    For use in a cost-benefit analysis to establish optimum levels of radiation protection in Korea under the ALARA principle, we introduce a discrete step function model to evaluate man-sievert monetary value in the real economic value. The model formula, which is unique and country-specific, is composed of real GDP, the nominal risk coefficient for cancer and hereditary effects, the aversion factor against radiation exposure, and average life expectancy. Unlike previous researches on alpha-value assessment, we show different alpha values in the real term, differentiated with respect to the range of individual doses, which would be more realistic and informative for application to the radiation protection practices. GDP deflators of economy can reflect the society's situations. Finally, we suggest that the Korean model can be generalized simply to other countries without normalizing any country-specific factors.

  6. Non-Equidistant Multivariable Model MGRM (1,n Based on Vector Valued Continued Fraction and Reciprocal Accumulated Generating Operation

    Directory of Open Access Journals (Sweden)

    Youxin Luo

    2013-07-01

    Full Text Available Grey system theory is a scientific theory possessed with wide adaptability to study poor information. The construction method of the background value in multivariable grey model was analyzed. The trapezoid formula and extrapolation method using rational interpolation and numerical integration was proposed based on the theory of vector valued continued fractions. And a non-equidistant multivariable grey model MGRM(1,n was built through applying reciprocal accumulated generating operation. The model is suitable for building both equidistant and non-equidistant models, and it broadens the application range of the grey model and effectively increases both the fitting and the prediction precisions of the model. The applicability and the reliability of the model built were proven by real cases.

  7. Assessment of the Value, Impact, and Validity of the Jobs and Economic Development Impacts (JEDI) Suite of Models

    Energy Technology Data Exchange (ETDEWEB)

    Billman, L.; Keyser, D.

    2013-08-01

    The Jobs and Economic Development Impacts (JEDI) models, developed by the National Renewable Energy Laboratory (NREL) for the U.S. Department of Energy (DOE) Office of Energy Efficiency and Renewable Energy (EERE), use input-output methodology to estimate gross (not net) jobs and economic impacts of building and operating selected types of renewable electricity generation and fuel plants. This analysis provides the DOE with an assessment of the value, impact, and validity of the JEDI suite of models. While the models produce estimates of jobs, earnings, and economic output, this analysis focuses only on jobs estimates. This validation report includes an introduction to JEDI models, an analysis of the value and impact of the JEDI models, and an analysis of the validity of job estimates generated by JEDI model through comparison to other modeled estimates and comparison to empirical, observed jobs data as reported or estimated for a commercial project, a state, or a region.

  8. Using the Expectancy Value Model of Motivation to Understand the Relationship between Student Attitudes and Achievement in Statistics

    Science.gov (United States)

    Hood, Michelle; Creed, Peter A.; Neumann, David L.

    2012-01-01

    We tested a model of the relationship between attitudes toward statistics and achievement based on Eccles' Expectancy Value Model (1983). Participants (n = 149; 83% female) were second-year Australian university students in a psychology statistics course (mean age = 23.36 years, SD = 7.94 years). We obtained demographic details, past performance,…

  9. Cross-National Validation of Prognostic Models Predicting Sickness Absence and the Added Value of Work Environment Variables

    NARCIS (Netherlands)

    Roelen, Corne A. M.; Stapelfeldt, Christina M.; Heymans, Martijn W.; van Rhenen, Willem; Labriola, Merete; Nielsen, Claus V.; Bultmann, Ute; Jensen, Chris

    2015-01-01

    Purpose To validate Dutch prognostic models including age, self-rated health and prior sickness absence (SA) for ability to predict high SA in Danish eldercare. The added value of work environment variables to the models' risk discrimination was also investigated. Methods 2,562 municipal eldercare w

  10. INFLUENCE OF VALUES ON LEADERSHIP STYLES: AN ANALYSIS ACCORDING TO BASS’ TRANSFORMATIONAL-TRANSACTIONAL LEADERSHIP MODEL

    Directory of Open Access Journals (Sweden)

    MARTÍN NADER

    2007-11-01

    Full Text Available Data corresponding to a study made with 224 low and high level leaders (142 men and 84 women resident appear in Buenos Aires, which worked in small, medium and big companies which had as primary target to determine the influence that exerts the values of the leader on their leadership style. It was observed that the self transcendence and openness to change values predicted transformational leadership style whereas the self promotion values predicted the transactional leadership style. There was no interaction between conservation values and transactional leadership style.

  11. A novel model for cost performance evaluation of pulverized coal injected into blast furnace based on effective calorific value

    Institute of Scientific and Technical Information of China (English)

    徐润生; 张建良; 左海滨; 李克江; 宋腾飞; 邵久刚

    2015-01-01

    The combustion process of pulverized coal injected into blast furnace involves a lot of physical and chemical reactions. Based on the combustion behaviors of pulverized coal, the conception of coal effective calorific value representing the actual thermal energy provided for blast furnace was proposed. A cost performance evaluation model of coal injection was built up for the optimal selection of various kinds of coal based on effective calorific value. The model contains two indicators: coal effective calorific value which has eight sub-indicators and coal injection cost which includes four sub-indicators. In addition, the calculation principle and application of cost performance evaluation model in a Chinese large-scale iron and steel company were comprehensively introduced. The evaluation results finally confirm that this novel model is of great significance to the optimal selection of blast furnace pulverized coal.

  12. Global sampling to assess the value of diverse observations in conditioning a real-world groundwater flow and transport model

    Science.gov (United States)

    Delsman, Joost R.; Winters, Pieter; Vandenbohede, Alexander; Oude Essink, Gualbert H. P.; Lebbe, Luc

    2016-03-01

    The use of additional types of observational data has often been suggested to alleviate the ill-posedness inherent to parameter estimation of groundwater models and constrain model uncertainty. Disinformation in observational data caused by errors in either the observations or the chosen model structure may, however, confound the value of adding observational data in model conditioning. This paper uses the global generalized likelihood uncertainty estimation methodology to investigate the value of different observational data types (heads, fluxes, salinity, and temperature) in conditioning a groundwater flow and transport model of an extensively monitored field site in the Netherlands. We compared model conditioning using the real observations to a synthetic model experiment, to demonstrate the possible influence of disinformation in observational data in model conditioning. Results showed that the value of different conditioning targets was less evident when conditioning to real measurements than in a measurement error-only synthetic model experiment. While in the synthetic experiment, all conditioning targets clearly improved model outcomes, minor improvements or even worsening of model outcomes was observed for the real measurements. This result was caused by errors in both the model structure and the observations, resulting in disinformation in the observational data. The observed impact of disinformation in the observational data reiterates the necessity of thorough data validation and the need for accounting for both model structural and observational errors in model conditioning. It further suggests caution when translating results of synthetic modeling examples to real-world applications. Still, applying diverse conditioning data types was found to be essential to constrain uncertainty, especially in the transport of solutes in the model.

  13. Predicting Calcium Values for Gastrointestinal Bleeding Patients in Intensive Care Unit Using Clinical Variables and Fuzzy Modeling

    Directory of Open Access Journals (Sweden)

    G Khalili-Zadeh-Mahani

    2016-07-01

    Full Text Available Introduction: Reducing unnecessary laboratory tests is an essential issue in the Intensive Care Unit. One solution for this issue is to predict the value of a laboratory test to specify the necessity of ordering the tests. The aim of this paper was to propose a clinical decision support system for predicting laboratory tests values. Calcium laboratory tests of three categories of patients, including upper and lower gastrointestinal bleeding, and unspecified hemorrhage of gastrointestinal tract, have been selected as the case studies for this research. Method: In this research, the data have been collected from MIMIC-II database. For predicting calcium laboratory values, a Fuzzy Takagi-Sugeno model is used and the input variables of the model are heart rate and previous value of calcium laboratory test. Results: The results showed that the values of calcium laboratory test for the understudy patients were predictable with an acceptable accuracy. In average, the mean absolute errors of the system for the three categories of the patients are 0.27, 0.29, and 0.28, respectively. Conclusion: In this research, using fuzzy modeling and two variables of heart rate and previous calcium laboratory values, a clinical decision support system was proposed for predicting laboratory values of three categories of patients with gastrointestinal bleeding. Using these two clinical values as input variables, the obtained results were acceptable and showed the capability of the proposed system in predicting calcium laboratory values. For achieving better results, the impact of more input variables should be studied. Since, the proposed system predicts the laboratory values instead of just predicting the necessity of the laboratory tests; it was more generalized than previous studies. So, the proposed method let the specialists make the decision depending on the condition of each patient.

  14. Modelo de previsão de value at risk utilizando volatilidade de longo prazo = Value at Risk prediction model using long term volatility

    Directory of Open Access Journals (Sweden)

    Vinicius Mothé Maia

    2016-07-01

    Full Text Available Tendo em vista a importância do Value at Risk (VaR como medida de risco para instituições financeiras e agências de risco, o presente estudo avaliou se o modelo ARLS é mais preciso no cálculo do VaR de longo prazo que os modelos tradicionais, dada sua maior adequação para a previsão da volatilidade. Considerando a utilização do VaR pelos agentes de mercado como medida de risco para o gerenciamento de portfólios é importante sua adequada mensuração. A partir de dados diários dos mercados de ações e cambial dos BRICS (Brasil, Rússia, Índia, China e África do Sul foram calculadas as volatilidades futuras para 15 dias, 1 mês e 3 meses. Em seguida, calculou-se as medidas tradicionais de avaliação da precisão do VaR. Os resultados sugerem a superioridade do modelo ARLS para a previsão da volatilidade cambial, capaz de prever corretamente o número de violações em 33% dos casos, enquanto os modelos tradicionais não obtiveram um bom desempenho. Com relação ao mercado acionário, os modelos GARCH e ARLS apresentaram desempenho similar. O modelo GARCH é superior considerando a perda média quadrática. Esses resultados apontam para a escolha do modelo ARLS no cálculo do VaR de portfólios cambiais devido a maior precisão alcançada. Ajuda assim os agentes de mercado a melhor gerirem o risco de suas carteiras. Em relação ao mercado acionário, em função do desempenho similar dos modelos GARCH e ARLS, o modelo GARCH é o mais indicado devido a sua maior simplicidade e fácil implementação computacional. Having in mind the importance of Value at Risk (VaR as a risk measure for financial institutions and rating agencies, this study evaluated whether the ARLS model is more accurate in the calculation of the long term VaR than the traditional models, considering it is more appropriate for predicting the long-term volatility. Due to the fact that VaR s being used for market players as a measure of risk for the portfolio

  15. Model for determining the change in value of the forest captial during the financial year. Modell foer beraekning av skogskapitalets vaerdefoeraendring under raekenskapsaret

    Energy Technology Data Exchange (ETDEWEB)

    Haegg, A.

    1993-01-01

    The annual report of a forestry district usually states an operating profit, expressed in monetary units, and a return figure, expressed as a percentage of the capital employed. If the operating profit from logging and silviculture in any given year is to reflect the true profit generated by the district, the profit must equal the value increment of that year. Any difference amounts to a change in the value of the forest capital. The aim of the study is to present a model for calculating the annual value increment and the operating profit from logging and silviculture, expressed in commensurable units. Thus, the model will make it possible to compare these quantities. The model presupposes the existence of a computer based, regularly updated, stand register. The problem analysis carried out has lead to the following conclusions: The value increment of the stand ensues from an increase of its anticipated value. It is, accordingly, a calculated value; At the start, the entire holding must be valued stand-by-stand; In the valuations, the real discount rate, as well as in- and outpayments should be adjusted for taxes; The value changes of the forest capital during the year should be based upon the calculated net profit, not upon the actual profit; and Actually existing stand registers can be used as a basis for the model, even if they do not explicity state all the data required by the model.

  16. Plug-In Hybrid Electric Vehicle Value Proposition Study: Phase 1, Task 2: Select Value Propositions/Business Model for Further Study

    Energy Technology Data Exchange (ETDEWEB)

    Sikes, Karen R [ORNL; Markel, Lawrence C [ORNL; Hadley, Stanton W [ORNL; Hinds, Shaun [Sentech, Inc.

    2008-04-01

    The Plug-In Hybrid Electric Vehicle (PHEV) Value Propositions Workshop held in Washington, D.C. in December 2007 served as the Task 1 Milestone for this study. Feedback from all five Workshop breakout sessions has been documented in a Workshop Summary Report, which can be found at www.sentech.org/phev. In this report, the project team compiled and presented a comprehensive list of potential value propositions that would later serve as a 'grab bag' of business model components in Task 2. After convening with the Guidance and Evaluation Committee and other PHEV stakeholders during the Workshop, several improvements to the technical approach were identified and incorporated into the project plan to present a more realistic and accurate case study and evaluation. The assumptions and modifications that will have the greatest impact on the case study selection process in Task 2 are described in more detail in this deliverable. The objective of Task 2 is to identify the combination of value propositions that is believed to be achievable by 2030 and collectively hold promise for a sustainable PHEV market by 2030. This deliverable outlines what the project team (with input from the Committee) has defined as its primary scenario to be tested in depth for the remainder of Phase 1. Plans for the second and third highest priority/probability business scenarios are also described in this deliverable as proposed follow up case studies in Phase 2. As part of each case study description, the proposed utility system (or subsystem), PHEV market segment, and facilities/buildings are defined.

  17. Estimated Nutritive Value of Low-Price Model Lunch Sets Provided to Garment Workers in Cambodia

    Directory of Open Access Journals (Sweden)

    Jan Makurat

    2017-07-01

    Full Text Available Background: The establishment of staff canteens is expected to improve the nutritional situation of Cambodian garment workers. The objective of this study is to assess the nutritive value of low-price model lunch sets provided at a garment factory in Phnom Penh, Cambodia. Methods: Exemplary lunch sets were served to female workers through a temporary canteen at a garment factory in Phnom Penh. Dish samples were collected repeatedly to examine mean serving sizes of individual ingredients. Food composition tables and NutriSurvey software were used to assess mean amounts and contributions to recommended dietary allowances (RDAs or adequate intake of energy, macronutrients, dietary fiber, vitamin C (VitC, iron, vitamin A (VitA, folate and vitamin B12 (VitB12. Results: On average, lunch sets provided roughly one third of RDA or adequate intake of energy, carbohydrates, fat and dietary fiber. Contribution to RDA of protein was high (46% RDA. The sets contained a high mean share of VitC (159% RDA, VitA (66% RDA, and folate (44% RDA, but were low in VitB12 (29% RDA and iron (20% RDA. Conclusions: Overall, lunches satisfied recommendations of caloric content and macronutrient composition. Sets on average contained a beneficial amount of VitC, VitA and folate. Adjustments are needed for a higher iron content. Alternative iron-rich foods are expected to be better suited, compared to increasing portions of costly meat/fish components. Lunch provision at Cambodian garment factories holds the potential to improve food security of workers, approximately at costs of <1 USD/person/day at large scale. Data on quantitative total dietary intake as well as physical activity among workers are needed to further optimize the concept of staff canteens.

  18. Model Development of Isan Country Song Compositions for Economic, Social and Cultural Value-Added

    Directory of Open Access Journals (Sweden)

    Nipinth Suwanrong

    2010-01-01

    Full Text Available Problem statement: Country Song was related to history, society, culture and economic. The objectives of this study were: (1 the historical background in composing the Isan Country Song, (2 the current situation and problem of composing style of Isan Country Song and (3 the development of composing pattern the Isan Country Song for elevating the economic value. Approach: The research area consisted of Isan Region including Chiaya Phume, Ubon Rachatani, Amnat Charoen and Sri-Sa-ket Provinces. The samples providing information included 170 persons. The instruments using for collecting data included: The Survey Form, Interview Form, Focus Group Discussion and Participatory Workshop. The data were classified into groups. Qualitative data were analyzed according to the specified objectives. The research findings were presented in descriptive analysis. Results: The research findings found that the historical background of Isan Country Song, reflected lifestyle, social situation, ideal, and Thai Culture with the rhyme developed from traditional Country Song blending with Big Brand Music Band mixing with the local singing words and rhythm. The current situation and problem after listening to the Country Song was more increasing popular since there were many song composers and chords. There were advertisements through the media as radio, television, mobile phone and internet network. Therefore, the buying and selling of song were easier and quicker. For the song composing, the text of a song didn’t focus on the rhyme and lacked of morality enhancement. The supplementary music sometimes lacked of beauty based on aesthetic principle. For the commercial problems, there were many violations of right. Consequently, the entrepreneur sometimes faced with the lost. For development, the god points of Isan Country Song were analyzed both of the text and rhyme. The composed song model was created by bringing Pentatonic scale mixing with message of Dorian Mode. The

  19. Animating Research with Counseling Values: A Training Model to Address the Research-to-Practice Gap

    Science.gov (United States)

    Lee, Kristi A.; Dewell, John A.; Holmes, Courtney M.

    2014-01-01

    The persistent research-to-practice gap poses a problem for counselor education. The gap may be caused by conflicts between the humanistic values that guide much of counseling and the values that guide research training. In this article, the authors address historical concerns regarding research training for students and the conducting of research…

  20. Risk model for suspected acute coronary syndrome is of limited value in an emergency department

    DEFF Research Database (Denmark)

    Mogensen, Christian Backer; Christiansen, Maja; Jørgensen, Jess Bjerre;

    2015-01-01

    if assigned to the high-risk group was 3.0. Allocation to the high-risk group, male gender and age above 60 years was associated with a higher risk of ACS. For patients fulfilling the high-risk definition, sensitivity was 71%, specificity 55%, negative predictive value 90% and positive predictive value 24...

  1. Business value and performance of a company : Choice for methodology and model sets

    NARCIS (Netherlands)

    Vet, van der M.; Hajdasinski, A.K.

    2009-01-01

    This paper is a continuation of the first article “Business Value Creation of (Transformational) Outsourcing, a Performance Based Approach” published in 2008 (Nyenrode Research Institute). In this paper the authors would like to discuss the distinction between the Value of a Company and the

  2. Design thinking to enhance the sustainable business modelling process - A workshop based on a value mapping process

    OpenAIRE

    Geissdoerfer, Martin; Bocken, Nancy M. P.; Hultink, Erik Jan

    2016-01-01

    This is the author accepted manuscript. The final version is available from Elsevier via http://dx.doi.org/10.1016/j.jclepro.2016.07.020 Sustainable business model innovation is an emerging topic, but only few tools are currently available to assist companies in sustainable business modelling. This paper works towards closing this gap by bringing together ‘design thinking’ and ‘sustainable business model innovation’ to refine the creative process of developing sustainable value proposition...

  3. A new fuzzy regression model based on interval-valued fuzzy neural network and its applications to management

    Directory of Open Access Journals (Sweden)

    Somaye Yeylaghi

    2017-06-01

    Full Text Available In this paper, a novel hybrid method based on interval-valued fuzzy neural network for approximate of interval-valued fuzzy regression models, is presented. The work of this paper is an expansion of the research of real fuzzy regression models. In this paper interval-valued fuzzy neural network (IVFNN can be trained with crisp and interval-valued fuzzy data. Here a neural network is considered as a part of a large field called neural computing or soft computing. Moreover, in order to find the approximate parameters, a simple algorithm from the cost function of the fuzzy neural network is proposed. Finally, we illustrate our approach by some numerical examples and compare this method with existing methods.

  4. Gender-based model comparisons of maternal values, monitoring, communication, and early adolescent risk behavior.

    Science.gov (United States)

    Cottrell, Lesley; Yu, Shuli; Liu, Hongjie; Deveaux, Lynette; Lunn, Sonja; Bain, Rosa Mae; Stanton, Bonita

    2007-10-01

    To examine the relationships among maternal values, monitoring knowledge, parent-adolescent communication, and adolescent risk involvement based on adolescent gender. Parental reports of their personal values, monitoring knowledge, and communication with their children were compared with adolescent reports of risk involvement using information gathered from 647 Bahamian mother-adolescent (9-13 years) dyads. Parental values of conservation (e.g., conformity) were positively associated with greater parent-adolescent communication and communication was significantly associated with greater monitoring knowledge for both genders. Among mother-son dyads only, group-based parental values of self-transcendence (e.g., universalism) were significantly associated with greater perceived parental monitoring knowledge; individualized self-enhancement values (e.g., hedonism) were negatively associated with open and supportive parent-adolescent communication. Parental values influence other parenting processes such as monitoring and communication. Parental monitoring, in turn, inversely influences adolescent risk involvement. These influences appear to differ based on the adolescent's gender, as many of the relationships were stronger among mother-son dyads. These findings highlight a need to better understand the nature of the relationship between maternal values, parent-adolescent interactions, and adolescents' risk decisions.

  5. Client value models provide a framework for rational library planning (or, phrasing the answer in the form of a question).

    Science.gov (United States)

    Van Moorsel, Guillaume

    2005-01-01

    Libraries often do not know how clients value their product/ service offerings. Yet at a time when the mounting costs for library support are increasingly difficult to justify to the parent institution, the library's ability to gauge the value of its offerings to clients has never been more critical. Client Value Models (CVMs) establish a common definition of value elements-or a "value vocabulary"-for libraries and their clients, thereby providing a basis upon which to make rational planning decisions regarding product/service acquisition and development. The CVM concept is borrowed from business and industry, but its application has a natural fit in libraries. This article offers a theoretical consideration and practical illustration of CVM application in libraries.

  6. The value of value congruence.

    Science.gov (United States)

    Edwards, Jeffrey R; Cable, Daniel M

    2009-05-01

    Research on value congruence has attempted to explain why value congruence leads to positive outcomes, but few of these explanations have been tested empirically. In this article, the authors develop and test a theoretical model that integrates 4 key explanations of value congruence effects, which are framed in terms of communication, predictability, interpersonal attraction, and trust. These constructs are used to explain the process by which value congruence relates to job satisfaction, organizational identification, and intent to stay in the organization, after taking psychological need fulfillment into account. Data from a heterogeneous sample of employees from 4 organizations indicate that the relationships that link individual and organizational values to outcomes are explained primarily by the trust that employees place in the organization and its members, followed by communication, and, to a lesser extent, interpersonal attraction. Polynomial regression analyses reveal that the relationships emanating from individual and organizational values often deviated from the idealized value congruence relationship that underlies previous theory and research. The authors' results also show that individual and organizational values exhibited small but significant relationships with job satisfaction and organizational identification that bypassed the mediators in their model, indicating that additional explanations of value congruence effects should be pursued in future research. (c) 2009 APA, all rights reserved.

  7. Applying Ethnic Equivalence and Cultural Values Models to African-American Teens' Perceptions of Parents.

    Science.gov (United States)

    Lamborn, Susie D.; Felbab, Amanda J.

    2003-01-01

    Study evaluated both the parenting styles and family ecologies models with interview responses from African American adolescents. Analyses contrasted each model with a joint model for predicting self esteem, self reliance, work orientation, and ethnic identity. Overall, findings suggest that a joint model that combines elements from both models…

  8. Applying Ethnic Equivalence and Cultural Values Models to African-American Teens' Perceptions of Parents.

    Science.gov (United States)

    Lamborn, Susie D.; Felbab, Amanda J.

    2003-01-01

    Study evaluated both the parenting styles and family ecologies models with interview responses from African American adolescents. Analyses contrasted each model with a joint model for predicting self esteem, self reliance, work orientation, and ethnic identity. Overall, findings suggest that a joint model that combines elements from both models…

  9. Application of risk-rated profit model functions in estimation of economic values for indigenous chicken breeding.

    Science.gov (United States)

    Okeno, Tobias O; Magothe, Thomas M; Kahi, Alexander K; Peters, Kurt J

    2012-08-01

    The economic values for productive (egg number, average daily gain, live weight, and mature weight) and functional (fertility, hatchability, broodiness, survival rate, feed intake, and egg weight) traits were derived for three production systems utilizing indigenous chicken in Kenya. The production systems considered were free-range, semi-intensive, and intensive system and were evaluated based on fixed flock size and fixed feed resource production circumstances. A bio-economic model that combined potential performances, feeding strategies, optimum culling strategies, farmer's preferences and accounted for imperfect knowledge concerning risk attitude of farmers and economic dynamics was employed to derive risk-rated economic values. The economic values for all the traits were highest in free-range system under the two production circumstances and decreased with level of intensification. The economic values for egg number, average daily gain, live weight, fertility, hatchability, and survival rate were positive while those for mature weight, broodiness, egg weight, and feed intake were negative. Generally, the economic values estimated under fixed feed resource production circumstances were higher than those derived under fixed flock size. The difference between economic values estimated using simple (traditional) and risk-rated profit model functions ranged from -47.26% to +67.11% indicating that inclusion of risks in estimation of economic values is important. The results of this study suggest that improvement targeting egg number, average daily gain, live weight, fertility, hatchability, and survival rate would have a positive impact on profitability of indigenous chicken production in Kenya.

  10. Evaluating the Value of High Spatial Resolution in National Capacity Expansion Models using ReEDS: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Krishnan, Venkat; Cole, Wesley

    2016-07-01

    Power sector capacity expansion models (CEMs) have a broad range of spatial resolutions. This paper uses the Regional Energy Deployment System (ReEDS) model, a long-term national scale electric sector CEM, to evaluate the value of high spatial resolution for CEMs. ReEDS models the United States with 134 load balancing areas (BAs) and captures the variability in existing generation parameters, future technology costs, performance, and resource availability using very high spatial resolution data, especially for wind and solar modeled at 356 resource regions. In this paper we perform planning studies at three different spatial resolutions--native resolution (134 BAs), state-level, and NERC region level--and evaluate how results change under different levels of spatial aggregation in terms of renewable capacity deployment and location, associated transmission builds, and system costs. The results are used to ascertain the value of high geographically resolved models in terms of their impact on relative competitiveness among renewable energy resources.

  11. USING OF NET PRESENT VALUE (NPV) TO TEST THE INTEGRATED MODEL IN BUILDING MANAGEMENT INFORMATION SYSTEMS

    OpenAIRE

    Omar, Mohammad; Abdullah, Khairul

    2017-01-01

    The integrated model is a new model that is recently developed in order to build the management information systems (MIS's) by using the classical approach system development methodology. The integrated model aims to address the drawbacks of the classical approach in consumption additional time and cost while building the MIS's. The integrated model was subjected to two tests by using the mathematical probability theories in order to ensure the validity of the integrated model in it...

  12. Slip Model Used for Prediction of r Value of BCC Metal Sheets from ODF Coefficients

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    Different slip models were used for prediction of rvalue of BCC metal sheets from ODF coefficients. According to the maximum plastic work theory developed by Bishop and Hill, it is expected that the higher of Taylor factors given by a slip model, the better predictio nobtained based on the model. From this point of view, a composed slip model of BCC metals was presented. Based on the model, the agreement of predicted rvalues for deep drawing steels with experimental ones is excellent.

  13. MODELS THAT EVALUATE THE VALUE OF THE FINANCIAL INSTRUMENTS. RECOGNITION AND MEASUREMENT UNDER INTERNATIONAL RULES

    OpenAIRE

    Mihaela Gruiescu; Corina Ioanăs; Adriana Florina Popa

    2010-01-01

    As the present financial markets have broadened and deepened, increasing numbers of firms are utilizing innovative financial instruments to accomplish business objectives and enhance shareholder value. It is crucial for the financial managers to keep abreast of available financial instruments, the business settings in which these instruments can create—and destroy—value, and modern analysis techniques for these instruments. A financial manager also should possess a basic understanding of the ...

  14. Modelling the life insurance needs using the human life value revision method

    Science.gov (United States)

    Hashim, Haslifah; Service, David

    2013-04-01

    There are numerous methods to determine the appropriate amount of life insurance a person needs - it can be scientific or simplistic. Many life insurance agents and financial advisors simply rely on traditional rules of thumb using the multiple of income method. The more scientific methods are the needs analysis and the human life value. The needs analysis is regarded as the most commonly used sales tool and the human life value is the most agreed academic expression for the purpose of life insurance. However, there are several weaknesses of using both methods. By using needs analysis as a sales tool, the recommendation amount of life insurance would leave a person underinsured. Similar goes to the human life value method. Nevertheless, both methods can be improved with a few revisions. The post-death needs under the needs analysis must be revised to incorporate the reality that the family's standard of living changes over time. The projection of a changing standard of living is a part of human life value analysis. Therefore, this research looked into both methods and combines both concept of needs analysis and human life value to create a powerful methodology that provide adequate life insurance protection - a method we name it as 'the Human Life Value Revision Method'.

  15. The SMM Model as a Boundary Value Problem Using the Discrete Diffusion Equation

    Science.gov (United States)

    Campbell, Joel

    2007-01-01

    A generalized single step stepwise mutation model (SMM) is developed that takes into account an arbitrary initial state to a certain partial difference equation. This is solved in both the approximate continuum limit and the more exact discrete form. A time evolution model is developed for Y DNA or mtDNA that takes into account the reflective boundary modeling minimum microsatellite length and the original difference equation. A comparison is made between the more widely known continuum Gaussian model and a discrete model, which is based on modified Bessel functions of the first kind. A correction is made to the SMM model for the probability that two individuals are related that takes into account a reflecting boundary modeling minimum microsatellite length. This method is generalized to take into account the general n-step model and exact solutions are found. A new model is proposed for the step distribution.

  16. The multi-criteria optimization for the formation of the multiple-valued logic model of a robotic agent

    Science.gov (United States)

    Bykovsky, A. Yu; Sherbakov, A. A.

    2016-08-01

    The C-valued Allen-Givone algebra is the attractive tool for modeling of a robotic agent, but it requires the consensus method of minimization for the simplification of logic expressions. This procedure substitutes some undefined states of the function for the maximal truth value, thus extending the initially given truth table. This further creates the problem of different formal representations for the same initially given function. The multi-criteria optimization is proposed for the deliberate choice of undefined states and model formation.

  17. Applying ethnic equivalence and cultural values models to African-American teens' perceptions of parents.

    Science.gov (United States)

    Lamborn, Susie D; Felbab, Amanda J

    2003-10-01

    This study evaluated both the parenting styles and family ecologies models with interview responses from 93 14- and 15-year-old African-American adolescents. The parenting styles model was more strongly represented in both open-ended and structured interview responses. Using variables from the structured interview as independent variables, regression analyses contrasted each model with a joint model for predicting self-esteem, self-reliance, work orientation, and ethnic identity. Overall, the findings suggest that a joint model that combines elements from both models provides a richer understanding of African-American families.

  18. Multi-stage IT project evaluation: The flexibility value obtained by implementing and resolving Berk, Green and Naik (2004) model

    Science.gov (United States)

    Abid, Fathi; Guermazi, Dorra

    2009-11-01

    In this paper, we evaluate a multi-stage information technology investment project, by implementing and resolving Berk, Green and Naik's (2004) model, which takes into account specific features of IT projects and considers the real option to suspend investment at each stage. We present a particular case of the model where the project value is the solution of an optimal control problem with a single state variable. In this case, the model is more intuitive and tractable. The case study confirms the practical potential of the model and highlights the importance of the real-option approach compared to classical discounted cash flow techniques in the valuation of IT projects.

  19. The Value of Value

    DEFF Research Database (Denmark)

    Sørensen, Asger

    parts of business ethics given prominence to especially one term, namely `value'. The question that interests me is the following: What does the articulation of ethics and morality in terms of values mean for ethics and morality as such. Or, to put the question in a more fashionably way: What...... is the value of value for morality and ethics?To make things a bit more precise, we can make use of the common distinction between ethics and morality, i.e. that morality is the immediate, collective and unconscious employment of morals, whereas ethics is the systematic, individual and conscious reflections...

  20. Applying an expectancy-value model to study motivators for work-task based information seeking

    DEFF Research Database (Denmark)

    Sigaard, Karen Tølbøl; Skov, Mette

    2015-01-01

    Purpose: The purpose of this paper is to operationalise and verify a cognitive motivation model that has been adapted to information seeking. The original model was presented within the field of psychology. Design/methodology/approach: An operationalisation of the model is presented based on the ...

  1. A Mean-Value Analysis of Stochastic Petri Net Models of Slotted Rings

    NARCIS (Netherlands)

    Coyle, Andrew; Haverkort, Boudewijn R.; Henderson, William; Pearce, Charles E.M.

    1996-01-01

    In this paper, we analyse Stochastic Petri Net (SPN) models of slotted-ring networks. We show that a simple SPN model of a slotted-ring network, which exhibits a product-form solution, yields similar results to a more detailed SPN model that has to be analysed by numerical means. Furthermore, we dem

  2. Reconsidering the use of rankings in the valuation of health states: a model for estimating cardinal values from ordinal data

    Directory of Open Access Journals (Sweden)

    Salomon Joshua A

    2003-12-01

    Full Text Available Abstract Background In survey studies on health-state valuations, ordinal ranking exercises often are used as precursors to other elicitation methods such as the time trade-off (TTO or standard gamble, but the ranking data have not been used in deriving cardinal valuations. This study reconsiders the role of ordinal ranks in valuing health and introduces a new approach to estimate interval-scaled valuations based on aggregate ranking data. Methods Analyses were undertaken on data from a previously published general population survey study in the United Kingdom that included rankings and TTO values for hypothetical states described using the EQ-5D classification system. The EQ-5D includes five domains (mobility, self-care, usual activities, pain/discomfort and anxiety/depression with three possible levels on each. Rank data were analysed using a random utility model, operationalized through conditional logit regression. In the statistical model, probabilities of observed rankings were related to the latent utilities of different health states, modeled as a linear function of EQ-5D domain scores, as in previously reported EQ-5D valuation functions. Predicted valuations based on the conditional logit model were compared to observed TTO values for the 42 states in the study and to predictions based on a model estimated directly from the TTO values. Models were evaluated using the intraclass correlation coefficient (ICC between predictions and mean observations, and the root mean squared error of predictions at the individual level. Results Agreement between predicted valuations from the rank model and observed TTO values was very high, with an ICC of 0.97, only marginally lower than for predictions based on the model estimated directly from TTO values (ICC = 0.99. Individual-level errors were also comparable in the two models, with root mean squared errors of 0.503 and 0.496 for the rank-based and TTO-based predictions, respectively. Conclusions

  3. Reconsidering the use of rankings in the valuation of health states: a model for estimating cardinal values from ordinal data.

    Science.gov (United States)

    Salomon, Joshua A

    2003-12-19

    BACKGROUND: In survey studies on health-state valuations, ordinal ranking exercises often are used as precursors to other elicitation methods such as the time trade-off (TTO) or standard gamble, but the ranking data have not been used in deriving cardinal valuations. This study reconsiders the role of ordinal ranks in valuing health and introduces a new approach to estimate interval-scaled valuations based on aggregate ranking data. METHODS: Analyses were undertaken on data from a previously published general population survey study in the United Kingdom that included rankings and TTO values for hypothetical states described using the EQ-5D classification system. The EQ-5D includes five domains (mobility, self-care, usual activities, pain/discomfort and anxiety/depression) with three possible levels on each. Rank data were analysed using a random utility model, operationalized through conditional logit regression. In the statistical model, probabilities of observed rankings were related to the latent utilities of different health states, modeled as a linear function of EQ-5D domain scores, as in previously reported EQ-5D valuation functions. Predicted valuations based on the conditional logit model were compared to observed TTO values for the 42 states in the study and to predictions based on a model estimated directly from the TTO values. Models were evaluated using the intraclass correlation coefficient (ICC) between predictions and mean observations, and the root mean squared error of predictions at the individual level. RESULTS: Agreement between predicted valuations from the rank model and observed TTO values was very high, with an ICC of 0.97, only marginally lower than for predictions based on the model estimated directly from TTO values (ICC = 0.99). Individual-level errors were also comparable in the two models, with root mean squared errors of 0.503 and 0.496 for the rank-based and TTO-based predictions, respectively. CONCLUSIONS: Modeling health

  4. The value of physical attractiveness in romantic partners: modeling biological and social variables.

    Science.gov (United States)

    Jonason, Peter K

    2009-04-01

    According to research on physical attractiveness, personal attributes such as gender, height, and self-perception are important in determining how much individuals value physical attractiveness in their romantic partners. In a survey of 228 college-aged participants, the ratings of the physical attractiveness of potential romantic partners were positively correlated with how much participants valued physical attractiveness in their long-term romantic partners. Individuals' partner preferences may be sensitive to their perceptions of themselves. Perceptions of the attractiveness of those in one's local area may also play a part in the development of such partner preferences through exposure.

  5. Value-Focused Thinking Model to Evaluate SHM System Alternatives From Military end User Requirements Point of View

    OpenAIRE

    Klimaszewski Sławomir

    2016-01-01

    The article describes Value-Focused Thinking (VFT) model developed in order to evaluate various alternatives for implementation of Structural Health Monitoring (SHM) system on a military aircraft. Four SHM system alternatives are considered based on: visual inspection (current approach), piezoelectric (PZT) sensors, Fiber Bragg Grating (FBG) sensors and Comparative Vacuum Monitoring (CVM) sensors. A numerical example is shown to illustrate the model capability. Sensitivity analyses are perfor...

  6. Creating shared value through business models based on sustainability and CSR : An empirical study of Swedish companies

    OpenAIRE

    Haskell, Lucas; Pålhed, Johan

    2016-01-01

    Businesses and humans in general have led to a variety of different social and environmental problems. In this thesis we call on businesses to be the solution to these problems. Therefore, we have developed two research questions:   How do Swedish companies incorporate CSR and sustainability into their business models? and How do Swedish companies create shared value through business models based on sustainability and CSR?   The purpose of these research questions is to discover how Swedish c...

  7. The value of adding optics to ecosystem models: a case study

    Directory of Open Access Journals (Sweden)

    M. Fujii

    2007-10-01

    Full Text Available Many ecosystem models have been developed to study the ocean's biogeochemical properties, but most of these models use simple formulations to describe light penetration and spectral quality. Here, an optical model is coupled with a previously published ecosystem model that explicitly represents two phytoplankton (picoplankton and diatoms and two zooplankton functional groups, as well as multiple nutrients and detritus. Surface ocean color fields and subsurface light fields are calculated by coupling the ecosystem model with an optical model that relates biogeochemical standing stocks with inherent optical properties (absorption, scattering; this provides input to a commercially available radiative transfer model (Ecolight. We apply this bio-optical model to the equatorial Pacific upwelling region, and find the model to be capable of reproducing many measured optical properties and key biogeochemical processes in this region. Our model results suggest that non-algal particles largely contribute to the total scattering or attenuation (>50% at 660 nm but have a much smaller contribution to particulate absorption (<20% at 440 nm, while picoplankton dominate the total phytoplankton absorption (>95% at 440 nm. These results are consistent with the field observations. In order to achieve such good agreement between data and model results, however, key model parameters, for which no field data are available, have to be constrained. Sensitivity analysis of the model results to optical parameters reveals a significant role played by colored dissolved organic matter through its influence on the quantity and quality of the ambient light. Coupling explicit optics to an ecosystem model provides advantages in generating: (1 a more accurate subsurface light-field, which is important for light sensitive biogeochemical processes such as photosynthesis and photo-oxidation, (2 additional constraints on model parameters that help to reduce uncertainties in

  8. The conservation value of elevation data accuracy and model sophistication in reserve design under sea-level rise.

    Science.gov (United States)

    Zhu, Mingjian; Hoctor, Tom; Volk, Mike; Frank, Kathryn; Linhoss, Anna

    2015-10-01

    Many studies have explored the value of using more sophisticated coastal impact models and higher resolution elevation data in sea-level rise (SLR) adaptation planning. However, we know little about to what extent the improved models and data could actually lead to better conservation outcomes under SLR. This is important to know because high-resolution data are likely to not be available in some data-poor coastal areas in the world and running more complicated coastal impact models is relatively time-consuming, expensive, and requires assistance by qualified experts and technicians. We address this research question in the context of identifying conservation priorities in response to SLR. Specifically, we investigated the conservation value of using more accurate light detection and ranging (Lidar)-based digital elevation data and process-based coastal land-cover change models (Sea Level Affecting Marshes Model, SLAMM) to identify conservation priorities versus simple "bathtub" models based on the relatively coarse National Elevation Dataset (NED) in a coastal region of northeast Florida. We compared conservation outcomes identified by reserve design software (Zonation) using three different model dataset combinations (Bathtub-NED, Bathtub-Lidar, and SLAMM-Lidar). The comparisons show that the conservation priorities are significantly different with different combinations of coastal impact models and elevation dataset inputs. The research suggests that it is valuable to invest in more accurate coastal impact models and elevation datasets in SLR adaptive conservation planning because this model-dataset combination could improve conservation outcomes under SLR. Less accurate coastal impact models, including ones created using coarser Digital Elevation Model (DEM) data can still be useful when better data and models are not available or feasible, but results need to be appropriately assessed and communicated. A future research priority is to investigate how

  9. Critical Values for Yen’s Q3: Identification of Local Dependence in the Rasch model using Residual Correlations

    DEFF Research Database (Denmark)

    Christensen, Karl Bang; Makransky, Guido; Horton, Mike

    2016-01-01

    The assumption of local independence is central to all IRT models. Violations can lead to inflated estimates of reliability and problems with construct validity. For the most widely used fit statistic Q3 there are currently no well-documented suggestions of the critical values which should be used...... guidelines that researchers and practitioners can follow when making decisions about local dependence during scale development and validation. We propose that a parametric bootstrapping procedure should be implemented in each separate situation in order to obtain the critical value of local dependence...... applicable to the data set, and provide example critical values for a number of data structure situations. The results show that for the Q3 fit statistic no single critical value is appropriate for all situations, as the percentiles in the empirical null distribution are influenced by the number of items...

  10. The impact of MCS models and EFAC values on the dose simulation for a proton pencil beam

    Science.gov (United States)

    Chen, Shih-Kuan; Chiang, Bing-Hao; Lee, Chung-Chi; Tung, Chuan-Jong; Hong, Ji-Hong; Chao, Tsi-Chian

    2017-08-01

    The Multiple Coulomb Scattering (MCS) model plays an important role in accurate MC simulation, especially for small field applications. The Rossi model is used in MCNPX 2.7.0, and the Lewis model in Geant4.9.6.p02. These two models may generate very different angular and spatial distributions in small field proton dosimetry. Beside angular and spatial distributions, step size is also an important issue that causes path length effects. The Energy Fraction (EFAC) value can be used in MCNPX 2.7.0 to control step sizes of MCS. In this study, we use MCNPX 2.7.0, Geant4.9.6.p02, and one pencil beam algorithm to evaluate the effect of dose deposition because of different MCS models and different EFAC values in proton disequilibrium situation. Different MCS models agree well with each other under a proton equilibrium situation. Under proton disequilibrium situations, the MCNPX and Geant4 results, however, show a significant deviation (up to 43%). In addition, the path length effects are more significant when EFAC is equal to 0.917 and 0.94 in small field proton dosimetry, and using a 0.97 EFAC value is the best for both accuracy and efficiency

  11. Adding value(s)

    DEFF Research Database (Denmark)

    Carré, David

    2015-01-01

    , 1992). In response, behavioral economics (Camerer, 1999) has shown that agents have values other than optimization underpinning their decisions. Therefore, concerns arose regarding which values are guiding the agent but not about how such values became relevant for the agent. In this presentation, I......Most economic inquires revolve around agents making decisions. Getting the ‘best value’, it is assumed, drives such decisions: gaining most while risking least. This assumption has been debunked by showing that people does not always choose neither maximum benefit nor less risk (Kahneman & Tversky...... will explore the consequences of shifting to the latter perspective, i.e. looking for the generative framework of values. Here I argue that economic behavior should also be seen as a sense-making process, guided by values that are chosen/rejected along with fellow human beings, in specific socio...

  12. Rarity in large data sets: Singletons, model values and the location of the species abundance distribution

    NARCIS (Netherlands)

    Straatsma, G.; Egli, S.

    2012-01-01

    Species abundance data in 12 large data sets, holding 10 × 103 to 125 × 106 individuals in 350 to 10 × 103 samples, were studied. Samples and subsets, for instance the summarized data of samples over years, and whole sets were analysed. Two methods of the binning of data, assigning abundance values

  13. Bringing Deleuze's Philosophy into Discourse on Values Education and Quality Teaching: An Australian Model

    Science.gov (United States)

    Semetsky, Inna; Lovat, Terence

    2011-01-01

    The article examines the Australian national program of values education via the lens of Deleuze's philosophy. It argues that it is teachers with a genuine level of self-knowledge who can create the conditions conducive to best practice in schools. Both theoretically and empirically, quality teaching has demonstrated the power of the affective…

  14. Beyond Traditional School Value-Added Models: A Multilevel Analysis of Complex School Effects in Chile

    Science.gov (United States)

    Troncoso, Patricio; Pampaka, Maria; Olsen, Wendy

    2016-01-01

    School value-added studies have largely demonstrated the effects of socioeconomic and demographic characteristics of the schools and the pupils on performance in standardised tests. Traditionally, these studies have assessed the variation coming only from the schools and the pupils. However, recent studies have shown that the analysis of academic…

  15. Bringing Deleuze's Philosophy into Discourse on Values Education and Quality Teaching: An Australian Model

    Science.gov (United States)

    Semetsky, Inna; Lovat, Terence

    2011-01-01

    The article examines the Australian national program of values education via the lens of Deleuze's philosophy. It argues that it is teachers with a genuine level of self-knowledge who can create the conditions conducive to best practice in schools. Both theoretically and empirically, quality teaching has demonstrated the power of the affective…

  16. Modelling the locational determinants of house prices: neural network and value tree approaches

    NARCIS (Netherlands)

    Kauko, Tom Johannes

    2002-01-01

    Tom Kauko's book comprises an analysis of the locational element in house prices. Locational features can increase or decrease the value of a house compared with a similar one elsewhere. So far, the problem of isolating this element has been well documented in the literatures on spatial housing mark

  17. The Course Valuation Model and 10 Steps to Increase Course Value: The Business Communication Course

    Science.gov (United States)

    Brown, Lori A.

    2015-01-01

    Communication competence is a leading agent in professional success and the ability most sought after by employers. Educational institutions benefit by producing students with such sought-after skills. However, there is a disconnect between skills practitioner stakeholders desire and what graduates deliver. Strengthening the value of a business…

  18. Intellectual Capital configurations and value creation: towards a conceptual model of HR Shared Services

    NARCIS (Netherlands)

    Meijerink, Jeroen; Bondarouk, Tanya; Looise, Jan Kees

    2010-01-01

    The ways by which intellectual capital (IC) drives value creation is often limitedly understood, particularly in the case of Human Resource Shared Services (HRSS), as its components – human, organizational and social capital – are treated as independent and separate constructs. In this paper, a conf

  19. A review of the modelling of water values in different use sectors in ...

    African Journals Online (AJOL)

    irrigated agriculture as an important use sector in terms of water volumes, ... wide range of topics is to provide information to policy decision-makers on the economics of water management in South .... ET values estimated by the RV method for eucalyptus vary .... cised rights will not be supported for transfer, it appears that.

  20. Rarity in large data sets: Singletons, model values and the location of the species abundance distribution

    NARCIS (Netherlands)

    Straatsma, G.; Egli, S.

    2012-01-01

    Species abundance data in 12 large data sets, holding 10 × 103 to 125 × 106 individuals in 350 to 10 × 103 samples, were studied. Samples and subsets, for instance the summarized data of samples over years, and whole sets were analysed. Two methods of the binning of data, assigning abundance values