WorldWideScience

Sample records for model properly accounts

  1. Explanations, mechanisms, and developmental models: Why the nativist account of early perceptual learning is not a proper mechanistic model

    Directory of Open Access Journals (Sweden)

    Radenović Ljiljana

    2013-01-01

    Full Text Available In the last several decades a number of studies on perceptual learning in early infancy have suggested that even infants seem to be sensitive to the way objects move and interact in the world. In order to explain the early emergence of infants’ sensitivity to causal patterns in the world some psychologists have proposed that core knowledge of objects and causal relations is innate (Leslie & Keeble 1987, Carey & Spelke, 1994; Keil, 1995; Spelke et al., 1994. The goal of this paper is to examine the nativist developmental model by investigating the criteria that a mechanistic model needs to fulfill if it is to be explanatory. Craver (2006 put forth a number of such criteria and developed a few very useful distinctions between explanation sketches and proper mechanistic explanations. By applying these criteria to the nativist developmental model I aim to show, firstly, that nativists only partially characterize the phenomenon at stake without giving us the details of when and under which conditions perception and attention in early infancy take place. Secondly, nativist start off with a description of the phenomena to be explained (even if it is only a partial description but import into it a particular theory of perception that requires further empirical evidence and further defense on its own. Furthermore, I argue that innate knowledge is a good candidate for a filler term (a term that is used to name the still unknown processes and parts of the mechanism and is likely to become redundant. Recent extensive research on early intermodal perception indicates that the mechanism enabling the perception of regularities and causal patterns in early infancy is grounded in our neurophysiology. However, this mechanism is fairly basic and does not involve highly sophisticated cognitive structures or innate core knowledge. I conclude with a remark that a closer examination of the mechanisms involved in early perceptual learning indicates that the nativism

  2. Generating flexible proper name references in text : Data, models and evaluation

    NARCIS (Netherlands)

    Castro Ferreira, Thiago; Krahmer, Emiel; Wubben, Sander

    2017-01-01

    This study introduces a statistical model able to generate variations of a proper name, taking into account the person to be mentioned, the discourse context and individual variation. The model relies on the REGnames corpus, a dataset with 53,102 proper name references to 1,000 people in different

  3. 26 CFR 1.1016-2 - Items properly chargeable to capital account.

    Science.gov (United States)

    2010-04-01

    ... (CONTINUED) INCOME TAX (CONTINUED) INCOME TAXES Basis Rules of General Application § 1.1016-2 Items properly... any applicable provision of law or regulation, is treated as an item not properly chargeable to.... For example, in the case of oil and gas wells no adjustment may be made in respect of any intangible...

  4. THE COMPANY ACCOUNTING EVALUATION – PRELIMINARY PHASE OF THE PROPER ANALYSIS OF FINANCIAL STATEMENTS

    OpenAIRE

    Doina Pacurari; Mircea Muntean

    2008-01-01

    The problem that the accounting information do not always reflect the economic reality may affect the analysis and forecast based on financial statements. This is due both to the accrual accounting limitations and to the fact that this type of accounting allows the result management. In spite of some disadvantages, the accrual accounting is considered superior to cash accounting in measuring the performances and determining financial position as well as in the predicting of future cash flow. ...

  5. THE COMPANY ACCOUNTING EVALUATION – PRELIMINARY PHASE OF THE PROPER ANALYSIS OF FINANCIAL STATEMENTS

    Directory of Open Access Journals (Sweden)

    Doina Pacurari

    2008-12-01

    Full Text Available The problem that the accounting information do not always reflect the economic reality may affect the analysis and forecast based on financial statements. This is due both to the accrual accounting limitations and to the fact that this type of accounting allows the result management. In spite of some disadvantages, the accrual accounting is considered superior to cash accounting in measuring the performances and determining financial position as well as in the predicting of future cash flow. In order to limit the negative effects on the results of analysis and forecast based on financial statements, the analists should evaluate the enterprise accounting and if necessary adjust the financial statements so they reflect the economic reality.

  6. The importance of proper feedback modeling in HWR

    International Nuclear Information System (INIS)

    Saphier, D.; Gorelik, Z.; Shapira, M.

    1996-01-01

    The DSNP simulation language was applied to study the effect of different modeling approximations of feedback phenomena in nuclear power plants. The different methods to model the feedback effects are presented and discussed. It is shown that HWR's are most sensitive to the correct modeling since the usually have at least three feedback effects acting at different time scales, and to achieve correct kinetics a one dimensional representation is needed with correct modeling of the in core time delays. The simulation methodology of lumped parameters and one dimensional models using the DSNP simulation language is presented (authors)

  7. Kawase & McDermott revisited with a proper ocean model.

    Science.gov (United States)

    Jochum, Markus; Poulsen, Mads; Nuterman, Roman

    2017-04-01

    A suite of experiments with global ocean models is used to test the hypothesis that Southern Ocean (SO) winds can modify the strength of the Atlantic Meridional Overturning Circulation (AMOC). It is found that for 3 and 1 degree resolution models the results are consistent with Toggweiler & Samuels (1995): stronger SO winds lead to a slight increase of the AMOC. In the simulations with 1/10 degree resolution, however, stronger SO winds weaken the AMOC. We show that these different outcomes are determined by the models' representation of topographic Rossby and Kelvin waves. Consistent with previous literature based on theory and idealized models, first baroclinic waves are slower in the coarse resolution models, but still manage to establish a pattern of global response that is similar to the one in the eddy-permitting model. Because of its different stratification, however, the Atlantic signal is transmitted by higher baroclinic modes. In the coarse resolution model these higher modes are dissipated before they reach 30N, whereas in the eddy-permitting model they reach the subpolar gyre undiminished. This inability of non-eddy-permitting ocean models to represent planetary waves with higher baroclinic modes casts doubt on the ability of climate models to represent non-local effects of climate change. Ideas on how to overcome these difficulties will be discussed.

  8. Modeling habitat dynamics accounting for possible misclassification

    Science.gov (United States)

    Veran, Sophie; Kleiner, Kevin J.; Choquet, Remi; Collazo, Jaime; Nichols, James D.

    2012-01-01

    Land cover data are widely used in ecology as land cover change is a major component of changes affecting ecological systems. Landscape change estimates are characterized by classification errors. Researchers have used error matrices to adjust estimates of areal extent, but estimation of land cover change is more difficult and more challenging, with error in classification being confused with change. We modeled land cover dynamics for a discrete set of habitat states. The approach accounts for state uncertainty to produce unbiased estimates of habitat transition probabilities using ground information to inform error rates. We consider the case when true and observed habitat states are available for the same geographic unit (pixel) and when true and observed states are obtained at one level of resolution, but transition probabilities estimated at a different level of resolution (aggregations of pixels). Simulation results showed a strong bias when estimating transition probabilities if misclassification was not accounted for. Scaling-up does not necessarily decrease the bias and can even increase it. Analyses of land cover data in the Southeast region of the USA showed that land change patterns appeared distorted if misclassification was not accounted for: rate of habitat turnover was artificially increased and habitat composition appeared more homogeneous. Not properly accounting for land cover misclassification can produce misleading inferences about habitat state and dynamics and also misleading predictions about species distributions based on habitat. Our models that explicitly account for state uncertainty should be useful in obtaining more accurate inferences about change from data that include errors.

  9. Modelling in Accounting. Theoretical and Practical Dimensions

    Directory of Open Access Journals (Sweden)

    Teresa Szot -Gabryś

    2010-10-01

    Full Text Available Accounting in the theoretical approach is a scientific discipline based on specific paradigms. In the practical aspect, accounting manifests itself through the introduction of a system for measurement of economic quantities which operates in a particular business entity. A characteristic of accounting is its flexibility and ability of adaptation to information needs of information recipients. One of the main currents in the development of accounting theory and practice is to cover by economic measurements areas which have not been hitherto covered by any accounting system (it applies, for example, to small businesses, agricultural farms, human capital, which requires the development of an appropriate theoretical and practical model. The article illustrates the issue of modelling in accounting based on the example of an accounting model developed for small businesses, i.e. economic entities which are not obliged by law to keep accounting records.

  10. Implementing a trustworthy cost-accounting model.

    Science.gov (United States)

    Spence, Jay; Seargeant, Dan

    2015-03-01

    Hospitals and health systems can develop an effective cost-accounting model and maximize the effectiveness of their cost-accounting teams by focusing on six key areas: Implementing an enhanced data model. Reconciling data efficiently. Accommodating multiple cost-modeling techniques. Improving transparency of cost allocations. Securing department manager participation. Providing essential education and training to staff members and stakeholders.

  11. Questioning Stakeholder Legitimacy: A Philanthropic Accountability Model.

    Science.gov (United States)

    Kraeger, Patsy; Robichau, Robbie

    2017-01-01

    Philanthropic organizations contribute to important work that solves complex problems to strengthen communities. Many of these organizations are moving toward engaging in public policy work, in addition to funding programs. This paper raises questions of legitimacy for foundations, as well as issues of transparency and accountability in a pluralistic democracy. Measures of civic health also inform how philanthropic organizations can be accountable to stakeholders. We propose a holistic model for philanthropic accountability that combines elements of transparency and performance accountability, as well as practices associated with the American pluralistic model for democratic accountability. We argue that philanthropic institutions should seek stakeholder and public input when shaping any public policy agenda. This paper suggests a new paradigm, called philanthropic accountability that can be used for legitimacy and democratic governance of private foundations engaged in policy work. The Philanthropic Accountability Model can be empirically tested and used as a governance tool.

  12. Understanding financial crisis through accounting models

    NARCIS (Netherlands)

    Bezemer, D.J.

    2010-01-01

    This paper presents evidence that accounting (or flow-of-funds) macroeconomic models helped anticipate the credit crisis and economic recession Equilibrium models ubiquitous in mainstream policy and research did not This study traces the Intellectual pedigrees of the accounting approach as an

  13. Study on proper constitutive model for evaluation of long term mechanical behavior of buffer material

    International Nuclear Information System (INIS)

    Hirai, Takashi; Tanai, Kenji; Shigeno, Yoshimasa; Namikawa, Tsutomu; Takaji, Kazuhiko; Ohnuma, Satoshi

    2004-02-01

    The objective of this report is to make a proposal of the proper constitutive models and parameters for the evaluation of the long term mechanical behavior of the buffer material in the engineered barrier system. In the second progress report by JNC, it was reported that the well designed engineered barrier system is stable and safety on mechanical support of the overpack to ensure stability and stress which acts on the overpack by using analysis which based on the popular constitutive models for the general clay soils. However, the buffer material which has swelling characteristics is considered not to be ordinary clay soils. So it is necessary to select the reliable constitutive models again. Therefore the proper models were selected again systematically in the several models which have been used for the assessment of the behavior of clay soils and the simulation analysis on the laboratory tests were carried out by using these models. From the result of the simulation analysis it appeared that the selected two models were alike to assess the behavior of the buffer material and the parameters which need to simulated the consolidation tests are different from those for the triaxial compression tests. Finally the analysis was conducted to evaluate the effect of the swelling of the overpack by the corrosion and the self weight which causes the sedimentation of the overpack. From the analytical result, it was clarified that two kinds of parameter sets are necessary to evaluate the deformation and the stress of the buffer material in the engineered barrier system. (author)

  14. Model Reduction Using Proper Orthogonal Decomposition and Predictive Control of Distributed Reactor System

    Directory of Open Access Journals (Sweden)

    Alejandro Marquez

    2013-01-01

    Full Text Available This paper studies the application of proper orthogonal decomposition (POD to reduce the order of distributed reactor models with axial and radial diffusion and the implementation of model predictive control (MPC based on discrete-time linear time invariant (LTI reduced-order models. In this paper, the control objective is to keep the operation of the reactor at a desired operating condition in spite of the disturbances in the feed flow. This operating condition is determined by means of an optimization algorithm that provides the optimal temperature and concentration profiles for the system. Around these optimal profiles, the nonlinear partial differential equations (PDEs, that model the reactor are linearized, and afterwards the linear PDEs are discretized in space giving as a result a high-order linear model. POD and Galerkin projection are used to derive the low-order linear model that captures the dominant dynamics of the PDEs, which are subsequently used for controller design. An MPC formulation is constructed on the basis of the low-order linear model. The proposed approach is tested through simulation, and it is shown that the results are good with regard to keep the operation of the reactor.

  15. Evaluation of the Township Proper Carrying Capacity over Qinghai-Tibet plateau by CASA model

    Science.gov (United States)

    Wu, Chengyong; Cao, Guangchao; Xue, Huaju; Jiang, Gang; Wang, Qi; Yuan, Jie; Chen, Kelong

    2018-01-01

    The existing study of proper carrying capacity (PCC) has mostly focused on province or county administrative units, which can only macroscopically master the quantitative characteristics of PCC, but could not effectively take some animal husbandry management measures that are pertinent and operational. At town-scale, this paper used CASA model to estimate the PCC in Mongolian Autonomous County of Henan, Qinghai province, China,with serious grassland degeneration that mainly caused by overgrazing. The results showed that the PCC throughout the County was 950,417 sheep unit. For the township, the PCC of Saierlong and Duosong were the largest (247,100 sheep unit) and the smallest (82,016 sheep unit) respectively. This study will provide reference data for developing sustainable development of town-scale pasture policies and also will help to evaluate the health status of the alpine grassland ecosystem on Qinghai-Tibet plateau.

  16. Hemodynamics of a Patient-Specific Aneurysm Model with Proper Orthogonal Decomposition

    Science.gov (United States)

    Han, Suyue; Chang, Gary Han; Modarres-Sadeghi, Yahya

    2017-11-01

    Wall shear stress (WSS) and oscillatory shear index (OSI) are two of the most-widely studied hemodynamic quantities in cardiovascular systems that have been shown to have the ability to elicit biological responses of the arterial wall, which could be used to predict the aneurysm development and rupture. In this study, a reduced-order model (ROM) of the hemodynamics of a patient-specific cerebral aneurysm is studied. The snapshot Proper Orthogonal Decomposition (POD) is utilized to construct the reduced-order bases of the flow using a CFD training set with known inflow parameters. It was shown that the area of low WSS and high OSI is correlated to higher POD modes. The resulting ROM can reproduce both WSS and OSI computationally for future parametric studies with significantly less computational cost. Agreement was observed between the WSS and OSI values obtained using direct CFD results and ROM results.

  17. Modelling of functional systems of managerial accounting

    Directory of Open Access Journals (Sweden)

    O.V. Fomina

    2017-12-01

    Full Text Available The modern stage of managerial accounting development takes place under the powerful influence of managerial innovations. The article aimed at the development of integrational model of budgeting and the system of balanced indices in the system of managerial accounting that will contribute the increasing of relevance for making managerial decisions by managers of different levels management. As a result of the study the author proposed the highly pragmatical integration model of budgeting and system of the balanced indices in the system of managerial accounting, which is realized by the development of the system of gathering, consolidation, analysis, and interpretation of financial and nonfinancial information, contributes the increasing of relevance for making managerial decisions on the base of coordination and effective and purpose orientation both strategical and operative resources of an enterprise. The effective integrational process of the system components makes it possible to distribute limited resources rationally taking into account prospective purposes and strategic initiatives, to carry

  18. Rate heterogeneity across Squamata, misleading ancestral state reconstruction and the importance of proper null model specification.

    Science.gov (United States)

    Harrington, S; Reeder, T W

    2017-02-01

    The binary-state speciation and extinction (BiSSE) model has been used in many instances to identify state-dependent diversification and reconstruct ancestral states. However, recent studies have shown that the standard procedure of comparing the fit of the BiSSE model to constant-rate birth-death models often inappropriately favours the BiSSE model when diversification rates vary in a state-independent fashion. The newly developed HiSSE model enables researchers to identify state-dependent diversification rates while accounting for state-independent diversification at the same time. The HiSSE model also allows researchers to test state-dependent models against appropriate state-independent null models that have the same number of parameters as the state-dependent models being tested. We reanalyse two data sets that originally used BiSSE to reconstruct ancestral states within squamate reptiles and reached surprising conclusions regarding the evolution of toepads within Gekkota and viviparity across Squamata. We used this new method to demonstrate that there are many shifts in diversification rates across squamates. We then fit various HiSSE submodels and null models to the state and phylogenetic data and reconstructed states under these models. We found that there is no single, consistent signal for state-dependent diversification associated with toepads in gekkotans or viviparity across all squamates. Our reconstructions show limited support for the recently proposed hypotheses that toepads evolved multiple times independently in Gekkota and that transitions from viviparity to oviparity are common in Squamata. Our results highlight the importance of considering an adequate pool of models and null models when estimating diversification rate parameters and reconstructing ancestral states. © 2016 European Society For Evolutionary Biology. Journal of Evolutionary Biology © 2016 European Society For Evolutionary Biology.

  19. Media Accountability Systems: Models, proposals and outlooks

    Directory of Open Access Journals (Sweden)

    Fernando O. Paulino

    2007-06-01

    Full Text Available This paper analyzes one of the basic actions of SOS-Imprensa, the mechanism to assure Media Accountability with the goal of proposing a synthesis of models for the Brazilian reality. The article aims to address the possibilities of creating and improving mechanisms to stimulate the democratic press process and to mark out and assure freedom of speech and personal rights with respect to the media. Based on the Press Social Responsibility Theory, the hypothesis is that the experiences analyzed (Communication Council, Press Council, Ombudsman and Readers Council are alternatives for accountability, mediation and arbitration, seeking visibility, trust and public support in favor of fairer media.

  20. The OntoREA Accounting Model: Ontology-based Modeling of the Accounting Domain

    Directory of Open Access Journals (Sweden)

    Christian Fischer-Pauzenberger

    2017-07-01

    Full Text Available McCarthy developed a framework for modeling the economic rationale of different business transactions along the enterprise value chain described in his seminal article “The REA Accounting Model – A Generalized Framework for Accounting Systems in a Shared Data Environment” Originally, the REA accounting model was specified in the entity-relationship (ER language. Later on other languages – especially in form of generic data models and UML class models (UML language – were used. Recently, the OntoUML language was developed by Guizzardi and used by Gailly et al. for a metaphysical reengineering of the REA enterprise ontology. Although the REA accounting model originally addressed the accounting domain, it most successfuly is applied as a reference framework for the conceptual modeling of enterprise systems. The primary research objective of this article is to anchor the REA-based models more deeply in the accounting domain. In order to achieve this objective, essential primitives of the REA model are identified and conceptualized in the OntoUML language within the Asset Liability Equity (ALE context of the traditional ALE accounting domain.

  1. Uncertainty in Discount Models and Environmental Accounting

    Directory of Open Access Journals (Sweden)

    Donald Ludwig

    2005-12-01

    Full Text Available Cost-benefit analysis (CBA is controversial for environmental issues, but is nevertheless employed by many governments and private organizations for making environmental decisions. Controversy centers on the practice of economic discounting in CBA for decisions that have substantial long-term consequences, as do most environmental decisions. Customarily, economic discounting has been calculated at a constant exponential rate, a practice that weights the present heavily in comparison with the future. Recent analyses of economic data show that the assumption of constant exponential discounting should be modified to take into account large uncertainties in long-term discount rates. A proper treatment of this uncertainty requires that we consider returns over a plausible range of assumptions about future discounting rates. When returns are averaged in this way, the schemes with the most severe discounting have a negligible effect on the average after a long period of time has elapsed. This re-examination of economic uncertainty provides support for policies that prevent or mitigate environmental damage. We examine these effects for three examples: a stylized renewable resource, management of a long-lived species (Atlantic Right Whales, and lake eutrophication.

  2. Fusion of expertise among accounting accounting faculty. Towards an expertise model for academia in accounting.

    NARCIS (Netherlands)

    Njoku, Jonathan C.; van der Heijden, Beatrice; Inanga, Eno L.

    2010-01-01

    This paper aims to portray an accounting faculty expert. It is argued that neither the academic nor the professional orientation alone appears adequate in developing accounting faculty expertise. The accounting faculty expert is supposed to develop into a so-called ‘flexpert’ (Van der Heijden, 2003)

  3. A simulation model for material accounting systems

    International Nuclear Information System (INIS)

    Coulter, C.A.; Thomas, K.E.

    1987-01-01

    A general-purpose model that was developed to simulate the operation of a chemical processing facility for nuclear materials has been extended to describe material measurement and accounting procedures as well. The model now provides descriptors for material balance areas, a large class of measurement instrument types and their associated measurement errors for various classes of materials, the measurement instruments themselves with their individual calibration schedules, and material balance closures. Delayed receipt of measurement results (as for off-line analytical chemistry assay), with interim use of a provisional measurement value, can be accurately represented. The simulation model can be used to estimate inventory difference variances for processing areas that do not operate at steady state, to evaluate the timeliness of measurement information, to determine process impacts of measurement requirements, and to evaluate the effectiveness of diversion-detection algorithms. Such information is usually difficult to obtain by other means. Use of the measurement simulation model is illustrated by applying it to estimate inventory difference variances for two material balance area structures of a fictitious nuclear material processing line

  4. Model Reduction Based on Proper Generalized Decomposition for the Stochastic Steady Incompressible Navier--Stokes Equations

    KAUST Repository

    Tamellini, L.

    2014-01-01

    In this paper we consider a proper generalized decomposition method to solve the steady incompressible Navier-Stokes equations with random Reynolds number and forcing term. The aim of such a technique is to compute a low-cost reduced basis approximation of the full stochastic Galerkin solution of the problem at hand. A particular algorithm, inspired by the Arnoldi method for solving eigenproblems, is proposed for an efficient greedy construction of a deterministic reduced basis approximation. This algorithm decouples the computation of the deterministic and stochastic components of the solution, thus allowing reuse of preexisting deterministic Navier-Stokes solvers. It has the remarkable property of only requiring the solution of m uncoupled deterministic problems for the construction of an m-dimensional reduced basis rather than M coupled problems of the full stochastic Galerkin approximation space, with m l M (up to one order of magnitudefor the problem at hand in this work). © 2014 Society for Industrial and Applied Mathematics.

  5. Modelling in Accounting. Theoretical and Practical Dimensions

    OpenAIRE

    Teresa Szot -Gabryś

    2010-01-01

    Accounting in the theoretical approach is a scientific discipline based on specific paradigms. In the practical aspect, accounting manifests itself through the introduction of a system for measurement of economic quantities which operates in a particular business entity. A characteristic of accounting is its flexibility and ability of adaptation to information needs of information recipients. One of the main currents in the development of accounting theory and practice is to cover by economic...

  6. An adaptive proper orthogonal decomposition method for model order reduction of multi-disc rotor system

    Science.gov (United States)

    Jin, Yulin; Lu, Kuan; Hou, Lei; Chen, Yushu

    2017-12-01

    The proper orthogonal decomposition (POD) method is a main and efficient tool for order reduction of high-dimensional complex systems in many research fields. However, the robustness problem of this method is always unsolved, although there are some modified POD methods which were proposed to solve this problem. In this paper, a new adaptive POD method called the interpolation Grassmann manifold (IGM) method is proposed to address the weakness of local property of the interpolation tangent-space of Grassmann manifold (ITGM) method in a wider parametric region. This method is demonstrated here by a nonlinear rotor system of 33-degrees of freedom (DOFs) with a pair of liquid-film bearings and a pedestal looseness fault. The motion region of the rotor system is divided into two parts: simple motion region and complex motion region. The adaptive POD method is compared with the ITGM method for the large and small spans of parameter in the two parametric regions to present the advantage of this method and disadvantage of the ITGM method. The comparisons of the responses are applied to verify the accuracy and robustness of the adaptive POD method, as well as the computational efficiency is also analyzed. As a result, the new adaptive POD method has a strong robustness and high computational efficiency and accuracy in a wide scope of parameter.

  7. Display of the information model accounting system

    OpenAIRE

    Matija Varga

    2011-01-01

    This paper presents the accounting information system in public companies, business technology matrix and data flow diagram. The paper describes the purpose and goals of the accounting process, matrix sub-process and data class. Data flow in the accounting process and the so-called general ledger module are described in detail. Activities of the financial statements and determining the financial statements of the companies are mentioned as well. It is stated how the general ledger...

  8. THE ROLE AND THE IMPORTANCE IN CHOOSING THE PROPER MANAGERIAL ACCOUNTING CONCEPTS REGARDING THE NEED FOR INFORMATION ON THE DECISION MAKING FACTORS WITHIN THE COMPANIES

    Directory of Open Access Journals (Sweden)

    Delia David

    2014-09-01

    Full Text Available Both the theory and modern practice of the management accounting took over two general concepts regarding its organizational process, structuring accounting in either an integrated system or a dualist one. We aim at emphasizing the characteristics, the role and the importance of these concepts in regards to the calculation process and the accounting entries of the costs generated by economic entities with an eye at gaining profit. Choosing one of the previously mentioned concepts must be done taking into consideration the specific of the company in question as well as the information necessary to the manager who needs to find out the optimum solution in order to achieve the rehabilitation and efficiency of the activity which is supposed to be carried out. The subject of this work is approached both theoretically and practically, relying on the following research methods: the comparison method, the observation method and the case study method.

  9. DMFCA Model as a Possible Way to Detect Creative Accounting and Accounting Fraud in an Enterprise

    Directory of Open Access Journals (Sweden)

    Jindřiška Kouřilová

    2013-05-01

    Full Text Available The quality of reported accounting data as well as the quality and behaviour of their users influence the efficiency of an enterprise’s management. Its assessment could therefore be changed as well. To identify creative accounting and fraud, several methods and tools were used. In this paper we would like to present our proposal of the DMFCA (Detection model Material Flow Cost Accounting balance model based on environmental accounting and the MFCA (Material Flow Cost Accounting as its method. The following balance areas are included: material, financial and legislative. Using the analysis of strengths and weaknesses of the model, its possible use within a production and business company were assessed. Its possible usage to the detection of some creative accounting techniques was also assessed. The Model is developed in details for practical use and describing theoretical aspects.

  10. Assessing Sexual Dicromatism: The Importance of Proper Parameterization in Tetrachromatic Visual Models.

    Directory of Open Access Journals (Sweden)

    Pierre-Paul Bitton

    Full Text Available Perceptual models of animal vision have greatly contributed to our understanding of animal-animal and plant-animal communication. The receptor-noise model of color contrasts has been central to this research as it quantifies the difference between two colors for any visual system of interest. However, if the properties of the visual system are unknown, assumptions regarding parameter values must be made, generally with unknown consequences. In this study, we conduct a sensitivity analysis of the receptor-noise model using avian visual system parameters to systematically investigate the influence of variation in light environment, photoreceptor sensitivities, photoreceptor densities, and light transmission properties of the ocular media and the oil droplets. We calculated the chromatic contrast of 15 plumage patches to quantify a dichromatism score for 70 species of Galliformes, a group of birds that display a wide range of sexual dimorphism. We found that the photoreceptor densities and the wavelength of maximum sensitivity of the short-wavelength-sensitive photoreceptor 1 (SWS1 can change dichromatism scores by 50% to 100%. In contrast, the light environment, transmission properties of the oil droplets, transmission properties of the ocular media, and the peak sensitivities of the cone photoreceptors had a smaller impact on the scores. By investigating the effect of varying two or more parameters simultaneously, we further demonstrate that improper parameterization could lead to differences between calculated and actual contrasts of more than 650%. Our findings demonstrate that improper parameterization of tetrachromatic visual models can have very large effects on measures of dichromatism scores, potentially leading to erroneous inferences. We urge more complete characterization of avian retinal properties and recommend that researchers either determine whether their species of interest possess an ultraviolet or near-ultraviolet sensitive SWS1

  11. Establishing a Proper Model of Tobacco Dependence: Influence of Age and Tobacco Smoke Constituents

    OpenAIRE

    Gellner, Candice Ann

    2017-01-01

    Cigarette smoking is the leading preventable cause of death in the United States. Of those who smoke, 9 out of 10 report trying their first cigarette before the age of 18. Although most people who initiate tobacco use are teenagers, animal models for studying tobacco dependence have traditionally focused on how adult animals initiate, withdrawal from and relapse to cigarette smoking. Furthermore, cigarette smoke contains more than 7,000 constituents, including nicotine, yet pre-clinical resea...

  12. Display of the information model accounting system

    Directory of Open Access Journals (Sweden)

    Matija Varga

    2011-12-01

    Full Text Available This paper presents the accounting information system in public companies, business technology matrix and data flow diagram. The paper describes the purpose and goals of the accounting process, matrix sub-process and data class. Data flow in the accounting process and the so-called general ledger module are described in detail. Activities of the financial statements and determining the financial statements of the companies are mentioned as well. It is stated how the general ledger module should function and what characteristics it must have. Line graphs will depict indicators of the company’s business success, indebtedness and company’s efficiency coefficients based on financial balance reports, and profit and loss report.

  13. A low-cost, goal-oriented ‘compact proper orthogonal decomposition’ basis for model reduction of static systems

    KAUST Repository

    Carlberg, Kevin

    2010-12-10

    A novel model reduction technique for static systems is presented. The method is developed using a goal-oriented framework, and it extends the concept of snapshots for proper orthogonal decomposition (POD) to include (sensitivity) derivatives of the state with respect to system input parameters. The resulting reduced-order model generates accurate approximations due to its goal-oriented construction and the explicit \\'training\\' of the model for parameter changes. The model is less computationally expensive to construct than typical POD approaches, since efficient multiple right-hand side solvers can be used to compute the sensitivity derivatives. The effectiveness of the method is demonstrated on a parameterized aerospace structure problem. © 2010 John Wiley & Sons, Ltd.

  14. Modelling Seasonal GWR of Daily PM2.5 with Proper Auxiliary Variables for the Yangtze River Delta

    Directory of Open Access Journals (Sweden)

    Man Jiang

    2017-04-01

    Full Text Available Over the past decades, regional haze episodes have frequently occurred in eastern China, especially in the Yangtze River Delta (YRD. Satellite derived Aerosol Optical Depth (AOD has been used to retrieve the spatial coverage of PM2.5 concentrations. To improve the retrieval accuracy of the daily AOD-PM2.5 model, various auxiliary variables like meteorological or geographical factors have been adopted into the Geographically Weighted Regression (GWR model. However, these variables are always arbitrarily selected without deep consideration of their potentially varying temporal or spatial contributions in the model performance. In this manuscript, we put forward an automatic procedure to select proper auxiliary variables from meteorological and geographical factors and obtain their optimal combinations to construct four seasonal GWR models. We employ two different schemes to comprehensively test the performance of our proposed GWR models: (1 comparison with other regular GWR models by varying the number of auxiliary variables; and (2 comparison with observed ground-level PM2.5 concentrations. The result shows that our GWR models of “AOD + 3” with three common meteorological variables generally perform better than all the other GWR models involved. Our models also show powerful prediction capabilities in PM2.5 concentrations with only slight overfitting. The determination coefficients R2 of our seasonal models are 0.8259 in spring, 0.7818 in summer, 0.8407 in autumn, and 0.7689 in winter. Also, the seasonal models in summer and autumn behave better than those in spring and winter. The comparison between seasonal and yearly models further validates the specific seasonal pattern of auxiliary variables of the GWR model in the YRD. We also stress the importance of key variables and propose a selection process in the AOD-PM2.5 model. Our work validates the significance of proper auxiliary variables in modelling the AOD-PM2.5 relationships and

  15. Studi Model Penerimaan Tehnologi (Technology Acceptance Model) Novice Accountant

    OpenAIRE

    Rustiana, Rustiana

    2006-01-01

    This study investigates adoption or application of behavior information technologyacceptance. Davis' Technology Acceptance Model is employed to explain perceived usefulness, perceived ease of use, and intention to use in information systems. The respondents were 228 accounting students in management information system. Data was collected by questionnaire and then analyzed by using linear regression analysis and independent t-test. The results are in line with most of the hypotheses, only hypo...

  16. Stationary flow fields prediction of variable physical domain based on proper orthogonal decomposition and kriging surrogate model

    Directory of Open Access Journals (Sweden)

    Yasong Qiu

    2015-02-01

    Full Text Available In this paper a new flow field prediction method which is independent of the governing equations, is developed to predict stationary flow fields of variable physical domain. Predicted flow fields come from linear superposition of selected basis modes generated by proper orthogonal decomposition (POD. Instead of traditional projection methods, kriging surrogate model is used to calculate the superposition coefficients through building approximate function relationships between profile geometry parameters of physical domain and these coefficients. In this context, the problem which troubles the traditional POD-projection method due to viscosity and compressibility has been avoided in the whole process. Moreover, there are no constraints for the inner product form, so two forms of simple ones are applied to improving computational efficiency and cope with variable physical domain problem. An iterative algorithm is developed to determine how many basis modes ranking front should be used in the prediction. Testing results prove the feasibility of this new method for subsonic flow field, but also prove that it is not proper for transonic flow field because of the poor predicted shock waves.

  17. Accountability

    Science.gov (United States)

    Fielding, Michael; Inglis, Fred

    2017-01-01

    This contribution republishes extracts from two important articles published around 2000 concerning the punitive accountability system suffered by English primary and secondary schools. The first concerns the inspection agency Ofsted, and the second managerialism. Though they do not directly address assessment, they are highly relevant to this…

  18. The financial accounting model from a system dynamics' perspective

    NARCIS (Netherlands)

    Melse, E.

    2006-01-01

    This paper explores the foundation of the financial accounting model. We examine the properties of the accounting equation as the principal algorithm for the design and the development of a System Dynamics model. Key to the perspective is the foundational requirement that resolves the temporal

  19. Accountancy Modeling on Intangible Fixed Assets in Terms of the Main Provisions of International Accounting Standards

    Directory of Open Access Journals (Sweden)

    Riana Iren RADU

    2014-12-01

    Full Text Available Intangible fixed assets are of great importance in terms of progress of economic units. In recent years new approaches have been developed, additions to old standards so that intangible assets have gained a reputation both in the economic environment and in academia. We intend to develop a practical study on the main accounting approaches of the accounting modeling of the intangibles that impact on a company's brand development research PRORESEARCH SRL.

  20. Modelling adversary actions against a nuclear material accounting system

    International Nuclear Information System (INIS)

    Lim, J.J.; Huebel, J.G.

    1979-01-01

    A typical nuclear material accounting system employing double-entry bookkeeping is described. A logic diagram is used to model the interactions of the accounting system and the adversary when he attempts to thwart it. Boolean equations are derived from the logic diagram; solution of these equations yields the accounts and records through which the adversary may disguise a SSNM theft and the collusion requirements needed to accomplish this feat. Some technical highlights of the logic diagram are also discussed

  1. Project MAP: Model Accounting Plan for Special Education. Final Report.

    Science.gov (United States)

    Rossi, Robert J.

    The Model Accounting Plan (MAP) is a demographic accounting system designed to meet three major goals related to improving planning, evaluation, and monitoring of special education programs. First, MAP provides local-level data for administrators and parents to monitor the progress, transition patterns, expected attainments, and associated costs…

  2. The Relevance of the CIPP Evaluation Model for Educational Accountability.

    Science.gov (United States)

    Stufflebeam, Daniel L.

    The CIPP Evaluation Model was originally developed to provide timely information in a systematic way for decision making, which is a proactive application of evaluation. This article examines whether the CIPP model also serves the retroactive purpose of providing information for accountability. Specifically, can the CIPP Model adequately assist…

  3. Models and Rules of Evaluation in International Accounting

    Directory of Open Access Journals (Sweden)

    Niculae Feleaga

    2006-04-01

    Full Text Available The accounting procedures cannot be analyzed without a previous evaluation. Value is in general a very subjective issue, usually the result of a monetary evaluation made to a specific asset, group of assets or entities, or to some rendered services. Within the economic sciences, value comes from its very own deep history. In accounting, the concept of value had a late and fragile start. The term of value must not be misinterpreted as being the same thing with cost, even though value is frequently measured through costs. At the origin of the international accounting standards lays the framework for preparing, presenting and disclosing the financial statements. The framework stays as a reference matrix, as a standard of standards, as a constitution of financial accounting. According to the international framework, the financial statements use different evaluation basis: the hystorical cost, the current cost, the realisable (settlement value, the present value (the present value of cash flows. Choosing the evaluation basis and the capital maintenance concept will eventually determine the accounting evaluation model used in preparing the financial statements of a company. The multitude of accounting evaluation models differentiate themselves one from another through various relevance and reliable degrees of accounting information and therefore, accountants (the prepares of financial statements must try to equilibrate these two main qualitative characteristics of financial information.

  4. Models and Rules of Evaluation in International Accounting

    Directory of Open Access Journals (Sweden)

    Liliana Feleaga

    2006-06-01

    Full Text Available The accounting procedures cannot be analyzed without a previous evaluation. Value is in general a very subjective issue, usually the result of a monetary evaluation made to a specific asset, group of assets or entities, or to some rendered services. Within the economic sciences, value comes from its very own deep history. In accounting, the concept of value had a late and fragile start. The term of value must not be misinterpreted as being the same thing with cost, even though value is frequently measured through costs. At the origin of the international accounting standards lays the framework for preparing, presenting and disclosing the financial statements. The framework stays as a reference matrix, as a standard of standards, as a constitution of financial accounting. According to the international framework, the financial statements use different evaluation basis: the hystorical cost, the current cost, the realisable (settlement value, the present value (the present value of cash flows. Choosing the evaluation basis and the capital maintenance concept will eventually determine the accounting evaluation model used in preparing the financial statements of a company. The multitude of accounting evaluation models differentiate themselves one from another through various relevance and reliable degrees of accounting information and therefore, accountants (the prepares of financial statements must try to equilibrate these two main qualitative characteristics of financial information.

  5. Can An Amended Standard Model Account For Cold Dark Matter?

    International Nuclear Information System (INIS)

    Goldhaber, Maurice

    2004-01-01

    It is generally believed that one has to invoke theories beyond the Standard Model to account for cold dark matter particles. However, there may be undiscovered universal interactions that, if added to the Standard Model, would lead to new members of the three generations of elementary fermions that might be candidates for cold dark matter particles

  6. Stochastic models in risk theory and management accounting

    NARCIS (Netherlands)

    Brekelmans, R.C.M.

    2000-01-01

    This thesis deals with stochastic models in two fields: risk theory and management accounting. Firstly, two extensions of the classical risk process are analyzed. A method is developed that computes bounds of the probability of ruin for the classical risk rocess extended with a constant interest

  7. Driving Strategic Risk Planning With Predictive Modelling For Managerial Accounting

    DEFF Research Database (Denmark)

    Nielsen, Steen; Pontoppidan, Iens Christian

    mathematical. The ultimate purpose of this paper is to “make the risk concept procedural and analytical” and to argue that accountants should now include stochastic risk management as a standard tool. Drawing on mathematical modelling and statistics, this paper methodically develops risk analysis approach......Currently, risk management in management/managerial accounting is treated as deterministic. Although it is well-known that risk estimates are necessarily uncertain or stochastic, until recently the methodology required to handle stochastic risk-based elements appear to be impractical and too...... for managerial accounting and shows how it can be used to determine the impact of different types of risk assessment input parameters on the variability of important outcome measures. The purpose is to: (i) point out the theoretical necessity of a stochastic risk framework; (ii) present a stochastic framework...

  8. Modeling of Accounting Doctoral Thesis with Emphasis on Solution for Financial Problems

    Directory of Open Access Journals (Sweden)

    F. Mansoori

    2015-02-01

    Full Text Available By passing the instruction period and increase of graduate students and also research budget, knowledge of accounting in Iran entered to the field of research in a way that number of accounting projects has been implemented in the real world. Because of that different experience in implementing the accounting standards were achieved. So, it was expected the mentioned experiences help to solve the financial problems in country, in spite of lots of efforts which were done for researching; we still have many financial and accounting problems in our country. PHD projects could be considered as one of the important solutions to improve the University subjects including accounting. PHD projects are considered as team work job and it will be legitimate by supervisor teams in universities.It is obvious that applied projects should solve part of the problems in accounting field but unfortunately it is not working in the real world. The question which came in to our mind is how come that the out put of the applied and knowledge base projects could not make the darkness of the mentioned problems clear and also why politicians in difficult situations prefer to use their own previous experiences in important decision makings instead of using the consultant’s knowledge base suggestions.In this research I’m going to study, the reasons behind that prevent the applied PHD projects from success in real world which relates to the point of view that consider the political suggestions which are out put of knowledge base projects are not qualified enough for implementation. For this purpose, the indicators of an applied PHD project were considered and 110 vise people were categorized the mentioned indicators and then in a comprehensive study other applied PHD accounting projects were compared to each other.As result, in this study problems of the studied researches were identified and a proper and applied model for creating applied research was developed.

  9. Accounting for Business Models: Increasing the Visibility of Stakeholders

    Directory of Open Access Journals (Sweden)

    Colin Haslam

    2015-01-01

    Full Text Available Purpose: This paper conceptualises a firm’s business model employing stakeholder theory as a central organising element to help inform the purpose and objective(s of business model financial reporting and disclosure. Framework: Firms interact with a complex network of primary and secondary stakeholders to secure the value proposition of a firm’s business model. This value proposition is itself a complex amalgam of value creating, value capturing and value manipulating arrangements with stakeholders. From a financial accounting perspective the purpose of the value proposition for a firm’s business model is to sustain liquidity and solvency as a going concern. Findings: This article argues that stakeholder relations impact upon the financial viability of a firm’s business model value proposition. However current financial reporting by function of expenses and the central organising objectives of the accounting conceptual framework conceal firm-stakeholder relations and their impact on reported financials. Practical implications: The practical implication of our paper is that ‘Business Model’ financial reporting would require a reorientation in the accounting conceptual framework that defines the objectives and purpose of financial reporting. This reorientation would involve reporting about stakeholder relations and their impact on a firms financials not simply reporting financial information to ‘investors’. Social Implications: Business model financial reporting has the potential to be stakeholder inclusive because the numbers and narratives reported by firms in their annual financial statements will increase the visibility of stakeholder relations and how these are being managed. What is original/value of paper: This paper’s original perspective is that it argues that a firm’s business model is structured out of stakeholder relations. It presents the firm’s value proposition as the product of value creating, capturing and

  10. Accounting for small scale heterogeneity in ecohydrologic watershed models

    Science.gov (United States)

    Burke, W.; Tague, C.

    2017-12-01

    Spatially distributed ecohydrologic models are inherently constrained by the spatial resolution of their smallest units, below which land and processes are assumed to be homogenous. At coarse scales, heterogeneity is often accounted for by computing store and fluxes of interest over a distribution of land cover types (or other sources of heterogeneity) within spatially explicit modeling units. However this approach ignores spatial organization and the lateral transfer of water and materials downslope. The challenge is to account both for the role of flow network topology and fine-scale heterogeneity. We present a new approach that defines two levels of spatial aggregation and that integrates spatially explicit network approach with a flexible representation of finer-scale aspatial heterogeneity. Critically, this solution does not simply increase the resolution of the smallest spatial unit, and so by comparison, results in improved computational efficiency. The approach is demonstrated by adapting Regional Hydro-Ecologic Simulation System (RHESSys), an ecohydrologic model widely used to simulate climate, land use, and land management impacts. We illustrate the utility of our approach by showing how the model can be used to better characterize forest thinning impacts on ecohydrology. Forest thinning is typically done at the scale of individual trees, and yet management responses of interest include impacts on watershed scale hydrology and on downslope riparian vegetation. Our approach allow us to characterize the variability in tree size/carbon reduction and water transfers between neighboring trees while still capturing hillslope to watershed scale effects, Our illustrative example demonstrates that accounting for these fine scale effects can substantially alter model estimates, in some cases shifting the impacts of thinning on downslope water availability from increases to decreases. We conclude by describing other use cases that may benefit from this approach

  11. Accounting for household heterogeneity in general equilibrium economic growth models

    International Nuclear Information System (INIS)

    Melnikov, N.B.; O'Neill, B.C.; Dalton, M.G.

    2012-01-01

    We describe and evaluate a new method of aggregating heterogeneous households that allows for the representation of changing demographic composition in a multi-sector economic growth model. The method is based on a utility and labor supply calibration that takes into account time variations in demographic characteristics of the population. We test the method using the Population-Environment-Technology (PET) model by comparing energy and emissions projections employing the aggregate representation of households to projections representing different household types explicitly. Results show that the difference between the two approaches in terms of total demand for energy and consumption goods is negligible for a wide range of model parameters. Our approach allows the effects of population aging, urbanization, and other forms of compositional change on energy demand and CO 2 emissions to be estimated and compared in a computationally manageable manner using a representative household under assumptions and functional forms that are standard in economic growth models.

  12. ACCOUNTING MODELS FOR OUTWARD PROCESSING TRANSACTIONS OF GOODS

    Directory of Open Access Journals (Sweden)

    Lucia PALIU-POPA

    2010-09-01

    Full Text Available In modern international trade, a significant expansion is experienced by commercial operations, also including goods outward processing transactions. The motivations for expanding these international economic affairs, which take place in a complex legal framework, consist of: capitalization of the production capacity for some partners, of the brand for others, leading to a significant commercial profit and thus increasing the currency contribution, without excluding the high and complex nature of risks, both in trading and extra-trading. Starting from the content of processing transactions of goods, as part of combined commercial operations and after clarifying the tax matters which affect the entry in the accounts, we shall present models for reflecting in the accounting of an entity established in Romania the operations of outward processing of goods, if the provider of such operations belongs to the extra-Community or Community area

  13. Principles of Proper Validation

    DEFF Research Database (Denmark)

    Esbensen, Kim; Geladi, Paul

    2010-01-01

    Validation in chemometrics is presented using the exemplar context of multivariate calibration/prediction. A phenomenological analysis of common validation practices in data analysis and chemometrics leads to formulation of a set of generic Principles of Proper Validation (PPV), which is based...

  14. A tulajdonnevek pszicho- és neurolingvisztikája. Vizsgálati szempontok és modellek a tulajdonnevek feldolgozásáról [The psycho- and neurolinguistics of proper names. Aspects and models of analysis on processing proper names

    Directory of Open Access Journals (Sweden)

    Reszegi, Katalin

    2014-12-01

    Full Text Available This paper provides an overview of the results of psycho- and neurolinguistic examinations into the mental processes involving proper names (i.e. storing, processing, retrieving proper names. We can denote entities of various types with the help of proper names, and although most of these types are universal, there are in fact some cultural differences. In the fields of science concerned, that is, in psycho- and neurolinguistics and in neuropsychology, attention is given almost exclusively to anthroponyms; mental and neurological features of toponyms and other name types are much less examined. Processing names is generally believed to display more difficulties than processing common nouns, and these difficulties present themselves more and more strongly with age. In connection with the special identifying function and semantic features of proper names, many researchers assume that we process the two groups of words in different ways. This paper, reflecting also on these assumptions, summarizes and explains the results of research into a selective anomia affecting monolingual speakers (word-finding disturbances; b localization; c reaction time measurement; and d speech disfluency concerning proper names (especially the “tip of the tongue phenomenon”. The author also presents the models of processing proper names, examining to what degree these models can be reconciled with our knowledge of the acquisition of proper names. Finally, the results and possible explanations of the small amount of research into the representation and processing of proper names by bilingual speakers are discussed.

  15. Accommodating environmental variation in population models: metaphysiological biomass loss accounting.

    Science.gov (United States)

    Owen-Smith, Norman

    2011-07-01

    1. There is a pressing need for population models that can reliably predict responses to changing environmental conditions and diagnose the causes of variation in abundance in space as well as through time. In this 'how to' article, it is outlined how standard population models can be modified to accommodate environmental variation in a heuristically conducive way. This approach is based on metaphysiological modelling concepts linking populations within food web contexts and underlying behaviour governing resource selection. Using population biomass as the currency, population changes can be considered at fine temporal scales taking into account seasonal variation. Density feedbacks are generated through the seasonal depression of resources even in the absence of interference competition. 2. Examples described include (i) metaphysiological modifications of Lotka-Volterra equations for coupled consumer-resource dynamics, accommodating seasonal variation in resource quality as well as availability, resource-dependent mortality and additive predation, (ii) spatial variation in habitat suitability evident from the population abundance attained, taking into account resource heterogeneity and consumer choice using empirical data, (iii) accommodating population structure through the variable sensitivity of life-history stages to resource deficiencies, affecting susceptibility to oscillatory dynamics and (iv) expansion of density-dependent equations to accommodate various biomass losses reducing population growth rate below its potential, including reductions in reproductive outputs. Supporting computational code and parameter values are provided. 3. The essential features of metaphysiological population models include (i) the biomass currency enabling within-year dynamics to be represented appropriately, (ii) distinguishing various processes reducing population growth below its potential, (iii) structural consistency in the representation of interacting populations and

  16. Development of a Reduced-Order Model for Reacting Gas-Solids Flow using Proper Orthogonal Decomposition

    Energy Technology Data Exchange (ETDEWEB)

    McDaniel, Dwayne; Dulikravich, George; Cizmas, Paul

    2017-11-27

    This report summarizes the objectives, tasks and accomplishments made during the three year duration of this research project. The report presents the results obtained by applying advanced computational techniques to develop reduced-order models (ROMs) in the case of reacting multiphase flows based on high fidelity numerical simulation of gas-solids flow structures in risers and vertical columns obtained by the Multiphase Flow with Interphase eXchanges (MFIX) software. The research includes a numerical investigation of reacting and non-reacting gas-solids flow systems and computational analysis that will involve model development to accelerate the scale-up process for the design of fluidization systems by providing accurate solutions that match the full-scale models. The computational work contributes to the development of a methodology for obtaining ROMs that is applicable to the system of gas-solid flows. Finally, the validity of the developed ROMs is evaluated by comparing the results against those obtained using the MFIX code. Additionally, the robustness of existing POD-based ROMs for multiphase flows is improved by avoiding non-physical solutions of the gas void fraction and ensuring that the reduced kinetics models used for reactive flows in fluidized beds are thermodynamically consistent.

  17. Nonlinear model-based control of the Czochralski process III: Proper choice of manipulated variables and controller parameter scheduling

    Science.gov (United States)

    Neubert, M.; Winkler, J.

    2012-12-01

    This contribution continues an article series [1,2] about the nonlinear model-based control of the Czochralski crystal growth process. The key idea of the presented approach is to use a sophisticated combination of nonlinear model-based and conventional (linear) PI controllers for tracking of both, crystal radius and growth rate. Using heater power and pulling speed as manipulated variables several controller structures are possible. The present part tries to systematize the properties of the materials to be grown in order to get unambiguous decision criteria for a most profitable choice of the controller structure. For this purpose a material specific constant M called interface mobility and a more process specific constant S called system response number are introduced. While the first one summarizes important material properties like thermal conductivity and latent heat the latter one characterizes the process by evaluating the average axial thermal gradients at the phase boundary and the actual growth rate at which the crystal is grown. Furthermore these characteristic numbers are useful for establishing a scheduling strategy for the PI controller parameters in order to improve the controller performance. Finally, both numbers give a better understanding of the general thermal system dynamics of the Czochralski technique.

  18. Accounting for model error in Bayesian solutions to hydrogeophysical inverse problems using a local basis approach

    Science.gov (United States)

    Irving, J.; Koepke, C.; Elsheikh, A. H.

    2017-12-01

    Bayesian solutions to geophysical and hydrological inverse problems are dependent upon a forward process model linking subsurface parameters to measured data, which is typically assumed to be known perfectly in the inversion procedure. However, in order to make the stochastic solution of the inverse problem computationally tractable using, for example, Markov-chain-Monte-Carlo (MCMC) methods, fast approximations of the forward model are commonly employed. This introduces model error into the problem, which has the potential to significantly bias posterior statistics and hamper data integration efforts if not properly accounted for. Here, we present a new methodology for addressing the issue of model error in Bayesian solutions to hydrogeophysical inverse problems that is geared towards the common case where these errors cannot be effectively characterized globally through some parametric statistical distribution or locally based on interpolation between a small number of computed realizations. Rather than focusing on the construction of a global or local error model, we instead work towards identification of the model-error component of the residual through a projection-based approach. In this regard, pairs of approximate and detailed model runs are stored in a dictionary that grows at a specified rate during the MCMC inversion procedure. At each iteration, a local model-error basis is constructed for the current test set of model parameters using the K-nearest neighbour entries in the dictionary, which is then used to separate the model error from the other error sources before computing the likelihood of the proposed set of model parameters. We demonstrate the performance of our technique on the inversion of synthetic crosshole ground-penetrating radar traveltime data for three different subsurface parameterizations of varying complexity. The synthetic data are generated using the eikonal equation, whereas a straight-ray forward model is assumed in the inversion

  19. Accounting for Water Insecurity in Modeling Domestic Water Demand

    Science.gov (United States)

    Galaitsis, S. E.; Huber-lee, A. T.; Vogel, R. M.; Naumova, E.

    2013-12-01

    Water demand management uses price elasticity estimates to predict consumer demand in relation to water pricing changes, but studies have shown that many additional factors effect water consumption. Development scholars document the need for water security, however, much of the water security literature focuses on broad policies which can influence water demand. Previous domestic water demand studies have not considered how water security can affect a population's consumption behavior. This study is the first to model the influence of water insecurity on water demand. A subjective indicator scale measuring water insecurity among consumers in the Palestinian West Bank is developed and included as a variable to explore how perceptions of control, or lack thereof, impact consumption behavior and resulting estimates of price elasticity. A multivariate regression model demonstrates the significance of a water insecurity variable for data sets encompassing disparate water access. When accounting for insecurity, the R-squaed value improves and the marginal price a household is willing to pay becomes a significant predictor for the household quantity consumption. The model denotes that, with all other variables held equal, a household will buy more water when the users are more water insecure. Though the reasons behind this trend require further study, the findings suggest broad policy implications by demonstrating that water distribution practices in scarcity conditions can promote consumer welfare and efficient water use.

  20. A note concerning the proper choice for Markov model order for daily precipitation in the humid tropics: a case study in Costa Rica

    Science.gov (United States)

    Harrison, Michael; Waylen, Peter

    2000-11-01

    The use of chain-dependent hydroclimatological models (sometimes referred to as combined models or two-part models) in analysing daily precipitation requires that rainfall be modelled using both occurrence and intensity statistics. Markov processes in the context of precipitation climatology have been studied in such regions as monsoonal Asia, sub-Saharan Africa and South America. Many studies have indicated that the use of a first-order Markov model is often adequate when describing daily precipitation occurrences, particularly when working in temperate regions, but relatively little work has been done in the humid tropics regarding proper Markov model order, particularly in the western hemisphere. This research examines the occurrence characteristics of Costa Rican daily precipitation by comparing the Akaike and Bayesian information criteria (AIC and BIC) for three long-term meteorological stations. It is found that the most parsimonious models generally are those of first order (winter) or zero order (summer). Overall, the BIC yields less ambiguous results than the AIC, and thus, a higher level of model confidence is achieved when using the BIC as the model-order selection criteria.

  1. Proper Islamic Consumption

    DEFF Research Database (Denmark)

    Fischer, Johan

    '[T]his book is an excellent study that is lucidly written, strongly informed by theory, rich in ethnography, and empirically grounded. It has blazed a new trail in employing the tools of both religious studies and cultural studies to dissect the complex subject of “proper Islamic consumption...... mobile, religiously committed communities to the opportunities and perils presented by modernisation. It also tells us something about the debates concerning the meanings and practices of Islam within an aggressive, globalised, secularised modernity. In Malaysia this is an especially intriguing issue...... because it is the Malay‐dominated state which has been crucial in generating and shaping a particular kind of modernity in order to address the problems posed for nation‐building by a quite radical form of ethnic pluralism.' Reviewed by V.T. (Terry) King, University of Leeds, ASEASUK News 46, 2009   'In...

  2. Hubble Space Telescope Proper Motion (HSTPROMO) Catalogs of Galactic Globular Clusters. V. The Rapid Rotation of 47 Tuc Traced and Modeled in Three Dimensions

    Science.gov (United States)

    Bellini, A.; Bianchini, P.; Varri, A. L.; Anderson, J.; Piotto, G.; van der Marel, R. P.; Vesperini, E.; Watkins, L. L.

    2017-08-01

    High-precision proper motions of the globular cluster 47 Tuc have allowed us to measure for the first time the cluster rotation in the plane of the sky and the velocity anisotropy profile from the cluster core out to about 13‧. These profiles are coupled with prior measurements along the line of sight (LOS) and the surface brightness profile and fit all together with self-consistent models specifically constructed to describe quasi-relaxed stellar systems with realistic differential rotation, axisymmetry, and pressure anisotropy. The best-fit model provides an inclination angle i between the rotation axis and the LOS direction of 30° and is able to simultaneously reproduce the full three-dimensional kinematics and structure of the cluster, while preserving a good agreement with the projected morphology. Literature models based solely on LOS measurements imply a significantly different inclination angle (i = 45°), demonstrating that proper motions play a key role in constraining the intrinsic structure of 47 Tuc. Our best-fit global dynamical model implies an internal rotation higher than previous studies have shown and suggests a peak of the intrinsic V/σ ratio of ∼0.9 at around two half-light radii, with a nonmonotonic intrinsic ellipticity profile reaching values up to 0.45. Our study unveils a new degree of dynamical complexity in 47 Tuc, which may be leveraged to provide new insights into the formation and evolution of globular clusters. Based on archival observations with the NASA/ESA Hubble Space Telescope, obtained at the Space Telescope Science Institute, which is operated by AURA, Inc., under NASA contract NAS 5-26555.

  3. The origins of the spanish railroad accounting model: a qualitative study of the MZA'S operating account (1856-1874

    Directory of Open Access Journals (Sweden)

    Beatriz Santos

    2014-12-01

    Full Text Available The lack of external regulation about the form and substance of the financial statements that railroad companies had to report during the implementation phase of the Spanish railway, meant that each company developed its own accounting model. In this study we have described, analysed and interpreted the more relevant changes in the accounting information in relation to the business result. Using the analysis of an historical case, we developed an ad-hoc research tool, for recording all the changes of the operating account. The results of the study prove that MZA’s operating account reflected the particularities of the railway business although subject to limitations, and the reported information improved during the study period in terms of relevance and reliability

  4. Accrual based accounting implementation: An approach for modelling major decisions

    Directory of Open Access Journals (Sweden)

    Ratno Agriyanto

    2016-12-01

    Full Text Available Over the last three decades the main issues of implementation of accrual based accounting government institutions in Indonesia. Implementation of accrual based accounting in government institutions amid debate about the usefulness of accounting information for decision-making. Empirical study shows that the accrual based of accounting information on a government institution is not used for decision making. The research objective was to determine the impact of the implementation of the accrual based accounting to the accrual basis of accounting information use for decision-making basis. We used the survey questionnaires. The data were processed by SEM using statistical software WarpPLS. The results showed that the implementation of the accrual based accounting in City Government Semarang has significantly positively associated with decision-making. Another important finding is the City Government officials of Semarang have personality, low tolerance of ambiguity is a negative effect on the relationship between the implementation of the accrual based accounting for decision making

  5. Bedrijfsrisico's van de accountant en het Audit Risk Model

    NARCIS (Netherlands)

    Wallage, Ph.; Klijnsmit, P.; Sodekamp, M.

    2003-01-01

    In de afgelopen jaren is het bedrijfsrisico van de controlerend accountant sterk toegenomen. De bedrijfsrisico’s van de accountant beginnen in toenemende mate een belemmering te vormen voor het aanvaarden van opdrachten. In dit artikel wordt aandacht besteed aan de wijze waarop de bedrijfsrisico’s

  6. Creative Accounting and Financial Reporting: Model Development and Empirical Testing

    OpenAIRE

    Fizza Tassadaq; Qaisar Ali Malik

    2015-01-01

    This paper empirically and critically investigates the issue of creative accounting in financial reporting. It not only analyzes the ethical responsibility of creative accounting but also focuses on other factors which influence the financial reporting like role of auditors, role of government regulations or international standards, impact of manipulative behaviors and impact of ethical values of an individual. Data has been collected through structured questionnaire from industrial sector. D...

  7. Creative Accounting & Financial Reporting: Model Development & Empirical Testing

    OpenAIRE

    Tassadaq, Fizza; Malik, Qaisar Ali

    2015-01-01

    This paper empirically and critically investigates the issue of creative accounting in financial reporting. It not only analyzes the ethical responsibility of creative accounting but also focuses on other factors which influence the financial reporting like role of auditors, role of govt. regulations or international standards, impact of manipulative behaviors and impact of ethical values of an individual. Data has been collected through structured questionnaire from industrial sector. Descri...

  8. A Model Driven Approach to domain standard specifications examplified by Finance Accounts receivable/ Accounts payable

    OpenAIRE

    Khan, Bahadar

    2005-01-01

    This thesis was written as a part of a master degree at the University of Oslo. The thesis work was conducted at SINTEF. The work has been carried out in the period November 2002 and April 2005. This thesis might be interesting to anyone interested in Domain Standard Specification Language developed by using the MDA approach to software development. The Model Driven Architecture (MDA) allows to separate the system functionality specification from its implementation on any specific technolo...

  9. Models and Rules of Evaluation in International Accounting

    OpenAIRE

    Liliana Feleaga; Niculae Feleaga

    2006-01-01

    The accounting procedures cannot be analyzed without a previous evaluation. Value is in general a very subjective issue, usually the result of a monetary evaluation made to a specific asset, group of assets or entities, or to some rendered services. Within the economic sciences, value comes from its very own deep history. In accounting, the concept of value had a late and fragile start. The term of value must not be misinterpreted as being the same thing with cost, even though value is freque...

  10. Accounting for heterogeneity of public lands in hedonic property models

    Science.gov (United States)

    Charlotte Ham; Patricia A. Champ; John B. Loomis; Robin M. Reich

    2012-01-01

    Open space lands, national forests in particular, are usually treated as homogeneous entities in hedonic price studies. Failure to account for the heterogeneous nature of public open spaces may result in inappropriate inferences about the benefits of proximate location to such lands. In this study the hedonic price method is used to estimate the marginal values for...

  11. Modelling Financial-Accounting Decisions by Means of OLAP Tools

    Directory of Open Access Journals (Sweden)

    Diana Elena CODREAN

    2011-03-01

    Full Text Available At present, one can say that a company’s good running largely depends on the information quantity and quality it relies on when making decisions. The information needed to underlie decisions and be obtained due to the existence of a high-performing information system which makes it possible for the data to be shown quickly, synthetically and truly, also providing the opportunity for complex analyses and predictions. In such circumstances, computerized accounting systems, too, have grown their complexity by means of data analyzing information solutions such as OLAP and Data Mining which help perform a multidimensional analysis of financial-accounting data, potential frauds can be detected and data hidden information can be revealed, trends for certain indicators can be set up, therefore ensuring useful information to a company’s decision making.

  12. The Charitable Trust Model: An Alternative Approach For Department Of Defense Accounting

    Science.gov (United States)

    2016-12-01

    unqualified opinion creates accountability issues that extend beyond the agency by making an audit of the U.S. consolidated financial statements challenging ...the foundation of contemporary reporting. The chapter then discusses the establishment and purpose of the Federal Accounting Standards Advisory...TRUST MODEL: AN ALTERNATIVE APPROACH FOR DEPARTMENT OF DEFENSE ACCOUNTING by Gerald V. Weers Jr. December 2016 Thesis Advisor: Philip J

  13. Resource Allocation Models and Accountability: A Jamaican Case Study

    Science.gov (United States)

    Nkrumah-Young, Kofi K.; Powell, Philip

    2008-01-01

    Higher education institutions (HEIs) may be funded privately, by the state or by a mixture of the two. Nevertheless, any state financing of HE necessitates a mechanism to determine the level of support and the channels through which it is to be directed; that is, a resource allocation model. Public funding, through resource allocation models,…

  14. A 2D finite element procedure for magnetic field analysis taking into account a vector Preisach model

    Directory of Open Access Journals (Sweden)

    Dupré Luc R.

    1997-01-01

    Full Text Available The main purpose of this paper is to incorporate a refined hysteresis model, viz. a vector Preisach model, in 2D magnetic field computations. To this end the governing Maxwell equations are rewritten in a suitable way, which allows to take into account the proper magnetic material parameters and, moreover, to pass to a variational formulation. The variational problem is solved numerically by a FE approximation, using a quadratic mesh, followed by the time discretisation based upon a modified Cranck Nicholson algorithm. The latter includes a suitable iteration procedure to deal with the nonlinear hysteresis behaviour. Finally, the effectiveness of the presented mathematical tool has been confirmed by several numerical experiments.

  15. Modeling Antibiotic Tolerance in Biofilms by Accounting for Nutrient Limitation

    OpenAIRE

    Roberts, Mark E.; Stewart, Philip S.

    2004-01-01

    A mathematical model of biofilm dynamics was used to investigate the protection from antibiotic killing that can be afforded to microorganisms in biofilms based on a mechanism of localized nutrient limitation and slow growth. The model assumed that the rate of killing by the antibiotic was directly proportional to the local growth rate. Growth rates in the biofilm were calculated by using the local concentration of a single growth-limiting substrate with Monod kinetics. The concentration prof...

  16. A novel Atoh1 "self-terminating" mouse model reveals the necessity of proper Atoh1 level and duration for hair cell differentiation and viability.

    Directory of Open Access Journals (Sweden)

    Ning Pan

    Full Text Available Atonal homolog1 (Atoh1 is a bHLH transcription factor essential for inner ear hair cell differentiation. Targeted expression of Atoh1 at various stages in development can result in hair cell differentiation in the ear. However, the level and duration of Atoh1 expression required for proper hair cell differentiation and maintenance remain unknown. We generated an Atoh1 conditional knockout (CKO mouse line using Tg(Atoh1-cre, in which the cre expression is driven by an Atoh1 enhancer element that is regulated by Atoh1 protein to "self-terminate" its expression. The mutant mice show transient, limited expression of Atoh1 in all hair cells in the ear. In the organ of Corti, reduction and delayed deletion of Atoh1 result in progressive loss of almost all the inner hair cells and the majority of the outer hair cells within three weeks after birth. The remaining cells express hair cell marker Myo7a and attract nerve fibers, but do not differentiate normal stereocilia bundles. Some Myo7a-positive cells persist in the cochlea into adult stages in the position of outer hair cells, flanked by a single row of pillar cells and two to three rows of disorganized Deiters cells. Gene expression analyses of Atoh1, Barhl1 and Pou4f3, genes required for survival and maturation of hair cells, reveal earlier and higher expression levels in the inner compared to the outer hair cells. Our data show that Atoh1 is crucial for hair cell mechanotransduction development, viability, and maintenance and also suggest that Atoh1 expression level and duration may play a role in inner vs. outer hair cell development. These genetically engineered Atoh1 CKO mice provide a novel model for establishing critical conditions needed to regenerate viable and functional hair cells with Atoh1 therapy.

  17. Accountability: a missing construct in models of adherence behavior and in clinical practice.

    Science.gov (United States)

    Oussedik, Elias; Foy, Capri G; Masicampo, E J; Kammrath, Lara K; Anderson, Robert E; Feldman, Steven R

    2017-01-01

    Piano lessons, weekly laboratory meetings, and visits to health care providers have in common an accountability that encourages people to follow a specified course of action. The accountability inherent in the social interaction between a patient and a health care provider affects patients' motivation to adhere to treatment. Nevertheless, accountability is a concept not found in adherence models, and is rarely employed in typical medical practice, where patients may be prescribed a treatment and not seen again until a return appointment 8-12 weeks later. The purpose of this paper is to describe the concept of accountability and to incorporate accountability into an existing adherence model framework. Based on the Self-Determination Theory, accountability can be considered in a spectrum from a paternalistic use of duress to comply with instructions (controlled accountability) to patients' autonomous internal desire to please a respected health care provider (autonomous accountability), the latter expected to best enhance long-term adherence behavior. Existing adherence models were reviewed with a panel of experts, and an accountability construct was incorporated into a modified version of Bandura's Social Cognitive Theory. Defining accountability and incorporating it into an adherence model will facilitate the development of measures of accountability as well as the testing and refinement of adherence interventions that make use of this critical determinant of human behavior.

  18. Accounting Fundamentals and Variations of Stock Price: Methodological Refinement with Recursive Simultaneous Model

    OpenAIRE

    Sumiyana, Sumiyana; Baridwan, Zaki

    2013-01-01

    This study investigates association between accounting fundamentals and variations of stock prices using recursive simultaneous equation model. The accounting fundamentalsconsist of earnings yield, book value, profitability, growth opportunities and discount rate. The prior single relationships model has been investigated by Chen and Zhang (2007),Sumiyana (2011) and Sumiyana et al. (2010). They assume that all accounting fundamentals associate direct-linearly to the stock returns. This study ...

  19. ACCOUNTING FUNDAMENTALS AND VARIATIONS OF STOCK PRICE: METHODOLOGICAL REFINEMENT WITH RECURSIVE SIMULTANEOUS MODEL

    OpenAIRE

    Sumiyana, Sumiyana; Baridwan, Zaki

    2015-01-01

    This study investigates association between accounting fundamentals and variations of stock prices using recursive simultaneous equation model. The accounting fundamentalsconsist of earnings yield, book value, profitability, growth opportunities and discount rate. The prior single relationships model has been investigated by Chen and Zhang (2007),Sumiyana (2011) and Sumiyana et al. (2010). They assume that all accounting fundamentals associate direct-linearly to the stock returns. This study ...

  20. Applying the International Medical Graduate Program Model to Alleviate the Supply Shortage of Accounting Doctoral Faculty

    Science.gov (United States)

    HassabElnaby, Hassan R.; Dobrzykowski, David D.; Tran, Oanh Thikie

    2012-01-01

    Accounting has been faced with a severe shortage in the supply of qualified doctoral faculty. Drawing upon the international mobility of foreign scholars and the spirit of the international medical graduate program, this article suggests a model to fill the demand in accounting doctoral faculty. The underlying assumption of the suggested model is…

  1. A cellular automation model accounting for bicycle's group behavior

    Science.gov (United States)

    Tang, Tie-Qiao; Rui, Ying-Xu; Zhang, Jian; Shang, Hua-Yan

    2018-02-01

    Recently, bicycle has become an important traffic tool in China, again. Due to the merits of bicycle, the group behavior widely exists in urban traffic system. However, little effort has been made to explore the impacts of the group behavior on bicycle flow. In this paper, we propose a CA (cellular automaton) model with group behavior to explore the complex traffic phenomena caused by shoulder group behavior and following group behavior on an open road. The numerical results illustrate that the proposed model can qualitatively describe the impacts of the two kinds of group behaviors on bicycle flow and that the effects are related to the mode and size of group behaviors. The results can help us to better understand the impacts of the bicycle's group behaviors on urban traffic system and effectively control the bicycle's group behavior.

  2. Accounting for latent classes in movie box office modeling

    OpenAIRE

    Antipov, Evgeny; Pokryshevskaya, Elena

    2010-01-01

    This paper addresses the issue of unobserved heterogeneity in film characteristics influence on box-office. We argue that the analysis of pooled samples, most common among researchers, does not shed light on underlying segmentations and leads to significantly different estimates obtained by researchers running similar regressions for movie success modeling. For instance, it may be expected that a restrictive MPAA rating is a box office poison for a family comedy, while it insignificantly infl...

  3. Biblical Scriptures impact on six ethical models influencing accounting practices

    OpenAIRE

    Rodgers, Waymond; Gago Rodríguez, Susana

    2006-01-01

    The recent frauds in organizations have been a point for reflection among researchers and practitioners regarding the lack of morality in certain decision-making. We argue for a modification of decision-making models that has been accepted in organizations with stronger links with ethics and morality. With this aim we propose a return to the base value of Christianity, supported by Bible scriptures, underlying six dominant ethical approaches that drive practices in organizations. Publicado

  4. Spherical Detector Device Mathematical Modelling with Taking into Account Detector Module Symmetry

    International Nuclear Information System (INIS)

    Batyj, V.G.; Fedorchenko, D.V.; Prokopets, S.I.; Prokopets, I.M.; Kazhmuradov, M.A.

    2005-01-01

    Mathematical Model for spherical detector device accounting to symmetry properties is considered. Exact algorithm for simulation of measurement procedure with multiple radiation sources is developed. Modelling results are shown to have perfect agreement with calibration measurements

  5. Computer-Based Resource Accounting Model for Automobile Technology Impact Assessment

    Science.gov (United States)

    1976-10-01

    A computer-implemented resource accounting model has been developed for assessing resource impacts of future automobile technology options. The resources tracked are materials, energy, capital, and labor. The model has been used in support of the Int...

  6. Financial Organization Information Security System Development using Modeling, IT assets and Accounts Classification Processes

    Directory of Open Access Journals (Sweden)

    Anton Sergeevich Zaytsev

    2013-12-01

    Full Text Available This article deals with processes of modeling, IT assets and account classification. Key principles of these processes configuration are pointed up. Also a model of Russian Federation banking system organization is developed.

  7. Nurse-directed care model in a psychiatric hospital: a model for clinical accountability.

    Science.gov (United States)

    E-Morris, Marlene; Caldwell, Barbara; Mencher, Kathleen J; Grogan, Kimberly; Judge-Gorny, Margaret; Patterson, Zelda; Christopher, Terrian; Smith, Russell C; McQuaide, Teresa

    2010-01-01

    The focus on recovery for persons with severe and persistent mental illness is leading state psychiatric hospitals to transform their method of care delivery. This article describes a quality improvement project involving a hospital's administration and multidisciplinary state-university affiliation that collaborated in the development and implementation of a nursing care delivery model in a state psychiatric hospital. The quality improvement project team instituted a new model to promote the hospital's vision of wellness and recovery through utilization of the therapeutic relationship and greater clinical accountability. Implementation of the model was accomplished in 2 phases: first, the establishment of a structure to lay the groundwork for accountability and, second, the development of a mechanism to provide a clinical supervision process for staff in their work with clients. Effectiveness of the model was assessed by surveys conducted at baseline and after implementation. Results indicated improvement in clinical practices and client living environment. As a secondary outcome, these improvements appeared to be associated with increased safety on the units evidenced by reduction in incidents of seclusion and restraint. Restructuring of the service delivery system of care so that clients are the center of clinical focus improves safety and can enhance the staff's attention to work with clients on their recovery. The role of the advanced practice nurse can influence the recovery of clients in state psychiatric hospitals. Future research should consider the impact on clients and their perceptions of the new service models.

  8. Econometric modelling of Serbian current account determinants: Jackknife Model Averaging approach

    Directory of Open Access Journals (Sweden)

    Petrović Predrag

    2014-01-01

    Full Text Available This research aims to model Serbian current account determinants for the period Q1 2002 - Q4 2012. Taking into account the majority of relevant determinants, using the Jackknife Model Averaging approach, 48 different models have been estimated, where 1254 equations needed to be estimated and averaged for each of the models. The results of selected representative models indicate moderate persistence of the CA and positive influence of: fiscal balance, oil trade balance, terms of trade, relative income and real effective exchange rates, where we should emphasise: (i a rather strong influence of relative income, (ii the fact that the worsening of oil trade balance results in worsening of other components (probably non-oil trade balance of CA and (iii that the positive influence of terms of trade reveals functionality of the Harberger-Laursen-Metzler effect in Serbia. On the other hand, negative influence is evident in case of: relative economic growth, gross fixed capital formation, net foreign assets and trade openness. What particularly stands out is the strong effect of relative economic growth that, most likely, reveals high citizens' future income growth expectations, which has negative impact on the CA.

  9. Towards accounting for dissolved iron speciation in global ocean models

    Directory of Open Access Journals (Sweden)

    A. Tagliabue

    2011-10-01

    Full Text Available The trace metal iron (Fe is now routinely included in state-of-the-art ocean general circulation and biogeochemistry models (OGCBMs because of its key role as a limiting nutrient in regions of the world ocean important for carbon cycling and air-sea CO2 exchange. However, the complexities of the seawater Fe cycle, which impact its speciation and bioavailability, are simplified in such OGCBMs due to gaps in understanding and to avoid high computational costs. In a similar fashion to inorganic carbon speciation, we outline a means by which the complex speciation of Fe can be included in global OGCBMs in a reasonably cost-effective manner. We construct an Fe speciation model based on hypothesised relationships between rate constants and environmental variables (temperature, light, oxygen, pH, salinity and assumptions regarding the binding strengths of Fe complexing organic ligands and test hypotheses regarding their distributions. As a result, we find that the global distribution of different Fe species is tightly controlled by spatio-temporal environmental variability and the distribution of Fe binding ligands. Impacts on bioavailable Fe are highly sensitive to assumptions regarding which Fe species are bioavailable and how those species vary in space and time. When forced by representations of future ocean circulation and climate we find large changes to the speciation of Fe governed by pH mediated changes to redox kinetics. We speculate that these changes may exert selective pressure on phytoplankton Fe uptake strategies in the future ocean. In future work, more information on the sources and sinks of ocean Fe ligands, their bioavailability, the cycling of colloidal Fe species and kinetics of Fe-surface coordination reactions would be invaluable. We hope our modeling approach can provide a means by which new observations of Fe speciation can be tested against hypotheses of the processes present in governing the ocean Fe cycle in an

  10. An extended lattice model accounting for traffic jerk

    Science.gov (United States)

    Redhu, Poonam; Siwach, Vikash

    2018-02-01

    In this paper, a flux difference lattice hydrodynamics model is extended by considering the traffic jerk effect which comes due to vehicular motion of non-motor automobiles. The effect of traffic jerk has been examined through linear stability analysis and shown that it can significantly enlarge the unstable region on the phase diagram. To describe the phase transition of traffic flow, mKdV equation near the critical point is derived through nonlinear stability analysis. The theoretical findings have been verified using numerical simulation which confirms that the jerk parameter plays an important role in stabilizing the traffic jam efficiently in sensing the flux difference of leading sites.

  11. MODEL OF ACCOUNTING ENGINEERING IN VIEW OF EARNINGS MANAGEMENT IN POLAND

    Directory of Open Access Journals (Sweden)

    Leszek Michalczyk

    2012-10-01

    Full Text Available The article introduces the theoretical foundations of the author’s original concept of accounting engineering. We assume a theoretical premise whereby accounting engineering is understood as a system of accounting practice utilising differences in economic events resultant from the use of divergent accounting methods. Unlike, for instance, creative or praxeological accounting, accounting engineering is composed only, and under all circumstances, of lawful activities and adheres to the current regulations of the balance sheet law. The aim of the article is to construct a model of accounting engineering exploiting taking into account differences inherently present in variant accounting. These differences result in disparate financial results of identical economic events. Given the fact that regardless of which variant is used in accounting, all settlements are eventually equal to one another, a new class of differences emerges - the accounting engineering potential. It is transferred to subsequent reporting (balance sheet periods. In the end, the profit “made” in a given period reduces the financial result of future periods. This effect is due to the “transfer” of costs from one period to another. Such actions may have sundry consequences and are especially dangerous whenever many individuals are concerned with the profit of a given company, e.g. on a stock exchange. The reverse may be observed when a company is privatised and its value is being intentionally reduced by a controlled recording of accounting provisions, depending on the degree to which they are justified. The reduction of a company’s goodwill in Balcerowicz’s model of no-tender privatisation allows to justify the low value of the purchased company. These are only some of many manifestations of variant accounting which accounting engineering employs. A theoretical model of the latter is presented in this article.

  12. A new model in achieving Green Accounting at hotels in Bali

    Science.gov (United States)

    Astawa, I. P.; Ardina, C.; Yasa, I. M. S.; Parnata, I. K.

    2018-01-01

    The concept of green accounting becomes a debate in terms of its implementation in a company. The result of previous studies indicates that there are no standard model regarding its implementation to support performance. The research aims to create a different green accounting model to other models by using local cultural elements as the variables in building it. The research is conducted in two steps. The first step is designing the model based on theoretical studies by considering the main and supporting elements in building the concept of green accounting. The second step is conducting a model test at 60 five stars hotels started with data collection through questionnaire and followed by data processing using descriptive statistic. The result indicates that the hotels’ owner has implemented green accounting attributes and it supports previous studies. Another result, which is a new finding, shows that the presence of local culture, government regulation, and the awareness of hotels’ owner has important role in the development of green accounting concept. The results of the research give contribution to accounting science in terms of green reporting. The hotel management should adopt local culture in building the character of accountant hired in the accounting department.

  13. Secular perturbation theory and computation of asteroid proper elements

    Science.gov (United States)

    Milani, Andrea; Knezevic, Zoran

    1991-01-01

    A new theory for the calculation of proper elements is presented. This theory defines an explicit algorithm applicable to any chosen set of orbits and accounts for the effect of shallow resonances on secular frequencies. The proper elements are computed with an iterative algorithm and the behavior of the iteration can be used to define a quality code.

  14. An Integrative Model of the Strategic Management Accounting at the Enterprises of Chemical Industry

    Directory of Open Access Journals (Sweden)

    Aleksandra Vasilyevna Glushchenko

    2016-06-01

    Full Text Available Currently, the issues of information and analytical support of strategic management enabling to take timely and high-quality management decisions, are extremely relevant. Conflicting and poor information, haphazard collected in the practice of large companies from unreliable sources, affects the effective implementation of their development strategies and carries the threat of risk, by the increasing instability of the external environment. Thus chemical industry is one of the central places in the industry of Russia and, of course, has its specificity in the formation of the informationsupport system. Such an information system suitable for the development and implementation of strategic directions, changes in recognized competitive advantages of strategic management accounting. The issues of the lack of requirements for strategic accounting information, its inconsistency in the result of simultaneous accumulation in different parts and using different methods of calculation and assessment of indicators is impossible without a well-constructed model of organization of strategic management accounting. The purpose of this study is to develop such a model, the implementation of which will allow realizing the possibility of achieving strategic goals by harmonizing information from the individual objects of the strategic account to increase the functional effectiveness of management decisions with a focus on strategy. Case study was based on dialectical logic and methods of system analysis, and identifying causal relationships in building a model of strategic management accounting that contributes to the forecasts of its development. The study proposed to implement an integrative model of organization of strategic management accounting. The purpose of a phased implementation of this model defines the objects and tools of strategic management accounting. Moreover, it is determined that from the point of view of increasing the usefulness of management

  15. Boltzmann babies in the proper time measure

    Energy Technology Data Exchange (ETDEWEB)

    Bousso, Raphael; Bousso, Raphael; Freivogel, Ben; Yang, I-Sheng

    2007-12-20

    After commenting briefly on the role of the typicality assumption in science, we advocate a phenomenological approach to the cosmological measure problem. Like any other theory, a measure should be simple, general, well defined, and consistent with observation. This allows us to proceed by elimination. As an example, we consider the proper time cutoff on a geodesic congruence. It predicts that typical observers are quantum fluctuations in the early universe, or Boltzmann babies. We sharpen this well-known youngness problem by taking into account the expansion and open spatial geometry of pocket universes. Moreover, we relate the youngness problem directly to the probability distribution for observables, such as the temperature of the cosmic background radiation. We consider a number of modifications of the proper time measure, but find none that would make it compatible with observation.

  16. Chinese Basic Pension Substitution Rate: A Monte Carlo Demonstration of the Individual Account Model

    OpenAIRE

    Dong, Bei; Zhang, Ling; Lu, Xuan

    2008-01-01

    At the end of 2005, the State Council of China passed ”The Decision on adjusting the Individual Account of Basic Pension System”, which adjusted the individual account in the 1997 basic pension system. In this essay, we will analyze the adjustment above, and use Life Annuity Actuarial Theory to establish the basic pension substitution rate model. Monte Carlo simulation is also used to prove the rationality of the model. Some suggestions are put forward associated with the substitution rate ac...

  17. Fitting a code-red virus spread model: An account of putting theory into practice

    NARCIS (Netherlands)

    Kolesnichenko, A.V.; Haverkort, Boudewijn R.H.M.; Remke, Anne Katharina Ingrid; de Boer, Pieter-Tjerk

    This paper is about fitting a model for the spreading of a computer virus to measured data, contributing not only the fitted model, but equally important, an account of the process of getting there. Over the last years, there has been an increased interest in epidemic models to study the speed of

  18. Mitigating BeiDou Satellite-Induced Code Bias: Taking into Account the Stochastic Model of Corrections

    Directory of Open Access Journals (Sweden)

    Fei Guo

    2016-06-01

    Full Text Available The BeiDou satellite-induced code biases have been confirmed to be orbit type-, frequency-, and elevation-dependent. Such code-phase divergences (code bias variations severely affect absolute precise applications which use code measurements. To reduce their adverse effects, an improved correction model is proposed in this paper. Different from the model proposed by Wanninger and Beer (2015, more datasets (a time span of almost two years were used to produce the correction values. More importantly, the stochastic information, i.e., the precision indexes, were given together with correction values in the improved model. However, only correction values were given while the precision indexes were completely missing in the traditional model. With the improved correction model, users may have a better understanding of their corrections, especially the uncertainty of corrections. Thus, it is helpful for refining the stochastic model of code observations. Validation tests in precise point positioning (PPP reveal that a proper stochastic model is critical. The actual precision of the corrected code observations can be reflected in a more objective manner if the stochastic model of the corrections is taken into account. As a consequence, PPP solutions with the improved model outperforms the traditional one in terms of positioning accuracy, as well as convergence speed. In addition, the Melbourne-Wübbena (MW combination which serves for ambiguity fixing were verified as well. The uncorrected MW values show strong systematic variations with an amplitude of half a wide-lane cycle, which prevents precise ambiguity determination and successful ambiguity resolution. After application of the code bias correction models, the systematic variations can be greatly removed, and the resulting wide lane ambiguities are more likely to be fixed. Moreover, the code residuals show more reasonable distributions after code bias corrections with either the traditional or the

  19. Mutual Calculations in Creating Accounting Models: A Demonstration of the Power of Matrix Mathematics in Accounting Education

    Science.gov (United States)

    Vysotskaya, Anna; Kolvakh, Oleg; Stoner, Greg

    2016-01-01

    The aim of this paper is to describe the innovative teaching approach used in the Southern Federal University, Russia, to teach accounting via a form of matrix mathematics. It thereby contributes to disseminating the technique of teaching to solve accounting cases using mutual calculations to a worldwide audience. The approach taken in this course…

  20. Kosambi and Proper Orthogonal Decomposition

    Indian Academy of Sciences (India)

    In 1943 Kosambi published a paper titled 'Statis- tics in function space' in the Journal of the Indian. Mathematical Society. This paper was the first to propose the technique of statistical analysis of- ten called proper orthogonal decomposition to- day. This article describes the contents of that paper and Kosambi's approach to ...

  1. Accountability: a missing construct in models of adherence behavior and in clinical practice

    Directory of Open Access Journals (Sweden)

    Oussedik E

    2017-07-01

    Full Text Available Elias Oussedik,1 Capri G Foy,2 E J Masicampo,3 Lara K Kammrath,3 Robert E Anderson,1 Steven R Feldman1,4,5 1Center for Dermatology Research, Department of Dermatology, Wake Forest School of Medicine, Winston-Salem, NC, USA; 2Department of Social Sciences and Health Policy, Wake Forest School of Medicine, Winston-Salem, NC, USA; 3Department of Psychology, Wake Forest University, Winston-Salem, NC, USA; 4Department of Pathology, Wake Forest School of Medicine, Winston-Salem, NC, USA; 5Department of Public Health Sciences, Wake Forest School of Medicine, Winston-Salem, NC, USA Abstract: Piano lessons, weekly laboratory meetings, and visits to health care providers have in common an accountability that encourages people to follow a specified course of action. The accountability inherent in the social interaction between a patient and a health care provider affects patients’ motivation to adhere to treatment. Nevertheless, accountability is a concept not found in adherence models, and is rarely employed in typical medical practice, where patients may be prescribed a treatment and not seen again until a return appointment 8–12 weeks later. The purpose of this paper is to describe the concept of accountability and to incorporate accountability into an existing adherence model framework. Based on the Self-Determination Theory, accountability can be considered in a spectrum from a paternalistic use of duress to comply with instructions (controlled accountability to patients’ autonomous internal desire to please a respected health care provider (autonomous accountability, the latter expected to best enhance long-term adherence behavior. Existing adherence models were reviewed with a panel of experts, and an accountability construct was incorporated into a modified version of Bandura’s Social Cognitive Theory. Defining accountability and incorporating it into an adherence model will facilitate the development of measures of accountability as well

  2. N-body modeling of globular clusters: detecting intermediate-mass black holes by non-equipartition in HST proper motions

    Science.gov (United States)

    Trenti, Michele

    2010-09-01

    Intermediate Mass Black Holes {IMBHs} are objects of considerable astrophysical significance. They have been invoked as possible remnants of Population III stars, precursors of supermassive black holes, sources of ultra-luminous X-ray emission, and emitters of gravitational waves. The centers of globular clusters, where they may have formed through runaway collapse of massive stars, may be our best chance of detecting them. HST studies of velocity dispersions have provided tentative evidence, but the measurements are difficult and the results have been disputed. It is thus important to explore and develop additional indicators of the presence of an IMBH in these systems. In a Cycle 16 theory project we focused on the fingerprints of an IMBH derived from HST photometry. We showed that an IMBH leads to a detectable quenching of mass segregation. Analysis of HST-ACS data for NGC 2298 validated the method, and ruled out an IMBH of more than 300 solar masses. We propose here to extend the search for IMBH signatures from photometry to kinematics. The velocity dispersion of stars in collisionally relaxed stellar systems such as globular clusters scales with main sequence mass as sigma m^alpha. A value alpha = -0.5 corresponds to equipartition. Mass-dependent kinematics can now be measured from HST proper motion studies {e.g., alpha = -0.21 for Omega Cen}. Preliminary analysis shows that the value of alpha can be used as indicator of the presence of an IMBH. In fact, the quenching of mass segregation is a result of the degree of equipartition that the system attains. However, detailed numerical simulations are required to quantify this. Therefore we propose {a} to carry out a new, larger set of realistic N-body simulations of star clusters with IMBHs, primordial binaries and stellar evolution to predict in detail the expected kinematic signatures and {b} to compare these predictions to datasets that are {becoming} available. Considerable HST resources have been invested in

  3. A Social Accountable Model for Medical Education System in Iran: A Grounded-Theory

    Directory of Open Access Journals (Sweden)

    Mohammadreza Abdolmaleki

    2017-10-01

    Full Text Available Social accountability has been increasingly discussed over the past three decades in various fields providing service to the community and has been expressed as a goal for various areas. In medical education system, like other social accountability areas, it is considered as one of the main objectives globally. The aim of this study was to seek a social accountability theory in the medical education system that is capable of identifying all the standards, norms, and conditions within the country related to the study subject and recognize their relationship. In this study, a total of eight experts in the field of social accountability in medical education system with executive or study experience were interviewedpersonally. After analysis of interviews, 379 codes, 59 secondary categories, 16 subcategories, and 9 main categories were obtained. The resulting data was collected and analyzed at three levels of open coding, axial coding, and selective coding in the form of grounded theory study of “Accountability model of medical education in Iran”, which can be used in education system’s policies and planning for social accountability, given that almost all effective components of social accountability in highereducation health system with causal and facilitator associations were determined.Keywords: SOCIAL ACCOUNTABILITY, COMMUNITY–ORIENTED MEDICINE, COMMUNITY MEDICINE, EDUCATION SYSTEM, GROUNDED THEORY

  4. Toward a Useful Model for Group Mentoring in Public Accounting Firms

    Directory of Open Access Journals (Sweden)

    Steven J. Johnson

    2013-07-01

    Full Text Available Today’s public accounting firms face a number of challenges in relation to their most valuable resource and primary revenue generator, human capital. Expanding regulations, technology advances, increased competition and high turnover rates are just a few of the issues confronting public accounting leaders in today’s complex business environment. In recent years, some public accounting firms have attempted to combat low retention and high burnout rates with traditional one-to-one mentoring programs, with varying degrees of success. Many firms have found that they lack the resources necessary to successfully implement and maintain such programs. In other industries, organizations have used a group mentoring approach in attempt to remove potential barriers to mentoring success. Although the research regarding group mentoring shows promise for positive organizational outcomes, no cases could be found in the literature regarding its usage in a public accounting firm. Because of the unique challenges associated with public accounting firms, this paper attempts to answer two questions: (1Does group mentoring provide a viable alternative to traditional mentoring in a public accounting firm? (2 If so, what general model might be used for implementing such a program? In answering these questions, a review of the group mentoring literature is provided, along with a suggested model for the implementation of group mentoring in a public accounting firm.

  5. A Two-Account Life Insurance Model for Scenario-Based Valuation Including Event Risk

    DEFF Research Database (Denmark)

    Jensen, Ninna Reitzel; Schomacker, Kristian Juul

    2015-01-01

    Using a two-account model with event risk, we model life insurance contracts taking into account both guaranteed and non-guaranteed payments in participating life insurance as well as in unit-linked insurance. Here, event risk is used as a generic term for life insurance events, such as death......, disability, etc. In our treatment of participating life insurance, we have special focus on the bonus schemes “consolidation” and “additional benefits”, and one goal is to formalize how these work and interact. Another goal is to describe similarities and differences between participating life insurance...... and unit-linked insurance. By use of a two-account model, we are able to illustrate general concepts without making the model too abstract. To allow for complicated financial markets without dramatically increasing the mathematical complexity, we focus on economic scenarios. We illustrate the use of our...

  6. A simulation model of hospital management based on cost accounting analysis according to disease.

    Science.gov (United States)

    Tanaka, Koji; Sato, Junzo; Guo, Jinqiu; Takada, Akira; Yoshihara, Hiroyuki

    2004-12-01

    Since a little before 2000, hospital cost accounting has been increasingly performed at Japanese national university hospitals. At Kumamoto University Hospital, for instance, departmental costs have been analyzed since 2000. And, since 2003, the cost balance has been obtained according to certain diseases for the preparation of Diagnosis-Related Groups and Prospective Payment System. On the basis of these experiences, we have constructed a simulation model of hospital management. This program has worked correctly at repeated trials and with satisfactory speed. Although there has been room for improvement of detailed accounts and cost accounting engine, the basic model has proved satisfactory. We have constructed a hospital management model based on the financial data of an existing hospital. We will later improve this program from the viewpoint of construction and using more various data of hospital management. A prospective outlook may be obtained for the practical application of this hospital management model.

  7. Facility level SSAC for model country - an introduction and material balance accounting principles

    International Nuclear Information System (INIS)

    Jones, R.J.

    1989-01-01

    A facility level State System of Accounting for and Control of Nuclear Materials (SSAC) for a model country and the principles of materials balance accounting relating to that country are described. The seven principal elements of a SSAC are examined and a facility level system based on them discussed. The seven elements are organization and management; nuclear material measurements; measurement quality; records and reports; physical inventory taking; material balance closing; containment and surveillance. 11 refs., 19 figs., 5 tabs

  8. Assimilation of tourism satellite accounts and applied general equilibrium models to inform tourism policy analysis

    OpenAIRE

    Rossouw, Riaan; Saayman, Melville

    2011-01-01

    Historically, tourism policy analysis in South Africa has posed challenges to accurate measurement. The primary reason for this is that tourism is not designated as an 'industry' in standard economic accounts. This paper therefore demonstrates the relevance and need for applied general equilibrium (AGE) models to be completed and extended through an integration with tourism satellite accounts (TSAs) as a tool for policy makers (especially tourism policy makers) in South Africa. The paper sets...

  9. Analysis Social Security System Model in South Sulawesi Province: On Accounting Perspective

    OpenAIRE

    Mediaty,; Said, Darwis; Syahrir,; Indrijawati, Aini

    2015-01-01

    - This research aims to analyze the poverty, education, and health in social security system model based on accounting perspective using empirical study on South Sulawesi Province. Issued Law No. 40 for 2004 regarding National Social Security System is one of attentions from government about social welfare. Accounting as a social science deserves to create social security mechanisms. One of the crucial mechanisms is social security system. This research is a grounded exploratory research w...

  10. School Board Improvement Plans in Relation to the AIP Model of Educational Accountability: A Content Analysis

    Science.gov (United States)

    van Barneveld, Christina; Stienstra, Wendy; Stewart, Sandra

    2006-01-01

    For this study we analyzed the content of school board improvement plans in relation to the Achievement-Indicators-Policy (AIP) model of educational accountability (Nagy, Demeris, & van Barneveld, 2000). We identified areas of congruence and incongruence between the plans and the model. Results suggested that the content of the improvement…

  11. Lessons learned for spatial modelling of ecosystem services in support of ecosystem accounting

    NARCIS (Netherlands)

    Schroter, M.; Remme, R.P.; Sumarga, E.; Barton, D.N.; Hein, L.G.

    2015-01-01

    Assessment of ecosystem services through spatial modelling plays a key role in ecosystem accounting. Spatial models for ecosystem services try to capture spatial heterogeneity with high accuracy. This endeavour, however, faces several practical constraints. In this article we analyse the trade-offs

  12. Accounting for differences in dieting status: steps in the refinement of a model.

    Science.gov (United States)

    Huon, G; Hayne, A; Gunewardene, A; Strong, K; Lunn, N; Piira, T; Lim, J

    1999-12-01

    The overriding objective of this paper is to outline the steps involved in refining a structural model to explain differences in dieting status. Cross-sectional data (representing the responses of 1,644 teenage girls) derive from the preliminary testing in a 3-year longitudinal study. A battery of measures assessed social influence, vulnerability (to conformity) disposition, protective (social coping) skills, and aspects of positive familial context as core components in a model proposed to account for the initiation of dieting. Path analyses were used to establish the predictive ability of those separate components and their interrelationships in accounting for differences in dieting status. Several components of the model were found to be important predictors of dieting status. The model incorporates significant direct, indirect (or mediated), and moderating relationships. Taking all variables into account, the strongest prediction of dieting status was from peer competitiveness, using a new scale developed specifically for this study. Systematic analyses are crucial for the refinement of models to be used in large-scale multivariate studies. In the short term, the model investigated in this study has been shown to be useful in accounting for cross-sectional differences in dieting status. The refined model will be most powerfully employed in large-scale time-extended studies of the initiation of dieting to lose weight. Copyright 1999 by John Wiley & Sons, Inc.

  13. The Effect of Platelet-Rich Plasma on Survival of the Composite Graft and the Proper Time of Injection in a Rabbit Ear Composite Graft Model

    Directory of Open Access Journals (Sweden)

    Hyun Nam Choi

    2014-11-01

    Full Text Available BackgroundAdministration of growth factors has been associated with increased viability of composite grafts greater than 1-cm in diameter. Platelet-rich plasma (PRP contains many of the growth factors studied. In this study, we evaluate the effect of PRP injection on composite graft viability and the proper time for injection.MethodsA total of 24 New Zealand White rabbits were divided into four groups. Autologous PRP was injected into the recipient sites three days before grafting in group 1, on the day of grafting in group 2, and three days after grafting in group 3. Group 4 served as control without PRP administration. Auricular composite grafts of 3-cm diameter were harvested and grafted back into place after being rotated 180 degrees. Median graft viability and microvessel density were evaluated at day 21 of graft via macroscopic photographs and immunofluorescent staining, respectively.ResultsThe median graft survival rate was 97.8% in group 1, 69.2% in group 2, 55.7% in group 3, and 40.8% in the control group. The median vessel counts were 34 (per ×200 HPF in group 1, 24.5 in group 2, 19.5 in group 3, and 10.5 in the control group.ConclusionsThis study demonstrates that PRP administration is associated with increased composite graft viability. All experimental groups showed a significantly higher survival rate and microvessel density, compared with the control group. Pre-administration of PRP was followed by the highest graft survival rate and revascularization. PRP treatments are minimally invasive, fast, easily applicable, and inexpensive, and offer a potential clinical pathway to larger composite grafts.

  14. Modelling characteristics of photovoltaic panels with thermal phenomena taken into account

    International Nuclear Information System (INIS)

    Krac, Ewa; Górecki, Krzysztof

    2016-01-01

    In the paper a new form of the electrothermal model of photovoltaic panels is proposed. This model takes into account the optical, electrical and thermal properties of the considered panels, as well as electrical and thermal properties of the protecting circuit and thermal inertia of the considered panels. The form of this model is described and some results of measurements and calculations of mono-crystalline and poly-crystalline panels are presented

  15. Accounting for uncertainty in ecological analysis: the strengths and limitations of hierarchical statistical modeling.

    Science.gov (United States)

    Cressie, Noel; Calder, Catherine A; Clark, James S; Ver Hoef, Jay M; Wikle, Christopher K

    2009-04-01

    Analyses of ecological data should account for the uncertainty in the process(es) that generated the data. However, accounting for these uncertainties is a difficult task, since ecology is known for its complexity. Measurement and/or process errors are often the only sources of uncertainty modeled when addressing complex ecological problems, yet analyses should also account for uncertainty in sampling design, in model specification, in parameters governing the specified model, and in initial and boundary conditions. Only then can we be confident in the scientific inferences and forecasts made from an analysis. Probability and statistics provide a framework that accounts for multiple sources of uncertainty. Given the complexities of ecological studies, the hierarchical statistical model is an invaluable tool. This approach is not new in ecology, and there are many examples (both Bayesian and non-Bayesian) in the literature illustrating the benefits of this approach. In this article, we provide a baseline for concepts, notation, and methods, from which discussion on hierarchical statistical modeling in ecology can proceed. We have also planted some seeds for discussion and tried to show where the practical difficulties lie. Our thesis is that hierarchical statistical modeling is a powerful way of approaching ecological analysis in the presence of inevitable but quantifiable uncertainties, even if practical issues sometimes require pragmatic compromises.

  16. PREDICTIVE CAPACITY OF INSOLVENCY MODELS BASED ON ACCOUNTING NUMBERS AND DESCRIPTIVE DATA

    Directory of Open Access Journals (Sweden)

    Rony Petson Santana de Souza

    2012-09-01

    Full Text Available In Brazil, research into models to predict insolvency started in the 1970s, with most authors using discriminant analysis as a statistical tool in their models. In more recent years, authors have increasingly tried to verify whether it is possible to forecast insolvency using descriptive data contained in firms’ reports. This study examines the capacity of some insolvency models to predict the failure of Brazilian companies that have gone bankrupt. The study is descriptive in nature with a quantitative approach, based on research of documents. The sample is omposed of 13 companies that were declared bankrupt between 1997 and 2003. The results indicate that the majority of the insolvency prediction models tested showed high rates of correct forecasts. The models relying on descriptive reports on average were more likely to succeed than those based on accounting figures. These findings demonstrate that although some studies indicate a lack of validity of predictive models created in different business settings, some of these models have good capacity to forecast insolvency in Brazil. We can conclude that both models based on accounting numbers and those relying on descriptive reports can predict the failure of firms. Therefore, it can be inferred that the majority of bankruptcy prediction models that make use of accounting numbers can succeed in predicting the failure of firms.

  17. The Anachronism of the Local Public Accountancy Determinate by the Accrual European Model

    Directory of Open Access Journals (Sweden)

    Riana Iren RADU

    2009-01-01

    Full Text Available Placing the European accrual model upon cash accountancy model,presently used in Romania, at the level of the local communities, makespossible that the anachronism of the model to manifest itself on the discussion’sconcentration at the nominalization about the model’s inclusion in everydaypublic practice. The basis of the accrual model were first defined in the lawregarding the commercial societies adopted in Great Britain in 1985, when theydetermined that all income and taxes referring to the financial year “will betaken into consideration without any boundary to the reception or paymentdate.”1 The accrual model in accountancy needs the recording of the non-casheffects in transactions or financial events for their appearance periods and not inany generated cash, received or paid. The business development was the basisfor “sophistication” of the recordings of the transactions and financial events,being prerequisite for recording the debtors’ or creditors’ sums.

  18. Multiple Imputation to Account for Measurement Error in Marginal Structural Models.

    Science.gov (United States)

    Edwards, Jessie K; Cole, Stephen R; Westreich, Daniel; Crane, Heidi; Eron, Joseph J; Mathews, W Christopher; Moore, Richard; Boswell, Stephen L; Lesko, Catherine R; Mugavero, Michael J

    2015-09-01

    Marginal structural models are an important tool for observational studies. These models typically assume that variables are measured without error. We describe a method to account for differential and nondifferential measurement error in a marginal structural model. We illustrate the method estimating the joint effects of antiretroviral therapy initiation and current smoking on all-cause mortality in a United States cohort of 12,290 patients with HIV followed for up to 5 years between 1998 and 2011. Smoking status was likely measured with error, but a subset of 3,686 patients who reported smoking status on separate questionnaires composed an internal validation subgroup. We compared a standard joint marginal structural model fit using inverse probability weights to a model that also accounted for misclassification of smoking status using multiple imputation. In the standard analysis, current smoking was not associated with increased risk of mortality. After accounting for misclassification, current smoking without therapy was associated with increased mortality (hazard ratio [HR]: 1.2 [95% confidence interval [CI] = 0.6, 2.3]). The HR for current smoking and therapy [0.4 (95% CI = 0.2, 0.7)] was similar to the HR for no smoking and therapy (0.4; 95% CI = 0.2, 0.6). Multiple imputation can be used to account for measurement error in concert with methods for causal inference to strengthen results from observational studies.

  19. Model Application of Accounting Information Systems of Spare Parts Sales and Purchase on Car Service Company

    Directory of Open Access Journals (Sweden)

    Lianawati Christian

    2015-09-01

    Full Text Available The purpose of this research is to analyze accounting information systems of sales and purchases of spare parts in general car service companies and to identify the problems encountered and the needs of necessary information. This research used literature study to collect data, field study with observation, and design using UML (Unified Modeling Language with activity diagrams, class diagrams, use case diagrams, database design, form design, display design, draft reports. The result achieved is an application model of accounting information systems of sales and purchases of spare parts in general car service companies. As a conclusion, the accounting information systems of sales and purchases provides ease for management to obtain information quickly and easily as well as the presentation of reports quickly and accurately.

  20. Accounting for environmental variability, modeling errors, and parameter estimation uncertainties in structural identification

    Science.gov (United States)

    Behmanesh, Iman; Moaveni, Babak

    2016-07-01

    This paper presents a Hierarchical Bayesian model updating framework to account for the effects of ambient temperature and excitation amplitude. The proposed approach is applied for model calibration, response prediction and damage identification of a footbridge under changing environmental/ambient conditions. The concrete Young's modulus of the footbridge deck is the considered updating structural parameter with its mean and variance modeled as functions of temperature and excitation amplitude. The identified modal parameters over 27 months of continuous monitoring of the footbridge are used to calibrate the updating parameters. One of the objectives of this study is to show that by increasing the levels of information in the updating process, the posterior variation of the updating structural parameter (concrete Young's modulus) is reduced. To this end, the calibration is performed at three information levels using (1) the identified modal parameters, (2) modal parameters and ambient temperatures, and (3) modal parameters, ambient temperatures, and excitation amplitudes. The calibrated model is then validated by comparing the model-predicted natural frequencies and those identified from measured data after deliberate change to the structural mass. It is shown that accounting for modeling error uncertainties is crucial for reliable response prediction, and accounting only the estimated variability of the updating structural parameter is not sufficient for accurate response predictions. Finally, the calibrated model is used for damage identification of the footbridge.

  1. Accounting for Co-Teaching: A Guide for Policymakers and Developers of Value-Added Models

    Science.gov (United States)

    Isenberg, Eric; Walsh, Elias

    2015-01-01

    We outline the options available to policymakers for addressing co-teaching in a value-added model. Building on earlier work, we propose an improvement to a method of accounting for co-teaching that treats co-teachers as teams, with each teacher receiving equal credit for co-taught students. Hock and Isenberg (2012) described a method known as the…

  2. Proper motion survey for solar nearby stars

    International Nuclear Information System (INIS)

    Goldman, Bertrand

    2001-01-01

    For its microlensing observations EROS 2 built one of the largest CCD mosaic opera ting since 1996. This instrument allowed us to survey a large area of the sky, to look for faint, cool compact objects in the Solar neighborhood that may contribute to the Dark Matter revealed by flat rotation curves of spiral galaxies and the Milky Way. We imaged over 400 square degrees, at least three times over four years, with a single, stable instrument. The aim of this work is the reduction, the analysis and the detection of high proper motion objects that would look like those expected in a dark halo. We selected and analyzed thousands of images taken in two bands, visible and near-infrared, and obtained a catalogue of several thousand stars with proper motion typically higher than 80 milli-arc-seconds per year. None of these candidates displays the expected properties of the halo objects: very high proper motion and faintness. The second part of our work was to put constraints on the contributions of white dwarfs and brown dwarfs ta the halo. To do that, we simulated our data set and estimated our sensitivity to halo objects. We compared our results about moderately high proper motion stars with existing Galactic models, and confirmed the robustness of these models. We deduced a upper limit ta the contribution of M v = 17.5 white dwarfs to the standard halo of 10% (at the 95% confidence level), or 5% of a 14 Gyr old halo, and to the contribution of brown dwarfs of 7% (95% C.L.). Finally, among our candidates, several interesting objects, that do not belong to the halo but are among the coolest and faintest known, have been discovered. Systematic search for faint, nearby objects thus lead us to study disk L dwarfs, as well as old white dwarfs of the disk. (author) [fr

  3. Accounting assessment

    Directory of Open Access Journals (Sweden)

    Kafka S.М.

    2017-03-01

    Full Text Available The proper evaluation of accounting objects influences essentially upon the reliability of assessing the financial situation of a company. Thus, the problem in accounting estimate is quite relevant. The works of home and foreign scholars on the issues of assessment of accounting objects, regulatory and legal acts of Ukraine controlling the accounting and compiling financial reporting are a methodological basis for the research. The author uses the theoretical methods of cognition (abstraction and generalization, analysis and synthesis, induction and deduction and other methods producing conceptual knowledge for the synthesis of theoretical and methodological principles in the evaluation of assets accounting, liabilities and equity. The tabular presentation and information comparison methods are used for analytical researches. The article considers the modern approaches to the issue of evaluation of accounting objects and financial statements items. The expedience to keep records under historical value is proved and the articles of financial statements are to be presented according to the evaluation on the reporting date. In connection with the evaluation the depreciation of fixed assets is considered as a process of systematic return into circulation of the before advanced funds on the purchase (production, improvement of fixed assets and intangible assets by means of including the amount of wear in production costs. Therefore it is proposed to amortize only the actual costs incurred, i.e. not to depreciate the fixed assets received free of charge and surplus valuation of different kinds.

  4. Towards New Empirical Versions of Financial and Accounting Models Corrected for Measurement Errors

    OpenAIRE

    Francois-Éric Racicot; Raymond Théoret; Alain Coen

    2006-01-01

    In this paper, we propose a new empirical version of the Fama and French Model based on the Hausman (1978) specification test and aimed at discarding measurement errors in the variables. The proposed empirical framework is general enough to be used for correcting other financial and accounting models of measurement errors. Removing measurement errors is important at many levels as information disclosure, corporate governance and protection of investors.

  5. The Ruling Discourse on Proper Womanhood in the Hungarian Parliament

    Directory of Open Access Journals (Sweden)

    Annus Irén

    2014-09-01

    Full Text Available Starting with a debate in September 2012 on the incorporation of domestic violence as a distinct offence in Hungary’s new Criminal Code, the issue of gender and proper womanhood has regularly re-surfaced in statements made by ruling coalition MPs in parliamentary debates. Drawing on discourse analysis, this study investigates a selection of these statements in the context of the government’s current policy and public discourse. The paper argues that these discourses outline an essentialist model reflective of a dominant ideology that is traditional, Christian, patriarchal and heteronormative, which, by hinting at women’s accountability for certain social ills, also allows for a chain of associations that ultimately results in the subversion of the overall social status of women, dividing and marginalising them further and discrediting any claims or actions aimed at establishing a more egalitarian society in the country

  6. Mass estimates from stellar proper motions: the mass of ω Centauri

    Science.gov (United States)

    D'Souza, Richard; Rix, Hans-Walter

    2013-03-01

    We lay out and apply methods to use proper motions of individual kinematic tracers for estimating the dynamical mass of star clusters. We first describe a simple projected mass estimator and then develop an approach that evaluates directly the likelihood of the discrete kinematic data given the model predictions. Those predictions may come from any dynamical modelling approach, and we implement an analytic King model, a spherical isotropic Jeans equation model and an axisymmetric, anisotropic Jeans equation model. This maximum likelihood modelling (MLM) provides a framework for a model-data comparison, and a resulting mass estimate, which accounts explicitly for the discrete nature of the data for individual stars, the varying error bars for proper motions of differing signal-to-noise ratio, and for data incompleteness. Both of these two methods are evaluated for their practicality and are shown to provide an unbiased and robust estimate of the cluster mass. We apply these approaches to the enigmatic globular cluster ω Centauri, combining the proper motion from van Leeuwen et al. with improved photometric cluster membership probabilities. We show that all mass estimates based on spherical isotropic models yield (4.55 ± 0.1) × 106 M⊙[D/5.5 ± 0.2 kpc]3, where our modelling allows us to show how the statistical precision of this estimate improves as more proper motion data of lower signal-to-noise ratio are included. MLM predictions, based on an anisotropic axisymmetric Jeans model, indicate for ω Cen that the inclusion of anisotropies is not important for the mass estimates, but that accounting for the flattening is: flattened models imply (4.05 ± 0.1) × 106 M⊙[D/5.5 ± 0.2 kpc]3, 10 per cent lower than when restricting the analysis to a spherical model. The best current distance estimates imply an additional uncertainty in the mass estimate of 12 per cent.

  7. One Model Fits All: Explaining Many Aspects of Number Comparison within a Single Coherent Model-A Random Walk Account

    Science.gov (United States)

    Reike, Dennis; Schwarz, Wolf

    2016-01-01

    The time required to determine the larger of 2 digits decreases with their numerical distance, and, for a given distance, increases with their magnitude (Moyer & Landauer, 1967). One detailed quantitative framework to account for these effects is provided by random walk models. These chronometric models describe how number-related noisy…

  8. Theoretical analysis of a hybrid traffic model accounting for safe velocity

    Science.gov (United States)

    Wang, Yu-Qing; Zhou, Chao-Fan; Yan, Bo-Wen; Zhang, De-Chen; Wang, Ji-Xin; Jia, Bin; Gao, Zi-You; Wu, Qing-Song

    2017-04-01

    A hybrid traffic-flow model [Wang-Zhou-Yan (WZY) model] is brought out in this paper. In WZY model, the global equilibrium velocity is replaced by the local equilibrium one, which emphasizes that the modification of vehicle velocity is based on the view of safe-driving rather than the global deployment. In the view of safe-driving, the effect of drivers’ estimation is taken into account. Moreover, the linear stability of the traffic model has been performed. Furthermore, in order to test the robustness of the system, the evolvement of the density wave and the velocity wave of the traffic flow has been numerically calculated.

  9. Fundamental Principles of Proper Space Kinematics

    Science.gov (United States)

    Wade, Sean

    It is desirable to understand the movement of both matter and energy in the universe based upon fundamental principles of space and time. Time dilation and length contraction are features of Special Relativity derived from the observed constancy of the speed of light. Quantum Mechanics asserts that motion in the universe is probabilistic and not deterministic. While the practicality of these dissimilar theories is well established through widespread application inconsistencies in their marriage persist, marring their utility, and preventing their full expression. After identifying an error in perspective the current theories are tested by modifying logical assumptions to eliminate paradoxical contradictions. Analysis of simultaneous frames of reference leads to a new formulation of space and time that predicts the motion of both kinds of particles. Proper Space is a real, three-dimensional space clocked by proper time that is undergoing a densification at the rate of c. Coordinate transformations to a familiar object space and a mathematical stationary space clarify the counterintuitive aspects of Special Relativity. These symmetries demonstrate that within the local universe stationary observers are a forbidden frame of reference; all is in motion. In lieu of Quantum Mechanics and Uncertainty the use of the imaginary number i is restricted for application to the labeling of mass as either material or immaterial. This material phase difference accounts for both the perceived constant velocity of light and its apparent statistical nature. The application of Proper Space Kinematics will advance more accurate representations of microscopic, oscopic, and cosmological processes and serve as a foundation for further study and reflection thereafter leading to greater insight.

  10. Accounting of inter-electron correlations in the model of mobile electron shells

    International Nuclear Information System (INIS)

    Panov, Yu.D.; Moskvin, A.S.

    2000-01-01

    One studied the basic peculiar features of the model for mobile electron shells for multielectron atom or cluster. One offered a variation technique to take account of the electron correlations where the coordinates of the centre of single-particle atomic orbital served as variation parameters. It enables to interpret dramatically variation of electron density distribution under anisotropic external effect in terms of the limited initial basis. One studied specific correlated states that might make correlation contribution into the orbital current. Paper presents generalization of the typical MO-LCAO pattern with the limited set of single particle functions enabling to take account of additional multipole-multipole interactions in the cluster [ru

  11. Accounting for Uncertainty in Decision Analytic Models Using Rank Preserving Structural Failure Time Modeling: Application to Parametric Survival Models.

    Science.gov (United States)

    Bennett, Iain; Paracha, Noman; Abrams, Keith; Ray, Joshua

    2018-01-01

    Rank Preserving Structural Failure Time models are one of the most commonly used statistical methods to adjust for treatment switching in oncology clinical trials. The method is often applied in a decision analytic model without appropriately accounting for additional uncertainty when determining the allocation of health care resources. The aim of the study is to describe novel approaches to adequately account for uncertainty when using a Rank Preserving Structural Failure Time model in a decision analytic model. Using two examples, we tested and compared the performance of the novel Test-based method with the resampling bootstrap method and with the conventional approach of no adjustment. In the first example, we simulated life expectancy using a simple decision analytic model based on a hypothetical oncology trial with treatment switching. In the second example, we applied the adjustment method on published data when no individual patient data were available. Mean estimates of overall and incremental life expectancy were similar across methods. However, the bootstrapped and test-based estimates consistently produced greater estimates of uncertainty compared with the estimate without any adjustment applied. Similar results were observed when using the test based approach on a published data showing that failing to adjust for uncertainty led to smaller confidence intervals. Both the bootstrapping and test-based approaches provide a solution to appropriately incorporate uncertainty, with the benefit that the latter can implemented by researchers in the absence of individual patient data. Copyright © 2018 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  12. Accounting for covariate measurement error in a Cox model analysis of recurrence of depression.

    Science.gov (United States)

    Liu, K; Mazumdar, S; Stone, R A; Dew, M A; Houck, P R; Reynolds, C F

    2001-01-01

    When a covariate measured with error is used as a predictor in a survival analysis using the Cox model, the parameter estimate is usually biased. In clinical research, covariates measured without error such as treatment procedure or sex are often used in conjunction with a covariate measured with error. In a randomized clinical trial of two types of treatments, we account for the measurement error in the covariate, log-transformed total rapid eye movement (REM) activity counts, in a Cox model analysis of the time to recurrence of major depression in an elderly population. Regression calibration and two variants of a likelihood-based approach are used to account for measurement error. The likelihood-based approach is extended to account for the correlation between replicate measures of the covariate. Using the replicate data decreases the standard error of the parameter estimate for log(total REM) counts while maintaining the bias reduction of the estimate. We conclude that covariate measurement error and the correlation between replicates can affect results in a Cox model analysis and should be accounted for. In the depression data, these methods render comparable results that have less bias than the results when measurement error is ignored.

  13. Using a Rasch Model to Account for Guessing as a Source of Low Discrimination.

    Science.gov (United States)

    Humphry, Stephen

    2015-01-01

    The most common approach to modelling item discrimination and guessing for multiple-choice questions is the three parameter logistic (3PL) model. However, proponents of Rasch models generally avoid using the 3PL model because to model guessing entails sacrificing the distinctive property and advantages of Rasch models. One approach to dealing with guessing based on the application of Rasch models is to omit responses in which guessing appears to play a significant role. However, this approach entails loss of information and it does not account for variable item discrimination. It has been shown, though, that provided specific constraints are met, it is possible to parameterize discrimination while preserving the distinctive property of Rasch models. This article proposes an approach that uses Rasch models to account for guessing on standard multiple-choice items simply by treating it as a source of low item discrimination. Technical considerations are noted although a detailed examination of such considerations is beyond the scope of this article.

  14. Accounting for Local Dependence with the Rasch Model: The Paradox of Information Increase.

    Science.gov (United States)

    Andrich, David

    Test theories imply statistical, local independence. Where local independence is violated, models of modern test theory that account for it have been proposed. One violation of local independence occurs when the response to one item governs the response to a subsequent item. Expanding on a formulation of this kind of violation between two items in the dichotomous Rasch model, this paper derives three related implications. First, it formalises how the polytomous Rasch model for an item constituted by summing the scores of the dependent items absorbs the dependence in its threshold structure. Second, it shows that as a consequence the unit when the dependence is accounted for is not the same as if the items had no response dependence. Third, it explains the paradox, known, but not explained in the literature, that the greater the dependence of the constituent items the greater the apparent information in the constituted polytomous item when it should provide less information.

  15. Cost accounting models used for price-setting of health services: an international review.

    Science.gov (United States)

    Raulinajtys-Grzybek, Monika

    2014-12-01

    The aim of the article was to present and compare cost accounting models which are used in the area of healthcare for pricing purposes in different countries. Cost information generated by hospitals is further used by regulatory bodies for setting or updating prices of public health services. The article presents a set of examples from different countries of the European Union, Australia and the United States and concentrates on DRG-based payment systems as they primarily use cost information for pricing. Differences between countries concern the methodology used, as well as the data collection process and the scope of the regulations on cost accounting. The article indicates that the accuracy of the calculation is only one of the factors that determine the choice of the cost accounting methodology. Important aspects are also the selection of the reference hospitals, precise and detailed regulations and the existence of complex healthcare information systems in hospitals. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  16. Accounting for methodological, structural, and parameter uncertainty in decision-analytic models: a practical guide.

    Science.gov (United States)

    Bilcke, Joke; Beutels, Philippe; Brisson, Marc; Jit, Mark

    2011-01-01

    Accounting for uncertainty is now a standard part of decision-analytic modeling and is recommended by many health technology agencies and published guidelines. However, the scope of such analyses is often limited, even though techniques have been developed for presenting the effects of methodological, structural, and parameter uncertainty on model results. To help bring these techniques into mainstream use, the authors present a step-by-step guide that offers an integrated approach to account for different kinds of uncertainty in the same model, along with a checklist for assessing the way in which uncertainty has been incorporated. The guide also addresses special situations such as when a source of uncertainty is difficult to parameterize, resources are limited for an ideal exploration of uncertainty, or evidence to inform the model is not available or not reliable. for identifying the sources of uncertainty that influence results most are also described. Besides guiding analysts, the guide and checklist may be useful to decision makers who need to assess how well uncertainty has been accounted for in a decision-analytic model before using the results to make a decision.

  17. The self-consistent field model for Fermi systems with account of three-body interactions

    Directory of Open Access Journals (Sweden)

    Yu.M. Poluektov

    2015-12-01

    Full Text Available On the basis of a microscopic model of self-consistent field, the thermodynamics of the many-particle Fermi system at finite temperatures with account of three-body interactions is built and the quasiparticle equations of motion are obtained. It is shown that the delta-like three-body interaction gives no contribution into the self-consistent field, and the description of three-body forces requires their nonlocality to be taken into account. The spatially uniform system is considered in detail, and on the basis of the developed microscopic approach general formulas are derived for the fermion's effective mass and the system's equation of state with account of contribution from three-body forces. The effective mass and pressure are numerically calculated for the potential of "semi-transparent sphere" type at zero temperature. Expansions of the effective mass and pressure in powers of density are obtained. It is shown that, with account of only pair forces, the interaction of repulsive character reduces the quasiparticle effective mass relative to the mass of a free particle, and the attractive interaction raises the effective mass. The question of thermodynamic stability of the Fermi system is considered and the three-body repulsive interaction is shown to extend the region of stability of the system with the interparticle pair attraction. The quasiparticle energy spectrum is calculated with account of three-body forces.

  18. Proper alignment of the microscope.

    Science.gov (United States)

    Rottenfusser, Rudi

    2013-01-01

    The light microscope is merely the first element of an imaging system in a research facility. Such a system may include high-speed and/or high-resolution image acquisition capabilities, confocal technologies, and super-resolution methods of various types. Yet more than ever, the proverb "garbage in-garbage out" remains a fact. Image manipulations may be used to conceal a suboptimal microscope setup, but an artifact-free image can only be obtained when the microscope is optimally aligned, both mechanically and optically. Something else is often overlooked in the quest to get the best image out of the microscope: Proper sample preparation! The microscope optics can only do its job when its design criteria are matched to the specimen or vice versa. The specimen itself, the mounting medium, the cover slip, and the type of immersion medium (if applicable) are all part of the total optical makeup. To get the best results out of a microscope, understanding the functions of all of its variable components is important. Only then one knows how to optimize these components for the intended application. Different approaches might be chosen to discuss all of the microscope's components. We decided to follow the light path which starts with the light source and ends at the camera or the eyepieces. To add more transparency to this sequence, the section up to the microscope stage was called the "Illuminating Section", to be followed by the "Imaging Section" which starts with the microscope objective. After understanding the various components, we can start "working with the microscope." To get the best resolution and contrast from the microscope, the practice of "Koehler Illumination" should be understood and followed by every serious microscopist. Step-by-step instructions as well as illustrations of the beam path in an upright and inverted microscope are included in this chapter. A few practical considerations are listed in Section 3. Copyright © 2013 Elsevier Inc. All rights

  19. Analysis of a microscopic model of taking into account 2p2h configurations

    International Nuclear Information System (INIS)

    Kamerdzhiev, S.P.; Tkachev, V.N.

    1986-01-01

    The Green's-function method has been used to obtain a general equation for the effective field in a nucleus, taking into account both 1p1h and 2p2h configurations. This equation has been used as the starting point for derivation of a previously developed microscopic model of taking 1p1h+phonon configurations into account in magic nuclei. The equation for the density matrix is analyzed in this model. It is shown that the number of quasiparticles is conserved. An equation is obtained for the effective field in the coordinate representation, which provides a formulation of the problem in the 1p1h+2p2h+continuum approximation. The equation is derived and quantitatively analyzed in the space of one-phonon states

  20. A Simple Accounting-based Valuation Model for the Debt Tax Shield

    Directory of Open Access Journals (Sweden)

    Andreas Scholze

    2010-05-01

    Full Text Available This paper describes a simple way to integrate the debt tax shield into an accounting-based valuation model. The market value of equity is determined by forecasting residual operating income, which is calculated by charging operating income for the operating assets at a required return that accounts for the tax benefit that comes from borrowing to raise cash for the operations. The model assumes that the firm maintains a deterministic financial leverage ratio, which tends to converge quickly to typical steady-state levels over time. From a practical point of view, this characteristic is of particular help, because it allows a continuing value calculation at the end of a short forecast period.

  1. Accounting outsourcing and some problems of selected software for accounting

    OpenAIRE

    Turková, Lenka

    2009-01-01

    Diploma thesis on Accounting outsourcing and key problems of selected software for accounting deals with the accounting outsourcing. Work focuses here on the question of the proper selection of an accounting firm and on the conditions of cooperation with it. In this work the reader is also acquainted with some software for accounting and with their advantages and disadvantages.

  2. An enhanced temperature index model for debris-covered glaciers accounting for thickness effect

    Science.gov (United States)

    Carenzo, M.; Pellicciotti, F.; Mabillard, J.; Reid, T.; Brock, B. W.

    2016-08-01

    Debris-covered glaciers are increasingly studied because it is assumed that debris cover extent and thickness could increase in a warming climate, with more regular rockfalls from the surrounding slopes and more englacial melt-out material. Debris energy-balance models have been developed to account for the melt rate enhancement/reduction due to a thin/thick debris layer, respectively. However, such models require a large amount of input data that are not often available, especially in remote mountain areas such as the Himalaya, and can be difficult to extrapolate. Due to their lower data requirements, empirical models have been used extensively in clean glacier melt modelling. For debris-covered glaciers, however, they generally simplify the debris effect by using a single melt-reduction factor which does not account for the influence of varying debris thickness on melt and prescribe a constant reduction for the entire melt across a glacier. In this paper, we present a new temperature-index model that accounts for debris thickness in the computation of melt rates at the debris-ice interface. The model empirical parameters are optimized at the point scale for varying debris thicknesses against melt rates simulated by a physically-based debris energy balance model. The latter is validated against ablation stake readings and surface temperature measurements. Each parameter is then related to a plausible set of debris thickness values to provide a general and transferable parameterization. We develop the model on Miage Glacier, Italy, and then test its transferability on Haut Glacier d'Arolla, Switzerland. The performance of the new debris temperature-index (DETI) model in simulating the glacier melt rate at the point scale is comparable to the one of the physically based approach, and the definition of model parameters as a function of debris thickness allows the simulation of the nonlinear relationship of melt rate to debris thickness, summarised by the

  3. [Application of detecting and taking overdispersion into account in Poisson regression model].

    Science.gov (United States)

    Bouche, G; Lepage, B; Migeot, V; Ingrand, P

    2009-08-01

    Researchers often use the Poisson regression model to analyze count data. Overdispersion can occur when a Poisson regression model is used, resulting in an underestimation of variance of the regression model parameters. Our objective was to take overdispersion into account and assess its impact with an illustration based on the data of a study investigating the relationship between use of the Internet to seek health information and number of primary care consultations. Three methods, overdispersed Poisson, a robust estimator, and negative binomial regression, were performed to take overdispersion into account in explaining variation in the number (Y) of primary care consultations. We tested overdispersion in the Poisson regression model using the ratio of the sum of Pearson residuals over the number of degrees of freedom (chi(2)/df). We then fitted the three models and compared parameter estimation to the estimations given by Poisson regression model. Variance of the number of primary care consultations (Var[Y]=21.03) was greater than the mean (E[Y]=5.93) and the chi(2)/df ratio was 3.26, which confirmed overdispersion. Standard errors of the parameters varied greatly between the Poisson regression model and the three other regression models. Interpretation of estimates from two variables (using the Internet to seek health information and single parent family) would have changed according to the model retained, with significant levels of 0.06 and 0.002 (Poisson), 0.29 and 0.09 (overdispersed Poisson), 0.29 and 0.13 (use of a robust estimator) and 0.45 and 0.13 (negative binomial) respectively. Different methods exist to solve the problem of underestimating variance in the Poisson regression model when overdispersion is present. The negative binomial regression model seems to be particularly accurate because of its theorical distribution ; in addition this regression is easy to perform with ordinary statistical software packages.

  4. Modeling Laterally Loaded Single Piles Accounting for Nonlinear Soil-Pile Interactions

    Directory of Open Access Journals (Sweden)

    Maryam Mardfekri

    2013-01-01

    Full Text Available The nonlinear behavior of a laterally loaded monopile foundation is studied using the finite element method (FEM to account for soil-pile interactions. Three-dimensional (3D finite element modeling is a convenient and reliable approach to account for the continuity of the soil mass and the nonlinearity of the soil-pile interactions. Existing simple methods for predicting the deflection of laterally loaded single piles in sand and clay (e.g., beam on elastic foundation, p-y method, and SALLOP are assessed using linear and nonlinear finite element analyses. The results indicate that for the specific case considered here the p-y method provides a reasonable accuracy, in spite of its simplicity, in predicting the lateral deflection of single piles. A simplified linear finite element (FE analysis of piles, often used in the literature, is also investigated and the influence of accounting for the pile diameter in the simplified linear FE model is evaluated. It is shown that modeling the pile as a line with beam-column elements results in a reduced contribution of the surrounding soil to the lateral stiffness of the pile and an increase of up to 200% in the predicted maximum lateral displacement of the pile head.

  5. Bayesian model accounting for within-class biological variability in Serial Analysis of Gene Expression (SAGE

    Directory of Open Access Journals (Sweden)

    Brentani Helena

    2004-08-01

    Full Text Available Abstract Background An important challenge for transcript counting methods such as Serial Analysis of Gene Expression (SAGE, "Digital Northern" or Massively Parallel Signature Sequencing (MPSS, is to carry out statistical analyses that account for the within-class variability, i.e., variability due to the intrinsic biological differences among sampled individuals of the same class, and not only variability due to technical sampling error. Results We introduce a Bayesian model that accounts for the within-class variability by means of mixture distribution. We show that the previously available approaches of aggregation in pools ("pseudo-libraries" and the Beta-Binomial model, are particular cases of the mixture model. We illustrate our method with a brain tumor vs. normal comparison using SAGE data from public databases. We show examples of tags regarded as differentially expressed with high significance if the within-class variability is ignored, but clearly not so significant if one accounts for it. Conclusion Using available information about biological replicates, one can transform a list of candidate transcripts showing differential expression to a more reliable one. Our method is freely available, under GPL/GNU copyleft, through a user friendly web-based on-line tool or as R language scripts at supplemental web-site.

  6. Bayesian model accounting for within-class biological variability in Serial Analysis of Gene Expression (SAGE).

    Science.gov (United States)

    Vêncio, Ricardo Z N; Brentani, Helena; Patrão, Diogo F C; Pereira, Carlos A B

    2004-08-31

    An important challenge for transcript counting methods such as Serial Analysis of Gene Expression (SAGE), "Digital Northern" or Massively Parallel Signature Sequencing (MPSS), is to carry out statistical analyses that account for the within-class variability, i.e., variability due to the intrinsic biological differences among sampled individuals of the same class, and not only variability due to technical sampling error. We introduce a Bayesian model that accounts for the within-class variability by means of mixture distribution. We show that the previously available approaches of aggregation in pools ("pseudo-libraries") and the Beta-Binomial model, are particular cases of the mixture model. We illustrate our method with a brain tumor vs. normal comparison using SAGE data from public databases. We show examples of tags regarded as differentially expressed with high significance if the within-class variability is ignored, but clearly not so significant if one accounts for it. Using available information about biological replicates, one can transform a list of candidate transcripts showing differential expression to a more reliable one. Our method is freely available, under GPL/GNU copyleft, through a user friendly web-based on-line tool or as R language scripts at supplemental web-site.

  7. A multiscale active structural model of the arterial wall accounting for smooth muscle dynamics.

    Science.gov (United States)

    Coccarelli, Alberto; Edwards, David Hughes; Aggarwal, Ankush; Nithiarasu, Perumal; Parthimos, Dimitris

    2018-02-01

    Arterial wall dynamics arise from the synergy of passive mechano-elastic properties of the vascular tissue and the active contractile behaviour of smooth muscle cells (SMCs) that form the media layer of vessels. We have developed a computational framework that incorporates both these components to account for vascular responses to mechanical and pharmacological stimuli. To validate the proposed framework and demonstrate its potential for testing hypotheses on the pathogenesis of vascular disease, we have employed a number of pharmacological probes that modulate the arterial wall contractile machinery by selectively inhibiting a range of intracellular signalling pathways. Experimental probes used on ring segments from the rabbit central ear artery are: phenylephrine, a selective α 1-adrenergic receptor agonist that induces vasoconstriction; cyclopiazonic acid (CPA), a specific inhibitor of sarcoplasmic/endoplasmic reticulum Ca 2+ -ATPase; and ryanodine, a diterpenoid that modulates Ca 2+ release from the sarcoplasmic reticulum. These interventions were able to delineate the role of membrane versus intracellular signalling, previously identified as main factors in smooth muscle contraction and the generation of vessel tone. Each SMC was modelled by a system of nonlinear differential equations that account for intracellular ionic signalling, and in particular Ca 2+ dynamics. Cytosolic Ca 2+ concentrations formed the catalytic input to a cross-bridge kinetics model. Contractile output from these cellular components forms the input to the finite-element model of the arterial rings under isometric conditions that reproduces the experimental conditions. The model does not account for the role of the endothelium, as the nitric oxide production was suppressed by the action of L-NAME, and also due to the absence of shear stress on the arterial ring, as the experimental set-up did not involve flow. Simulations generated by the integrated model closely matched experimental

  8. Modeling 2-alternative forced-choice tasks: Accounting for both magnitude and difference effects.

    Science.gov (United States)

    Ratcliff, Roger; Voskuilen, Chelsea; Teodorescu, Andrei

    2018-03-01

    We present a model-based analysis of two-alternative forced-choice tasks in which two stimuli are presented side by side and subjects must make a comparative judgment (e.g., which stimulus is brighter). Stimuli can vary on two dimensions, the difference in strength of the two stimuli and the magnitude of each stimulus. Differences between the two stimuli produce typical RT and accuracy effects (i.e., subjects respond more quickly and more accurately when there is a larger difference between the two). However, the overall magnitude of the pair of stimuli also affects RT and accuracy. In the more common two-choice task, a single stimulus is presented and the stimulus varies on only one dimension. In this two-stimulus task, if the standard diffusion decision model is fit to the data with only drift rate (evidence accumulation rate) differing among conditions, the model cannot fit the data. However, if either of one of two variability parameters is allowed to change with stimulus magnitude, the model can fit the data. This results in two models that are extremely constrained with about one tenth of the number of parameters than there are data points while at the same time the models account for accuracy and correct and error RT distributions. While both of these versions of the diffusion model can account for the observed data, the model that allows across-trial variability in drift to vary might be preferred for theoretical reasons. The diffusion model fits are compared to the leaky competing accumulator model which did not perform as well. Copyright © 2018 Elsevier Inc. All rights reserved.

  9. Model of inventory replenishment in periodic review accounting for the occurrence of shortages

    Directory of Open Access Journals (Sweden)

    Stanisław Krzyżaniak

    2014-03-01

    Full Text Available Background: Despite the development of alternative concepts of goods flow management, the inventory management under conditions of random variations of demand is still an important issue, both from the point of view of inventory keeping and replenishment costs and the service level measured as the level of inventory availability. There is a number of inventory replenishment systems used in these conditions, but they are mostly developments of two basic systems: reorder point-based and periodic review-based. The paper deals with the latter system. Numerous researches indicate the need to improve the classical models describing that system, the reason being mainly the necessity to adapt the model better to the actual conditions. This allows a correct selection of parameters that control the used inventory replenishment system and - as a result - to obtain expected economic effects. Methods: This research aimed at building a model of the periodic review system to reflect the relations (observed during simulation tests between the volume of inventory shortages and the degree of accounting for so-called deferred demand, and the service level expressed as the probability of satisfying the demand in the review and the inventory replenishment cycle. The following model building and testing method has been applied: numerical simulation of inventory replenishment - detailed analysis of simulation results - construction of the model taking into account the regularities observed during the simulations - determination of principles of solving the system of relations creating the model - verification of the results obtained from the model using the results from simulation. Results: Presented are selected results of calculations based on classical formulas and using the developed model, which describe the relations between the service level and the parameters controlling the discussed inventory replenishment system. The results are compared to the simulation

  10. A two-phase moisture transport model accounting for sorption hysteresis in layered porous building constructions

    DEFF Research Database (Denmark)

    Johannesson, Björn; Janz, Mårten

    2009-01-01

    and exhibits different transport properties. A successful model of such a case may shred light on the performance of different constructions with regards to, for example, mould growth and freeze thaw damages. For this purpose a model has been developed which is based on a two phase flow, vapor and liquid water......, with account also to sorption hysteresis. The different materials in the considered layered construction are assigned different properties, i.e. vapor and liquid water diffusivities and boundary (wetting and drying) sorption curves. Further, the scanning behavior between wetting and drying boundary curves...

  11. @AACAnatomy twitter account goes live: A sustainable social media model for professional societies.

    Science.gov (United States)

    Benjamin, Hannah K; Royer, Danielle F

    2018-05-01

    Social media, with its capabilities of fast, global information sharing, provides a useful medium for professional development, connecting and collaborating with peers, and outreach. The goals of this study were to describe a new, sustainable model for Twitter use by professional societies, and analyze its impact on @AACAnatomy, the Twitter account of the American Association of Clinical Anatomists. Under supervision of an Association committee member, an anatomy graduate student developed a protocol for publishing daily tweets for @AACAnatomy. Five tweet categories were used: Research, Announcements, Replies, Engagement, and Community. Analytics from the 6-month pilot phase were used to assess the impact of the new model. @AACAnatomy had a steady average growth of 33 new followers per month, with less than 10% likely representing Association members. Research tweets, based on Clinical Anatomy articles with an abstract link, were the most shared, averaging 5,451 impressions, 31 link clicks, and nine #ClinAnat hashtag clicks per month. However, tweets from non-Research categories accounted for the highest impression and engagement metrics in four out of six months. For all tweet categories, monthly averages show consistent interaction of followers with the account. Daily tweet publication resulted in a 103% follower increase. An active Twitter account successfully facilitated regular engagement with @AACAnatomy followers and the promotion of clinical anatomy topics within a broad community. This Twitter model has the potential for implementation by other societies as a sustainable medium for outreach, networking, collaboration, and member engagement. Clin. Anat. 31:566-575, 2018. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  12. MODELING ENERGY EXPENDITURE AND OXYGEN CONSUMPTION IN HUMAN EXPOSURE MODELS: ACCOUNTING FOR FATIGUE AND EPOC

    Science.gov (United States)

    Human exposure and dose models often require a quantification of oxygen consumption for a simulated individual. Oxygen consumption is dependent on the modeled Individual's physical activity level as described in an activity diary. Activity level is quantified via standardized val...

  13. Accounting for standard errors of vision-specific latent trait in regression models.

    Science.gov (United States)

    Wong, Wan Ling; Li, Xiang; Li, Jialiang; Wong, Tien Yin; Cheng, Ching-Yu; Lamoureux, Ecosse L

    2014-07-11

    To demonstrate the effectiveness of Hierarchical Bayesian (HB) approach in a modeling framework for association effects that accounts for SEs of vision-specific latent traits assessed using Rasch analysis. A systematic literature review was conducted in four major ophthalmic journals to evaluate Rasch analysis performed on vision-specific instruments. The HB approach was used to synthesize the Rasch model and multiple linear regression model for the assessment of the association effects related to vision-specific latent traits. The effectiveness of this novel HB one-stage "joint-analysis" approach allows all model parameters to be estimated simultaneously and was compared with the frequently used two-stage "separate-analysis" approach in our simulation study (Rasch analysis followed by traditional statistical analyses without adjustment for SE of latent trait). Sixty-six reviewed articles performed evaluation and validation of vision-specific instruments using Rasch analysis, and 86.4% (n = 57) performed further statistical analyses on the Rasch-scaled data using traditional statistical methods; none took into consideration SEs of the estimated Rasch-scaled scores. The two models on real data differed for effect size estimations and the identification of "independent risk factors." Simulation results showed that our proposed HB one-stage "joint-analysis" approach produces greater accuracy (average of 5-fold decrease in bias) with comparable power and precision in estimation of associations when compared with the frequently used two-stage "separate-analysis" procedure despite accounting for greater uncertainty due to the latent trait. Patient-reported data, using Rasch analysis techniques, do not take into account the SE of latent trait in association analyses. The HB one-stage "joint-analysis" is a better approach, producing accurate effect size estimations and information about the independent association of exposure variables with vision-specific latent traits

  14. Comparing dark matter models, modified Newtonian dynamics and modified gravity in accounting for galaxy rotation curves

    Science.gov (United States)

    Li, Xin; Tang, Li; Lin, Hai-Nan

    2017-05-01

    We compare six models (including the baryonic model, two dark matter models, two modified Newtonian dynamics models and one modified gravity model) in accounting for galaxy rotation curves. For the dark matter models, we assume NFW profile and core-modified profile for the dark halo, respectively. For the modified Newtonian dynamics models, we discuss Milgrom’s MOND theory with two different interpolation functions, the standard and the simple interpolation functions. For the modified gravity, we focus on Moffat’s MSTG theory. We fit these models to the observed rotation curves of 9 high-surface brightness and 9 low-surface brightness galaxies. We apply the Bayesian Information Criterion and the Akaike Information Criterion to test the goodness-of-fit of each model. It is found that none of the six models can fit all the galaxy rotation curves well. Two galaxies can be best fitted by the baryonic model without involving nonluminous dark matter. MOND can fit the largest number of galaxies, and only one galaxy can be best fitted by the MSTG model. Core-modified model fits about half the LSB galaxies well, but no HSB galaxies, while the NFW model fits only a small fraction of HSB galaxies but no LSB galaxies. This may imply that the oversimplified NFW and core-modified profiles cannot model the postulated dark matter haloes well. Supported by Fundamental Research Funds for the Central Universities (106112016CDJCR301206), National Natural Science Fund of China (11305181, 11547305 and 11603005), and Open Project Program of State Key Laboratory of Theoretical Physics, Institute of Theoretical Physics, Chinese Academy of Sciences, China (Y5KF181CJ1)

  15. Assessing and accounting for the effects of model error in Bayesian solutions to hydrogeophysical inverse problems

    Science.gov (United States)

    Koepke, C.; Irving, J.; Roubinet, D.

    2014-12-01

    Geophysical methods have gained much interest in hydrology over the past two decades because of their ability to provide estimates of the spatial distribution of subsurface properties at a scale that is often relevant to key hydrological processes. Because of an increased desire to quantify uncertainty in hydrological predictions, many hydrogeophysical inverse problems have recently been posed within a Bayesian framework, such that estimates of hydrological properties and their corresponding uncertainties can be obtained. With the Bayesian approach, it is often necessary to make significant approximations to the associated hydrological and geophysical forward models such that stochastic sampling from the posterior distribution, for example using Markov-chain-Monte-Carlo (MCMC) methods, is computationally feasible. These approximations lead to model structural errors, which, so far, have not been properly treated in hydrogeophysical inverse problems. Here, we study the inverse problem of estimating unsaturated hydraulic properties, namely the van Genuchten-Mualem (VGM) parameters, in a layered subsurface from time-lapse, zero-offset-profile (ZOP) ground penetrating radar (GPR) data, collected over the course of an infiltration experiment. In particular, we investigate the effects of assumptions made for computational tractability of the stochastic inversion on model prediction errors as a function of depth and time. These assumptions are that (i) infiltration is purely vertical and can be modeled by the 1D Richards equation, and (ii) the petrophysical relationship between water content and relative dielectric permittivity is known. Results indicate that model errors for this problem are far from Gaussian and independently identically distributed, which has been the common assumption in previous efforts in this domain. In order to develop a more appropriate likelihood formulation, we use (i) a stochastic description of the model error that is obtained through

  16. An analytical model accounting for tip shape evolution during atom probe analysis of heterogeneous materials.

    Science.gov (United States)

    Rolland, N; Larson, D J; Geiser, B P; Duguay, S; Vurpillot, F; Blavette, D

    2015-12-01

    An analytical model describing the field evaporation dynamics of a tip made of a thin layer deposited on a substrate is presented in this paper. The difference in evaporation field between the materials is taken into account in this approach in which the tip shape is modeled at a mesoscopic scale. It was found that the non-existence of sharp edge on the surface is a sufficient condition to derive the morphological evolution during successive evaporation of the layers. This modeling gives an instantaneous and smooth analytical representation of the surface that shows good agreement with finite difference simulations results, and a specific regime of evaporation was highlighted when the substrate is a low evaporation field phase. In addition, the model makes it possible to calculate theoretically the tip analyzed volume, potentially opening up new horizons for atom probe tomographic reconstruction. Copyright © 2015 Elsevier B.V. All rights reserved.

  17. Regional Balance Model of Financial Flows through Sectoral Approaches System of National Accounts

    Directory of Open Access Journals (Sweden)

    Ekaterina Aleksandrovna Zaharchuk

    2017-03-01

    Full Text Available The main purpose of the study, the results of which are reflected in this article, is the theoretical and methodological substantiation of possibilities to build a regional balance model of financial flows consistent with the principles of the construction of the System of National Accounts (SNA. The paper summarizes the international experience of building regional accounts in the SNA as well as reflects the advantages and disadvantages of the existing techniques for constructing Social Accounting Matrix. The authors have proposed an approach to build the regional balance model of financial flows, which is based on the disaggregated tables of the formation, distribution and use of the added value of territory in the framework of institutional sectors of SNA (corporations, public administration, households. Within the problem resolution of the transition of value added from industries to sectors, the authors have offered an approach to the accounting of development, distribution and use of value added within the institutional sectors of the territories. The methods of calculation are based on the publicly available information base of statistics agencies and federal services. The authors provide the scheme of the interrelations of the indicators of the regional balance model of financial flows. It allows to coordinate mutually the movement of regional resources by the sectors of «corporation», «public administration» and «households» among themselves, and cash flows of the region — by the sectors and directions of use. As a result, they form a single account of the formation and distribution of territorial financial resources, which is a regional balance model of financial flows. This matrix shows the distribution of financial resources by income sources and sectors, where the components of the formation (compensation, taxes and gross profit, distribution (transfers and payments and use (final consumption, accumulation of value added are

  18. Accounting for Zero Inflation of Mussel Parasite Counts Using Discrete Regression Models

    Directory of Open Access Journals (Sweden)

    Emel Çankaya

    2017-06-01

    Full Text Available In many ecological applications, the absences of species are inevitable due to either detection faults in samples or uninhabitable conditions for their existence, resulting in high number of zero counts or abundance. Usual practice for modelling such data is regression modelling of log(abundance+1 and it is well know that resulting model is inadequate for prediction purposes. New discrete models accounting for zero abundances, namely zero-inflated regression (ZIP and ZINB, Hurdle-Poisson (HP and Hurdle-Negative Binomial (HNB amongst others are widely preferred to the classical regression models. Due to the fact that mussels are one of the economically most important aquatic products of Turkey, the purpose of this study is therefore to examine the performances of these four models in determination of the significant biotic and abiotic factors on the occurrences of Nematopsis legeri parasite harming the existence of Mediterranean mussels (Mytilus galloprovincialis L.. The data collected from the three coastal regions of Sinop city in Turkey showed more than 50% of parasite counts on the average are zero-valued and model comparisons were based on information criterion. The results showed that the probability of the occurrence of this parasite is here best formulated by ZINB or HNB models and influential factors of models were found to be correspondent with ecological differences of the regions.

  19. A predictive coding account of bistable perception - a model-based fMRI study.

    Science.gov (United States)

    Weilnhammer, Veith; Stuke, Heiner; Hesselmann, Guido; Sterzer, Philipp; Schmack, Katharina

    2017-05-01

    In bistable vision, subjective perception wavers between two interpretations of a constant ambiguous stimulus. This dissociation between conscious perception and sensory stimulation has motivated various empirical studies on the neural correlates of bistable perception, but the neurocomputational mechanism behind endogenous perceptual transitions has remained elusive. Here, we recurred to a generic Bayesian framework of predictive coding and devised a model that casts endogenous perceptual transitions as a consequence of prediction errors emerging from residual evidence for the suppressed percept. Data simulations revealed close similarities between the model's predictions and key temporal characteristics of perceptual bistability, indicating that the model was able to reproduce bistable perception. Fitting the predictive coding model to behavioural data from an fMRI-experiment on bistable perception, we found a correlation across participants between the model parameter encoding perceptual stabilization and the behaviourally measured frequency of perceptual transitions, corroborating that the model successfully accounted for participants' perception. Formal model comparison with established models of bistable perception based on mutual inhibition and adaptation, noise or a combination of adaptation and noise was used for the validation of the predictive coding model against the established models. Most importantly, model-based analyses of the fMRI data revealed that prediction error time-courses derived from the predictive coding model correlated with neural signal time-courses in bilateral inferior frontal gyri and anterior insulae. Voxel-wise model selection indicated a superiority of the predictive coding model over conventional analysis approaches in explaining neural activity in these frontal areas, suggesting that frontal cortex encodes prediction errors that mediate endogenous perceptual transitions in bistable perception. Taken together, our current work

  20. A predictive coding account of bistable perception - a model-based fMRI study.

    Directory of Open Access Journals (Sweden)

    Veith Weilnhammer

    2017-05-01

    Full Text Available In bistable vision, subjective perception wavers between two interpretations of a constant ambiguous stimulus. This dissociation between conscious perception and sensory stimulation has motivated various empirical studies on the neural correlates of bistable perception, but the neurocomputational mechanism behind endogenous perceptual transitions has remained elusive. Here, we recurred to a generic Bayesian framework of predictive coding and devised a model that casts endogenous perceptual transitions as a consequence of prediction errors emerging from residual evidence for the suppressed percept. Data simulations revealed close similarities between the model's predictions and key temporal characteristics of perceptual bistability, indicating that the model was able to reproduce bistable perception. Fitting the predictive coding model to behavioural data from an fMRI-experiment on bistable perception, we found a correlation across participants between the model parameter encoding perceptual stabilization and the behaviourally measured frequency of perceptual transitions, corroborating that the model successfully accounted for participants' perception. Formal model comparison with established models of bistable perception based on mutual inhibition and adaptation, noise or a combination of adaptation and noise was used for the validation of the predictive coding model against the established models. Most importantly, model-based analyses of the fMRI data revealed that prediction error time-courses derived from the predictive coding model correlated with neural signal time-courses in bilateral inferior frontal gyri and anterior insulae. Voxel-wise model selection indicated a superiority of the predictive coding model over conventional analysis approaches in explaining neural activity in these frontal areas, suggesting that frontal cortex encodes prediction errors that mediate endogenous perceptual transitions in bistable perception. Taken together

  1. Accounting for and predicting the influence of spatial autocorrelation in water quality modeling

    Science.gov (United States)

    Miralha, L.; Kim, D.

    2017-12-01

    Although many studies have attempted to investigate the spatial trends of water quality, more attention is yet to be paid to the consequences of considering and ignoring the spatial autocorrelation (SAC) that exists in water quality parameters. Several studies have mentioned the importance of accounting for SAC in water quality modeling, as well as the differences in outcomes between models that account for and ignore SAC. However, the capacity to predict the magnitude of such differences is still ambiguous. In this study, we hypothesized that SAC inherently possessed by a response variable (i.e., water quality parameter) influences the outcomes of spatial modeling. We evaluated whether the level of inherent SAC is associated with changes in R-Squared, Akaike Information Criterion (AIC), and residual SAC (rSAC), after accounting for SAC during modeling procedure. The main objective was to analyze if water quality parameters with higher Moran's I values (inherent SAC measure) undergo a greater increase in R² and a greater reduction in both AIC and rSAC. We compared a non-spatial model (OLS) to two spatial regression approaches (spatial lag and error models). Predictor variables were the principal components of topographic (elevation and slope), land cover, and hydrological soil group variables. We acquired these data from federal online sources (e.g. USGS). Ten watersheds were selected, each in a different state of the USA. Results revealed that water quality parameters with higher inherent SAC showed substantial increase in R² and decrease in rSAC after performing spatial regressions. However, AIC values did not show significant changes. Overall, the higher the level of inherent SAC in water quality variables, the greater improvement of model performance. This indicates a linear and direct relationship between the spatial model outcomes (R² and rSAC) and the degree of SAC in each water quality variable. Therefore, our study suggests that the inherent level of

  2. Towards proper name generation : A corpus analysis

    NARCIS (Netherlands)

    Castro Ferreira, Thiago; Wubben, Sander; Krahmer, Emiel

    We introduce a corpus for the study of proper name generation. The corpus consists of proper name references to people in webpages, extracted from the Wikilinks corpus. In our analyses, we aim to identify the different ways, in terms of length and form, in which a proper names are produced

  3. ACCOUNTING AND AUDIT OPERATIONS ON CURRENT ACCOUNT

    Directory of Open Access Journals (Sweden)

    Koblyanska Olena

    2018-03-01

    Full Text Available Introduction. The article is devoted to theoretical, methodical and practical issues of accounting and auditing of operations on the current account. The purpose of the study is to deepen and consolidate the theoretical and practical knowledge of the issues of accounting and auditing of operations on the current account, identify practical problems with the implementation of the methodology and organization of accounting and auditing of operations on the current account and develop recommendations for the elimination of deficiencies and improve the accounting and auditing. Results. The issue of the relevance of proper accounting and audit of transactions on the current account in the bank is considered. The research of typical operations on the current account was carried out with using of the method of their reflection in the account on practical examples. Features of the audit of transactions on the current account are examined, the procedure for its implementation is presented, and types of abuses and violations that occur while performing operations on the current account are identified. The legal regulation of accounting, analysis and control of operations with cash on current accounts is considered. The problem issues related to the organization and conducting of the audit of funds in the accounts of the bank are analyzed, as well as the directions of their solution are determined. The proposals for determining the sequence of actions of the auditor during the check of cash flow on accounts in the bank are provided. Conclusions. The questions about theoretical, methodological and practical issues of accounting and auditing of operations on the current account in the bank. A study of typical operations with cash on the current account was carried out with the use of the method of their reflection in the accounts and the features of the auditing of cash on the account.

  4. Possible Relativistic Definitions of Parallax, Proper Motion and Radial Velocity

    National Research Council Canada - National Science Library

    Klioner, S

    2000-01-01

    .... In this paper, the authors briefly describe a relativistic model of space-based optical positional observations valid at a high level of accuracy, and suggest definitions of parallax, proper motion...

  5. A synthesis of literature on evaluation of models for policy applications, with implications for forest carbon accounting

    Science.gov (United States)

    Stephen P. Prisley; Michael J. Mortimer

    2004-01-01

    Forest modeling has moved beyond the realm of scientific discovery into the policy arena. The example that motivates this review is the application of models for forest carbon accounting. As negotiations determine the terms under which forest carbon will be accounted, reported, and potentially traded, guidelines and standards are being developed to ensure consistency,...

  6. Green accounts for sulphur and nitrogen deposition in Sweden. Implementation of a theoretical model in practice

    International Nuclear Information System (INIS)

    Ahlroth, S.

    2001-01-01

    This licentiate thesis tries to bridge the gap between the theoretical and the practical studies in the field of environmental accounting. In the paper, 1 develop an optimal control theory model for adjusting NDP for the effects Of SO 2 and NO x emissions, and subsequently insert empirically estimated values. The model includes correction entries for the effects on welfare, real capital, health and the quality and quantity of renewable natural resources. In the empirical valuation study, production losses were estimated with dose-response functions. Recreational and other welfare values were estimated by the contingent valuation (CV) method. Effects on capital depreciation are also included. For comparison, abatement costs and environmental protection expenditures for reducing sulfur and nitrogen emissions were estimated. The theoretical model was then utilized to calculate the adjustment to NDP in a consistent manner

  7. Green accounts for sulphur and nitrogen deposition in Sweden. Implementation of a theoretical model in practice

    Energy Technology Data Exchange (ETDEWEB)

    Ahlroth, S.

    2001-01-01

    This licentiate thesis tries to bridge the gap between the theoretical and the practical studies in the field of environmental accounting. In the paper, 1 develop an optimal control theory model for adjusting NDP for the effects Of SO{sub 2} and NO{sub x} emissions, and subsequently insert empirically estimated values. The model includes correction entries for the effects on welfare, real capital, health and the quality and quantity of renewable natural resources. In the empirical valuation study, production losses were estimated with dose-response functions. Recreational and other welfare values were estimated by the contingent valuation (CV) method. Effects on capital depreciation are also included. For comparison, abatement costs and environmental protection expenditures for reducing sulfur and nitrogen emissions were estimated. The theoretical model was then utilized to calculate the adjustment to NDP in a consistent manner.

  8. Matematical modeling of galophytic plants productivity taking into account the temperature factor and soil salinity level

    Science.gov (United States)

    Natalia, Slyusar; Pisman, Tamara; Pechurkin, Nikolai S.

    Among the most challenging tasks faced by contemporary ecology is modeling of biological production process in different plant communities. The difficulty of the task is determined by the complexity of the study material. Models showing the influence of climate and climate change on plant growth, which would also involve soil site parameters, could be of both practical and theoretical interest. In this work a mathematical model has been constructed to describe the growth dynamics of different plant communities of halophytic meadows as dependent upon the temperature factor and soil salinity level, which could be further used to predict yields of these plant communities. The study was performed on plants of halophytic meadows in the coastal area of Lake of the Republic of Khakasia in 2004 - 2006. Every plant community grew on the soil of a different level of salinity - the amount of the solid residue of the saline soil aqueous extract. The mathematical model was analyzed using field data of 2004 and 2006, the years of contrasting air temperatures. Results of model investigations show that there is a correlation between plant growth and the temperature of the air for plant communities growing on soils containing the lowest (0.1Thus, results of our study, in which we used a mathematical model describing the development of plant communities of halophytic meadows and field measurements, suggest that both climate conditions (temperature) and ecological factors of the plants' habitat (soil salinity level) should be taken into account when constructing models for predicting crop yields.

  9. The Dynamics of the Accounting Models and Their Impact upon the Financial Risk Evaluation

    Directory of Open Access Journals (Sweden)

    Victor Munteanu

    2015-03-01

    Full Text Available All the companies are exposed to risks and circumstances can take an unexpected turn at some point in time. What the company can control is how these risks are managed and firstly the steps to be taken to avoid them. The way how the scientific expertise, data and the advice on devising the risk strategies are understood, represented and incorporated into a structured system has visibly evolved since the 19th century until present, along with the accounting models and the main factors that triggered a higher concern in this sector.

  10. Internet accounting dictionaries

    DEFF Research Database (Denmark)

    Nielsen, Sandro; Mourier, Lise

    2005-01-01

    An examination of existing accounting dictionaries on the Internet reveals a general need for a new type of dictionary. In contrast to the dictionaries now accessible, the future accounting dictionaries should be designed as proper Internet dictionaries based on a functional approach so they can...

  11. A three-dimensional model of mammalian tyrosinase active site accounting for loss of function mutations.

    Science.gov (United States)

    Schweikardt, Thorsten; Olivares, Concepción; Solano, Francisco; Jaenicke, Elmar; García-Borrón, José Carlos; Decker, Heinz

    2007-10-01

    Tyrosinases are the first and rate-limiting enzymes in the synthesis of melanin pigments responsible for colouring hair, skin and eyes. Mutation of tyrosinases often decreases melanin production resulting in albinism, but the effects are not always understood at the molecular level. Homology modelling of mouse tyrosinase based on recently published crystal structures of non-mammalian tyrosinases provides an active site model accounting for loss-of-function mutations. According to the model, the copper-binding histidines are located in a helix bundle comprising four densely packed helices. A loop containing residues M374, S375 and V377 connects the CuA and CuB centres, with the peptide oxygens of M374 and V377 serving as hydrogen acceptors for the NH-groups of the imidazole rings of the copper-binding His367 and His180. Therefore, this loop is essential for the stability of the active site architecture. A double substitution (374)MS(375) --> (374)GG(375) or a single M374G mutation lead to a local perturbation of the protein matrix at the active site affecting the orientation of the H367 side chain, that may be unable to bind CuB reliably, resulting in loss of activity. The model also accounts for loss of function in two naturally occurring albino mutations, S380P and V393F. The hydroxyl group in S380 contributes to the correct orientation of M374, and the substitution of V393 for a bulkier phenylalanine sterically impedes correct side chain packing at the active site. Therefore, our model explains the mechanistic necessity for conservation of not only active site histidines but also adjacent amino acids in tyrosinase.

  12. What can Gaia proper motions tell us about Milky Way dwarf galaxies?

    NARCIS (Netherlands)

    Jin, S.; Helmi, A.; Breddels, M.

    We present a proper-motion study on models of the dwarf spheroidal galaxy Sculptor, based on the predicted proper-motion accuracy of Gaia measurements. Gaia will measure proper motions of several hundreds of stars for a Sculptor-like system. Even with an uncertainty on the proper motion of order 1.5

  13. Palaeomagnetic dating method accounting for post-depositional remanence and its application to geomagnetic field modelling

    Science.gov (United States)

    Nilsson, A.; Suttie, N.

    2016-12-01

    Sedimentary palaeomagnetic data may exhibit some degree of smoothing of the recorded field due to the gradual processes by which the magnetic signal is `locked-in' over time. Here we present a new Bayesian method to construct age-depth models based on palaeomagnetic data, taking into account and correcting for potential lock-in delay. The age-depth model is built on the widely used "Bacon" dating software by Blaauw and Christen (2011, Bayesian Analysis 6, 457-474) and is designed to combine both radiocarbon and palaeomagnetic measurements. To our knowledge, this is the first palaeomagnetic dating method that addresses the potential problems related post-depositional remanent magnetisation acquisition in age-depth modelling. Age-depth models, including site specific lock-in depth and lock-in filter function, produced with this method are shown to be consistent with independent results based on radiocarbon wiggle match dated sediment sections. Besides its primary use as a dating tool, our new method can also be used specifically to identify the most likely lock-in parameters for a specific record. We explore the potential to use these results to construct high-resolution geomagnetic field models based on sedimentary palaeomagnetic data, adjusting for smoothing induced by post-depositional remanent magnetisation acquisition. Potentially, this technique could enable reconstructions of Holocene geomagnetic field with the same amplitude of variability observed in archaeomagnetic field models for the past three millennia.

  14. Accountability and pediatric physician-researchers: are theoretical models compatible with Canadian lived experience?

    Directory of Open Access Journals (Sweden)

    Czoli Christine

    2011-10-01

    Full Text Available Abstract Physician-researchers are bound by professional obligations stemming from both the role of the physician and the role of the researcher. Currently, the dominant models for understanding the relationship between physician-researchers' clinical duties and research duties fit into three categories: the similarity position, the difference position and the middle ground. The law may be said to offer a fourth "model" that is independent from these three categories. These models frame the expectations placed upon physician-researchers by colleagues, regulators, patients and research participants. This paper examines the extent to which the data from semi-structured interviews with 30 physician-researchers at three major pediatric hospitals in Canada reflect these traditional models. It seeks to determine the extent to which existing models align with the described lived experience of the pediatric physician-researchers interviewed. Ultimately, we find that although some physician-researchers make references to something like the weak version of the similarity position, the pediatric-researchers interviewed in this study did not describe their dual roles in a way that tightly mirrors any of the existing theoretical frameworks. We thus conclude that either physician-researchers are in need of better training regarding the nature of the accountability relationships that flow from their dual roles or that models setting out these roles and relationships must be altered to better reflect what we can reasonably expect of physician-researchers in a real-world environment.

  15. Response Mixture Modeling: Accounting for Heterogeneity in Item Characteristics across Response Times.

    Science.gov (United States)

    Molenaar, Dylan; de Boeck, Paul

    2018-02-01

    In item response theory modeling of responses and response times, it is commonly assumed that the item responses have the same characteristics across the response times. However, heterogeneity might arise in the data if subjects resort to different response processes when solving the test items. These differences may be within-subject effects, that is, a subject might use a certain process on some of the items and a different process with different item characteristics on the other items. If the probability of using one process over the other process depends on the subject's response time, within-subject heterogeneity of the item characteristics across the response times arises. In this paper, the method of response mixture modeling is presented to account for such heterogeneity. Contrary to traditional mixture modeling where the full response vectors are classified, response mixture modeling involves classification of the individual elements in the response vector. In a simulation study, the response mixture model is shown to be viable in terms of parameter recovery. In addition, the response mixture model is applied to a real dataset to illustrate its use in investigating within-subject heterogeneity in the item characteristics across response times.

  16. A Modified Model to Estimate Building Rental Multipiers Accounting for Advalorem Operating Expenses

    Directory of Open Access Journals (Sweden)

    Smolyak S.A.

    2016-09-01

    Full Text Available To develop ideas on building element valuation contained in the first article on the subject published in REMV, we propose an elaboration of the approach accounting for ad valorem expenses incidental to property management, such as land taxes, income/capital gains tax, and insurance premium costs; all such costs, being of an ad valorem nature in the first instance, cause circularity in the logic of the model, which, however, is not intractable under the proposed approach. The resulting formulas for carrying out practical estimation of building rental multipliers and, in consequence, of building values, turn out to be somewhat modified, and we demonstrate the sensitivity of the developed approach to the impact of these ad valorem factors. On the other hand, it is demonstrated that (accounting for building depreciation charges, which should seemingly be included among the considered ad valorem factors, cancel out and do not have any impact on the resulting estimates. However, treating the depreciation of buildings in quantifiable economic terms as a reduction in derivable operating benefits over time (instead of mere physical indications, such as age, we also demonstrate that the approach has implications for estimating the economic service lives of buildings and can be practical when used in conjunction with the market-related approach to valuation – from which the requisite model inputs can be extracted as shown in the final part of the paper.

  17. Radiative transfer modeling through terrestrial atmosphere and ocean accounting for inelastic processes: Software package SCIATRAN

    Science.gov (United States)

    Rozanov, V. V.; Dinter, T.; Rozanov, A. V.; Wolanin, A.; Bracher, A.; Burrows, J. P.

    2017-06-01

    SCIATRAN is a comprehensive software package which is designed to model radiative transfer processes in the terrestrial atmosphere and ocean in the spectral range from the ultraviolet to the thermal infrared (0.18-40 μm). It accounts for multiple scattering processes, polarization, thermal emission and ocean-atmosphere coupling. The main goal of this paper is to present a recently developed version of SCIATRAN which takes into account accurately inelastic radiative processes in both the atmosphere and the ocean. In the scalar version of the coupled ocean-atmosphere radiative transfer solver presented by Rozanov et al. [61] we have implemented the simulation of the rotational Raman scattering, vibrational Raman scattering, chlorophyll and colored dissolved organic matter fluorescence. In this paper we discuss and explain the numerical methods used in SCIATRAN to solve the scalar radiative transfer equation including trans-spectral processes, and demonstrate how some selected radiative transfer problems are solved using the SCIATRAN package. In addition we present selected comparisons of SCIATRAN simulations with those published benchmark results, independent radiative transfer models, and various measurements from satellite, ground-based, and ship-borne instruments. The extended SCIATRAN software package along with a detailed User's Guide is made available for scientists and students, who are undertaking their own research typically at universities, via the web page of the Institute of Environmental Physics (IUP), University of Bremen: http://www.iup.physik.uni-bremen.de.

  18. Accounting for measurement error in human life history trade-offs using structural equation modeling.

    Science.gov (United States)

    Helle, Samuli

    2018-03-01

    Revealing causal effects from correlative data is very challenging and a contemporary problem in human life history research owing to the lack of experimental approach. Problems with causal inference arising from measurement error in independent variables, whether related either to inaccurate measurement technique or validity of measurements, seem not well-known in this field. The aim of this study is to show how structural equation modeling (SEM) with latent variables can be applied to account for measurement error in independent variables when the researcher has recorded several indicators of a hypothesized latent construct. As a simple example of this approach, measurement error in lifetime allocation of resources to reproduction in Finnish preindustrial women is modelled in the context of the survival cost of reproduction. In humans, lifetime energetic resources allocated in reproduction are almost impossible to quantify with precision and, thus, typically used measures of lifetime reproductive effort (e.g., lifetime reproductive success and parity) are likely to be plagued by measurement error. These results are contrasted with those obtained from a traditional regression approach where the single best proxy of lifetime reproductive effort available in the data is used for inference. As expected, the inability to account for measurement error in women's lifetime reproductive effort resulted in the underestimation of its underlying effect size on post-reproductive survival. This article emphasizes the advantages that the SEM framework can provide in handling measurement error via multiple-indicator latent variables in human life history studies. © 2017 Wiley Periodicals, Inc.

  19. Implementation of a cost-accounting model in a biobank: practical implications.

    Science.gov (United States)

    Gonzalez-Sanchez, Maria Beatriz; Lopez-Valeiras, Ernesto; García-Montero, Andres C

    2014-01-01

    Given the state of global economy, cost measurement and control have become increasingly relevant over the past years. The scarcity of resources and the need to use these resources more efficiently is making cost information essential in management, even in non-profit public institutions. Biobanks are no exception. However, no empirical experiences on the implementation of cost accounting in biobanks have been published to date. The aim of this paper is to present a step-by-step implementation of a cost-accounting tool for the main production and distribution activities of a real/active biobank, including a comprehensive explanation on how to perform the calculations carried out in this model. Two mathematical models for the analysis of (1) production costs and (2) request costs (order management and sample distribution) have stemmed from the analysis of the results of this implementation, and different theoretical scenarios have been prepared. Global analysis and discussion provides valuable information for internal biobank management and even for strategic decisions at the research and development governmental policies level.

  20. Accounting for rainfall evaporation using dual-polarization radar and mesoscale model data

    Science.gov (United States)

    Pallardy, Quinn; Fox, Neil I.

    2018-02-01

    Implementation of dual-polarization radar should allow for improvements in quantitative precipitation estimates due to dual-polarization capability allowing for the retrieval of the second moment of the gamma drop size distribution. Knowledge of the shape of the DSD can then be used in combination with mesoscale model data to estimate the motion and evaporation of each size of drop falling from the height at which precipitation is observed by the radar to the surface. Using data from Central Missouri at a range between 130 and 140 km from the operational National Weather Service radar a rain drop tracing scheme was developed to account for the effects of evaporation, where individual raindrops hitting the ground were traced to the point in space and time where they interacted with the radar beam. The results indicated evaporation played a significant role in radar rainfall estimation in situations where the atmosphere was relatively dry. Improvements in radar estimated rainfall were also found in these situations by accounting for evaporation. The conclusion was made that the effects of raindrop evaporation were significant enough to warrant further research into the inclusion high resolution model data in the radar rainfall estimation process for appropriate locations.

  1. An extended car-following model accounting for the average headway effect in intelligent transportation system

    Science.gov (United States)

    Kuang, Hua; Xu, Zhi-Peng; Li, Xing-Li; Lo, Siu-Ming

    2017-04-01

    In this paper, an extended car-following model is proposed to simulate traffic flow by considering average headway of preceding vehicles group in intelligent transportation systems environment. The stability condition of this model is obtained by using the linear stability analysis. The phase diagram can be divided into three regions classified as the stable, the metastable and the unstable ones. The theoretical result shows that the average headway plays an important role in improving the stabilization of traffic system. The mKdV equation near the critical point is derived to describe the evolution properties of traffic density waves by applying the reductive perturbation method. Furthermore, through the simulation of space-time evolution of the vehicle headway, it is shown that the traffic jam can be suppressed efficiently with taking into account the average headway effect, and the analytical result is consistent with the simulation one.

  2. MODELLING OF THERMOELASTIC TRANSIENT CONTACT INTERACTION FOR BINARY BEARING TAKING INTO ACCOUNT CONVECTION

    Directory of Open Access Journals (Sweden)

    Igor KOLESNIKOV

    2016-12-01

    Full Text Available Serviceability of metal-polymeric "dry-friction" sliding bearings depends on many parameters, including the rotational speed, friction coefficient, thermal and mechanical properties of the bearing system and, as a result, the value of contact temperature. The objective of this study is to develop a computational model for the metallic-polymer bearing, determination on the basis of this model temperature distribution, equivalent and contact stresses for elements of the bearing arrangement and selection of the optimal parameters for the bearing system to achieve thermal balance. Static problem for the combined sliding bearing with the account of heat generation due to friction has been studied in [1]; the dynamic thermoelastic problem of the shaft rotation in a single and double layer bronze bearings were investigated in [2, 3].

  3. Accounting for genetic interactions improves modeling of individual quantitative trait phenotypes in yeast.

    Science.gov (United States)

    Forsberg, Simon K G; Bloom, Joshua S; Sadhu, Meru J; Kruglyak, Leonid; Carlborg, Örjan

    2017-04-01

    Experiments in model organisms report abundant genetic interactions underlying biologically important traits, whereas quantitative genetics theory predicts, and data support, the notion that most genetic variance in populations is additive. Here we describe networks of capacitating genetic interactions that contribute to quantitative trait variation in a large yeast intercross population. The additive variance explained by individual loci in a network is highly dependent on the allele frequencies of the interacting loci. Modeling of phenotypes for multilocus genotype classes in the epistatic networks is often improved by accounting for the interactions. We discuss the implications of these results for attempts to dissect genetic architectures and to predict individual phenotypes and long-term responses to selection.

  4. Proper body mechanics from an engineering perspective.

    Science.gov (United States)

    Mohr, Edward G

    2010-04-01

    The economic viability of the manual therapy practitioner depends on the number of massages/treatments that can be given in a day or week. Fatigue or injuries can have a major impact on the income potential and could ultimately reach the point which causes the practitioner to quit the profession, and seek other, less physically demanding, employment. Manual therapy practitioners in general, and massage therapists in particular, can utilize a large variety of body postures while giving treatment to a client. The hypothesis of this paper is that there is an optimal method for applying force to the client, which maximizes the benefit to the client, and at the same time minimizes the strain and effort required by the practitioner. Two methods were used to quantifiably determine the effect of using "poor" body mechanics (Improper method) and "best" body mechanics (Proper/correct method). The first approach uses computer modeling to compare the two methods. Both postures were modeled, such that the biomechanical effects on the practitioner's elbow, shoulder, hip, knee and ankle joints could be calculated. The force applied to the client, along with the height and angle of application of the force, was held constant for the comparison. The second approach was a field study of massage practitioners (n=18) to determine their maximal force capability, again comparing methods using "Improper and Proper body mechanics". Five application methods were tested at three different application heights, using a digital palm force gauge. Results showed that there was a definite difference between the two methods, and that the use of correct body mechanics can have a large impact on the health and well being of the massage practitioner over both the short and long term. Copyright 2009 Elsevier Ltd. All rights reserved.

  5. Calibration of an experimental model of tritium storage bed designed for 'in situ' accountability

    International Nuclear Information System (INIS)

    Bidica, Nicolae; Stefanescu, Ioan; Bucur, Ciprian; Bulubasa, Gheorghe; Deaconu, Mariea

    2009-01-01

    Full text: Objectives: Tritium accountancy of the storage beds in tritium facilities is an important issue for tritium inventory control. The purpose of our work was to perform calibration of an experimental model of tritium storage bed with a special design, using electric heaters to simulate tritium decay, and to evaluate the detection limit of the accountancy method. The objective of this paper is to present an experimental method used for calibration of the storage bed and the experimental results consisting of calibration curves and detection limit. Our method is based on a 'self-assaying' tritium storage bed. The basic characteristics of the design of our storage bed consists, in principle, of a uniform distribution of the storage material on several copper thin fins (in order to obtain a uniform temperature field inside the bed), an electrical heat source to simulate the tritium decay heat, a system of thermocouples for measuring the temperature field inside the bed, and good thermal isolation of the bed from the external environment. Within this design of the tritium storage bed, the tritium accounting method is based on determining the decay heat of tritium by measuring the temperature increase of the isolated storage bed. Experimental procedure consisted in measuring of temperature field inside the bed for few values of the power injected with the aid of electrical heat source. Data have been collected for few hours and the temperature increase rate was determined for each value of the power injected. Graphical representation of temperature rise versus injected powers was obtained. This accounting method of tritium inventory stored as metal tritide is a reliable solution for in-situ tritium accountability in a tritium handling facility. Several improvements can be done regarding the design of the storage bed in order to improve the measurement accuracy and to obtain a lower detection limit as for instance use of more accurate thermocouples or special

  6. Gaussian covariance graph models accounting for correlated marker effects in genome-wide prediction.

    Science.gov (United States)

    Martínez, C A; Khare, K; Rahman, S; Elzo, M A

    2017-10-01

    Several statistical models used in genome-wide prediction assume uncorrelated marker allele substitution effects, but it is known that these effects may be correlated. In statistics, graphical models have been identified as a useful tool for covariance estimation in high-dimensional problems and it is an area that has recently experienced a great expansion. In Gaussian covariance graph models (GCovGM), the joint distribution of a set of random variables is assumed to be Gaussian and the pattern of zeros of the covariance matrix is encoded in terms of an undirected graph G. In this study, methods adapting the theory of GCovGM to genome-wide prediction were developed (Bayes GCov, Bayes GCov-KR and Bayes GCov-H). In simulated data sets, improvements in correlation between phenotypes and predicted breeding values and accuracies of predicted breeding values were found. Our models account for correlation of marker effects and permit to accommodate general structures as opposed to models proposed in previous studies, which consider spatial correlation only. In addition, they allow incorporation of biological information in the prediction process through its use when constructing graph G, and their extension to the multi-allelic loci case is straightforward. © 2017 Blackwell Verlag GmbH.

  7. A New Evapotranspiration Model Accounting for Advection and Its Validation during SMEX02

    Directory of Open Access Journals (Sweden)

    Yongmin Yang

    2013-01-01

    Full Text Available Based on the crop water stress index (CWSI concept, a new model was proposed to account for advection to estimate evapotranspiration. Both local scale evaluation with sites observations and regional scale evaluation with a remote dataset from Landsat 7 ETM+ were carried out to assess the performance of this model. Local scale evaluation indicates that this newly developed model can effectively characterize the daily variations of evapotranspiration and the predicted results show good agreement with the site observations. For all the 6 corn sites, the coefficient of determination (R2 is 0.90 and the root mean square difference (RMSD is 58.52W/m2. For all the 6 soybean sites, the R2 and RMSD are 0.85 and 49.46W/m2, respectively. Regional scale evaluation shows that the model can capture the spatial variations of evapotranspiration at the Landsat-based scale. Clear spatial patterns were observed at the Landsat-based scale and are closely related to the dominant land covers, corn and soybean. Furthermore, the surface resistance derived from instantaneous CWSI was applied to the Penman-Monteith equation to estimate daily evapotranspiration. Overall, results indicate that this newly developed model is capable of estimating reliable surface heat fluxes using remotely sensed data.

  8. A model proposal concerning balance scorecard application integrated with resource consumption accounting in enterprise performance management

    Directory of Open Access Journals (Sweden)

    ORHAN ELMACI

    2014-06-01

    Full Text Available The present study intended to investigate the “Balance Scorecard (BSC model integrated with Resource Consumption Accounting (RCA” which helps to evaluate the enterprise as matrix structure in its all parts. It aims to measure how much tangible and intangible values (assets of enterprises contribute to the enterprises. In other words, it measures how effectively, actively, and efficiently these values (assets are used. In short, it aims to measure sustainable competency of enterprises. As expressing the effect of tangible and intangible values (assets of the enterprise on the performance in mathematical and statistical methods is insufficient, it is targeted that RCA Method integrated with BSC model is based on matrix structure and control models. The effects of all complex factors in the enterprise on the performance (productivity and efficiency estimated algorithmically with cause and effect diagram. The contributions of matrix structures for reaching the management functional targets of the enterprises that operate in market competitive environment increasing day to day, is discussed. So in the context of modern management theories, as a contribution to BSC approach which is in the foreground in today’s administrative science of enterprises in matrix organizational structures, multidimensional performance evaluation model -RCA integrated with BSC Model proposal- is presented as strategic planning and strategic evaluation instrument.

  9. A Model for Urban Environment and Resource Planning Based on Green GDP Accounting System

    Directory of Open Access Journals (Sweden)

    Linyu Xu

    2013-01-01

    Full Text Available The urban environment and resources are currently on course that is unsustainable in the long run due to excessive human pursuit of economic goals. Thus, it is very important to develop a model to analyse the relationship between urban economic development and environmental resource protection during the process of rapid urbanisation. This paper proposed a model to identify the key factors in urban environment and resource regulation based on a green GDP accounting system, which consisted of four parts: economy, society, resource, and environment. In this model, the analytic hierarchy process (AHP method and a modified Pearl curve model were combined to allow for dynamic evaluation, with higher green GDP value as the planning target. The model was applied to the environmental and resource planning problem of Wuyishan City, and the results showed that energy use was a key factor that influenced the urban environment and resource development. Biodiversity and air quality were the most sensitive factors that influenced the value of green GDP in the city. According to the analysis, the urban environment and resource planning could be improved for promoting sustainable development in Wuyishan City.

  10. Air quality modeling for accountability research: Operational, dynamic, and diagnostic evaluation

    Science.gov (United States)

    Henneman, Lucas R. F.; Liu, Cong; Hu, Yongtao; Mulholland, James A.; Russell, Armistead G.

    2017-10-01

    Photochemical grid models play a central role in air quality regulatory frameworks, including in air pollution accountability research, which seeks to demonstrate the extent to which regulations causally impacted emissions, air quality, and public health. There is a need, however, to develop and demonstrate appropriate practices for model application and evaluation in an accountability framework. We employ a combination of traditional and novel evaluation techniques to assess four years (2001-02, 2011-12) of simulated pollutant concentrations across a decade of major emissions reductions using the Community Multiscale Air Quality (CMAQ) model. We have grouped our assessments in three categories: Operational evaluation investigates how well CMAQ captures absolute concentrations; dynamic evaluation investigates how well CMAQ captures changes in concentrations across the decade of changing emissions; diagnostic evaluation investigates how CMAQ attributes variability in concentrations and sensitivities to emissions between meteorology and emissions, and how well this attribution compares to empirical statistical models. In this application, CMAQ captures O3 and PM2.5 concentrations and change over the decade in the Eastern United States similarly to past CMAQ applications and in line with model evaluation guidance; however, some PM2.5 species-EC, OC, and sulfate in particular-exhibit high biases in various months. CMAQ-simulated PM2.5 has a high bias in winter months and low bias in the summer, mainly due to a high bias in OC during the cold months and low bias in OC and sulfate during the summer. Simulated O3 and PM2.5 changes across the decade have normalized mean bias of less than 2.5% and 17%, respectively. Detailed comparisons suggest biased EC emissions, negative wintertime SO42- sensitivities to mobile source emissions, and incomplete capture of OC chemistry in the summer and winter. Photochemical grid model-simulated O3 and PM2.5 responses to emissions and

  11. Radiative transfer modeling through terrestrial atmosphere and ocean accounting for inelastic processes: Software package SCIATRAN

    International Nuclear Information System (INIS)

    Rozanov, V.V.; Dinter, T.; Rozanov, A.V.; Wolanin, A.; Bracher, A.; Burrows, J.P.

    2017-01-01

    SCIATRAN is a comprehensive software package which is designed to model radiative transfer processes in the terrestrial atmosphere and ocean in the spectral range from the ultraviolet to the thermal infrared (0.18–40 μm). It accounts for multiple scattering processes, polarization, thermal emission and ocean–atmosphere coupling. The main goal of this paper is to present a recently developed version of SCIATRAN which takes into account accurately inelastic radiative processes in both the atmosphere and the ocean. In the scalar version of the coupled ocean–atmosphere radiative transfer solver presented by Rozanov et al. we have implemented the simulation of the rotational Raman scattering, vibrational Raman scattering, chlorophyll and colored dissolved organic matter fluorescence. In this paper we discuss and explain the numerical methods used in SCIATRAN to solve the scalar radiative transfer equation including trans-spectral processes, and demonstrate how some selected radiative transfer problems are solved using the SCIATRAN package. In addition we present selected comparisons of SCIATRAN simulations with those published benchmark results, independent radiative transfer models, and various measurements from satellite, ground-based, and ship-borne instruments. The extended SCIATRAN software package along with a detailed User's Guide is made available for scientists and students, who are undertaking their own research typically at universities, via the web page of the Institute of Environmental Physics (IUP), University of Bremen: (http://www.iup.physik.uni-bremen.de). - Highlights: • A new version of the software package SCIATRAN is presented. • Inelastic scattering in water and atmosphere is implemented in SCIATRAN. • Raman scattering and fluorescence can be included in radiative transfer calculations. • Comparisons to other radiative transfer models show excellent agreement. • Comparisons to observations show consistent results.

  12. Modeling of ethylbenzene dehydrogenation kinetics process taking into account deactivation of catalyst bed of the reactor

    Directory of Open Access Journals (Sweden)

    V. K. Bityukov

    2017-01-01

    Full Text Available Styrene synthesis process occurring in a two-stage continuous adiabatic reactor is a complex chemical engineering system. It is characterized by indeterminacy, nonstationarity and occurs in permanent uncontrolled disturbances. Therefore, the task of developing the predictive control system of the main product concentration of the dehydrogenation reaction - styrene to maintain this value within a predetermined range throughout the period of operation is important. This solution is impossible without the development of the process model on the basis of the kinetic revised scheme, taking into account the drop of the reactor catalytic bed activity due to coke formation on the surface. The article justifies and proposes: the drop changes dependence of catalyst bed activity as a time of reactor block operation function and improved model of chemical reactions kinetics. The synthesized mathematical model of the process is a system of ordinary differential equations and allows you: to calculate the concentration profiles of reaction mixture components during the passage of the charge through the adiabatic reactor stage, to determine the contact gas composition at the outlet of the reactor stages throughout the cycle of catalytic system, taking into account temperature changes and drop of the catalyst bed activity. The compensation of the decreased catalyst bed activity is carried out by raising the temperature in the reactor block for the duration of the operation. The estimation of the values of chemical reactions rate constants, as well as the calculation and analysis of the main and by-products concentrations of dehydrogenation reactions at the outlet of the reactor plant is curried out. Simulation results show that the change of temperature of the reactor, carried out by the exponential law considering deactivation of the catalyst bed allows the yield in a given range of technological regulations throughout the operation cycle of the reactor block.

  13. A margin model to account for respiration-induced tumour motion and its variability

    International Nuclear Information System (INIS)

    Coolens, Catherine; Webb, Steve; Evans, Phil M; Shirato, H; Nishioka, K

    2008-01-01

    In order to reduce the sensitivity of radiotherapy treatments to organ motion, compensation methods are being investigated such as gating of treatment delivery, tracking of tumour position, 4D scanning and planning of the treatment, etc. An outstanding problem that would occur with all these methods is the assumption that breathing motion is reproducible throughout the planning and delivery process of treatment. This is obviously not a realistic assumption and is one that will introduce errors. A dynamic internal margin model (DIM) is presented that is designed to follow the tumour trajectory and account for the variability in respiratory motion. The model statistically describes the variation of the breathing cycle over time, i.e. the uncertainty in motion amplitude and phase reproducibility, in a polar coordinate system from which margins can be derived. This allows accounting for an additional gating window parameter for gated treatment delivery as well as minimizing the area of normal tissue irradiated. The model was illustrated with abdominal motion for a patient with liver cancer and tested with internal 3D lung tumour trajectories. The results confirm that the respiratory phases around exhale are most reproducible and have the smallest variation in motion amplitude and phase (approximately 2 mm). More importantly, the margin area covering normal tissue is significantly reduced by using trajectory-specific margins (as opposed to conventional margins) as the angular component is by far the largest contributor to the margin area. The statistical approach to margin calculation, in addition, offers the possibility for advanced online verification and updating of breathing variation as more data become available

  14. FPLUME-1.0: An integral volcanic plume model accounting for ash aggregation

    Science.gov (United States)

    Folch, Arnau; Costa, Antonio; Macedonio, Giovanni

    2016-04-01

    Eruption Source Parameters (ESP) characterizing volcanic eruption plumes are crucial inputs for atmospheric tephra dispersal models, used for hazard assessment and risk mitigation. We present FPLUME-1.0, a steady-state 1D cross-section averaged eruption column model based on the Buoyant Plume Theory (BPT). The model accounts for plume bending by wind, entrainment of ambient moisture, effects of water phase changes, particle fallout and re-entrainment, a new parameterization for the air entrainment coefficients and a model for wet aggregation of ash particles in presence of liquid water or ice. In the occurrence of wet aggregation, the model predicts an "effective" grain size distribution depleted in fines with respect to that erupted at the vent. Given a wind profile, the model can be used to determine the column height from the eruption mass flow rate or vice-versa. The ultimate goal is to improve ash cloud dispersal forecasts by better constraining the ESP (column height, eruption rate and vertical distribution of mass) and the "effective" particle grain size distribution resulting from eventual wet aggregation within the plume. As test cases we apply the model to the eruptive phase-B of the 4 April 1982 El Chichón volcano eruption (México) and the 6 May 2010 Eyjafjallajökull eruption phase (Iceland). The modular structure of the code facilitates the implementation in the future code versions of more quantitative ash aggregation parameterization as further observations and experiments data will be available for better constraining ash aggregation processes.

  15. A Unifying Modeling of Plant Shoot Gravitropism With an Explicit Account of the Effects of Growth

    Directory of Open Access Journals (Sweden)

    Renaud eBastien

    2014-04-01

    Full Text Available Gravitropism, the slow reorientation of plant growth in response to gravity, is a major determinant of the form and posture of land plants. Recently a universal model of shoot gravitropism, the AC model, has been presented, in which the dynamics of the tropic movement is only determined by the contradictory controls of i graviception, that tends to curve the plants towards the vertical, and ii proprioception, that tends to keep the stem straights. This model was found valid over a large range of species and over two order of magnitude in organ size. However the motor of the movement, the elongation, has been neglected in the AC model. Taking into account explicit growth effects, however, requires consideration of the material derivative, i.e. the rate of change of curvature bound to an expanding and convected organ elements. Here we show that it is possible to rewrite the material equation of curvature in a compact simplified form that express directly the curvature variation as a function of the median elongation andof the distribution of the differential growth. Through this extended model, called the ACE model, two main destabilizing effects of growth on the tropic movement are identified : i the passive orientation drift, which occurs when a curved element elongates without differential growth and ii the fixed curvature which occurs when a element leaves the elongation zone and is no longer able to change its curvature actively. By comparing the AC and ACE models to experiments, these two effects were however found negligible, revealing a probable selection for rapid convergence to the steady state shape during the tropic movement so as to escape the growth destabilizing effects, involving in particular a selection over proprioceptive sensitivity. Then the simplified AC mode can be used to analyze gravitropism and posture control in actively elongating plant organs without significant information loss.

  16. Material Protection, Accounting, and Control Technologies (MPACT): Modeling and Simulation Roadmap

    Energy Technology Data Exchange (ETDEWEB)

    Cipiti, Benjamin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Dunn, Timothy [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Durbin, Samual [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Durkee, Joe W. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); England, Jeff [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Jones, Robert [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Ketusky, Edward [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Li, Shelly [Idaho National Lab. (INL), Idaho Falls, ID (United States); Lindgren, Eric [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Meier, David [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Miller, Michael [Idaho National Lab. (INL), Idaho Falls, ID (United States); Osburn, Laura Ann [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Pereira, Candido [Argonne National Lab. (ANL), Argonne, IL (United States); Rauch, Eric Benton [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Scaglione, John [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Scherer, Carolynn P. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Sprinkle, James K. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Yoo, Tae-Sic [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-08-05

    The development of sustainable advanced nuclear fuel cycles is a long-term goal of the Office of Nuclear Energy’s (DOE-NE) Fuel Cycle Technologies program. The Material Protection, Accounting, and Control Technologies (MPACT) campaign is supporting research and development (R&D) of advanced instrumentation, analysis tools, and integration methodologies to meet this goal. This advanced R&D is intended to facilitate safeguards and security by design of fuel cycle facilities. The lab-scale demonstration of a virtual facility, distributed test bed, that connects the individual tools being developed at National Laboratories and university research establishments, is a key program milestone for 2020. These tools will consist of instrumentation and devices as well as computer software for modeling. To aid in framing its long-term goal, during FY16, a modeling and simulation roadmap is being developed for three major areas of investigation: (1) radiation transport and sensors, (2) process and chemical models, and (3) shock physics and assessments. For each area, current modeling approaches are described, and gaps and needs are identified.

  17. Accounting for exhaust gas transport dynamics in instantaneous emission models via smooth transition regression.

    Science.gov (United States)

    Kamarianakis, Yiannis; Gao, H Oliver

    2010-02-15

    Collecting and analyzing high frequency emission measurements has become very usual during the past decade as significantly more information with respect to formation conditions can be collected than from regulated bag measurements. A challenging issue for researchers is the accurate time-alignment between tailpipe measurements and engine operating variables. An alignment procedure should take into account both the reaction time of the analyzers and the dynamics of gas transport in the exhaust and measurement systems. This paper discusses a statistical modeling framework that compensates for variable exhaust transport delay while relating tailpipe measurements with engine operating covariates. Specifically it is shown that some variants of the smooth transition regression model allow for transport delays that vary smoothly as functions of the exhaust flow rate. These functions are characterized by a pair of coefficients that can be estimated via a least-squares procedure. The proposed models can be adapted to encompass inherent nonlinearities that were implicit in previous instantaneous emissions modeling efforts. This article describes the methodology and presents an illustrative application which uses data collected from a diesel bus under real-world driving conditions.

  18. Comprehensive impedance model of cobalt deposition in sulfate solutions accounting for homogeneous reactions and adsorptive effects

    International Nuclear Information System (INIS)

    Vazquez-Arenas, Jorge; Pritzker, Mark

    2011-01-01

    A comprehensive physicochemical model for cobalt deposition onto a cobalt rotating disk electrode in sulfate-borate (pH 3) solutions is derived and statistically fit to experimental EIS spectra obtained over a range of CoSO 4 concentrations, overpotentials and rotation speeds. The model accounts for H + and water reduction, homogeneous reactions and mass transport within the boundary layer. Based on a thermodynamic analysis, the species CoSO 4(aq) , B(OH) 3(aq) , B 3 O 3 (OH) 4 - , H + and OH - and two homogeneous reactions (B(OH) 3(aq) hydrolysis and water dissociation) are included in the model. Kinetic and transport parameters are estimated by minimizing the sum-of-squares error between the model and experimental measurements using a simplex method. The electrode response is affected most strongly by parameters associated with the first step of Co(II) reduction, reflecting its control of the rate of Co deposition, and is moderately sensitive to the parameters for H + reduction and the Co(II) diffusion coefficient. Water reduction is found not to occur to any significant extent under the conditions studied. These trends are consistent with that obtained by fitting equivalent electrical circuits to the experimental spectra. The simplest circuit that best fits the data consists of two RQ elements (resistor-constant phase element) in parallel or series with the solution resistance.

  19. Modelling of gas-metal arc welding taking into account metal vapour

    Energy Technology Data Exchange (ETDEWEB)

    Schnick, M; Fuessel, U; Hertel, M; Haessler, M [Institute of Surface and Manufacturing Technology, Technische Universitaet Dresden, D-01062 Dresden (Germany); Spille-Kohoff, A [CFX Berlin Software GmbH, Karl-Marx-Allee 90, 10243 Berlin (Germany); Murphy, A B [CSIRO Materials Science and Engineering, PO Box 218, Lindfield NSW 2070 (Australia)

    2010-11-03

    The most advanced numerical models of gas-metal arc welding (GMAW) neglect vaporization of metal, and assume an argon atmosphere for the arc region, as is also common practice for models of gas-tungsten arc welding (GTAW). These models predict temperatures above 20 000 K and a temperature distribution similar to GTAW arcs. However, spectroscopic temperature measurements in GMAW arcs demonstrate much lower arc temperatures. In contrast to measurements of GTAW arcs, they have shown the presence of a central local minimum of the radial temperature distribution. This paper presents a GMAW model that takes into account metal vapour and that is able to predict the local central minimum in the radial distributions of temperature and electric current density. The influence of different values for the net radiative emission coefficient of iron vapour, which vary by up to a factor of hundred, is examined. It is shown that these net emission coefficients cause differences in the magnitudes, but not in the overall trends, of the radial distribution of temperature and current density. Further, the influence of the metal vaporization rate is investigated. We present evidence that, for higher vaporization rates, the central flow velocity inside the arc is decreased and can even change direction so that it is directed from the workpiece towards the wire, although the outer plasma flow is still directed towards the workpiece. In support of this thesis, we have attempted to reproduce the measurements of Zielinska et al for spray-transfer mode GMAW numerically, and have obtained reasonable agreement.

  20. Integrated Approach Model of Risk, Control and Auditing of Accounting Information Systems

    Directory of Open Access Journals (Sweden)

    Claudiu BRANDAS

    2013-01-01

    Full Text Available The use of IT in the financial and accounting processes is growing fast and this leads to an increase in the research and professional concerns about the risks, control and audit of Ac-counting Information Systems (AIS. In this context, the risk and control of AIS approach is a central component of processes for IT audit, financial audit and IT Governance. Recent studies in the literature on the concepts of risk, control and auditing of AIS outline two approaches: (1 a professional approach in which we can fit ISA, COBIT, IT Risk, COSO and SOX, and (2 a research oriented approach in which we emphasize research on continuous auditing and fraud using information technology. Starting from the limits of existing approaches, our study is aimed to developing and testing an Integrated Approach Model of Risk, Control and Auditing of AIS on three cycles of business processes: purchases cycle, sales cycle and cash cycle in order to improve the efficiency of IT Governance, as well as ensuring integrity, reality, accuracy and availability of financial statements.

  1. Pretransformation strain modulations in proper ferroelastics

    Energy Technology Data Exchange (ETDEWEB)

    Saxena, A. (Los Alamos National Lab., NM (United States)); Barsch, G.R. (Pennsylvania State Univ., University Park, PA (United States))

    1992-01-01

    Within the framework of the Landau-Ginzburg model for the 0[sub h]-D[sub 4h] proper ferroelastic transformation pretransformation structural modulations may arise as (possibly defect-stabilized) pseudo-critical fluctuations that can be described by an effective elastic [phi][sub 4] model. Here we show that the strain amplitude and temperature dependence of an intermediate tetragonal phase observed above and concurrently with the face-centered cubic to face-centered tetragonal transition in Fe[sub 0.7]Pd[sub 0.3] can be understood semi-quantitatively in terms of the crossover behavior from the Ising to the displacive limit by applying the crossover phase diagram of Beale, Sarker and Krumhanal to the effective [phi][sup 4] model. The fact that this type of pretransformation modulation has not been observed with X-ray for other ferroelastic transformations can be attributed to large values of the order parameter gradient coefficient in these cases.

  2. Pretransformation strain modulations in proper ferroelastics

    Energy Technology Data Exchange (ETDEWEB)

    Saxena, A. [Los Alamos National Lab., NM (United States); Barsch, G.R. [Pennsylvania State Univ., University Park, PA (United States)

    1992-12-31

    Within the framework of the Landau-Ginzburg model for the 0{sub h}-D{sub 4h} proper ferroelastic transformation pretransformation structural modulations may arise as (possibly defect-stabilized) pseudo-critical fluctuations that can be described by an effective elastic {phi}{sub 4} model. Here we show that the strain amplitude and temperature dependence of an intermediate tetragonal phase observed above and concurrently with the face-centered cubic to face-centered tetragonal transition in Fe{sub 0.7}Pd{sub 0.3} can be understood semi-quantitatively in terms of the crossover behavior from the Ising to the displacive limit by applying the crossover phase diagram of Beale, Sarker and Krumhanal to the effective {phi}{sup 4} model. The fact that this type of pretransformation modulation has not been observed with X-ray for other ferroelastic transformations can be attributed to large values of the order parameter gradient coefficient in these cases.

  3. Accounting for treatment use when validating a prognostic model: a simulation study.

    Science.gov (United States)

    Pajouheshnia, Romin; Peelen, Linda M; Moons, Karel G M; Reitsma, Johannes B; Groenwold, Rolf H H

    2017-07-14

    Prognostic models often show poor performance when applied to independent validation data sets. We illustrate how treatment use in a validation set can affect measures of model performance and present the uses and limitations of available analytical methods to account for this using simulated data. We outline how the use of risk-lowering treatments in a validation set can lead to an apparent overestimation of risk by a prognostic model that was developed in a treatment-naïve cohort to make predictions of risk without treatment. Potential methods to correct for the effects of treatment use when testing or validating a prognostic model are discussed from a theoretical perspective.. Subsequently, we assess, in simulated data sets, the impact of excluding treated individuals and the use of inverse probability weighting (IPW) on the estimated model discrimination (c-index) and calibration (observed:expected ratio and calibration plots) in scenarios with different patterns and effects of treatment use. Ignoring the use of effective treatments in a validation data set leads to poorer model discrimination and calibration than would be observed in the untreated target population for the model. Excluding treated individuals provided correct estimates of model performance only when treatment was randomly allocated, although this reduced the precision of the estimates. IPW followed by exclusion of the treated individuals provided correct estimates of model performance in data sets where treatment use was either random or moderately associated with an individual's risk when the assumptions of IPW were met, but yielded incorrect estimates in the presence of non-positivity or an unobserved confounder. When validating a prognostic model developed to make predictions of risk without treatment, treatment use in the validation set can bias estimates of the performance of the model in future targeted individuals, and should not be ignored. When treatment use is random, treated

  4. Accounting for treatment use when validating a prognostic model: a simulation study

    Directory of Open Access Journals (Sweden)

    Romin Pajouheshnia

    2017-07-01

    Full Text Available Abstract Background Prognostic models often show poor performance when applied to independent validation data sets. We illustrate how treatment use in a validation set can affect measures of model performance and present the uses and limitations of available analytical methods to account for this using simulated data. Methods We outline how the use of risk-lowering treatments in a validation set can lead to an apparent overestimation of risk by a prognostic model that was developed in a treatment-naïve cohort to make predictions of risk without treatment. Potential methods to correct for the effects of treatment use when testing or validating a prognostic model are discussed from a theoretical perspective.. Subsequently, we assess, in simulated data sets, the impact of excluding treated individuals and the use of inverse probability weighting (IPW on the estimated model discrimination (c-index and calibration (observed:expected ratio and calibration plots in scenarios with different patterns and effects of treatment use. Results Ignoring the use of effective treatments in a validation data set leads to poorer model discrimination and calibration than would be observed in the untreated target population for the model. Excluding treated individuals provided correct estimates of model performance only when treatment was randomly allocated, although this reduced the precision of the estimates. IPW followed by exclusion of the treated individuals provided correct estimates of model performance in data sets where treatment use was either random or moderately associated with an individual's risk when the assumptions of IPW were met, but yielded incorrect estimates in the presence of non-positivity or an unobserved confounder. Conclusions When validating a prognostic model developed to make predictions of risk without treatment, treatment use in the validation set can bias estimates of the performance of the model in future targeted individuals, and

  5. Accounting for detectability in fish distribution models: an approach based on time-to-first-detection

    Directory of Open Access Journals (Sweden)

    Mário Ferreira

    2015-12-01

    Full Text Available Imperfect detection (i.e., failure to detect a species when the species is present is increasingly recognized as an important source of uncertainty and bias in species distribution modeling. Although methods have been developed to solve this problem by explicitly incorporating variation in detectability in the modeling procedure, their use in freshwater systems remains limited. This is probably because most methods imply repeated sampling (≥ 2 of each location within a short time frame, which may be impractical or too expensive in most studies. Here we explore a novel approach to control for detectability based on the time-to-first-detection, which requires only a single sampling occasion and so may find more general applicability in freshwaters. The approach uses a Bayesian framework to combine conventional occupancy modeling with techniques borrowed from parametric survival analysis, jointly modeling factors affecting the probability of occupancy and the time required to detect a species. To illustrate the method, we modeled large scale factors (elevation, stream order and precipitation affecting the distribution of six fish species in a catchment located in north-eastern Portugal, while accounting for factors potentially affecting detectability at sampling points (stream depth and width. Species detectability was most influenced by depth and to lesser extent by stream width and tended to increase over time for most species. Occupancy was consistently affected by stream order, elevation and annual precipitation. These species presented a widespread distribution with higher uncertainty in tributaries and upper stream reaches. This approach can be used to estimate sampling efficiency and provide a practical framework to incorporate variations in the detection rate in fish distribution models.

  6. A Thermodamage Strength Theoretical Model of Ceramic Materials Taking into Account the Effect of Residual Stress

    Directory of Open Access Journals (Sweden)

    Weiguo Li

    2012-01-01

    Full Text Available A thermodamage strength theoretical model taking into account the effect of residual stress was established and applied to each temperature phase based on the study of effects of various physical mechanisms on the fracture strength of ultrahigh-temperature ceramics. The effects of SiC particle size, crack size, and SiC particle volume fraction on strength corresponding to different temperatures were studied in detail. This study showed that when flaw size is not large, the bigger SiC particle size results in the greater effect of tensile residual stress in the matrix grains on strength reduction, and this prediction coincides with experimental results; and the residual stress and the combined effort of particle size and crack size play important roles in controlling material strength.

  7. A hybrid mode choice model to account for the dynamic effect of inertia over time

    DEFF Research Database (Denmark)

    Cherchi, Elisabetta; Börjesson, Maria; Bierlaire, Michel

    The influence of habits, giving rise to inertia effect, in the choice process has been intensely debated in the literature. Typically inertia is accounted for by letting the indirect utility functions of the alternatives of the choice situation at time t depend on the outcome of the choice made...... gathered over a continuous period of time, six weeks, to study both inertia and the influence of habits. Tendency to stick with the same alternative is measured through lagged variables that link the current choice with the previous trip made with the same purpose, mode and time of day. However, the lagged...... effect of the previous trips is not constant but it depends on the individual propensity to undertake habitual trips which is captured by the individual specific latent variable. And the frequency of the trips in the previous week is used as an indicator of the habitual behavior. The model estimation...

  8. REGRESSION MODEL FOR RISK REPORTING IN FINANCIAL STATEMENTS OF ACCOUNTING SERVICES ENTITIES

    Directory of Open Access Journals (Sweden)

    Mirela NICHITA

    2015-06-01

    Full Text Available The purpose of financial reports is to provide useful information to users; the utility of information is defined through the qualitative characteristics (fundamental and enhancing. The financial crisis emphasized the limits of financial reporting which has been unable to prevent investors about the risks they were facing. Due to the current changes in business environment, managers have been highly motivated to rethink and improve the risk governance philosophy, processes and methodologies. The lack of quality, timely data and adequate systems to capture, report and measure the right information across the organization is a fundamental challenge for implementing and sustaining all aspects of effective risk management. Starting with the 80s, the investors are more interested in narratives (Notes to financial statements, than in primary reports (financial position and performance. The research will apply a regression model for assessment of risk reporting by the professional (accounting and taxation services for major companies from Romania during the period 2009 – 2013.

  9. Hydrodynamic modeling of urban flooding taking into account detailed data about city infrastructure

    Science.gov (United States)

    Belikov, Vitaly; Norin, Sergey; Aleksyuk, Andrey; Krylenko, Inna; Borisova, Natalya; Rumyantsev, Alexey

    2017-04-01

    Flood waves moving across urban areas have specific features. Thus, the linear objects of infrastructure (such as embankments, roads, dams) can change the direction of flow or block the water movement. On the contrary, paved avenues and wide streets in the cities contribute to the concentration of flood waters. Buildings create an additional resistance to the movement of water, which depends on the urban density and the type of constructions; this effect cannot be completely described by Manning's resistance law. In addition, part of the earth surface, occupied by buildings, is excluded from the flooded area, which results in a substantial (relative to undeveloped areas) increase of the depth of flooding, especially for unsteady flow conditions. An approach to numerical simulation of urban areas flooding that consists in direct allocating of all buildings and structures on the computational grid are proposed. This can be done in almost full automatic way with usage of modern software. Real geometry of all objects of infrastructure can be taken into account on the base of highly detailed digital maps and satellite images. The calculations based on two-dimensional Saint-Venant equations on irregular adaptive computational meshes, which can contain millions of cells and take into account tens of thousands of buildings and other objects of infrastructure. Flood maps, received as result of modeling, are the basis for the damage and risk assessment for urban areas. The main advantage of the developed method is high-precision calculations, realistic modeling results and appropriate graphical display of the flood dynamics and dam-break wave's propagation on urban areas. Verification of this method has been done on the experimental data and real events simulations, including catastrophic flooding of the Krymsk city in 2012 year.

  10. Accounting for seasonal isotopic patterns of forest canopy intercepted precipitation in streamflow modeling

    Science.gov (United States)

    Stockinger, Michael P.; Lücke, Andreas; Vereecken, Harry; Bogena, Heye R.

    2017-12-01

    Forest canopy interception alters the isotopic tracer signal of precipitation leading to significant isotopic differences between open precipitation (δOP) and throughfall (δTF). This has important consequences for the tracer-based modeling of streamwater transit times. Some studies have suggested using a simple static correction to δOP by uniformly increasing it because δTF is rarely available for hydrological modeling. Here, we used data from a 38.5 ha spruce forested headwater catchment where three years of δOP and δTF were available to develop a data driven method that accounts for canopy effects on δOP. Changes in isotopic composition, defined as the difference δTF-δOP, varied seasonally with higher values during winter and lower values during summer. We used this pattern to derive a corrected δOP time series and analyzed the impact of using (1) δOP, (2) reference throughfall data (δTFref) and (3) the corrected δOP time series (δOPSine) in estimating the fraction of young water (Fyw), i.e., the percentage of streamflow younger than two to three months. We found that Fyw derived from δOPSine came closer to δTFref in comparison to δOP. Thus, a seasonally-varying correction for δOP can be successfully used to infer δTF where it is not available and is superior to the method of using a fixed correction factor. Seasonal isotopic enrichment patterns should be accounted for when estimating Fyw and more generally in catchment hydrology studies using other tracer methods to reduce uncertainty.

  11. Large proper motions in the Orion nebula

    International Nuclear Information System (INIS)

    Cudworth, K.M.; Stone, R.C.

    1977-01-01

    Several nebular features, as well as one faint star, with large proper motions were identified within the Orion nebula. The measured proper motions correspond to tangential velocities of up to approximately 70 km sec -1 . One new probable variable star was also found

  12. Ethnology and the Study of Proper Names.

    Science.gov (United States)

    Bean, Susan S.

    1980-01-01

    Discusses the importance of uncovering the universal features of proper names and relating them to different naming systems. Suggests that this viewpoint may lead to an appreciation of proper names as a sociolinguistic universal and a cultural variable, beyond the particulars on which most of the literature has focused. (MES)

  13. Economic Enpowerment of Nigerian women through proper ...

    African Journals Online (AJOL)

    This paper is an attempt to make a modest contribution on how to finance Nigerian women economic empowerment programme. It also provides certain measures of proper monitoring and channeling those resources in order to achieve the target they are meant for. The writer is of the view that if proper monitoring is done, ...

  14. Biological parametric mapping accounting for random regressors with regression calibration and model II regression.

    Science.gov (United States)

    Yang, Xue; Lauzon, Carolyn B; Crainiceanu, Ciprian; Caffo, Brian; Resnick, Susan M; Landman, Bennett A

    2012-09-01

    Massively univariate regression and inference in the form of statistical parametric mapping have transformed the way in which multi-dimensional imaging data are studied. In functional and structural neuroimaging, the de facto standard "design matrix"-based general linear regression model and its multi-level cousins have enabled investigation of the biological basis of the human brain. With modern study designs, it is possible to acquire multi-modal three-dimensional assessments of the same individuals--e.g., structural, functional and quantitative magnetic resonance imaging, alongside functional and ligand binding maps with positron emission tomography. Largely, current statistical methods in the imaging community assume that the regressors are non-random. For more realistic multi-parametric assessment (e.g., voxel-wise modeling), distributional consideration of all observations is appropriate. Herein, we discuss two unified regression and inference approaches, model II regression and regression calibration, for use in massively univariate inference with imaging data. These methods use the design matrix paradigm and account for both random and non-random imaging regressors. We characterize these methods in simulation and illustrate their use on an empirical dataset. Both methods have been made readily available as a toolbox plug-in for the SPM software. Copyright © 2012 Elsevier Inc. All rights reserved.

  15. Context-Specific Proportion Congruency Effects: An Episodic Learning Account and Computational Model.

    Science.gov (United States)

    Schmidt, James R

    2016-01-01

    In the Stroop task, participants identify the print color of color words. The congruency effect is the observation that response times and errors are increased when the word and color are incongruent (e.g., the word "red" in green ink) relative to when they are congruent (e.g., "red" in red). The proportion congruent (PC) effect is the finding that congruency effects are reduced when trials are mostly incongruent rather than mostly congruent. This PC effect can be context-specific. For instance, if trials are mostly incongruent when presented in one location and mostly congruent when presented in another location, the congruency effect is smaller for the former location. Typically, PC effects are interpreted in terms of strategic control of attention in response to conflict, termed conflict adaptation or conflict monitoring. In the present manuscript, however, an episodic learning account is presented for context-specific proportion congruent (CSPC) effects. In particular, it is argued that context-specific contingency learning can explain part of the effect, and context-specific rhythmic responding can explain the rest. Both contingency-based and temporal-based learning can parsimoniously be conceptualized within an episodic learning framework. An adaptation of the Parallel Episodic Processing model is presented. This model successfully simulates CSPC effects, both for contingency-biased and contingency-unbiased (transfer) items. The same fixed-parameter model can explain a range of other findings from the learning, timing, binding, practice, and attentional control domains.

  16. Modeling Lung Carcinogenesis in Radon-Exposed Miner Cohorts: Accounting for Missing Information on Smoking.

    Science.gov (United States)

    van Dillen, Teun; Dekkers, Fieke; Bijwaard, Harmen; Brüske, Irene; Wichmann, H-Erich; Kreuzer, Michaela; Grosche, Bernd

    2016-05-01

    Epidemiological miner cohort data used to estimate lung cancer risks related to occupational radon exposure often lack cohort-wide information on exposure to tobacco smoke, a potential confounder and important effect modifier. We have developed a method to project data on smoking habits from a case-control study onto an entire cohort by means of a Monte Carlo resampling technique. As a proof of principle, this method is tested on a subcohort of 35,084 former uranium miners employed at the WISMUT company (Germany), with 461 lung cancer deaths in the follow-up period 1955-1998. After applying the proposed imputation technique, a biologically-based carcinogenesis model is employed to analyze the cohort's lung cancer mortality data. A sensitivity analysis based on a set of 200 independent projections with subsequent model analyses yields narrow distributions of the free model parameters, indicating that parameter values are relatively stable and independent of individual projections. This technique thus offers a possibility to account for unknown smoking habits, enabling us to unravel risks related to radon, to smoking, and to the combination of both. © 2015 Society for Risk Analysis.

  17. The Models of Distance Forms of Learning in National Academy of Statistics, Accounting and Audit

    Directory of Open Access Journals (Sweden)

    L. V.

    2017-03-01

    Full Text Available Finding solutions to the problems faced by the Ukrainian education system require an adequate organizing structure for education system, enabling for transition to the principle of life-long education. The best option for this is the distance learning systems (DLS, which are considered by leading Ukrainian universities as high-performance information technologies in modern education, envisaged by the National Informatization Program, with the goals of reforming higher education in Ukraine in the context of joining the European educational area. The experience of implementing DLS “Prometheus” and Moodle and the main directions of distance learning development at the National Academy of Statistics, Accounting and Audit (NASAA are analyzed and summed up. The emphasis is made on the need to improve the skills of teachers with use of open distance courses and gradual preparation of students for the learning process in the new conditions. The structure of distance courses for different forms of education (full-time, part-time, and blended is built. The forms of learning (face-to-face, the driver complementary, rotation model; flex model, etc. are analyzed. The dynamic version of implementing blended learning models in NASAA using DLS “Prometheus” and Moodle is presented. It is concluded that the experience of NASAA shows that the blended form of distance learning based on Moodle platform is the most adequate to the requirements of Ukraine’s development within the framework of European education.

  18. A common signal detection model accounts for both perception and discrimination of the watercolor effect.

    Science.gov (United States)

    Devinck, Frédéric; Knoblauch, Kenneth

    2012-03-21

    Establishing the relation between perception and discrimination is a fundamental objective in psychophysics, with the goal of characterizing the neural mechanisms mediating perception. Here, we show that a procedure for estimating a perceptual scale based on a signal detection model also predicts discrimination performance. We use a recently developed procedure, Maximum Likelihood Difference Scaling (MLDS), to measure the perceptual strength of a long-range, color, filling-in phenomenon, the Watercolor Effect (WCE), as a function of the luminance ratio between the two components of its generating contour. MLDS is based on an equal-variance, gaussian, signal detection model and yields a perceptual scale with interval properties. The strength of the fill-in percept increased 10-15 times the estimate of the internal noise level for a 3-fold increase in the luminance ratio. Each observer's estimated scale predicted discrimination performance in a subsequent paired-comparison task. A common signal detection model accounts for both the appearance and discrimination data. Since signal detection theory provides a common metric for relating discrimination performance and neural response, the results have implications for comparing perceptual and neural response functions.

  19. Accounting for selection bias in species distribution models: An econometric approach on forested trees based on structural modeling

    Science.gov (United States)

    Ay, Jean-Sauveur; Guillemot, Joannès; Martin-StPaul, Nicolas K.; Doyen, Luc; Leadley, Paul

    2015-04-01

    Species distribution models (SDMs) are widely used to study and predict the outcome of global change on species. In human dominated ecosystems the presence of a given species is the result of both its ecological suitability and human footprint on nature such as land use choices. Land use choices may thus be responsible for a selection bias in the presence/absence data used in SDM calibration. We present a structural modelling approach (i.e. based on structural equation modelling) that accounts for this selection bias. The new structural species distribution model (SSDM) estimates simultaneously land use choices and species responses to bioclimatic variables. A land use equation based on an econometric model of landowner choices was joined to an equation of species response to bioclimatic variables. SSDM allows the residuals of both equations to be dependent, taking into account the possibility of shared omitted variables and measurement errors. We provide a general description of the statistical theory and a set of application on forested trees over France using databases of climate and forest inventory at different spatial resolution (from 2km to 8 km). We also compared the output of the SSDM with outputs of a classical SDM in term of bioclimatic response curves and potential distribution under current climate. According to the species and the spatial resolution of the calibration dataset, shapes of bioclimatic response curves the modelled species distribution maps differed markedly between the SSDM and classical SDMs. The magnitude and directions of these differences were dependent on the correlations between the errors from both equations and were highest for higher spatial resolutions. A first conclusion is that the use of classical SDMs can potentially lead to strong miss-estimation of the actual and future probability of presence modelled. Beyond this selection bias, the SSDM we propose represents a crucial step to account for economic constraints on tree

  20. Accounting comparability and the accuracy of peer-based valuation models

    NARCIS (Netherlands)

    Young, S.; Zeng, Y.

    2015-01-01

    We examine the link between enhanced accounting comparability and the valuation performance of pricing multiples. Using the warranted multiple method proposed by Bhojraj and Lee (2002, Journal of Accounting Research), we demonstrate how enhanced accounting comparability leads to better peer-based

  1. The cyclicality of loan loss provisions under three different accounting models: the United Kingdom, Spain, and Brazil

    Directory of Open Access Journals (Sweden)

    Antônio Maria Henri Beyle de Araújo

    2017-11-01

    Full Text Available ABSTRACT A controversy involving loan loss provisions in banks concerns their relationship with the business cycle. While international accounting standards for recognizing provisions (incurred loss model would presumably be pro-cyclical, accentuating the effects of the current economic cycle, an alternative model, the expected loss model, has countercyclical characteristics, acting as a buffer against economic imbalances caused by expansionary or contractionary phases in the economy. In Brazil, a mixed accounting model exists, whose behavior is not known to be pro-cyclical or countercyclical. The aim of this research is to analyze the behavior of these accounting models in relation to the business cycle, using an econometric model consisting of financial and macroeconomic variables. The study allowed us to identify the impact of credit risk behavior, earnings management, capital management, Gross Domestic Product (GDP behavior, and the behavior of the unemployment rate on provisions in countries that use different accounting models. Data from commercial banks in the United Kingdom (incurred loss, in Spain (expected loss, and in Brazil (mixed model were used, covering the period from 2001 to 2012. Despite the accounting models of the three countries being formed by very different rules regarding possible effects on the business cycles, the results revealed a pro-cyclical behavior of provisions in each country, indicating that when GDP grows, provisions tend to fall and vice versa. The results also revealed other factors influencing the behavior of loan loss provisions, such as earning management.

  2. KINERJA PENGELOLAAN LIMBAH HOTEL PESERTA PROPER DAN NON PROPER DI KABUPATEN BADUNG, PROVINSI BALI

    Directory of Open Access Journals (Sweden)

    Putri Nilakandi Perdanawati Pitoyo

    2016-07-01

    Full Text Available Bali tourism development can lead to positive and negative impacts that threatening environmental sustainability. This research evaluates the hotel performance of the waste management that includes management of waste water, emission, hazardous, and solid waste by hotel that participate at PROPER and non PROPER. Research using qualitative descriptive method. Not all of non PROPER doing test on waste water quality, chimney emissions quality, an inventory of hazardous waste and solid waste sorting. Wastewater discharge of PROPER hotels ranged from 290.9 to 571.8 m3/day and non PROPER ranged from 8.4 to 98.1 m3/day with NH3 parameter values that exceed the quality standards. The quality of chimney emissions were still below the quality standard. The volume of the hazardous waste of PROPER hotels ranged from 66.1 to 181.9 kg/month and non PROPER ranged from 5.003 to 103.42 kg/month. Hazardous waste from the PROPER hotel which has been stored in the TPS hazardous waste. The volume of the solid waste of PROPER hotel ranged from 342.34 to 684.54 kg/day and non PROPER ranged from 4.83 to 181.51 kg/day. The PROPER and non PROPER hotel not sort the solid waste. The hotel performance in term of wastewater management, emission, hazardous, and solid waste is better at the PROPER hotel compared to non PROPER participants.

  3. Investigation of a new model accounting for rotors of finite tip-speed ratio in yaw or tilt

    DEFF Research Database (Denmark)

    Branlard, Emmanuel; Gaunaa, Mac; Machefaux, Ewan

    2014-01-01

    The main results from a recently developed vortex model are implemented into a Blade Element Momentum(BEM) code. This implementation accounts for the effect of finite tip-speed ratio, an effect which was not considered in standard BEM yaw-models. The model and its implementation are presented. Data...

  4. Modelling of a mecanum wheel taking into account the geometry of road rollers

    Science.gov (United States)

    Hryniewicz, P.; Gwiazda, A.; Banaś, W.; Sękala, A.; Foit, K.

    2017-08-01

    During the process planning in a company one of the basic factors associated with the production costs is the operation time for particular technological jobs. The operation time consists of time units associated with the machining tasks of a workpiece as well as the time associated with loading and unloading and the transport operations of this workpiece between machining stands. Full automation of manufacturing in industry companies tends to a maximal reduction in machine downtimes, thereby the fixed costs simultaneously decreasing. The new construction of wheeled vehicles, using Mecanum wheels, reduces the transport time of materials and workpieces between machining stands. These vehicles have the ability to simultaneously move in two axes and thus more rapid positioning of the vehicle relative to the machining stand. The Mecanum wheel construction implies placing, around the wheel free rollers that are mounted at an angle 450, which allow the movement of the vehicle not only in its axis but also perpendicular thereto. The improper selection of the rollers can cause unwanted vertical movement of the vehicle, which may cause difficulty in positioning of the vehicle in relation to the machining stand and the need for stabilisation. Hence the proper design of the free rollers is essential in designing the whole Mecanum wheel construction. It allows avoiding the disadvantageous and unwanted vertical vibrations of a whole vehicle with these wheels. In the article the process of modelling the free rollers, in order to obtain the desired shape of unchanging, horizontal trajectory of the vehicle is presented. This shape depends on the desired diameter of the whole Mecanum wheel, together with the road rollers, and the width of the drive wheel. Another factor related with the curvature of the trajectory shape is the length of the road roller and its diameter decreases depending on the position with respect to its centre. The additional factor, limiting construction of

  5. A hill-type muscle model expansion accounting for effects of varying transverse muscle load.

    Science.gov (United States)

    Siebert, Tobias; Stutzig, Norman; Rode, Christian

    2018-01-03

    Recent studies demonstrated that uniaxial transverse loading (F G ) of a rat gastrocnemius medialis muscle resulted in a considerable reduction of maximum isometric muscle force (ΔF im ). A hill-type muscle model assuming an identical gearing G between both ΔF im and F G as well as lifting height of the load (Δh) and longitudinal muscle shortening (Δl CC ) reproduced experimental data for a single load. Here we tested if this model is able to reproduce experimental changes in ΔF im and Δh for increasing transverse loads (0.64 N, 1.13 N, 1.62 N, 2.11 N, 2.60 N). Three different gearing ratios were tested: (I) constant G c representing the idea of a muscle specific gearing parameter (e.g. predefined by the muscle geometry), (II) G exp determined in experiments with varying transverse load, and (III) G f that reproduced experimental ΔF im for each transverse load. Simulations using G c overestimated ΔF im (up to 59%) and Δh (up to 136%) for increasing load. Although the model assumption (equal G for forces and length changes) held for the three lower loads using G exp and G f , simulations resulted in underestimation of ΔF im by 38% and overestimation of Δh by 58% for the largest load, respectively. To simultaneously reproduce experimental ΔF im and Δh for the two larger loads, it was necessary to reduce F im by 1.9% and 4.6%, respectively. The model seems applicable to account for effects of muscle deformation within a range of transverse loading when using a linear load-dependent function for G. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. A mass-density model can account for the size-weight illusion

    Science.gov (United States)

    Bergmann Tiest, Wouter M.; Drewing, Knut

    2018-01-01

    When judging the heaviness of two objects with equal mass, people perceive the smaller and denser of the two as being heavier. Despite the large number of theories, covering bottom-up and top-down approaches, none of them can fully account for all aspects of this size-weight illusion and thus for human heaviness perception. Here we propose a new maximum-likelihood estimation model which describes the illusion as the weighted average of two heaviness estimates with correlated noise: One estimate derived from the object’s mass, and the other from the object’s density, with estimates’ weights based on their relative reliabilities. While information about mass can directly be perceived, information about density will in some cases first have to be derived from mass and volume. However, according to our model at the crucial perceptual level, heaviness judgments will be biased by the objects’ density, not by its size. In two magnitude estimation experiments, we tested model predictions for the visual and the haptic size-weight illusion. Participants lifted objects which varied in mass and density. We additionally varied the reliability of the density estimate by varying the quality of either visual (Experiment 1) or haptic (Experiment 2) volume information. As predicted, with increasing quality of volume information, heaviness judgments were increasingly biased towards the object’s density: Objects of the same density were perceived as more similar and big objects were perceived as increasingly lighter than small (denser) objects of the same mass. This perceived difference increased with an increasing difference in density. In an additional two-alternative forced choice heaviness experiment, we replicated that the illusion strength increased with the quality of volume information (Experiment 3). Overall, the results highly corroborate our model, which seems promising as a starting point for a unifying framework for the size-weight illusion and human heaviness

  7. Modeling the dynamic behavior of railway track taking into account the occurrence of defects in the system wheel-rail

    OpenAIRE

    Loktev Alexey; Sychev Vyacheslav; Gluzberg Boris; Gridasova Ekaterina

    2017-01-01

    This paper investigates the influence of wheel defects on the development of rail defects up to a state where rail prompt replacement becomes necessary taking into account different models of the dynamic contact between a wheel and a rail. In particular, the quasistatic Hertz model, the linear elastic model and the elastoplastic Aleksandrov-Kadomtsev model. Based on the model of the wheel-rail contact the maximum stresses are determined which take place in the rail in the presence of wheel de...

  8. Historical Account to the State of the Art in Debris Flow Modeling

    Science.gov (United States)

    Pudasaini, Shiva P.

    2013-04-01

    In this contribution, I present a historical account of debris flow modelling leading to the state of the art in simulations and applications. A generalized two-phase model is presented that unifies existing avalanche and debris flow theories. The new model (Pudasaini, 2012) covers both the single-phase and two-phase scenarios and includes many essential and observable physical phenomena. In this model, the solid-phase stress is closed by Mohr-Coulomb plasticity, while the fluid stress is modeled as a non-Newtonian viscous stress that is enhanced by the solid-volume-fraction gradient. A generalized interfacial momentum transfer includes viscous drag, buoyancy and virtual mass forces, and a new generalized drag force is introduced to cover both solid-like and fluid-like drags. Strong couplings between solid and fluid momentum transfer are observed. The two-phase model is further extended to describe the dynamics of rock-ice avalanches with new mechanical models. This model explains dynamic strength weakening and includes internal fluidization, basal lubrication, and exchanges of mass and momentum. The advantages of the two-phase model over classical (effectively single-phase) models are discussed. Advection and diffusion of the fluid through the solid are associated with non-linear fluxes. Several exact solutions are constructed, including the non-linear advection-diffusion of fluid, kinematic waves of debris flow front and deposition, phase-wave speeds, and velocity distribution through the flow depth and through the channel length. The new model is employed to study two-phase subaerial and submarine debris flows, the tsunami generated by the debris impact at lakes/oceans, and rock-ice avalanches. Simulation results show that buoyancy enhances flow mobility. The virtual mass force alters flow dynamics by increasing the kinetic energy of the fluid. Newtonian viscous stress substantially reduces flow deformation, whereas non-Newtonian viscous stress may change the

  9. ACCOUNTING HARMONIZATION AND HISTORICAL COST ACCOUNTING

    OpenAIRE

    Valentin Gabriel CRISTEA

    2017-01-01

    There is a huge interest in accounting harmonization and historical costs accounting, in what they offer us. In this article, different valuation models are discussed. Although one notices the movement from historical cost accounting to fair value accounting, each one has its advantages.

  10. ACCOUNTING HARMONIZATION AND HISTORICAL COST ACCOUNTING

    Directory of Open Access Journals (Sweden)

    Valentin Gabriel CRISTEA

    2017-05-01

    Full Text Available There is a huge interest in accounting harmonization and historical costs accounting, in what they offer us. In this article, different valuation models are discussed. Although one notices the movement from historical cost accounting to fair value accounting, each one has its advantages.

  11. Development of the Mathematical Model of Diesel Fuel Catalytic Dewaxing Process Taking into Account Factors of Nonstationarity

    Directory of Open Access Journals (Sweden)

    Frantsina Evgeniya

    2016-01-01

    Full Text Available The paper describes the results of mathematical modelling of diesel fuel catalytic dewaxing process, performed taking into account the factors of process nonstationarity driven by changes in process technological parameters, feedstock composition and catalyst deactivation. The error of hydrocarbon contents calculation via the use of the developed model does not exceed 1.6 wt.%. This makes it possible to apply the model for solution to optimization and forecasting problems occurred in catalytic systems under industrial conditions. It was shown through the model calculation that temperature in the dewaxing reactor without catalyst deactivation is lower by 19 °C than actual and catalyst deactivation degree accounts for 32 %.

  12. Accounting for spatial correlation errors in the assimilation of GRACE into hydrological models through localization

    Science.gov (United States)

    Khaki, M.; Schumacher, M.; Forootan, E.; Kuhn, M.; Awange, J. L.; van Dijk, A. I. J. M.

    2017-10-01

    Assimilation of terrestrial water storage (TWS) information from the Gravity Recovery And Climate Experiment (GRACE) satellite mission can provide significant improvements in hydrological modelling. However, the rather coarse spatial resolution of GRACE TWS and its spatially correlated errors pose considerable challenges for achieving realistic assimilation results. Consequently, successful data assimilation depends on rigorous modelling of the full error covariance matrix of the GRACE TWS estimates, as well as realistic error behavior for hydrological model simulations. In this study, we assess the application of local analysis (LA) to maximize the contribution of GRACE TWS in hydrological data assimilation. For this, we assimilate GRACE TWS into the World-Wide Water Resources Assessment system (W3RA) over the Australian continent while applying LA and accounting for existing spatial correlations using the full error covariance matrix. GRACE TWS data is applied with different spatial resolutions including 1° to 5° grids, as well as basin averages. The ensemble-based sequential filtering technique of the Square Root Analysis (SQRA) is applied to assimilate TWS data into W3RA. For each spatial scale, the performance of the data assimilation is assessed through comparison with independent in-situ ground water and soil moisture observations. Overall, the results demonstrate that LA is able to stabilize the inversion process (within the implementation of the SQRA filter) leading to less errors for all spatial scales considered with an average RMSE improvement of 54% (e.g., 52.23 mm down to 26.80 mm) for all the cases with respect to groundwater in-situ measurements. Validating the assimilated results with groundwater observations indicates that LA leads to 13% better (in terms of RMSE) assimilation results compared to the cases with Gaussian errors assumptions. This highlights the great potential of LA and the use of the full error covariance matrix of GRACE TWS

  13. Short-run analysis of fiscal policy and the current account in a finite horizon model

    OpenAIRE

    Heng-fu Zou

    1995-01-01

    This paper utilizes a technique developed by Judd to quantify the short-run effects of fiscal policies and income shocks on the current account in a small open economy. It is found that: (1) a future increase in government spending improves the short-run current account; (2) a future tax increase worsens the short-run current account; (3) a present increase in the government spending worsens the short-run current account dollar by dollar, while a present increase in the income improves the cu...

  14. Computation of Asteroid Proper Elements: Recent Advances

    Science.gov (United States)

    Knežević, Z.

    2017-12-01

    The recent advances in computation of asteroid proper elements are briefly reviewed. Although not representing real breakthroughs in computation and stability assessment of proper elements, these advances can still be considered as important improvements offering solutions to some practical problems encountered in the past. The problem of getting unrealistic values of perihelion frequency for very low eccentricity orbits is solved by computing frequencies using the frequency-modified Fourier transform. The synthetic resonant proper elements adjusted to a given secular resonance helped to prove the existence of Astraea asteroid family. The preliminary assessment of stability with time of proper elements computed by means of the analytical theory provides a good indication of their poorer performance with respect to their synthetic counterparts, and advocates in favor of ceasing their regular maintenance; the final decision should, however, be taken on the basis of more comprehensive and reliable direct estimate of their individual and sample average deviations from constancy.

  15. Computation of asteroid proper elements: Recent advances

    Directory of Open Access Journals (Sweden)

    Knežević Z.

    2017-01-01

    Full Text Available The recent advances in computation of asteroid proper elements are briefly reviewed. Although not representing real breakthroughs in computation and stability assessment of proper elements, these advances can still be considered as important improvements offering solutions to some practical problems encountered in the past. The problem of getting unrealistic values of perihelion frequency for very low eccentricity orbits is solved by computing frequencies using the frequencymodified Fourier transform. The synthetic resonant proper elements adjusted to a given secular resonance helped to prove the existence of Astraea asteroid family. The preliminary assessment of stability with time of proper elements computed by means of the analytical theory provides a good indication of their poorer performance with respect to their synthetic counterparts, and advocates in favor of ceasing their regular maintenance; the final decision should, however, be taken on the basis of more comprehensive and reliable direct estimate of their individual and sample average deviations from constancy.

  16. ASTEROID PROPER ELEMENTS V1.0

    Data.gov (United States)

    National Aeronautics and Space Administration — Proper elements of asteroids are derived from the osculating orbital elements by correcting for the perturbations of the major planets to arrive at elements which...

  17. A near-real-time material accountancy model and its preliminary demonstration in the Tokai reprocessing plant

    International Nuclear Information System (INIS)

    Ikawa, K.; Ihara, H.; Nishimura, H.; Tsutsumi, M.; Sawahata, T.

    1983-01-01

    The study of a near-real-time (n.r.t.) material accountancy system as applied to small or medium-sized spent fuel reprocessing facilities has been carried out since 1978 under the TASTEX programme. In this study, a model of the n.r.t. accountancy system, called the ten-day-detection-time model, was developed and demonstrated in the actual operating plant. The programme was closed on May 1981, but the study has been extended. The effectiveness of the proposed n.r.t. accountancy model was evaluated by means of simulation techniques. The results showed that weekly material balances covering the entire process MBA could provide sufficient information to satisfy the IAEA guidelines for small or medium-sized facilities. The applicability of the model to the actual plant has been evaluated by a series of field tests which covered four campaigns. In addition to the material accountancy data, many valuable operational data with regard to additional locations for an in-process inventory, the time needed for an in-process inventory, etc., have been obtained. A CUMUF (cumulative MUF) chart of the resulting MUF data in the C-1 and C-2 campaigns clearly showed that there had been a measurement bias across the process MBA. This chart gave a dramatic picture of the power of the n.r.t. accountancy concept by showing the nature of this bias, which was not clearly shown in the conventional material accountancy data. (author)

  18. Development of accounting quality management system

    Directory of Open Access Journals (Sweden)

    Plakhtii T.F.

    2017-08-01

    Full Text Available Accounting organization as one of the types of practical activities at the enterprise involves organization of the process of implementation of various kinds of accounting procedures to ensure meeting needs of the users of accounting information. Therefore, to improve its quality an owner should use tools, methods and procedures that enable to improve the quality of implementation of accounting methods and technology. The necessity of using a quality management system for the improvement of accounting organization at the enterprise is substantiated. The system of accounting quality management is developed and grounded in the context of ISO 9001:2015, which includes such processes as the processes of the accounting system, leadership, planning, and evaluation. On the basis of specification and justification of the set of universal requirements (content requirements, formal requirements the model of the environment of demands for high-quality organization of the computerized accounting system that improves the process of preparing high quality financial statements is developed. In order to improve the system of accounting quality management, to justify the main objectives of its further development, namely elimination of unnecessary characteristics of accounting information, the differences between the current level of accounting information quality and its perfect level are considered; the meeting of new needs of users of accounting information that have not been satisfied yet. The ways of leadership demonstration in the system of accounting quality management of accounting subjects at the enterprise are substantiated. The relationship between the current level of accounting information quality and its perfect level is considered. The possible types of measures aimed at improving the system of accounting quality management are identified. The paper grounds the need to include the principle of proper management in the current set of accounting

  19. On the Determination of Proper Time

    OpenAIRE

    Hurl, Bing; Zhang, Zhi-Yong Wang Hai-Dong

    1998-01-01

    Through the analysis of the definition of the duration of proper time of a particle given by the length of its world line, we show that there is no transitivity of the coordinate time function derived from the definition, so there exists an ambiguity in the determination of the duration of the proper time for the particle. Its physical consequence is illustrated with quantum measurement effect.

  20. A mathematical multiscale model of bone remodeling, accounting for pore space-specific mechanosensation.

    Science.gov (United States)

    Pastrama, Maria-Ioana; Scheiner, Stefan; Pivonka, Peter; Hellmich, Christian

    2018-02-01

    While bone tissue is a hierarchically organized material, mathematical formulations of bone remodeling are often defined on the level of a millimeter-sized representative volume element (RVE), "smeared" over all types of bone microstructures seen at lower observation scales. Thus, there is no explicit consideration of the fact that the biological cells and biochemical factors driving bone remodeling are actually located in differently sized pore spaces: active osteoblasts and osteoclasts can be found in the vascular pores, whereas the lacunar pores host osteocytes - bone cells originating from former osteoblasts which were then "buried" in newly deposited extracellular bone matrix. We here propose a mathematical description which considers size and shape of the pore spaces where the biological and biochemical events take place. In particular, a previously published systems biology formulation, accounting for biochemical regulatory mechanisms such as the rank-rankl-opg pathway, is cast into a multiscale framework coupled to a poromicromechanical model. The latter gives access to the vascular and lacunar pore pressures arising from macroscopic loading. Extensive experimental data on the biological consequences of this loading strongly suggest that the aforementioned pore pressures, together with the loading frequency, are essential drivers of bone remodeling. The novel approach presented here allows for satisfactory simulation of the evolution of bone tissue under various loading conditions, and for different species; including scenarios such as mechanical dis- and overuse of murine and human bone, or in osteocyte-free bone. Copyright © 2017 Elsevier Inc. All rights reserved.

  1. Modeling Occupancy of Hosts by Mistletoe Seeds after Accounting for Imperfect Detectability

    Science.gov (United States)

    Fadini, Rodrigo F.; Cintra, Renato

    2015-01-01

    The detection of an organism in a given site is widely used as a state variable in many metapopulation and epidemiological studies. However, failure to detect the species does not necessarily mean that it is absent. Assessing detectability is important for occupancy (presence—absence) surveys; and identifying the factors reducing detectability may help improve survey precision and efficiency. A method was used to estimate the occupancy status of host trees colonized by mistletoe seeds of Psittacanthus plagiophyllus as a function of host covariates: host size and presence of mistletoe infections on the same or on the nearest neighboring host (the cashew tree Anacardium occidentale). The technique also evaluated the effect of taking detectability into account for estimating host occupancy by mistletoe seeds. Individual host trees were surveyed for presence of mistletoe seeds with the aid of two or three observers to estimate detectability and occupancy. Detectability was, on average, 17% higher in focal-host trees with infected neighbors, while decreased about 23 to 50% from smallest to largest hosts. The presence of mistletoe plants in the sample tree had negligible effect on detectability. Failure to detect hosts as occupied decreased occupancy by 2.5% on average, with maximum of 10% for large and isolated hosts. The method presented in this study has potential for use with metapopulation studies of mistletoes, especially those focusing on the seed stage, but also as improvement of accuracy in occupancy models estimates often used for metapopulation dynamics of tree-dwelling plants in general. PMID:25973754

  2. Associative account of self-cognition: extended forward model and multi-layer structure

    Directory of Open Access Journals (Sweden)

    Motoaki eSugiura

    2013-08-01

    Full Text Available The neural correlates of self identified by neuroimaging studies differ depending on which aspects of self are addressed. Here, three categories of self are proposed based on neuroimaging findings and an evaluation of the likely underlying cognitive processes. The physical self, representing self-agency of action, body ownership, and bodily self-recognition, is supported by the sensory and motor association cortices located primarily in the right hemisphere. The interpersonal self, representing the attention or intentions of others directed at the self, is supported by several amodal association cortices in the dorsomedial frontal and lateral posterior cortices. The social self, representing the self as a collection of context-dependent social values, is supported by the ventral aspect of the medial prefrontal cortex and the posterior cingulate cortex. Despite differences in the underlying cognitive processes and neural substrates, all three categories of self are likely to share the computational characteristics of the forward model, which is underpinned by internal schema or learned associations between one’s behavioral output and the consequential input. Additionally, these three categories exist within a hierarchical layer structure based on developmental processes that updates the schema through the attribution of prediction error. In this account, most of the association cortices critically contribute to some aspect of the self through associative learning while the primary regions involved shift from the lateral to the medial cortices in a sequence from the physical to the interpersonal to the social self.

  3. Taking into account hydrological modelling uncertainty in Mediterranean flash-floods forecasting

    Science.gov (United States)

    Edouard, Simon; Béatrice, Vincendon; Véronique, Ducrocq

    2015-04-01

    Title : Taking into account hydrological modelling uncertainty in Mediterranean flash-floods forecasting Authors : Simon EDOUARD*, Béatrice VINCENDON*, Véronique Ducrocq* * : GAME/CNRM(Météo-France, CNRS)Toulouse,France Mediterranean intense weather events often lead to devastating flash-floods (FF). Increasing the lead time of FF forecasts would permit to better anticipate their catastrophic consequences. These events are one part of Mediterranean hydrological cycle. HyMeX (HYdrological cycle in the Mediterranean EXperiment) aims at a better understanding and quantification of the hydrological cycle and related processes in the Mediterranean. In order to get a lot of data, measurement campaigns were conducted. The first special observing period (SOP1) of these campaigns, served as a test-bed for a real-time hydrological ensemble prediction system (HEPS) dedicated to FF forecasting. It produced an ensemble of quantitative discharge forecasts (QDF) using the ISBA-TOP system. ISBATOP is a coupling between the surface scheme ISBA and a version of TOPMODEL dedicated to Mediterranean fast responding rivers. ISBA-TOP was driven with several quantitative precipitation forecasts (QPF) ensembles based on AROME atmospheric convection-permitting model. This permitted to take into account the uncertainty that affects QPF and that propagates up to the QDF. This uncertainty is major for discharge forecasting especially in the case of Mediterranean flash-floods. But other sources of uncertainty need to be sampled in HEPS systems. One of them is inherent to the hydrological modelling. The ISBA-TOP coupled system has been improved since the initial version, that was used for instance during Hymex SOP1. The initial ISBA-TOP consisted into coupling a TOPMODEL approach with ISBA-3L, which represented the soil stratification with 3 layers. The new version consists into coupling the same TOPMODEL approach with a version of ISBA where more than ten layers describe the soil vertical

  4. Design of a Competency-Based Assessment Model in the Field of Accounting

    Science.gov (United States)

    Ciudad-Gómez, Adelaida; Valverde-Berrocoso, Jesús

    2012-01-01

    This paper presents the phases involved in the design of a methodology to contribute both to the acquisition of competencies and to their assessment in the field of Financial Accounting, within the European Higher Education Area (EHEA) framework, which we call MANagement of COMpetence in the areas of Accounting (MANCOMA). Having selected and…

  5. Cost accounting at GKSS

    International Nuclear Information System (INIS)

    Hinz, R.

    1979-01-01

    The GKSS has a cost accounting system comprising cost type, cost centre and cost unit accounting which permits of a comprehensive and detailed supervision of the accural of costs and use of funds, makes price setting for outside orders possible and provides the necessary data for decision-making and planning. It fulfills the requirement for an ordered accounting system; it is therefore guaranteed that there exists between financial accounts department and cost accounting a proper demarcation and transition, that costs are accounted fully only on the basis of vouchers and only once, evaluation and distribution are unified and the principle of causation is observed. Two employees are engaged in costs and services accounting. Although we strive to effect adaptations as swiftly as possible, and constantly to adapt refinements and supplementary processes for the improvement of the system, this can only occur within the scope of, and with the exactitude necessary for the required information. (author)

  6. Pharmacokinetic Modeling of Manganese III. Physiological Approaches Accounting for Background and Tracer Kinetics

    Energy Technology Data Exchange (ETDEWEB)

    Teeguarden, Justin G.; Gearhart, Jeffrey; Clewell, III, H. J.; Covington, Tammie R.; Nong, Andy; Anderson, Melvin E.

    2007-01-01

    assessments (Dixit et al., 2003). With most exogenous compounds, there is often no background exposure and body concentrations are not under active control from homeostatic processes as occurs with essential nutrients. Any complete Mn PBPK model would include the homeostatic regulation as an essential nutritional element and the additional exposure routes by inhalation. Two companion papers discuss the kinetic complexities of the quantitative dose-dependent alterations in hepatic and intestinal processes that control uptake and elimination of Mn (Teeguarden et al., 2006a, b). Radioactive 54Mn has been to investigate the behavior of the more common 55Mn isotope in the body because the distribution and elimination of tracer doses reflects the overall distributional characteristics of Mn. In this paper, we take the first steps in developing a multi-route PBPK model for Mn. Here we develop a PBPK model to account for tissue concentrations and tracer kinetics of Mn under normal dietary intake. This model for normal levels of Mn will serve as the starting point for more complete model descriptions that include dose-dependencies in both oral uptake and and biliary excretion. Material and Methods Experimental Data Two studies using 54Mn tracer were employed in model development. (Furchner et al. 1966; Wieczorek and Oberdorster 1989). In Furchner et al. (1966) male Sprague-Dawley rats received an ip injection of carrier-free 54MnCl2 while maintained on standard rodent feed containing ~ 45 ppm Mn. Tissue radioactivity of 54Mn was measured by liquid scintillation counting between post injection days 1 to 89 and reported as percent of administered dose per kg tissue. 54Mn time courses were reported for liver, kidney, bone, brain, muscle, blood, lung and whole body. Because ip uptake is via the portal circulation to the liver, this data set had information on distribution and clearance behaviors of Mn entering the systemic circulation from liver.

  7. Materials measurement and accounting in an operating plutonium conversion and purification process. Phase I. Process modeling and simulation

    International Nuclear Information System (INIS)

    Thomas, C.C. Jr.; Ostenak, C.A.; Gutmacher, R.G.; Dayem, H.A.; Kern, E.A.

    1981-04-01

    A model of an operating conversion and purification process for the production of reactor-grade plutonium dioxide was developed as the first component in the design and evaluation of a nuclear materials measurement and accountability system. The model accurately simulates process operation and can be used to identify process problems and to predict the effect of process modifications

  8. Fracture criteria under creep with strain history taken into account, and long-term strength modelling

    Science.gov (United States)

    Khokhlov, A. V.

    2009-08-01

    In the present paper, we continue to study the nonlinear constitutive relation (CR) between the stress and strain proposed in [1] to describe one-dimensional isothermal rheological processes in the case of monotone variation of the strain (in particular, relaxation, creep, plasticity, and superplasticity). We show that this CR together with the strain fracture criterion (FC) leads to theoretical long-term strength curves (LSC) with the same qualitative properties as the typical experimental LSC of viscoelastoplastic materials. We propose two parametric families of fracture criteria in the case of monotone uniaxial strain, which are related to the strain fracture criterion (SFC) but take into account the strain increase history and the dependence of the critical strain on the stress. Instead of the current strain, they use other measures of damage related to the strain history by time-dependent integral operators. For any values of the material parameters, analytic studies of these criteria allowed us to find several useful properties, which confirm that they can be used to describe the creep fracture of different materials. In particular, we prove that, together with the proposed constitutive relations, these FC lead to theoretical long-term strength curves (TLSC) with the same qualitative properties as the experimental LSC. It is important that each of the constructed families of FC forms a monotone and continuous scale of criteria (monotonously and continuously depending on a real parameter) that contains the SFC as the limit case. Moreover, the criteria in the first family always provide the fracture time greater than that given by the SFC, the criteria in the second family always provide a smaller fracture time, and the difference can be made arbitrarily small by choosing the values of the control parameter near the scale end. This property is very useful in finding a more accurate adjustment of the model to the existing experimental data describing the

  9. 25 CFR 87.12 - Insuring the proper performance of approved plans.

    Science.gov (United States)

    2010-04-01

    ... 25 Indians 1 2010-04-01 2010-04-01 false Insuring the proper performance of approved plans. 87.12... DISTRIBUTION OF INDIAN JUDGMENT FUNDS § 87.12 Insuring the proper performance of approved plans. A timetable... regarding the maintenance of the timetable, a full accounting of any per capita distribution, and the...

  10. Accounting for water management issues within hydrological simulation: Alternative modelling options and a network optimization approach

    Science.gov (United States)

    Efstratiadis, Andreas; Nalbantis, Ioannis; Rozos, Evangelos; Koutsoyiannis, Demetris

    2010-05-01

    In mixed natural and artificialized river basins, many complexities arise due to anthropogenic interventions in the hydrological cycle, including abstractions from surface water bodies, groundwater pumping or recharge and water returns through drainage systems. Typical engineering approaches adopt a multi-stage modelling procedure, with the aim to handle the complexity of process interactions and the lack of measured abstractions. In such context, the entire hydrosystem is separated into natural and artificial sub-systems or components; the natural ones are modelled individually, and their predictions (i.e. hydrological fluxes) are transferred to the artificial components as inputs to a water management scheme. To account for the interactions between the various components, an iterative procedure is essential, whereby the outputs of the artificial sub-systems (i.e. abstractions) become inputs to the natural ones. However, this strategy suffers from multiple shortcomings, since it presupposes that pure natural sub-systems can be located and that sufficient information is available for each sub-system modelled, including suitable, i.e. "unmodified", data for calibrating the hydrological component. In addition, implementing such strategy is ineffective when the entire scheme runs in stochastic simulation mode. To cope with the above drawbacks, we developed a generalized modelling framework, following a network optimization approach. This originates from the graph theory, which has been successfully implemented within some advanced computer packages for water resource systems analysis. The user formulates a unified system which is comprised of the hydrographical network and the typical components of a water management network (aqueducts, pumps, junctions, demand nodes etc.). Input data for the later include hydraulic properties, constraints, targets, priorities and operation costs. The real-world system is described through a conceptual graph, whose dummy properties

  11. Process Accounting

    OpenAIRE

    Gilbertson, Keith

    2002-01-01

    Standard utilities can help you collect and interpret your Linux system's process accounting data. Describes the uses of process accounting, standard process accounting commands, and example code that makes use of process accounting utilities.

  12. Accountability and non-proliferation nuclear regime: a review of the mutual surveillance Brazilian-Argentine model for nuclear safeguards

    International Nuclear Information System (INIS)

    Xavier, Roberto Salles

    2014-01-01

    The regimes of accountability, the organizations of global governance and institutional arrangements of global governance of nuclear non-proliferation and of Mutual Vigilance Brazilian-Argentine of Nuclear Safeguards are the subject of research. The starting point is the importance of the institutional model of global governance for the effective control of non-proliferation of nuclear weapons. In this context, the research investigates how to structure the current arrangements of the international nuclear non-proliferation and what is the performance of model Mutual Vigilance Brazilian-Argentine of Nuclear Safeguards in relation to accountability regimes of global governance. For that, was searched the current literature of three theoretical dimensions: accountability, global governance and global governance organizations. In relation to the research method was used the case study and the treatment technique of data the analysis of content. The results allowed: to establish an evaluation model based on accountability mechanisms; to assess how behaves the model Mutual Vigilance Brazilian-Argentine Nuclear Safeguards front of the proposed accountability regime; and to measure the degree to which regional arrangements that work with systems of global governance can strengthen these international systems. (author)

  13. Spinfoam cosmology with the proper vertex amplitude

    Science.gov (United States)

    Vilensky, Ilya

    2017-11-01

    The proper vertex amplitude is derived from the Engle-Pereira-Rovelli-Livine vertex by restricting to a single gravitational sector in order to achieve the correct semi-classical behaviour. We apply the proper vertex to calculate a cosmological transition amplitude that can be viewed as the Hartle-Hawking wavefunction. To perform this calculation we deduce the integral form of the proper vertex and use extended stationary phase methods to estimate the large-volume limit. We show that the resulting amplitude satisfies an operator constraint whose classical analogue is the Hamiltonian constraint of the Friedmann-Robertson-Walker cosmology. We find that the constraint dynamically selects the relevant family of coherent states and demonstrate a similar dynamic selection in standard quantum mechanics. We investigate the effects of dynamical selection on long-range correlations.

  14. Asteroid proper elements and secular resonances

    Science.gov (United States)

    Knezevic, Zoran; Milani, Andrea

    1992-01-01

    In a series of papers (e.g., Knezevic, 1991; Milani and Knezevic, 1990; 1991) we reported on the progress we were making in computing asteroid proper elements, both as regards their accuracy and long-term stability. Additionally, we reported on the efficiency and 'intelligence' of our software. At the same time, we studied the associated problems of resonance effects, and we introduced the new class of 'nonlinear' secular resonances; we determined the locations of these secular resonances in proper-element phase space and analyzed their impact on the asteroid family classification. Here we would like to summarize the current status of our work and possible further developments.

  15. Proper generalized decompositions an introduction to computer implementation with Matlab

    CERN Document Server

    Cueto, Elías; Alfaro, Icíar

    2016-01-01

    This book is intended to help researchers overcome the entrance barrier to Proper Generalized Decomposition (PGD), by providing a valuable tool to begin the programming task. Detailed Matlab Codes are included for every chapter in the book, in which the theory previously described is translated into practice. Examples include parametric problems, non-linear model order reduction and real-time simulation, among others. Proper Generalized Decomposition (PGD) is a method for numerical simulation in many fields of applied science and engineering. As a generalization of Proper Orthogonal Decomposition or Principal Component Analysis to an arbitrary number of dimensions, PGD is able to provide the analyst with very accurate solutions for problems defined in high dimensional spaces, parametric problems and even real-time simulation. .

  16. [Optimization of ecological footprint model based on environmental pollution accounts: a case study in Pearl River Delta urban agglomeration].

    Science.gov (United States)

    Bai, Yu; Zeng, Hui; Wei, Jian-bing; Zhang, Wen-juan; Zhao, Hong-wei

    2008-08-01

    To solve the problem of ignoring the calculation of environment pollution in traditional ecological footprint model accounts, this paper put forward an optimized ecological footprint (EF) model, taking the pollution footprint into account. In the meantime, the environmental capacity's calculation was also added into the system of ecological capacity, and further used to do ecological assessment of Pearl River Delta urban agglomeration in 2005. The results showed a perfect inosculation between the ecological footprint and the development characteristics and spatial pattern, and illustrated that the optimized EF model could make a better orientation for the environmental pollution in the system, and also, could roundly explain the environmental effects of human activity. The optimization of ecological footprint model had better integrality and objectivity than traditional models.

  17. On Risk Charges and Shadow Account Options in Pension Funds

    DEFF Research Database (Denmark)

    Jørgensen, Peter Løchte; Gatzert, Nadine

    2015-01-01

    to equityholders and our paper develops a model in which the influence of risk charges and shadow account options on stakeholders’ value can be quantified and studied. Our numerical results show that the value of shadow account options can be significant and thus come at the risk of expropriating policyholder...... wealth. However, our analysis also shows that this risk can be remedied if proper attention is given to the specific contract design and to the fixing of fair contract parameters at the outset....

  18. Strategy Guideline. Proper Water Heater Selection

    Energy Technology Data Exchange (ETDEWEB)

    Hoeschele, M. [Alliance for Residential Building Innovation (ARBI), Davis, CA (United States); Springer, D. [Alliance for Residential Building Innovation (ARBI), Davis, CA (United States); German, A. [Alliance for Residential Building Innovation (ARBI), Davis, CA (United States); Staller, J. [Alliance for Residential Building Innovation (ARBI), Davis, CA (United States); Zhang, Y. [Alliance for Residential Building Innovation (ARBI), Davis, CA (United States)

    2015-04-09

    This Strategy Guideline on proper water heater selection was developed by the Building America team Alliance for Residential Building Innovation to provide step-by-step procedures for evaluating preferred cost-effective options for energy efficient water heater alternatives based on local utility rates, climate, and anticipated loads.

  19. Strategy Guideline: Proper Water Heater Selection

    Energy Technology Data Exchange (ETDEWEB)

    Hoeschele, M. [Alliance for Residential Building Innovation, Davis, CA (United States); Springer, D. [Alliance for Residential Building Innovation, Davis, CA (United States); German, A. [Alliance for Residential Building Innovation, Davis, CA (United States); Staller, J. [Alliance for Residential Building Innovation, Davis, CA (United States); Zhang, Y. [Alliance for Residential Building Innovation, Davis, CA (United States)

    2015-04-01

    This Strategy Guideline on proper water heater selection was developed by the Building America team Alliance for Residential Building Innovation to provide step-by-step procedures for evaluating preferred cost-effective options for energy efficient water heater alternatives based on local utility rates, climate, and anticipated loads.

  20. Improved Industrial Development In Nigeria Through Proper ...

    African Journals Online (AJOL)

    The paper noted that most industrial development strategies in Nigeria did not give attention to technology education. And that technology education as recognized by few of the strategies were not only properly articulated for the tertiary institutions, but also poorly implemented. Therefore, to put technology and thus ...

  1. Archetypes: the PropeR way

    NARCIS (Netherlands)

    van der Linden, Helma; Grimson, Jane; Tange, Huibert; Talmon, Jan; Hasman, Arie

    2004-01-01

    The PropeR project studies the effect of Decision Support in an Electronic Health Record system (EHR) on the quality of care. One of the applications supports a multidisciplinary primary care team rehabilitating stroke patients in their home environment. This project required an EHR system that

  2. Dynamical systems of proper characteristic 0

    International Nuclear Information System (INIS)

    Ahmad, K.H.; Hamoui, A.

    1991-07-01

    Flows with orbits of proper characteristics 0 exhibit recurrent behaviour, a feature of basic importance in the description of their dynamics. Here, we analyze flows with such orbits relating them with recurrent flows and with flows that exhibit orbital, Poisson or Lagrange stability. (author). 11 refs

  3. Closing the Gaps : Taking into Account the Effects of Heat stress and Fatique Modeling in an Operational Analysis

    NARCIS (Netherlands)

    Woodill, G.; Barbier, R.R.; Fiamingo, C.

    2010-01-01

    Traditional, combat model based analysis of Dismounted Combatant Operations (DCO) has focused on the ‘lethal’ aspects in an engagement, and to a limited extent the environment in which the engagement takes place. These are however only two of the factors that should be taken into account when

  4. Accounting for Test Variability through Sizing Local Domains in Sequential Design Optimization with Concurrent Calibration-Based Model Validation

    Science.gov (United States)

    2013-08-01

    release. 1 Proceedings of IDETC/ CIE 2013 ASME 2013 International Design Engineering Technical Conferences & Computers and Information in Engineering...in Sequential Design Optimization with Concurrent Calibration-Based Model Validation Dorin Drignei 1 Mathematics and Statistics Department...insufficient to achieve the desired validity level . In this paper, we introduce a technique to determine the number of tests required to account for their

  5. Structural equation models using partial least squares: an example of the application of SmartPLS® in accounting research

    Directory of Open Access Journals (Sweden)

    João Carlos Hipólito Bernardes do Nascimento

    2016-08-01

    Full Text Available In view of the Accounting academy’s increasing in the investigation of latent phenomena, researchers have used robust multivariate techniques. Although Structural Equation Models are frequently used in the international literature, however, the Accounting academy has made little use of the variant based on Partial Least Squares (PLS-SEM, mostly due to lack of knowledge on the applicability and benefits of its use for Accounting research. Even if the PLS-SEM approach is regularly used in surveys, this method is appropriate to model complex relations with multiple relationships of dependence and independence between latent variables. In that sense, it is very useful for application in experiments and file data. In that sense, a literature review is presented of Accounting studies that used the PLS-SEM technique. Next, as no specific publications were observed that exemplified the application of the technique in Accounting, a PLS-SEM application is developed to encourage exploratory research by means of the software SmartPLS®, being particularly useful to graduate students. Therefore, the main contribution of this article is methodological, given its objective to clearly identify the guidelines for the appropriate use of PLS. By presenting an example of how to conduct an exploratory research using PLS-SEM, the intention is to contribute to researchers’ enhanced understanding of how to use and report on the technique in their research.

  6. Accounting Department Chairpersons' Perceptions of Business School Performance Using a Market Orientation Model

    Science.gov (United States)

    Webster, Robert L.; Hammond, Kevin L.; Rothwell, James C.

    2013-01-01

    This manuscript is part of a stream of continuing research examining market orientation within higher education and its potential impact on organizational performance. The organizations researched are business schools and the data collected came from chairpersons of accounting departments of AACSB member business schools. We use a reworded Narver…

  7. 76 FR 29249 - Medicare Program; Pioneer Accountable Care Organization Model: Request for Applications

    Science.gov (United States)

    2011-05-20

    ... application process and selection criteria are described in Section IV of the Request for Applications but in... suppliers with a mechanism for shared governance that have formed an Accountable Care Organization (ACO..., leadership, and commitment to outcomes-based contracts with non- Medicare purchasers. Final selection will be...

  8. A regional-scale, high resolution dynamical malaria model that accounts for population density, climate and surface hydrology.

    Science.gov (United States)

    Tompkins, Adrian M; Ermert, Volker

    2013-02-18

    The relative roles of climate variability and population related effects in malaria transmission could be better understood if regional-scale dynamical malaria models could account for these factors. A new dynamical community malaria model is introduced that accounts for the temperature and rainfall influences on the parasite and vector life cycles which are finely resolved in order to correctly represent the delay between the rains and the malaria season. The rainfall drives a simple but physically based representation of the surface hydrology. The model accounts for the population density in the calculation of daily biting rates. Model simulations of entomological inoculation rate and circumsporozoite protein rate compare well to data from field studies from a wide range of locations in West Africa that encompass both seasonal endemic and epidemic fringe areas. A focus on Bobo-Dioulasso shows the ability of the model to represent the differences in transmission rates between rural and peri-urban areas in addition to the seasonality of malaria. Fine spatial resolution regional integrations for Eastern Africa reproduce the malaria atlas project (MAP) spatial distribution of the parasite ratio, and integrations for West and Eastern Africa show that the model grossly reproduces the reduction in parasite ratio as a function of population density observed in a large number of field surveys, although it underestimates malaria prevalence at high densities probably due to the neglect of population migration. A new dynamical community malaria model is publicly available that accounts for climate and population density to simulate malaria transmission on a regional scale. The model structure facilitates future development to incorporate migration, immunity and interventions.

  9. Limited-memory adaptive snapshot selection for proper orthogonal decomposition

    Energy Technology Data Exchange (ETDEWEB)

    Oxberry, Geoffrey M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Kostova-Vassilevska, Tanya [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Arrighi, Bill [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Chand, Kyle [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-04-02

    Reduced order models are useful for accelerating simulations in many-query contexts, such as optimization, uncertainty quantification, and sensitivity analysis. However, offline training of reduced order models can have prohibitively expensive memory and floating-point operation costs in high-performance computing applications, where memory per core is limited. To overcome this limitation for proper orthogonal decomposition, we propose a novel adaptive selection method for snapshots in time that limits offline training costs by selecting snapshots according an error control mechanism similar to that found in adaptive time-stepping ordinary differential equation solvers. The error estimator used in this work is related to theory bounding the approximation error in time of proper orthogonal decomposition-based reduced order models, and memory usage is minimized by computing the singular value decomposition using a single-pass incremental algorithm. Results for a viscous Burgers’ test problem demonstrate convergence in the limit as the algorithm error tolerances go to zero; in this limit, the full order model is recovered to within discretization error. The resulting method can be used on supercomputers to generate proper orthogonal decomposition-based reduced order models, or as a subroutine within hyperreduction algorithms that require taking snapshots in time, or within greedy algorithms for sampling parameter space.

  10. A simple model to quantitatively account for periodic outbreaks of the measles in the Dutch Bible Belt

    Science.gov (United States)

    Bier, Martin; Brak, Bastiaan

    2015-04-01

    In the Netherlands there has been nationwide vaccination against the measles since 1976. However, in small clustered communities of orthodox Protestants there is widespread refusal of the vaccine. After 1976, three large outbreaks with about 3000 reported cases of the measles have occurred among these orthodox Protestants. The outbreaks appear to occur about every twelve years. We show how a simple Kermack-McKendrick-like model can quantitatively account for the periodic outbreaks. Approximate analytic formulae to connect the period, size, and outbreak duration are derived. With an enhanced model we take the latency period in account. We also expand the model to follow how different age groups are affected. Like other researchers using other methods, we conclude that large scale underreporting of the disease must occur.

  11. Accounting for Slipping and Other False Negatives in Logistic Models of Student Learning

    Science.gov (United States)

    MacLellan, Christopher J.; Liu, Ran; Koedinger, Kenneth R.

    2015-01-01

    Additive Factors Model (AFM) and Performance Factors Analysis (PFA) are two popular models of student learning that employ logistic regression to estimate parameters and predict performance. This is in contrast to Bayesian Knowledge Tracing (BKT) which uses a Hidden Markov Model formalism. While all three models tend to make similar predictions,…

  12. Assessing patient awareness of proper hand hygiene.

    Science.gov (United States)

    Busby, Sunni R; Kennedy, Bryan; Davis, Stephanie C; Thompson, Heather A; Jones, Jan W

    2015-05-01

    The authors hypothesized that patients may not understand the forms of effective hand hygiene employed in the hospital environment. Multiple studies demonstrate the importance of hand hygiene in reducing healthcare-associated infections (HAIs). Extensive research about how to improve compliance has been conducted. Patients' perceptions of proper hand hygiene were evaluated when caregivers used soap and water, waterless hand cleaner, or a combination of these. No significant differences were observed, but many patients reported they did not notice whether their providers cleaned their hands. Educating patients and their caregivers about the protection afforded by proper, consistent hand hygiene practices is important. Engaging patients to monitor healthcare workers may increase compliance, reduce the spread of infection, and lead to better overall patient outcomes. This study revealed a need to investigate the effects of patient education on patient perceptions of hand hygiene. Results of this study appear to indicate a need to focus on patient education and the differences between soap and water versus alcohol-based hand sanitizers as part of proper hand hygiene. Researchers could be asking: "Why have patients not been engaged as members of the healthcare team who have the most to lose?"

  13. Alternative biosphere modeling for safety assessment of HLW disposal taking account of geosphere-biosphere interface of marine environment

    International Nuclear Information System (INIS)

    Kato, Tomoko; Ishiguro, Katsuhiko; Naito, Morimasa; Ikeda, Takao; Little, Richard

    2001-03-01

    In the safety assessment of a high-level radioactive waste (HLW) disposal system, it is required to estimate radiological impacts on future human beings arising from potential radionuclide releases from a deep repository into the surface environment. In order to estimated the impacts, a biosphere model is developed by reasonably assuming radionuclide migration processes in the surface environment and relevant human lifestyles. It is important to modify the present biosphere models or to develop alternative biosphere models applying the biosphere models according to quality and quantify of the information acquired through the siting process for constructing the repository. In this study, alternative biosphere models were developed taking geosphere-biosphere interface of marine environment into account. Moreover, the flux to dose conversion factors calculated by these alternative biosphere models was compared with those by the present basic biosphere models. (author)

  14. Efficient modeling of sun/shade canopy radiation dynamics explicitly accounting for scattering

    Science.gov (United States)

    Bodin, P.; Franklin, O.

    2012-04-01

    The separation of global radiation (Rg) into its direct (Rb) and diffuse constituents (Rg) is important when modeling plant photosynthesis because a high Rd:Rg ratio has been shown to enhance Gross Primary Production (GPP). To include this effect in vegetation models, the plant canopy must be separated into sunlit and shaded leaves. However, because such models are often too intractable and computationally expensive for theoretical or large scale studies, simpler sun-shade approaches are often preferred. A widely used and computationally efficient sun-shade model was developed by Goudriaan (1977) (GOU). However, compared to more complex models, this model's realism is limited by its lack of explicit treatment of radiation scattering. Here we present a new model based on the GOU model, but which in contrast explicitly simulates radiation scattering by sunlit leaves and the absorption of this radiation by the canopy layers above and below (2-stream approach). Compared to the GOU model our model predicts significantly different profiles of scattered radiation that are in better agreement with measured profiles of downwelling diffuse radiation. With respect to these data our model's performance is equal to a more complex and much slower iterative radiation model while maintaining the simplicity and computational efficiency of the GOU model.

  15. Internet accounting

    NARCIS (Netherlands)

    Pras, Aiko; van Beijnum, Bernhard J.F.; Sprenkels, Ron; Parhonyi, R.

    2001-01-01

    This article provides an introduction to Internet accounting and discusses the status of related work within the IETF and IRTF, as well as certain research projects. Internet accounting is different from accounting in POTS. To understand Internet accounting, it is important to answer questions like

  16. Factors accounting for youth suicide attempt in Hong Kong: a model building.

    Science.gov (United States)

    Wan, Gloria W Y; Leung, Patrick W L

    2010-10-01

    This study aimed at proposing and testing a conceptual model of youth suicide attempt. We proposed a model that began with family factors such as a history of physical abuse and parental divorce/separation. Family relationship, presence of psychopathology, life stressors, and suicide ideation were postulated as mediators, leading to youth suicide attempt. The stepwise entry of the risk factors to a logistic regression model defined their proximity as related to suicide attempt. Path analysis further refined our proposed model of youth suicide attempt. Our originally proposed model was largely confirmed. The main revision was dropping parental divorce/separation as a risk factor in the model due to lack of significant contribution when examined alongside with other risk factors. This model was cross-validated by gender. This study moved research on youth suicide from identification of individual risk factors to model building, integrating separate findings of the past studies.

  17. Accounting for subgrid scale topographic variations in flood propagation modeling using MODFLOW

    DEFF Research Database (Denmark)

    Milzow, Christian; Kinzelbach, W.

    2010-01-01

    To be computationally viable, grid-based spatially distributed hydrological models of large wetlands or floodplains must be set up using relatively large cells (order of hundreds of meters to kilometers). Computational costs are especially high when considering the numerous model runs or model time...

  18. Accounting for the influence of the Earth's sphericity in three-dimensional density modelling

    Science.gov (United States)

    Martyshko, P. S.; Byzov, D. D.; Chernoskutov, A. I.

    2017-11-01

    A method for transformation of the three-dimensional regional "flat" density models of the Earth's crust and upper mantle to the "spherical" models and vice versa is proposed. A computation algorithm and a method of meaningful comparison of the vertical component of the gravity field of both models are presented.

  19. Accounting for imperfect forward modeling in geophysical inverse problems — Exemplified for crosshole tomography

    DEFF Research Database (Denmark)

    Hansen, Thomas Mejer; Cordua, Knud Skou; Holm Jacobsen, Bo

    2014-01-01

    of the modeling error was inferred in the form of a correlated Gaussian probability distribution. The key to the method was the ability to generate many realizations from a statistical description of the source of the modeling error, which in this case is the a priori model. The methodology was tested for two...

  20. Cost accounting in radiation oncology: a computer-based model for reimbursement.

    Science.gov (United States)

    Perez, C A; Kobeissi, B; Smith, B D; Fox, S; Grigsby, P W; Purdy, J A; Procter, H D; Wasserman, T H

    1993-04-02

    The skyrocketing cost of medical care in the United States has resulted in multiple efforts in cost containment. The present work offers a rational computer-based cost accounting approach to determine the actual use of resources in providing a specific service in a radiation oncology center. A procedure-level cost accounting system was developed by using recorded information on actual time and effort spent by individual staff members performing various radiation oncology procedures, and analyzing direct and indirect costs related to staffing (labor), facilities and equipment, supplies, etc. Expenditures were classified as direct or indirect and fixed or variable. A relative value unit was generated to allocate specific cost factors to each procedure. Different costs per procedure were identified according to complexity. Whereas there was no significant difference in the treatment time between low-energy (4 and 6 MV) or high-energy (18 MV) accelerators, there were significantly higher costs identified in the operation of a high-energy linear accelerator, a reflection of initial equipment investment, quality assurance and calibration procedures, maintenance costs, service contract, and replacement parts. Utilization of resources was related to the complexity of the procedures performed and whether the treatments were delivered to inpatients or outpatients. In analyzing time motion for physicians and other staff, it was apparent that a greater effort must be made to train the staff to accurately record all times involved in a given procedure, and it is strongly recommended that each institution perform its own time motion studies to more accurately determine operating costs. Sixty-six percent of our facility's global costs were for labor, 20% for other operating expenses, 10% for space, and 4% for equipment. Significant differences were noted in the cost allocation for professional or technical functions, as labor, space, and equipment costs are higher in the latter

  1. Development of a coal shrinkage-swelling model accounting for water content in the micropores

    Energy Technology Data Exchange (ETDEWEB)

    Prob Thararoop; Zuleima T. Karpyn; Turgay Ertekin [Pennsylvania State University, University Park, PA (United States). Petroleum and Natural Gas Engineering

    2009-07-01

    Changes in cleat permeability of coal seams are influenced by internal stress, and release or adsorption of gas in the coal matrix during production/injection processes. Coal shrinkage-swelling models have been proposed to quantify such changes; however none of the existing models incorporates the effect of the presence of water in the micropores on the gas sorption of coalbeds. This paper proposes a model of coal shrinkage and swelling, incorporating the effect of water in the micropores. The proposed model was validated using field permeability data from San Juan basin coalbeds and compared with coal shrinkage and swelling models existing in the literature.

  2. THE CURRENT ACCOUNT DEFICIT AND THE FIXED EXCHANGE RATE. ADJUSTING MECHANISMS AND MODELS.

    Directory of Open Access Journals (Sweden)

    HATEGAN D.B. Anca

    2010-07-01

    Full Text Available The main purpose of the paper is to explain what measures can be taken in order to fix the trade deficit, and the pressure that is upon a country by imposing such measures. The international and the national supply and demand conditions change rapidly, and if a country doesn’t succeed in keeping a tight control over its deficit, a lot of factors will affect its wellbeing. In order to reduce the external trade deficit, the government needs to resort to several techniques. The desired result is to have a balanced current account, and therefore, the government is free to use measures such as fixing its exchange rate, reducing government spending etc. We have shown that all these measures will have a certain impact upon an economy, by allowing its exports to thrive and eliminate the danger from excessive imports, or vice-versa. The main conclusion our paper is that government intervention is allowed in order to maintain the balance of the current account.

  3. Nonlinear analysis of a new car-following model accounting for the global average optimal velocity difference

    Science.gov (United States)

    Peng, Guanghan; Lu, Weizhen; He, Hongdi

    2016-09-01

    In this paper, a new car-following model is proposed by considering the global average optimal velocity difference effect on the basis of the full velocity difference (FVD) model. We investigate the influence of the global average optimal velocity difference on the stability of traffic flow by making use of linear stability analysis. It indicates that the stable region will be enlarged by taking the global average optimal velocity difference effect into account. Subsequently, the mKdV equation near the critical point and its kink-antikink soliton solution, which can describe the traffic jam transition, is derived from nonlinear analysis. Furthermore, numerical simulations confirm that the effect of the global average optimal velocity difference can efficiently improve the stability of traffic flow, which show that our new consideration should be taken into account to suppress the traffic congestion for car-following theory.

  4. Accounting for spatial effects in land use regression for urban air pollution modeling.

    Science.gov (United States)

    Bertazzon, Stefania; Johnson, Markey; Eccles, Kristin; Kaplan, Gilaad G

    2015-01-01

    In order to accurately assess air pollution risks, health studies require spatially resolved pollution concentrations. Land-use regression (LUR) models estimate ambient concentrations at a fine spatial scale. However, spatial effects such as spatial non-stationarity and spatial autocorrelation can reduce the accuracy of LUR estimates by increasing regression errors and uncertainty; and statistical methods for resolving these effects--e.g., spatially autoregressive (SAR) and geographically weighted regression (GWR) models--may be difficult to apply simultaneously. We used an alternate approach to address spatial non-stationarity and spatial autocorrelation in LUR models for nitrogen dioxide. Traditional models were re-specified to include a variable capturing wind speed and direction, and re-fit as GWR models. Mean R(2) values for the resulting GWR-wind models (summer: 0.86, winter: 0.73) showed a 10-20% improvement over traditional LUR models. GWR-wind models effectively addressed both spatial effects and produced meaningful predictive models. These results suggest a useful method for improving spatially explicit models. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  5. JOMAR - A model for accounting the environmental loads from building constructions

    Energy Technology Data Exchange (ETDEWEB)

    Roenning, Anne; Nereng, Guro; Vold, Mie; Bjoerberg, Svein; Lassen, Niels

    2008-07-01

    The objective for this project was to develop a model as a basis for calculation of environmental profile for whole building constructions, based upon data from databases and general LCA software, in addition to the model structure from the Nordic project on LCC assessment of buildings. The model has been tested on three building constructions; timber based, flexible and heavy as well as heavy. Total energy consumption and emissions contributing to climate change are calculated in a total life cycle perspective. The developed model and exemplifying case assessments have shown that a holistic model including operation phase is both important and possible to implement. The project has shown that the operation phase causes the highest environmental loads when it comes to the exemplified impact categories. A suggestion on further development of the model along two different axes in collaboration with a broader representation from the building sector is given in the report (author)(tk)

  6. An extended two-lane car-following model accounting for inter-vehicle communication

    Science.gov (United States)

    Ou, Hui; Tang, Tie-Qiao

    2018-04-01

    In this paper, we develop a novel car-following model with inter-vehicle communication to explore each vehicle's movement in a two-lane traffic system when an incident occurs on a lane. The numerical results show that the proposed model can perfectly describe each vehicle's motion when an incident occurs, i.e., no collision occurs while the classical full velocity difference (FVD) model produces collision on each lane, which shows the proposed model is more reasonable. The above results can help drivers to reasonably adjust their driving behaviors when an incident occurs in a two-lane traffic system.

  7. Reconstruction of Arabidopsis metabolic network models accounting for subcellular compartmentalization and tissue-specificity.

    Science.gov (United States)

    Mintz-Oron, Shira; Meir, Sagit; Malitsky, Sergey; Ruppin, Eytan; Aharoni, Asaph; Shlomi, Tomer

    2012-01-03

    Plant metabolic engineering is commonly used in the production of functional foods and quality trait improvement. However, to date, computational model-based approaches have only been scarcely used in this important endeavor, in marked contrast to their prominent success in microbial metabolic engineering. In this study we present a computational pipeline for the reconstruction of fully compartmentalized tissue-specific models of Arabidopsis thaliana on a genome scale. This reconstruction involves automatic extraction of known biochemical reactions in Arabidopsis for both primary and secondary metabolism, automatic gap-filling, and the implementation of methods for determining subcellular localization and tissue assignment of enzymes. The reconstructed tissue models are amenable for constraint-based modeling analysis, and significantly extend upon previous model reconstructions. A set of computational validations (i.e., cross-validation tests, simulations of known metabolic functionalities) and experimental validations (comparison with experimental metabolomics datasets under various compartments and tissues) strongly testify to the predictive ability of the models. The utility of the derived models was demonstrated in the prediction of measured fluxes in metabolically engineered seed strains and the design of genetic manipulations that are expected to increase vitamin E content, a significant nutrient for human health. Overall, the reconstructed tissue models are expected to lay down the foundations for computational-based rational design of plant metabolic engineering. The reconstructed compartmentalized Arabidopsis tissue models are MIRIAM-compliant and are available upon request.

  8. Voxelized Model of Brain Infusion That Accounts for Small Feature Fissures: Comparison With Magnetic Resonance Tracer Studies

    Science.gov (United States)

    Dai, Wei; Astary, Garrett W.; Kasinadhuni, Aditya K.; Carney, Paul R.; Mareci, Thomas H.; Sarntinoranont, Malisa

    2016-01-01

    Convection enhanced delivery (CED) is a promising novel technology to treat neural diseases, as it can transport macromolecular therapeutic agents greater distances through tissue by direct infusion. To minimize off-target delivery, our group has developed 3D computational transport models to predict infusion flow fields and tracer distributions based on magnetic resonance (MR) diffusion tensor imaging data sets. To improve the accuracy of our voxelized models, generalized anisotropy (GA), a scalar measure of a higher order diffusion tensor obtained from high angular resolution diffusion imaging (HARDI) was used to improve tissue segmentation within complex tissue regions of the hippocampus by capturing small feature fissures. Simulations were conducted to reveal the effect of these fissures and cerebrospinal fluid (CSF) boundaries on CED tracer diversion and mistargeting. Sensitivity analysis was also conducted to determine the effect of dorsal and ventral hippocampal infusion sites and tissue transport properties on drug delivery. Predicted CED tissue concentrations from this model are then compared with experimentally measured MR concentration profiles. This allowed for more quantitative comparison between model predictions and MR measurement. Simulations were able to capture infusate diversion into fissures and other CSF spaces which is a major source of CED mistargeting. Such knowledge is important for proper surgical planning. PMID:26833078

  9. A Social Audit Model for Agro-biotechnology Initiatives in Developing Countries: Accounting for Ethical, Social, Cultural, and Commercialization Issues

    Directory of Open Access Journals (Sweden)

    Obidimma Ezezika

    2009-10-01

    Full Text Available There is skepticism and resistance to innovations associated with agro-biotechnology projects, leading to the possibility of failure. The source of the skepticism is complex, but partly traceable to how local communities view genetically engineered crops, public perception on the technology’s implications, and views on the role of the private sector in public health and agriculture, especially in the developing world. We posit that a governance and management model in which ethical, social, cultural, and commercialization issues are accounted for and addressed is important in mitigating risk of project failure and improving the appropriate adoption of agro-biotechnology in sub-Saharan Africa. We introduce a social audit model, which we term Ethical, Social, Cultural and Commercialization (ESC2 auditing and which we developed based on feedback from a number of stakeholders. We lay the foundation for its importance in agro-biotechnology development projects and show how the model can be applied to projects run by Public Private Partnerships. We argue that the implementation of the audit model can help to build public trust through facilitating project accountability and transparency. The model also provides evidence on how ESC2 issues are perceived by various stakeholders, which enables project managers to effectively monitor and improve project performance. Although this model was specifically designed for agro-biotechnology initiatives, we show how it can also be applied to other development projects.

  10. A constitutive model accounting for strain ageing effects on work-hardening. Application to a C-Mn steel

    Science.gov (United States)

    Ren, Sicong; Mazière, Matthieu; Forest, Samuel; Morgeneyer, Thilo F.; Rousselier, Gilles

    2017-12-01

    One of the most successful models for describing the Portevin-Le Chatelier effect in engineering applications is the Kubin-Estrin-McCormick model (KEMC). In the present work, the influence of dynamic strain ageing on dynamic recovery due to dislocation annihilation is introduced in order to improve the KEMC model. This modification accounts for additional strain hardening rate due to limited dislocation annihilation by the diffusion of solute atoms and dislocation pinning at low strain rate and/or high temperature. The parameters associated with this novel formulation are identified based on tensile tests for a C-Mn steel at seven temperatures ranging from 20 °C to 350 °C. The validity of the model and the improvement compared to existing models are tested using 2D and 3D finite element simulations of the Portevin-Le Chatelier effect in tension.

  11. Accounting for sex differences in PTSD: A multi-variable mediation model

    DEFF Research Database (Denmark)

    Christiansen, Dorte M.; Hansen, Maj

    2015-01-01

    ABSTRACT Background: Approximately twice as many females as males are diagnosed with posttraumatic stress disorder (PTSD). However, little is known about why females report more PTSD symptoms than males. Prior studies have generally focused on few potential mediators at a time and have often used...... methods that were not ideally suited to test for mediation effects. Prior research has identified a number of individual risk factors that may contribute to sex differences in PTSD severity, although these cannot fully account for the increased symptom levels in females when examined individually...... and related variables in 73.3% of all Danish bank employees exposed to bank robbery during the period from April 2010 to April 2011. Participants filled out questionnaires 1 week (T1, Nﰁ450) and 6 months after the robbery (T2, Nﰁ368; 61.1% females). Mediation was examined using an analysis designed...

  12. Taking individual scaling differences into account by analyzing profile data with the Mixed Assessor Model

    DEFF Research Database (Denmark)

    Brockhoff, Per Bruun; Schlich, Pascal; Skovgaard, Ib

    2015-01-01

    Scale range differences between individual assessors will often constitute a non-trivial part of the assessor-by-product interaction in sensory profile data (Brockhoff, 2003, 1998; Brockhoff and Skovgaard, 1994). We suggest a new mixed model ANOVA analysis approach, the Mixed Assessor Model (MAM...

  13. Development and Evaluation of Model Algorithms to Account for Chemical Transformation in the Nearroad Environment

    Science.gov (United States)

    We describe the development and evaluation of two new model algorithms for NOx chemistry in the R-LINE near-road dispersion model for traffic sources. With increased urbanization, there is increased mobility leading to higher amount of traffic related activity on a global scale. ...

  14. Assessing and accounting for time heterogeneity in stochastic actor oriented models

    NARCIS (Netherlands)

    Lospinoso, Joshua A.; Schweinberger, Michael; Snijders, Tom A. B.; Ripley, Ruth M.

    This paper explores time heterogeneity in stochastic actor oriented models (SAOM) proposed by Snijders (Sociological methodology. Blackwell, Boston, pp 361-395, 2001) which are meant to study the evolution of networks. SAOMs model social networks as directed graphs with nodes representing people,

  15. An individual-based model of Zebrafish population dynamics accounting for energy dynamics

    DEFF Research Database (Denmark)

    Beaudouin, Remy; Goussen, Benoit; Piccini, Benjamin

    2015-01-01

    Developing population dynamics models for zebrafish is crucial in order to extrapolate from toxicity data measured at the organism level to biological levels relevant to support and enhance ecological risk assessment. To achieve this, a dynamic energy budget for individual zebrafish (DEB model) w...

  16. Bioeconomic Modelling of Wetlands and Waterfowl in Western Canada: Accounting for Amenity Values

    NARCIS (Netherlands)

    Kooten, van G.C.; Whitey, P.; Wong, L.

    2011-01-01

    This study reexamines and updates an original bioeconomic model of optimal duck harvest and wetland retention by Hammack and Brown (1974, Waterfowl and Wetlands: Toward Bioeconomic Analysis. Washington, DC: Resources for the Future). It then extends the model to include the nonmarket (in situ) value

  17. Accounting for correlated observations in an age-based state-space stock assessment model

    DEFF Research Database (Denmark)

    Berg, Casper Willestofte; Nielsen, Anders

    2016-01-01

    Fish stock assessment models often relyon size- or age-specific observations that are assumed to be statistically independent of each other. In reality, these observations are not raw observations, but rather they are estimates from a catch-standardization model or similar summary statistics based...

  18. Assigned value improves memory of proper names.

    Science.gov (United States)

    Festini, Sara B; Hartley, Alan A; Tauber, Sarah K; Rhodes, Matthew G

    2013-01-01

    Names are more difficult to remember than other personal information such as occupations. The current research examined the influence of assigned point value on memory and metamemory judgements for names and occupations to determine whether incentive can improve recall of proper names. In Experiment 1 participants studied face-name and face-occupation pairs assigned 1 or 10 points, made judgements of learning, and were given a cued recall test. High-value names were recalled more often than low-value names. However, recall of occupations was not influenced by value. In Experiment 2 meaningless nonwords were used for both names and occupations. The name difficulty disappeared, and value influenced recall of both names and occupations. Thus value similarly influenced names and occupations when meaningfulness was held constant. In Experiment 3 participants were required to use overt rote rehearsal for all items. Value did not boost recall of high-value names, suggesting that differential processing could not be implemented to improve memory. Thus incentives may improve memory for proper names by motivating people to engage in selective rehearsal and effortful elaborative processing.

  19. 7 CFR 1735.92 - Accounting considerations.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 11 2010-01-01 2010-01-01 false Accounting considerations. 1735.92 Section 1735.92... All Acquisitions and Mergers § 1735.92 Accounting considerations. (a) Proper accounting shall be... in the absence of such a commission, as required by RUS based on Generally Accepted Accounting...

  20. A Nonlinear Transmission Line Model of the Cochlea With Temporal Integration Accounts for Duration Effects in Threshold Fine Structure

    DEFF Research Database (Denmark)

    Verhey, Jesko L.; Mauermann, Manfred; Epp, Bastian

    2017-01-01

    than for long signals. The present study demonstrates how this effect can be captured by a nonlinear and active model of the cochlear in combination with a temporal integration stage. Since this cochlear model also accounts for fine structure and connected level dependent effects, it is superior......For normal-hearing listeners, auditory pure-tone thresholds in quiet often show quasi periodic fluctuations when measured with a high frequency resolution, referred to as threshold fine structure. Threshold fine structure is dependent on the stimulus duration, with smaller fluctuations for short...

  1. An improved car-following model accounting for the preceding car's taillight

    Science.gov (United States)

    Zhang, Jian; Tang, Tie-Qiao; Yu, Shao-Wei

    2018-02-01

    During the deceleration process, the preceding car's taillight may have influences on its following car's driving behavior. In this paper, we propose an extended car-following model with consideration of the preceding car's taillight. Two typical situations are used to simulate each car's movement and study the effects of the preceding car's taillight on the driving behavior. Meanwhile, sensitivity analysis of the model parameter is in detail discussed. The numerical results show that the proposed model can improve the stability of traffic flow and the traffic safety can be enhanced without a decrease of efficiency especially when cars pass through a signalized intersection.

  2. An agent-based simulation model of patient choice of health care providers in accountable care organizations.

    Science.gov (United States)

    Alibrahim, Abdullah; Wu, Shinyi

    2018-03-01

    Accountable care organizations (ACO) in the United States show promise in controlling health care costs while preserving patients' choice of providers. Understanding the effects of patient choice is critical in novel payment and delivery models like ACO that depend on continuity of care and accountability. The financial, utilization, and behavioral implications associated with a patient's decision to forego local health care providers for more distant ones to access higher quality care remain unknown. To study this question, we used an agent-based simulation model of a health care market composed of providers able to form ACO serving patients and embedded it in a conditional logit decision model to examine patients capable of choosing their care providers. This simulation focuses on Medicare beneficiaries and their congestive heart failure (CHF) outcomes. We place the patient agents in an ACO delivery system model in which provider agents decide if they remain in an ACO and perform a quality improving CHF disease management intervention. Illustrative results show that allowing patients to choose their providers reduces the yearly payment per CHF patient by $320, reduces mortality rates by 0.12 percentage points and hospitalization rates by 0.44 percentage points, and marginally increases provider participation in ACO. This study demonstrates a model capable of quantifying the effects of patient choice in a theoretical ACO system and provides a potential tool for policymakers to understand implications of patient choice and assess potential policy controls.

  3. Accounting for partial sleep deprivation and cumulative sleepiness in the Three-Process Model of alertness regulation.

    Science.gov (United States)

    Akerstedt, Torbjörn; Ingre, Michael; Kecklund, Göran; Folkard, Simon; Axelsson, John

    2008-04-01

    Mathematical models designed to predict alertness or performance have been developed primarily as tools for evaluating work and/or sleep-wake schedules that deviate from the traditional daytime orientation. In general, these models cope well with the acute changes resulting from an abnormal sleep but have difficulties handling sleep restriction across longer periods. The reason is that the function representing recovery is too steep--usually exponentially so--and with increasing sleep loss, the steepness increases, resulting in too rapid recovery. The present study focused on refining the Three-Process Model of alertness regulation. We used an experiment with 4 h of sleep/night (nine participants) that included subjective self-ratings of sleepiness every hour. To evaluate the model at the individual subject level, a set of mixed-effect regression analyses were performed using subjective sleepiness as the dependent variable. These mixed models estimate a fixed effect (group mean) and a random effect that accounts for heterogeneity between participants in the overall level of sleepiness (i.e., a random intercept). Using this technique, a point was sought on the exponential recovery function that would explain maximum variance in subjective sleepiness by switching to a linear function. The resulting point explaining the highest amount of variance was 12.2 on the 1-21 unit scale. It was concluded that the accumulation of sleep loss effects on subjective sleepiness may be accounted for by making the recovery function linear below a certain point on the otherwise exponential function.

  4. A new computational account of cognitive control over reinforcement-based decision-making: Modeling of a probabilistic learning task.

    Science.gov (United States)

    Zendehrouh, Sareh

    2015-11-01

    Recent work on decision-making field offers an account of dual-system theory for decision-making process. This theory holds that this process is conducted by two main controllers: a goal-directed system and a habitual system. In the reinforcement learning (RL) domain, the habitual behaviors are connected with model-free methods, in which appropriate actions are learned through trial-and-error experiences. However, goal-directed behaviors are associated with model-based methods of RL, in which actions are selected using a model of the environment. Studies on cognitive control also suggest that during processes like decision-making, some cortical and subcortical structures work in concert to monitor the consequences of decisions and to adjust control according to current task demands. Here a computational model is presented based on dual system theory and cognitive control perspective of decision-making. The proposed model is used to simulate human performance on a variant of probabilistic learning task. The basic proposal is that the brain implements a dual controller, while an accompanying monitoring system detects some kinds of conflict including a hypothetical cost-conflict one. The simulation results address existing theories about two event-related potentials, namely error related negativity (ERN) and feedback related negativity (FRN), and explore the best account of them. Based on the results, some testable predictions are also presented. Copyright © 2015 Elsevier Ltd. All rights reserved.

  5. Toward a formalized account of attitudes: The Causal Attitude Network (CAN) Model

    NARCIS (Netherlands)

    Dalege, J.; Borsboom, D.; Harreveld, F. van; Berg, H. van den; Conner, M.; Maas, H.L.J. van der

    2016-01-01

    This article introduces the Causal Attitude Network (CAN) model, which conceptualizes attitudes as networks consisting of evaluative reactions and interactions between these reactions. Relevant evaluative reactions include beliefs, feelings, and behaviors toward the attitude object. Interactions

  6. Mechanistic Physiologically Based Pharmacokinetic (PBPK) Model of the Heart Accounting for Inter-Individual Variability: Development and Performance Verification.

    Science.gov (United States)

    Tylutki, Zofia; Mendyk, Aleksander; Polak, Sebastian

    2018-04-01

    Modern model-based approaches to cardiac safety and efficacy assessment require accurate drug concentration-effect relationship establishment. Thus, knowledge of the active concentration of drugs in heart tissue is desirable along with inter-subject variability influence estimation. To that end, we developed a mechanistic physiologically based pharmacokinetic model of the heart. The models were described with literature-derived parameters and written in R, v.3.4.0. Five parameters were estimated. The model was fitted to amitriptyline and nortriptyline concentrations after an intravenous infusion of amitriptyline. The cardiac model consisted of 5 compartments representing the pericardial fluid, heart extracellular water, and epicardial intracellular, midmyocardial intracellular, and endocardial intracellular fluids. Drug cardiac metabolism, passive diffusion, active efflux, and uptake were included in the model as mechanisms involved in the drug disposition within the heart. The model accounted for inter-individual variability. The estimates of optimized parameters were within physiological ranges. The model performance was verified by simulating 5 clinical studies of amitriptyline intravenous infusion, and the simulated pharmacokinetic profiles agreed with clinical data. The results support the model feasibility. The proposed structure can be tested with the goal of improving the patient-specific model-based cardiac safety assessment and offers a framework for predicting cardiac concentrations of various xenobiotics. Copyright © 2018 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.

  7. Accounting for scattering in the Landauer-Datta-Lundstrom transport model

    Directory of Open Access Journals (Sweden)

    Юрій Олексійович Кругляк

    2015-03-01

    Full Text Available Scattering of carriers in the LDL transport model during the changes of the scattering times in the collision processes is considered qualitatively. The basic relationship between the transmission coefficient T and the average mean free path  is derived for 1D conductor. As an example, the experimental data for Si MOSFET are analyzed with the use of various models of reliability.

  8. Carbon accounting and economic model uncertainty of emissions from biofuels-induced land use change.

    Science.gov (United States)

    Plevin, Richard J; Beckman, Jayson; Golub, Alla A; Witcover, Julie; O'Hare, Michael

    2015-03-03

    Few of the numerous published studies of the emissions from biofuels-induced "indirect" land use change (ILUC) attempt to propagate and quantify uncertainty, and those that have done so have restricted their analysis to a portion of the modeling systems used. In this study, we pair a global, computable general equilibrium model with a model of greenhouse gas emissions from land-use change to quantify the parametric uncertainty in the paired modeling system's estimates of greenhouse gas emissions from ILUC induced by expanded production of three biofuels. We find that for the three fuel systems examined--US corn ethanol, Brazilian sugar cane ethanol, and US soybean biodiesel--95% of the results occurred within ±20 g CO2e MJ(-1) of the mean (coefficient of variation of 20-45%), with economic model parameters related to crop yield and the productivity of newly converted cropland (from forestry and pasture) contributing most of the variance in estimated ILUC emissions intensity. Although the experiments performed here allow us to characterize parametric uncertainty, changes to the model structure have the potential to shift the mean by tens of grams of CO2e per megajoule and further broaden distributions for ILUC emission intensities.

  9. An extended continuum model accounting for the driver's timid and aggressive attributions

    International Nuclear Information System (INIS)

    Cheng, Rongjun; Ge, Hongxia; Wang, Jufeng

    2017-01-01

    Considering the driver's timid and aggressive behaviors simultaneously, a new continuum model is put forwarded in this paper. By applying the linear stability theory, we presented the analysis of new model's linear stability. Through nonlinear analysis, the KdV–Burgers equation is derived to describe density wave near the neutral stability line. Numerical results verify that aggressive driving is better than timid act because the aggressive driver will adjust his speed timely according to the leading car's speed. The key improvement of this new model is that the timid driving deteriorates traffic stability while the aggressive driving will enhance traffic stability. The relationship of energy consumption between the aggressive and timid driving is also studied. Numerical results show that aggressive driver behavior can not only suppress the traffic congestion but also reduce the energy consumption. - Highlights: • A new continuum model is developed with the consideration of the driver's timid and aggressive behaviors simultaneously. • Applying the linear stability theory, the new model's linear stability is obtained. • Through nonlinear analysis, the KdV–Burgers equation is derived. • The energy consumption for this model is studied.

  10. An extended continuum model accounting for the driver's timid and aggressive attributions

    Energy Technology Data Exchange (ETDEWEB)

    Cheng, Rongjun; Ge, Hongxia [Faculty of Maritime and Transportation, Ningbo University, Ningbo 315211 (China); Jiangsu Province Collaborative Innovation Center for Modern Urban Traffic Technologies, Nanjing 210096 (China); National Traffic Management Engineering and Technology Research Centre Ningbo University Sub-centre, Ningbo 315211 (China); Wang, Jufeng, E-mail: wjf@nit.zju.edu.cn [Ningbo Institute of Technology, Zhejiang University, Ningbo 315100 (China)

    2017-04-18

    Considering the driver's timid and aggressive behaviors simultaneously, a new continuum model is put forwarded in this paper. By applying the linear stability theory, we presented the analysis of new model's linear stability. Through nonlinear analysis, the KdV–Burgers equation is derived to describe density wave near the neutral stability line. Numerical results verify that aggressive driving is better than timid act because the aggressive driver will adjust his speed timely according to the leading car's speed. The key improvement of this new model is that the timid driving deteriorates traffic stability while the aggressive driving will enhance traffic stability. The relationship of energy consumption between the aggressive and timid driving is also studied. Numerical results show that aggressive driver behavior can not only suppress the traffic congestion but also reduce the energy consumption. - Highlights: • A new continuum model is developed with the consideration of the driver's timid and aggressive behaviors simultaneously. • Applying the linear stability theory, the new model's linear stability is obtained. • Through nonlinear analysis, the KdV–Burgers equation is derived. • The energy consumption for this model is studied.

  11. A Critical Examination of the Models Proposed to Account for Baryon-Antibaryon Segregation Following the Quark-Hadron Transition

    Science.gov (United States)

    Garfinkle, Moishe

    2015-04-01

    The major concern of the Standard Cosmological Model (SCM) is to account for the continuing existence of the universe in spite of the Standard Particle Model (SPM). According to the SPM below the quark-hadron temperature (~ 150 +/- 50 MeV) the rate of baryon-antibaryon pair creation from γ radiation is in equilibrium with rate of pair annihilation. At freeze-out (~ 20 +/- 10 MeV) the rate of pair creation ceases. Henceforth only annihilation occurs below this temperature, resulting in a terminal pair ratio B+/ γ = B-/ γ ~ 10-18, insufficient to account for the present universe which would require a pair ratio minimum of at least B+/ γ = B-/ γ ~ 10-10. The present universe could not exist according to the SPM unless a mechanism was devised to segregation baryons from antibaryon before freeze-out. The SPM can be tweaked to accommodate the first two conditions but all of the mechanisms proposed over the past sixty years for the third condition failed. All baryon-number excursions devised were found to be reversible. The major concern of the SCM is to account for the continuing existence of the universe in spite of the SPM. The present universe could not exist according to the SPM unless a mechanism was devised to segregation baryons from antibaryon before freeze-out. It is the examination of these possible mechanisms that is subject of this work.

  12. Accounting for diffusion in agent based models of reaction-diffusion systems with application to cytoskeletal diffusion.

    Science.gov (United States)

    Azimi, Mohammad; Jamali, Yousef; Mofrad, Mohammad R K

    2011-01-01

    Diffusion plays a key role in many biochemical reaction systems seen in nature. Scenarios where diffusion behavior is critical can be seen in the cell and subcellular compartments where molecular crowding limits the interaction between particles. We investigate the application of a computational method for modeling the diffusion of molecules and macromolecules in three-dimensional solutions using agent based modeling. This method allows for realistic modeling of a system of particles with different properties such as size, diffusion coefficients, and affinity as well as the environment properties such as viscosity and geometry. Simulations using these movement probabilities yield behavior that mimics natural diffusion. Using this modeling framework, we simulate the effects of molecular crowding on effective diffusion and have validated the results of our model using Langevin dynamics simulations and note that they are in good agreement with previous experimental data. Furthermore, we investigate an extension of this framework where single discrete cells can contain multiple particles of varying size in an effort to highlight errors that can arise from discretization that lead to the unnatural behavior of particles undergoing diffusion. Subsequently, we explore various algorithms that differ in how they handle the movement of multiple particles per cell and suggest an algorithm that properly accommodates multiple particles of various sizes per cell that can replicate the natural behavior of these particles diffusing. Finally, we use the present modeling framework to investigate the effect of structural geometry on the directionality of diffusion in the cell cytoskeleton with the observation that parallel orientation in the structural geometry of actin filaments of filopodia and the branched structure of lamellipodia can give directionality to diffusion at the filopodia-lamellipodia interface.

  13. Mobility of domain walls in proper ferroelastic martensites

    Energy Technology Data Exchange (ETDEWEB)

    Barsch, G.R. [Pennsylvania State Univ., University Park, PA (United States). Materials Research Lab.]|[Pennsylvania State Univ., University Park, PA (United States). Dept. of Physics

    1995-12-01

    Based on the Landau-Ginzberg free energy functional for an O{sub h}-D{sub 4h} proper ferroelastic martensitic transformation the mobility of a (110) twin boundary in a large bicrystal has been calculated by including dissipation in the approximation of the phonon viscosity model and by solving the inverse boundary value problem for the limiting case of strong shear modulus softening. Application to actual materials requires determination of the phonon viscosity tensor from experimental ultrasonic attenuation or low frequency internal friction data after subtraction of ``extrinsic`` losses, especially those from pretransformation structural strain modulations (``tweed``) and dislocations. Numerical application to V{sub 3}Si, the only proper ferroelastic martensite for which such experimental data pertaining to the soft [110]/[1 anti 10] shear mode are available, is discussed. (orig.).

  14. Accounting standards

    NARCIS (Netherlands)

    Stellinga, B.; Mügge, D.

    2014-01-01

    The European and global regulation of accounting standards have witnessed remarkable changes over the past twenty years. In the early 1990s, EU accounting practices were fragmented along national lines and US accounting standards were the de facto global standards. Since 2005, all EU listed

  15. Accounting outsourcing

    OpenAIRE

    Klečacká, Tereza

    2009-01-01

    This thesis gives a complex view on accounting outsourcing, deals with the outsourcing process from its beginning (condition of collaboration, making of contract), through collaboration to its possible ending. This work defines outsourcing, indicates the main advatages, disadvatages and arguments for its using. The main object of thesis is mainly practical side of accounting outsourcing and providing of first quality accounting services.

  16. Accounting for misclassification in electronic health records-derived exposures using generalized linear finite mixture models.

    Science.gov (United States)

    Hubbard, Rebecca A; Johnson, Eric; Chubak, Jessica; Wernli, Karen J; Kamineni, Aruna; Bogart, Andy; Rutter, Carolyn M

    2017-06-01

    Exposures derived from electronic health records (EHR) may be misclassified, leading to biased estimates of their association with outcomes of interest. An example of this problem arises in the context of cancer screening where test indication, the purpose for which a test was performed, is often unavailable. This poses a challenge to understanding the effectiveness of screening tests because estimates of screening test effectiveness are biased if some diagnostic tests are misclassified as screening. Prediction models have been developed for a variety of exposure variables that can be derived from EHR, but no previous research has investigated appropriate methods for obtaining unbiased association estimates using these predicted probabilities. The full likelihood incorporating information on both the predicted probability of exposure-class membership and the association between the exposure and outcome of interest can be expressed using a finite mixture model. When the regression model of interest is a generalized linear model (GLM), the expectation-maximization algorithm can be used to estimate the parameters using standard software for GLMs. Using simulation studies, we compared the bias and efficiency of this mixture model approach to alternative approaches including multiple imputation and dichotomization of the predicted probabilities to create a proxy for the missing predictor. The mixture model was the only approach that was unbiased across all scenarios investigated. Finally, we explored the performance of these alternatives in a study of colorectal cancer screening with colonoscopy. These findings have broad applicability in studies using EHR data where gold-standard exposures are unavailable and prediction models have been developed for estimating proxies.

  17. Accounting for misclassified outcomes in binary regression models using multiple imputation with internal validation data.

    Science.gov (United States)

    Edwards, Jessie K; Cole, Stephen R; Troester, Melissa A; Richardson, David B

    2013-05-01

    Outcome misclassification is widespread in epidemiology, but methods to account for it are rarely used. We describe the use of multiple imputation to reduce bias when validation data are available for a subgroup of study participants. This approach is illustrated using data from 308 participants in the multicenter Herpetic Eye Disease Study between 1992 and 1998 (48% female; 85% white; median age, 49 years). The odds ratio comparing the acyclovir group with the placebo group on the gold-standard outcome (physician-diagnosed herpes simplex virus recurrence) was 0.62 (95% confidence interval (CI): 0.35, 1.09). We masked ourselves to physician diagnosis except for a 30% validation subgroup used to compare methods. Multiple imputation (odds ratio (OR) = 0.60; 95% CI: 0.24, 1.51) was compared with naive analysis using self-reported outcomes (OR = 0.90; 95% CI: 0.47, 1.73), analysis restricted to the validation subgroup (OR = 0.57; 95% CI: 0.20, 1.59), and direct maximum likelihood (OR = 0.62; 95% CI: 0.26, 1.53). In simulations, multiple imputation and direct maximum likelihood had greater statistical power than did analysis restricted to the validation subgroup, yet all 3 provided unbiased estimates of the odds ratio. The multiple-imputation approach was extended to estimate risk ratios using log-binomial regression. Multiple imputation has advantages regarding flexibility and ease of implementation for epidemiologists familiar with missing data methods.

  18. Unsupervised machine learning account of magnetic transitions in the Hubbard model

    Science.gov (United States)

    Ch'ng, Kelvin; Vazquez, Nick; Khatami, Ehsan

    2018-01-01

    We employ several unsupervised machine learning techniques, including autoencoders, random trees embedding, and t -distributed stochastic neighboring ensemble (t -SNE), to reduce the dimensionality of, and therefore classify, raw (auxiliary) spin configurations generated, through Monte Carlo simulations of small clusters, for the Ising and Fermi-Hubbard models at finite temperatures. Results from a convolutional autoencoder for the three-dimensional Ising model can be shown to produce the magnetization and the susceptibility as a function of temperature with a high degree of accuracy. Quantum fluctuations distort this picture and prevent us from making such connections between the output of the autoencoder and physical observables for the Hubbard model. However, we are able to define an indicator based on the output of the t -SNE algorithm that shows a near perfect agreement with the antiferromagnetic structure factor of the model in two and three spatial dimensions in the weak-coupling regime. t -SNE also predicts a transition to the canted antiferromagnetic phase for the three-dimensional model when a strong magnetic field is present. We show that these techniques cannot be expected to work away from half filling when the "sign problem" in quantum Monte Carlo simulations is present.

  19. Precedent Proper Names in Informal Oikonymy

    Directory of Open Access Journals (Sweden)

    Maria V. Akhmetova

    2013-06-01

    Full Text Available The paper deals with the Russian language informal city names (oikonyms motivated by other toponyms (with reference to Russia and the CIS. The author shows that the motivating proper name can replace the city name (e. g. Глазго < Glasgow ‘Glazov’ or contaminate with it (e. g. Экибостон < Ekibastuz + Boston, the “alien” onym being attracted to construct an informal oikonym due to its phonetic similarity or, on occasion, due to an affinity, either real or imaginary, between the two settlements. The author argues that the phonetic motivation is more characteristic for the modern urban tradition, than for popular dialects.

  20. Spatial modelling and ecosystem accounting for land use planning: addressing deforestation and oil palm expansion in Central Kalimantan, Indonesia

    OpenAIRE

    Sumarga, E.

    2015-01-01

    Ecosystem accounting is a new area of environmental economic accounting that aims to measure ecosystem services in a way that is in line with national accounts. The key characteristics of ecosystem accounting include the extension of the valuation boundary of the System of National Accounts, allowing the inclusion of a broader set of ecosystem services types such regulating services and cultural services. Consistent with the principles of national account, ecosystem accounting focuses on asse...

  1. A comparison of land use change accounting methods: seeking common grounds for key modeling choices in biofuel assessments

    DEFF Research Database (Denmark)

    de Bikuna Salinas, Koldo Saez; Hamelin, Lorie; Hauschild, Michael Zwicky

    2018-01-01

    Five currently used methods to account for the global warming (GW) impact of the induced land-use change (LUC) greenhouse gas (GHG) emissions have been applied to four biofuel case studies. Two of the investigated methods attempt to avoid the need of considering a definite occupation -thus...... amortization period by considering ongoing LUC trends as a dynamic baseline. This leads to the accounting of a small fraction (0.8%) of the related emissions from the assessed LUC, thus their validity is disputed. The comparison of methods and contrasting case studies illustrated the need of clearly...... distinguishing between the different time horizons involved in life cycle assessments (LCA) of land-demanding products like biofuels. Absent in ISO standards, and giving rise to several confusions, definitions for the following time horizons have been proposed: technological scope, inventory model, impact...

  2. Model of Environmental Development of the Urbanized Areas: Accounting of Ecological and other Factors

    Science.gov (United States)

    Abanina, E. N.; Pandakov, K. G.; Agapov, D. A.; Sorokina, Yu V.; Vasiliev, E. H.

    2017-05-01

    Modern cities and towns are often characterized by poor administration, which could be the reason of environmental degradation, the poverty growth, decline in economic growth and social isolation. In these circumstances it is really important to conduct fresh researches forming new ways of sustainable development of administrative districts. This development of the urban areas depends on many interdependent factors: ecological, economic, social. In this article we show some theoretical aspects of forming a model of environmental progress of the urbanized areas. We submit some model containing four levels including natural resources capacities of the territory, its social features, economic growth and human impact. The author describes the interrelations of elements of the model. In this article the program of environmental development of a city is offered and it could be used in any urban area.

  3. Mathematical modeling of pigment dispersion taking into account the full agglomerate particle size distribution

    DEFF Research Database (Denmark)

    Kiil, Søren

    2017-01-01

    . The only adjustable parameter used was an apparent rate constant for the linear agglomerate erosion rate. Model simulations, at selected values of time, for the full agglomerate particle size distribution were in good qualitative agreement with the measured values. A quantitative match of the experimental...... particle size distribution was simulated. Data from two previous experimental investigations were used for model validation. The first concerns two different yellow organic pigments dispersed in nitrocellulose/ethanol vehicles in a ball mill and the second a red organic pigment dispersed in a solvent...... particle size distributions could be obtained using time-dependent fragment distributions, but this resulted in a very slight improvement in the simulated transient mean diameter only. The model provides a mechanistic understanding of the agglomerate breakage process that can be used, e...

  4. Multiphysics Model of Palladium Hydride Isotope Exchange Accounting for Higher Dimensionality

    Energy Technology Data Exchange (ETDEWEB)

    Gharagozloo, Patricia E.; Eliassi, Mehdi; Bon, Bradley Luis

    2015-03-01

    This report summarizes computational model developm ent and simulations results for a series of isotope exchange dynamics experiments i ncluding long and thin isothermal beds similar to the Foltz and Melius beds and a lar ger non-isothermal experiment on the NENG7 test bed. The multiphysics 2D axi-symmetr ic model simulates the temperature and pressure dependent exchange reactio n kinetics, pressure and isotope dependent stoichiometry, heat generation from the r eaction, reacting gas flow through porous media, and non-uniformities in the bed perme ability. The new model is now able to replicate the curved reaction front and asy mmetry of the exit gas mass fractions over time. The improved understanding of the exchange process and its dependence on the non-uniform bed properties and te mperatures in these larger systems is critical to the future design of such sy stems.

  5. Taking dietary habits into account: A computational method for modeling food choices that goes beyond price.

    Directory of Open Access Journals (Sweden)

    Rahmatollah Beheshti

    Full Text Available Computational models have gained popularity as a predictive tool for assessing proposed policy changes affecting dietary choice. Specifically, they have been used for modeling dietary changes in response to economic interventions, such as price and income changes. Herein, we present a novel addition to this type of model by incorporating habitual behaviors that drive individuals to maintain or conform to prior eating patterns. We examine our method in a simulated case study of food choice behaviors of low-income adults in the US. We use data from several national datasets, including the National Health and Nutrition Examination Survey (NHANES, the US Bureau of Labor Statistics and the USDA, to parameterize our model and develop predictive capabilities in 1 quantifying the influence of prior diet preferences when food budgets are increased and 2 simulating the income elasticities of demand for four food categories. Food budgets can increase because of greater affordability (due to food aid and other nutritional assistance programs, or because of higher income. Our model predictions indicate that low-income adults consume unhealthy diets when they have highly constrained budgets, but that even after budget constraints are relaxed, these unhealthy eating behaviors are maintained. Specifically, diets in this population, before and after changes in food budgets, are characterized by relatively low consumption of fruits and vegetables and high consumption of fat. The model results for income elasticities also show almost no change in consumption of fruit and fat in response to changes in income, which is in agreement with data from the World Bank's International Comparison Program (ICP. Hence, the proposed method can be used in assessing the influences of habitual dietary patterns on the effectiveness of food policies.

  6. Taking dietary habits into account: A computational method for modeling food choices that goes beyond price.

    Science.gov (United States)

    Beheshti, Rahmatollah; Jones-Smith, Jessica C; Igusa, Takeru

    2017-01-01

    Computational models have gained popularity as a predictive tool for assessing proposed policy changes affecting dietary choice. Specifically, they have been used for modeling dietary changes in response to economic interventions, such as price and income changes. Herein, we present a novel addition to this type of model by incorporating habitual behaviors that drive individuals to maintain or conform to prior eating patterns. We examine our method in a simulated case study of food choice behaviors of low-income adults in the US. We use data from several national datasets, including the National Health and Nutrition Examination Survey (NHANES), the US Bureau of Labor Statistics and the USDA, to parameterize our model and develop predictive capabilities in 1) quantifying the influence of prior diet preferences when food budgets are increased and 2) simulating the income elasticities of demand for four food categories. Food budgets can increase because of greater affordability (due to food aid and other nutritional assistance programs), or because of higher income. Our model predictions indicate that low-income adults consume unhealthy diets when they have highly constrained budgets, but that even after budget constraints are relaxed, these unhealthy eating behaviors are maintained. Specifically, diets in this population, before and after changes in food budgets, are characterized by relatively low consumption of fruits and vegetables and high consumption of fat. The model results for income elasticities also show almost no change in consumption of fruit and fat in response to changes in income, which is in agreement with data from the World Bank's International Comparison Program (ICP). Hence, the proposed method can be used in assessing the influences of habitual dietary patterns on the effectiveness of food policies.

  7. Improved signal model for confocal sensors accounting for object depending artifacts.

    Science.gov (United States)

    Mauch, Florian; Lyda, Wolfram; Gronle, Marc; Osten, Wolfgang

    2012-08-27

    The conventional signal model of confocal sensors is well established and has proven to be exceptionally robust especially when measuring rough surfaces. Its physical derivation however is explicitly based on plane surfaces or point like objects, respectively. Here we show experimental results of a confocal point sensor measurement of a surface standard. The results illustrate the rise of severe artifacts when measuring curved surfaces. On this basis, we present a systematic extension of the conventional signal model that is proven to be capable of qualitatively explaining these artifacts.

  8. Accounting for parameter uncertainty in the definition of parametric distributions used to describe individual patient variation in health economic models.

    Science.gov (United States)

    Degeling, Koen; IJzerman, Maarten J; Koopman, Miriam; Koffijberg, Hendrik

    2017-12-15

    Parametric distributions based on individual patient data can be used to represent both stochastic and parameter uncertainty. Although general guidance is available on how parameter uncertainty should be accounted for in probabilistic sensitivity analysis, there is no comprehensive guidance on reflecting parameter uncertainty in the (correlated) parameters of distributions used to represent stochastic uncertainty in patient-level models. This study aims to provide this guidance by proposing appropriate methods and illustrating the impact of this uncertainty on modeling outcomes. Two approaches, 1) using non-parametric bootstrapping and 2) using multivariate Normal distributions, were applied in a simulation and case study. The approaches were compared based on point-estimates and distributions of time-to-event and health economic outcomes. To assess sample size impact on the uncertainty in these outcomes, sample size was varied in the simulation study and subgroup analyses were performed for the case-study. Accounting for parameter uncertainty in distributions that reflect stochastic uncertainty substantially increased the uncertainty surrounding health economic outcomes, illustrated by larger confidence ellipses surrounding the cost-effectiveness point-estimates and different cost-effectiveness acceptability curves. Although both approaches performed similar for larger sample sizes (i.e. n = 500), the second approach was more sensitive to extreme values for small sample sizes (i.e. n = 25), yielding infeasible modeling outcomes. Modelers should be aware that parameter uncertainty in distributions used to describe stochastic uncertainty needs to be reflected in probabilistic sensitivity analysis, as it could substantially impact the total amount of uncertainty surrounding health economic outcomes. If feasible, the bootstrap approach is recommended to account for this uncertainty.

  9. Accounting for parameter uncertainty in the definition of parametric distributions used to describe individual patient variation in health economic models

    Directory of Open Access Journals (Sweden)

    Koen Degeling

    2017-12-01

    Full Text Available Abstract Background Parametric distributions based on individual patient data can be used to represent both stochastic and parameter uncertainty. Although general guidance is available on how parameter uncertainty should be accounted for in probabilistic sensitivity analysis, there is no comprehensive guidance on reflecting parameter uncertainty in the (correlated parameters of distributions used to represent stochastic uncertainty in patient-level models. This study aims to provide this guidance by proposing appropriate methods and illustrating the impact of this uncertainty on modeling outcomes. Methods Two approaches, 1 using non-parametric bootstrapping and 2 using multivariate Normal distributions, were applied in a simulation and case study. The approaches were compared based on point-estimates and distributions of time-to-event and health economic outcomes. To assess sample size impact on the uncertainty in these outcomes, sample size was varied in the simulation study and subgroup analyses were performed for the case-study. Results Accounting for parameter uncertainty in distributions that reflect stochastic uncertainty substantially increased the uncertainty surrounding health economic outcomes, illustrated by larger confidence ellipses surrounding the cost-effectiveness point-estimates and different cost-effectiveness acceptability curves. Although both approaches performed similar for larger sample sizes (i.e. n = 500, the second approach was more sensitive to extreme values for small sample sizes (i.e. n = 25, yielding infeasible modeling outcomes. Conclusions Modelers should be aware that parameter uncertainty in distributions used to describe stochastic uncertainty needs to be reflected in probabilistic sensitivity analysis, as it could substantially impact the total amount of uncertainty surrounding health economic outcomes. If feasible, the bootstrap approach is recommended to account for this uncertainty.

  10. Creative Accounting Model for Increasing Banking Industries’ Competitive Advantage in Indonesia (P.197-207

    Directory of Open Access Journals (Sweden)

    Supriyati Supriyati

    2017-01-01

    Full Text Available Bank Indonesia demands that the national banks should improve their transparency of financial condition and performance for public in line with the development of their products and activities. Furthermore, the banks’ financial statements of Bank Indonesia have become the basis for determining the status of their soundness. In fact, they tend to practice earnings management in order that they can meet the criteria required by Bank Indonesia. For internal purposes, the initiative of earning management has a positive impact on the performance of management. However, for the users of financial statements, it may differ, for example for the value of company, length of time the financial audit, and other aspects of tax evasion by the banks. This study tries to find out 1 the effect of GCG on Earnings Management, 2 the effect of earning management on Company value, the Audit Report Lag, and Taxation, and 3 the effect of Audit Report Lag on Corporate Value and Taxation. This is a quantitative research with the data collected from the bank financial statements, GCG implementation report, and the banks’ annual reports of 2003-2013. There were 41 banks taken using purposive sampling, as listed on the Indonesia Stock Exchange. The results showed that the implementation of GCG affects the occurrence of earning management. Accounting policy flexibility through earning management is expected to affect the length of the audit process and the accuracy of the financial statements presentation on public side. This research is expected to provide managerial implications in order to consider the possibility of earnings management practices in the banking industry. In the long term, earning management is expected to improve the banks’ competitiveness through an increase in the value of the company. Explicitly, earning management also affects the tax avoidance; therefore, the banks intend to pay lower taxes without breaking the existing legislation Taxation

  11. Creative Accounting Model for Increasing Banking Industries’ Competitive Advantage in Indonesia

    Directory of Open Access Journals (Sweden)

    Supriyati

    2015-12-01

    Full Text Available Bank Indonesia demands that the national banks should improve their transparency of financial condition and performance for public in line with the development of their products and activities. Furthermore, the banks’ financial statements of Bank Indonesia have become the basis for determining the status of their soundness. In fact, they tend to practice earnings management in order that they can meet the crite-ria required by Bank Indonesia. For internal purposes, the initiative of earning management has a positive impact on the performance of management. However, for the users of financial statements, it may dif-fer, for example for the value of company, length of time the financial audit, and other aspects of tax evasion by the banks. This study tries to find out 1 the effect of GCG on Earnings Management, 2 the effect of earning management on Company value, theAudit Report Lag, and Taxation, and 3 the effect of Audit Report Lag on Corporate Value and Taxation. This is a quantitative research with the data collected from the bank financial statements, GCG implementation report, and the banks’ annual reports of 2003-2013. There were 41 banks taken using purposive sampling, as listed on the Indonesia Stock Exchange. The results showed that the implementation of GCG affects the occurrence of earning management. Accounting policy flexibility through earning management is expected to affect the length of the audit process and the accuracy of the financial statements presentation on public side. This research is expected to provide managerial implications in order to consider the possibility of earnings management practices in the banking industry. In the long term, earning management is expected to improve the banks’ competitiveness through an increase in the value of the company. Explicitly, earning management also affects the tax avoidance; therefore, the banks intend to pay lower taxes without breaking the existing legislation Taxation

  12. Current-account effects of a devaluation in an optimizing model with capital accumulation

    DEFF Research Database (Denmark)

    Nielsen, Søren Bo

    1991-01-01

    This article explores the consequences of a devaluation in the context of a ‘real', optimizing model of a small open economy. What provides for real effects of the devaluation is the existence of nominal wage stickiness during a contract period. We show that if this contract period is relatively...... assets and of capital...

  13. Summary of model to account for inhibition of CAM corrosion by porous ceramic coating

    Energy Technology Data Exchange (ETDEWEB)

    Hopper, R., LLNL

    1998-03-31

    Corrosion occurs during five characteristic periods or regimes. These are summarized below. For more detailed discussion, see the attached Memorandum by Robert Hopper entitled `Ceramic Barrier Performance Model, Version 1.0, Description of Initial PA Input` and dated March 30, 1998.

  14. Methods for Accounting for Co-Teaching in Value-Added Models. Working Paper

    Science.gov (United States)

    Hock, Heinrich; Isenberg, Eric

    2012-01-01

    Isolating the effect of a given teacher on student achievement (value-added modeling) is complicated when the student is taught the same subject by more than one teacher. We consider three methods, which we call the Partial Credit Method, Teacher Team Method, and Full Roster Method, for estimating teacher effects in the presence of co-teaching.…

  15. Accounting for false-positive acoustic detections of bats using occupancy models

    Science.gov (United States)

    Clement, Matthew J.; Rodhouse, Thomas J.; Ormsbee, Patricia C.; Szewczak, Joseph M.; Nichols, James D.

    2014-01-01

    1. Acoustic surveys have become a common survey method for bats and other vocal taxa. Previous work shows that bat echolocation may be misidentified, but common analytic methods, such as occupancy models, assume that misidentifications do not occur. Unless rare, such misidentifications could lead to incorrect inferences with significant management implications.

  16. Practical Model for First Hyperpolarizability Dispersion Accounting for Both Homogeneous and Inhomogeneous Broadening Effects.

    Science.gov (United States)

    Campo, Jochen; Wenseleers, Wim; Hales, Joel M; Makarov, Nikolay S; Perry, Joseph W

    2012-08-16

    A practical yet accurate dispersion model for the molecular first hyperpolarizability β is presented, incorporating both homogeneous and inhomogeneous line broadening because these affect the β dispersion differently, even if they are indistinguishable in linear absorption. Consequently, combining the absorption spectrum with one free shape-determining parameter Ginhom, the inhomogeneous line width, turns out to be necessary and sufficient to obtain a reliable description of the β dispersion, requiring no information on the homogeneous (including vibronic) and inhomogeneous line broadening mechanisms involved, providing an ideal model for practical use in extrapolating experimental nonlinear optical (NLO) data. The model is applied to the efficient NLO chromophore picolinium quinodimethane, yielding an excellent fit of the two-photon resonant wavelength-dependent data and a dependable static value β0 = 316 × 10(-30) esu. Furthermore, we show that including a second electronic excited state in the model does yield an improved description of the NLO data at shorter wavelengths but has only limited influence on β0.

  17. Accountability in Training Transfer: Adapting Schlenker's Model of Responsibility to a Persistent but Solvable Problem

    Science.gov (United States)

    Burke, Lisa A.; Saks, Alan M.

    2009-01-01

    Decades have been spent studying training transfer in organizational environments in recognition of a transfer problem in organizations. Theoretical models of various antecedents, empirical studies of transfer interventions, and studies of best practices have all been advanced to address this continued problem. Yet a solution may not be so…

  18. Small strain multiphase-field model accounting for configurational forces and mechanical jump conditions

    Science.gov (United States)

    Schneider, Daniel; Schoof, Ephraim; Tschukin, Oleg; Reiter, Andreas; Herrmann, Christoph; Schwab, Felix; Selzer, Michael; Nestler, Britta

    2017-08-01

    Computational models based on the phase-field method have become an essential tool in material science and physics in order to investigate materials with complex microstructures. The models typically operate on a mesoscopic length scale resolving structural changes of the material and provide valuable information about the evolution of microstructures and mechanical property relations. For many interesting and important phenomena, such as martensitic phase transformation, mechanical driving forces play an important role in the evolution of microstructures. In order to investigate such physical processes, an accurate calculation of the stresses and the strain energy in the transition region is indispensable. We recall a multiphase-field elasticity model based on the force balance and the Hadamard jump condition at the interface. We show the quantitative characteristics of the model by comparing the stresses, strains and configurational forces with theoretical predictions in two-phase cases and with results from sharp interface calculations in a multiphase case. As an application, we choose the martensitic phase transformation process in multigrain systems and demonstrate the influence of the local homogenization scheme within the transition regions on the resulting microstructures.

  19. Shadow Segmentation and Augmentation Using á-overlay Models that Account for Penumbra

    DEFF Research Database (Denmark)

    Nielsen, Michael; Madsen, Claus B.

    2006-01-01

    This paper introduces a new concept within shadow segmentation. Previously, an image is considered to consist of shadow and non-shadow regions. Thus, a binary mask is estimated using various heuristics regarding structural and retinex/color constancy theories. We wish to model natural shadows so...

  20. An Exemplar-Model Account of Feature Inference from Uncertain Categorizations

    Science.gov (United States)

    Nosofsky, Robert M.

    2015-01-01

    In a highly systematic literature, researchers have investigated the manner in which people make feature inferences in paradigms involving uncertain categorizations (e.g., Griffiths, Hayes, & Newell, 2012; Murphy & Ross, 1994, 2007, 2010a). Although researchers have discussed the implications of the results for models of categorization and…

  1. Evaluation of alternative surface runoff accounting procedures using the SWAT model

    Science.gov (United States)

    For surface runoff estimation in the Soil and Water Assessment Tool (SWAT) model, the curve number (CN) procedure is commonly adopted to calculate surface runoff by utilizing antecedent soil moisture condition (SCSI) in field. In the recent version of SWAT (SWAT2005), an alternative approach is ava...

  2. Using state-and-transition modeling to account for imperfect detection in invasive species management

    Science.gov (United States)

    Frid, Leonardo; Holcombe, Tracy; Morisette, Jeffrey T.; Olsson, Aaryn D.; Brigham, Lindy; Bean, Travis M.; Betancourt, Julio L.; Bryan, Katherine

    2013-01-01

    Buffelgrass, a highly competitive and flammable African bunchgrass, is spreading rapidly across both urban and natural areas in the Sonoran Desert of southern and central Arizona. Damages include increased fire risk, losses in biodiversity, and diminished revenues and quality of life. Feasibility of sustained and successful mitigation will depend heavily on rates of spread, treatment capacity, and cost–benefit analysis. We created a decision support model for the wildland–urban interface north of Tucson, AZ, using a spatial state-and-transition simulation modeling framework, the Tool for Exploratory Landscape Scenario Analyses. We addressed the issues of undetected invasions, identifying potentially suitable habitat and calibrating spread rates, while answering questions about how to allocate resources among inventory, treatment, and maintenance. Inputs to the model include a state-and-transition simulation model to describe the succession and control of buffelgrass, a habitat suitability model, management planning zones, spread vectors, estimated dispersal kernels for buffelgrass, and maps of current distribution. Our spatial simulations showed that without treatment, buffelgrass infestations that started with as little as 80 ha (198 ac) could grow to more than 6,000 ha by the year 2060. In contrast, applying unlimited management resources could limit 2060 infestation levels to approximately 50 ha. The application of sufficient resources toward inventory is important because undetected patches of buffelgrass will tend to grow exponentially. In our simulations, areas affected by buffelgrass may increase substantially over the next 50 yr, but a large, upfront investment in buffelgrass control could reduce the infested area and overall management costs.

  3. Making Collaborative Innovation Accountable

    DEFF Research Database (Denmark)

    Sørensen, Eva

    The public sector is increasingly expected to be innovative, but the prize for a more innovative public sector might be that it becomes difficult to hold public authorities to account for their actions. The article explores the tensions between innovative and accountable governance, describes the...... the foundation for these tensions in different accountability models, and suggest directions to take in analyzing the accountability of collaborative innovation processes.......The public sector is increasingly expected to be innovative, but the prize for a more innovative public sector might be that it becomes difficult to hold public authorities to account for their actions. The article explores the tensions between innovative and accountable governance, describes...

  4. Choice of Measurement Locations of Nonlinear Structures Using Proper Orthogonal Modes and Effective Independence Distribution Vector

    Directory of Open Access Journals (Sweden)

    T. G. Ritto

    2014-01-01

    Full Text Available This paper proposes a methodology to automatically choose the measurement locations of a nonlinear structure/equipment that needs to be monitored while operating. The response of the computational model (or experimental data is used to construct the proper orthogonal modes applying the proper orthogonal decomposition (POD, and the effective independence distribution vector (EIDV procedure is employed to eliminate, iteratively, locations that contribute less for the independence of the target proper orthogonal modes.

  5. Conditional models accounting for regression to the mean in observational multi-wave panel studies on alcohol consumption.

    Science.gov (United States)

    Ripatti, Samuli; Mäkelä, Pia

    2008-01-01

    To develop statistical methodology needed for studying whether effects of an acute-onset intervention differ by consumption group that accounts correctly for the effect of regression to the mean (RTM) in observational panel studies with three or more measurement waves. A general statistical modelling framework, based on conditional models, is presented for analysing alcohol panel data with three or more measurements, that models the dependence between initial drinking level and change in consumption controlling for RTM. The method is illustrated by panel data from Finland, southern Sweden and Denmark, where the effects of large changes in alcohol taxes and travellers' allowances were studied. The suggested model allows for drawing statistical inference of the parameters of interest and also the identification of non-linear effects of an intervention by initial consumption using standard statistical software modelling tools. There was no evidence in any of the countries of the changes being larger among heavy drinkers, but in southern Sweden there was evidence that light drinkers raised their level of consumption. Conditional models are a versatile modelling framework that offers a flexible tool for modelling and testing changes due to intervention in consumption by initial consumption while controlling simultaneously for RTM.

  6. Tree biomass in the Swiss landscape: nationwide modelling for improved accounting for forest and non-forest trees.

    Science.gov (United States)

    Price, B; Gomez, A; Mathys, L; Gardi, O; Schellenberger, A; Ginzler, C; Thürig, E

    2017-03-01

    Trees outside forest (TOF) can perform a variety of social, economic and ecological functions including carbon sequestration. However, detailed quantification of tree biomass is usually limited to forest areas. Taking advantage of structural information available from stereo aerial imagery and airborne laser scanning (ALS), this research models tree biomass using national forest inventory data and linear least-square regression and applies the model both inside and outside of forest to create a nationwide model for tree biomass (above ground and below ground). Validation of the tree biomass model against TOF data within settlement areas shows relatively low model performance (R 2 of 0.44) but still a considerable improvement on current biomass estimates used for greenhouse gas inventory and carbon accounting. We demonstrate an efficient and easily implementable approach to modelling tree biomass across a large heterogeneous nationwide area. The model offers significant opportunity for improved estimates on land use combination categories (CC) where tree biomass has either not been included or only roughly estimated until now. The ALS biomass model also offers the advantage of providing greater spatial resolution and greater within CC spatial variability compared to the current nationwide estimates.

  7. Spatial modelling and ecosystem accounting for land use planning: addressing deforestation and oil palm expansion in Central Kalimantan, Indonesia

    NARCIS (Netherlands)

    Sumarga, E.

    2015-01-01

    Ecosystem accounting is a new area of environmental economic accounting that aims to measure ecosystem services in a way that is in line with national accounts. The key characteristics of ecosystem accounting include the extension of the valuation boundary of the System of National Accounts,

  8. Spatial modelling and ecosystem accounting for land use planning: addressing deforestation and oil palm expansion in Central Kalimantan, Indonesia

    NARCIS (Netherlands)

    Sumarga, E.

    2015-01-01

    Ecosystem accounting is a new area of environmental economic accounting that aims to measure ecosystem services in a way that is in line with national accounts. The key characteristics of ecosystem accounting include the extension of the valuation boundary of the System of National Accounts,

  9. Accountability and non-proliferation nuclear regime: a review of the mutual surveillance Brazilian-Argentine model for nuclear safeguards; Accountability e regime de nao proliferacao nuclear: uma avaliacao do modelo de vigilancia mutua brasileiro-argentina de salvaguardas nucleares

    Energy Technology Data Exchange (ETDEWEB)

    Xavier, Roberto Salles

    2014-08-01

    The regimes of accountability, the organizations of global governance and institutional arrangements of global governance of nuclear non-proliferation and of Mutual Vigilance Brazilian-Argentine of Nuclear Safeguards are the subject of research. The starting point is the importance of the institutional model of global governance for the effective control of non-proliferation of nuclear weapons. In this context, the research investigates how to structure the current arrangements of the international nuclear non-proliferation and what is the performance of model Mutual Vigilance Brazilian-Argentine of Nuclear Safeguards in relation to accountability regimes of global governance. For that, was searched the current literature of three theoretical dimensions: accountability, global governance and global governance organizations. In relation to the research method was used the case study and the treatment technique of data the analysis of content. The results allowed: to establish an evaluation model based on accountability mechanisms; to assess how behaves the model Mutual Vigilance Brazilian-Argentine Nuclear Safeguards front of the proposed accountability regime; and to measure the degree to which regional arrangements that work with systems of global governance can strengthen these international systems. (author)

  10. Refining Sunrise/set Prediction Models by Accounting for the Effects of Refraction

    Science.gov (United States)

    Wilson, Teresa; Bartlett, Jennifer L.

    2016-01-01

    Current atmospheric models used to predict the times of sunrise and sunset have an error of one to four minutes at mid-latitudes (0° - 55° N/S). At higher latitudes, slight changes in refraction may cause significant discrepancies, including determining even whether the Sun appears to rise or set. While different components of refraction are known, how they affect predictions of sunrise/set has not yet been quantified. A better understanding of the contributions from temperature profile, pressure, humidity, and aerosols, could significantly improve the standard prediction. Because sunrise/set times and meteorological data from multiple locations will be necessary for a thorough investigation of the problem, we will collect this data using smartphones as part of a citizen science project. This analysis will lead to more complete models that will provide more accurate times for navigators and outdoorsman alike.

  11. Why does placing the question before an arithmetic word problem improve performance? A situation model account.

    Science.gov (United States)

    Thevenot, Catherine; Devidal, Michel; Barrouillet, Pierre; Fayol, Michel

    2007-01-01

    The aim of this paper is to investigate the controversial issue of the nature of the representation constructed by individuals to solve arithmetic word problems. More precisely, we consider the relevance of two different theories: the situation or mental model theory (Johnson-Laird, 1983; Reusser, 1989) and the schema theory (Kintsch & Greeno, 1985; Riley, Greeno, & Heller, 1983). Fourth-graders who differed in their mathematical skills were presented with problems that varied in difficulty and with the question either before or after the text. We obtained the classic effect of the position of the question, with better performance when the question was presented prior to the text. In addition, this effect was more marked in the case of children who had poorer mathematical skills and in the case of more difficult problems. We argue that this pattern of results is compatible only with the situation or mental model theory, and not with the schema theory.

  12. Mathematical modeling of pigment dispersion taking into account the full agglomerate particle size distribution

    DEFF Research Database (Denmark)

    Kiil, Søren

    2017-01-01

    The purpose of this work is to develop a mathematical model that can quantify the dispersion of pigments, with a focus on the mechanical breakage of pigment agglomerates. The underlying physical mechanism was assumed to be surface erosion of spherical pigment agglomerates. The full agglomerate......-based acrylic vehicle in a three-roll mill. When the linear rate of agglomerate surface erosion was taken to be proportional to the external agglomerate surface area, simulations of the volume-moment mean diameter over time were in good quantitative agreement with experimental data for all three pigments....... The only adjustable parameter used was an apparent rate constant for the linear agglomerate erosion rate. Model simulations, at selected values of time, for the full agglomerate particle size distribution were in good qualitative agreement with the measured values. A quantitative match of the experimental...

  13. Modelling of saturated soil slopes equilibrium with an account of the liquid phase bearing capacity

    Directory of Open Access Journals (Sweden)

    Maltseva Tatyana

    2017-01-01

    Full Text Available The paper presents an original method of solving the problem of uniformly distributed load action on a two-phase elastic half-plane with the use of a kinematic model. The kinematic model (Maltsev L.E. of two-phase medium is based on two new hypotheses according to which the stress and strain state of the two-phase body is described by a system of linear elliptic equations. These equations differ from the Lame equations of elasticity theory with two terms in each equation. The terms describe the bearing capacity of the liquid phase or a decrease in stress in the solid phase. The finite element method has been chosen as a solution method.

  14. Model application of Murabahah financing acknowledgement statement of Sharia accounting standard No 59 Year 2002

    Science.gov (United States)

    Muda, Iskandar; Panjaitan, Rohdearni; Erlina; Ginting, Syafruddin; Maksum, Azhar; Abubakar

    2018-03-01

    The purpose of this research is to observe murabahah financing implantation model. Observations were made on one of the sharia banks going public in Indonesia. Form of implementation of such implementation in the form of financing given the exact facilities and maximum financing, then the provision of financing should be adjusted to the type, business conditions and business plans prospective mudharib. If the financing provided is too low with the mudharib requirement not reaching the target and the financing is not refundable.

  15. An extended heterogeneous car-following model accounting for anticipation driving behavior and mixed maximum speeds

    Science.gov (United States)

    Sun, Fengxin; Wang, Jufeng; Cheng, Rongjun; Ge, Hongxia

    2018-02-01

    The optimal driving speeds of the different vehicles may be different for the same headway. In the optimal velocity function of the optimal velocity (OV) model, the maximum speed vmax is an important parameter determining the optimal driving speed. A vehicle with higher maximum speed is more willing to drive faster than that with lower maximum speed in similar situation. By incorporating the anticipation driving behavior of relative velocity and mixed maximum speeds of different percentages into optimal velocity function, an extended heterogeneous car-following model is presented in this paper. The analytical linear stable condition for this extended heterogeneous traffic model is obtained by using linear stability theory. Numerical simulations are carried out to explore the complex phenomenon resulted from the cooperation between anticipation driving behavior and heterogeneous maximum speeds in the optimal velocity function. The analytical and numerical results all demonstrate that strengthening driver's anticipation effect can improve the stability of heterogeneous traffic flow, and increasing the lowest value in the mixed maximum speeds will result in more instability, but increasing the value or proportion of the part already having higher maximum speed will cause different stabilities at high or low traffic densities.

  16. Taking error into account when fitting models using Approximate Bayesian Computation.

    Science.gov (United States)

    van der Vaart, Elske; Prangle, Dennis; Sibly, Richard M

    2018-03-01

    Stochastic computer simulations are often the only practical way of answering questions relating to ecological management. However, due to their complexity, such models are difficult to calibrate and evaluate. Approximate Bayesian Computation (ABC) offers an increasingly popular approach to this problem, widely applied across a variety of fields. However, ensuring the accuracy of ABC's estimates has been difficult. Here, we obtain more accurate estimates by incorporating estimation of error into the ABC protocol. We show how this can be done where the data consist of repeated measures of the same quantity and errors may be assumed to be normally distributed and independent. We then derive the correct acceptance probabilities for a probabilistic ABC algorithm, and update the coverage test with which accuracy is assessed. We apply this method, which we call error-calibrated ABC, to a toy example and a realistic 14-parameter simulation model of earthworms that is used in environmental risk assessment. A comparison with exact methods and the diagnostic coverage test show that our approach improves estimation of parameter values and their credible intervals for both models. © 2017 by the Ecological Society of America.

  17. A new lattice model accounting for multiple optimal current differences' anticipation effect in two-lane system

    Science.gov (United States)

    Li, Xiaoqin; Fang, Kangling; Peng, Guanghan

    2017-11-01

    This paper extends a two-lane lattice hydrodynamic traffic flow model to take into account the driver's anticipation effect in sensing the multiple optimal current differences. Based on the proposed model, we derive analytically the effect of driver's anticipation of multiple optimal current differences on the instability of traffic dynamics. The phase diagrams have been plotted and discussed that the stability region enhances with anticipation effect in sensing multiple optimal current differences. Through simulation, it is found that the oscillation of density wave around critical density decreases with an increase in lattice number and anticipation time for transient and steady state. The simulation results are in good agreement with the theoretical analysis, which show that considering the driver's anticipation of multiple optimal current differences in two-lane lattice model stabilizes the traffic flow and suppresses the traffic jam efficiently.

  18. A method for improving predictive modeling by taking into account lag time: Example of selenium bioaccumulation in a flowing system

    International Nuclear Information System (INIS)

    Beckon, William N.

    2016-01-01

    Highlights: • A method for estimating response time in cause-effect relationships is demonstrated. • Predictive modeling is appreciably improved by taking into account this lag time. • Bioaccumulation lag is greater for organisms at higher trophic levels. • This methodology may be widely applicable in disparate disciplines. - Abstract: For bioaccumulative substances, efforts to predict concentrations in organisms at upper trophic levels, based on measurements of environmental exposure, have been confounded by the appreciable but hitherto unknown amount of time it may take for bioaccumulation to occur through various pathways and across several trophic transfers. The study summarized here demonstrates an objective method of estimating this lag time by testing a large array of potential lag times for selenium bioaccumulation, selecting the lag that provides the best regression between environmental exposure (concentration in ambient water) and concentration in the tissue of the target organism. Bioaccumulation lag is generally greater for organisms at higher trophic levels, reaching times of more than a year in piscivorous fish. Predictive modeling of bioaccumulation is improved appreciably by taking into account this lag. More generally, the method demonstrated here may improve the accuracy of predictive modeling in a wide variety of other cause-effect relationships in which lag time is substantial but inadequately known, in disciplines as diverse as climatology (e.g., the effect of greenhouse gases on sea levels) and economics (e.g., the effects of fiscal stimulus on employment).

  19. A method for improving predictive modeling by taking into account lag time: Example of selenium bioaccumulation in a flowing system

    Energy Technology Data Exchange (ETDEWEB)

    Beckon, William N., E-mail: William_Beckon@fws.gov

    2016-07-15

    Highlights: • A method for estimating response time in cause-effect relationships is demonstrated. • Predictive modeling is appreciably improved by taking into account this lag time. • Bioaccumulation lag is greater for organisms at higher trophic levels. • This methodology may be widely applicable in disparate disciplines. - Abstract: For bioaccumulative substances, efforts to predict concentrations in organisms at upper trophic levels, based on measurements of environmental exposure, have been confounded by the appreciable but hitherto unknown amount of time it may take for bioaccumulation to occur through various pathways and across several trophic transfers. The study summarized here demonstrates an objective method of estimating this lag time by testing a large array of potential lag times for selenium bioaccumulation, selecting the lag that provides the best regression between environmental exposure (concentration in ambient water) and concentration in the tissue of the target organism. Bioaccumulation lag is generally greater for organisms at higher trophic levels, reaching times of more than a year in piscivorous fish. Predictive modeling of bioaccumulation is improved appreciably by taking into account this lag. More generally, the method demonstrated here may improve the accuracy of predictive modeling in a wide variety of other cause-effect relationships in which lag time is substantial but inadequately known, in disciplines as diverse as climatology (e.g., the effect of greenhouse gases on sea levels) and economics (e.g., the effects of fiscal stimulus on employment).

  20. The occupant response to autonomous braking: a modeling approach that accounts for active musculature.

    Science.gov (United States)

    Östh, Jonas; Brolin, Karin; Carlsson, Stina; Wismans, Jac; Davidsson, Johan

    2012-01-01

    The aim of this study is to model occupant kinematics in an autonomous braking event by using a finite element (FE) human body model (HBM) with active muscles as a step toward HBMs that can be used for injury prediction in integrated precrash and crash simulations. Trunk and neck musculature was added to an existing FE HBM. Active muscle responses were achieved using a simplified implementation of 3 feedback controllers for head angle, neck angle, and angle of the lumbar spine. The HBM was compared with volunteer responses in sled tests with 10 ms(-2) deceleration over 0.2 s and in 1.4-s autonomous braking interventions with a peak deceleration of 6.7 ms(-2). The HBM captures the characteristics of the kinematics of volunteers in sled tests. Peak forward displacements have the same timing as for the volunteers, and lumbar muscle activation timing matches data from one of the volunteers. The responses of volunteers in autonomous braking interventions are mainly small head rotations and translational motions. This is captured by the HBM controller objective, which is to maintain the initial angular positions. The HBM response with active muscles is within ±1 standard deviation of the average volunteer response with respect to head displacements and angular rotation. With the implementation of feedback control of active musculature in an FE HBM it is possible to model the occupant response to autonomous braking interventions. The lumbar controller is important for the simulations of lap belt-restrained occupants; it is less important for the kinematics of occupants with a modern 3-point seat belt. Increasing head and neck controller gains provides a better correlation for head rotation, whereas it reduces the vertical head displacement and introduces oscillations.

  1. What Time is Your Sunset? Accounting for Refraction in Sunrise/set Prediction Models

    Science.gov (United States)

    Wilson, Teresa; Bartlett, Jennifer Lynn; Chizek Frouard, Malynda; Hilton, James; Phlips, Alan; Edgar, Roman

    2018-01-01

    Algorithms that predict sunrise and sunset times currently have an uncertainty of one to four minutes at mid-latitudes (0° - 55° N/S) due to limitations in the atmospheric models they incorporate. At higher latitudes, slight changes in refraction can cause significant discrepancies, including difficulties determining whether the Sun appears to rise or set. While different components of refraction are known, how they affect predictions of sunrise/set has not yet been quantified. A better understanding of the contributions from temperature profile, pressure, humidity, and aerosols could significantly improve the standard prediction.We present a sunrise/set calculator that interchanges the refraction component by varying the refraction model. We, then, compared these predictions with data sets of observed rise/set times taken from Mount Wilson Observatory in California, University of Alberta in Edmonton, Alberta, and onboard the SS James Franco in the Atlantic. A thorough investigation of the problem requires a more substantial data set of observed rise/set times and corresponding meteorological data from around the world.We have developed a mobile application, Sunrise & Sunset Observer, so that anyone can capture this astronomical and meteorological data using their smartphone video recorder as part of a citizen science project. The Android app for this project is available in the Google Play store. Videos can also be submitted through the project website (riseset.phy.mtu.edu). Data analysis will lead to more complete models that will provide higher accuracy rise/set predictions to benefit astronomers, navigators, and outdoorsmen everywhere.

  2. An extended macro traffic flow model accounting for multiple optimal velocity functions with different probabilities

    Science.gov (United States)

    Cheng, Rongjun; Ge, Hongxia; Wang, Jufeng

    2017-08-01

    Due to the maximum velocity and safe headway distance of the different vehicles are not exactly the same, an extended macro model of traffic flow with the consideration of multiple optimal velocity functions with probabilities is proposed in this paper. By means of linear stability theory, the new model's linear stability condition considering multiple probabilities optimal velocity is obtained. The KdV-Burgers equation is derived to describe the propagating behavior of traffic density wave near the neutral stability line through nonlinear analysis. The numerical simulations of influences of multiple maximum velocities and multiple safety distances on model's stability and traffic capacity are carried out. The cases of two different kinds of maximum speeds with same safe headway distance, two different types of safe headway distances with same maximum speed and two different max velocities and two different time-gaps are all explored by numerical simulations. First cases demonstrate that when the proportion of vehicles with a larger vmax increase, the traffic tends to unstable, which also means that jerk and brakes is not conducive to traffic stability and easier to result in stop and go phenomenon. Second cases show that when the proportion of vehicles with greater safety spacing increases, the traffic tends to be unstable, which also means that too cautious assumptions or weak driving skill is not conducive to traffic stability. Last cases indicate that increase of maximum speed is not conducive to traffic stability, while reduction of the safe headway distance is conducive to traffic stability. Numerical simulation manifests that the mixed driving and traffic diversion does not have effect on the traffic capacity when traffic density is low or heavy. Numerical results also show that mixed driving should be chosen to increase the traffic capacity when the traffic density is lower, while the traffic diversion should be chosen to increase the traffic capacity when

  3. Where's the problem? Considering Laing and Esterson's account of schizophrenia, social models of disability, and extended mental disorder.

    Science.gov (United States)

    Cooper, Rachel

    2017-08-01

    In this article, I compare and evaluate R. D. Laing and A. Esterson's account of schizophrenia as developed in Sanity, Madness and the Family (1964), social models of disability, and accounts of extended mental disorder. These accounts claim that some putative disorders (schizophrenia, disability, certain mental disorders) should not be thought of as reflecting biological or psychological dysfunction within the afflicted individual, but instead as external problems (to be located in the family, or in the material and social environment). In this article, I consider the grounds on which such claims might be supported. I argue that problems should not be located within an individual putative patient in cases where there is some acceptable test environment in which there is no problem. A number of cases where such an argument can show that there is no internal disorder are discussed. I argue, however, that Laing and Esterson's argument-that schizophrenia is not within diagnosed patients-does not work. The problem with their argument is that they fail to show that the diagnosed women in their study function adequately in any environment.

  4. Method for determining the duration of construction basing on evolutionary modeling taking into account random organizational expectations

    Directory of Open Access Journals (Sweden)

    Alekseytsev Anatoliy Viktorovich

    2016-10-01

    Full Text Available One of the problems of construction planning is failure to meet time constraints and increase of workflow duration. In the recent years informational technologies are efficiently used to solve the problem of estimation of construction period. The issue of optimal estimate of the duration of construction, taking into account the possible organizational expectations is considered in the article. In order to solve this problem the iteration scheme of evolutionary modeling, in which random values of organizational expectations are used as variable parameters is developed. Adjustable genetic operators are used to improve the efficiency of the search for solutions. The reliability of the proposed approach is illustrated by an example of formation of construction schedules of monolithic foundations for buildings, taking into account possible disruptions of supply of concrete and reinforcement cages. Application of the presented methodology enables automated acquisition of several alternative scheduling of construction in accordance with standard or directive duration. Application of this computational procedure has the prospects of taking into account of construction downtime due to weather, accidents related to construction machinery breakdowns or local emergency collapses of the structures being erected.

  5. Optimization model of energy mix taking into account the environmental impact

    International Nuclear Information System (INIS)

    Gruenwald, O.; Oprea, D.

    2012-01-01

    At present, the energy system in the Czech Republic needs to decide some important issues regarding limited fossil resources, greater efficiency in producing of electrical energy and reducing emission levels of pollutants. These problems can be decided only by formulating and implementing an energy mix that will meet these conditions: rational, reliable, sustainable and competitive. The aim of this article is to find a new way of determining an optimal mix for the energy system in the Czech Republic. To achieve the aim, the linear optimization model comprising several economics, environmental and technical aspects will be applied. (Authors)

  6. Research destruction ice under dynamic loading. Part 1. Modeling explosive ice cover into account the temperature

    Directory of Open Access Journals (Sweden)

    Bogomolov Gennady N.

    2017-01-01

    Full Text Available In the research, the behavior of ice under shock and explosive loads is analyzed. Full-scale experiments were carried out. It is established that the results of 2013 practically coincide with the results of 2017, which is explained by the temperature of the formation of river ice. Two research objects are considered, including freshwater ice and river ice cover. The Taylor test was simulated numerically. The results of the Taylor test are presented. Ice is described by an elastoplastic model of continuum mechanics. The process of explosive loading of ice by emulsion explosives is numerically simulated. The destruction of the ice cover under detonation products is analyzed in detail.

  7. Modeling liquid-vapor equilibria with an equation of state taking into account dipolar interactions and association by hydrogen bonding

    International Nuclear Information System (INIS)

    Perfetti, E.

    2006-11-01

    Modelling fluid-rock interactions as well as mixing and unmixing phenomena in geological processes requires robust equations of state (EOS) which must be applicable to systems containing water, gases over a broad range of temperatures and pressures. Cubic equations of state based on the Van der Waals theory (e. g. Soave-Redlich-Kwong or Peng-Robinson) allow simple modelling from the critical parameters of the studied fluid components. However, the accuracy of such equations becomes poor when water is a major component of the fluid since neither association trough hydrogen bonding nor dipolar interactions are accounted for. The Helmholtz energy of a fluid may be written as the sum of different energetic contributions by factorization of partition function. The model developed in this thesis for the pure H 2 O and H 2 S considers three contributions. The first contribution represents the reference Van der Waals fluid which is modelled by the SRK cubic EOS. The second contribution accounts for association through hydrogen bonding and is modelled by a term derived from Cubic Plus Association (CPA) theory. The third contribution corresponds to the dipolar interactions and is modelled by the Mean Spherical Approximation (MSA) theory. The resulting CPAMSA equation has six adjustable parameters, which three represent physical terms whose values are close to their experimental counterpart. This equation results in a better reproduction of the thermodynamic properties of pure water than obtained using the classical CPA equation along the vapour-liquid equilibrium. In addition, extrapolation to higher temperatures and pressure is satisfactory. Similarly, taking into account dipolar interactions together with the SRK cubic equation of state for calculating molar volume of H 2 S as a function of pressure and temperature results in a significant improvement compared to the SRK equation alone. Simple mixing rules between dipolar molecules are proposed to model the H 2 O-H 2 S

  8. Photoproduction of pions on nuclear in chiral bag model with account of motion effects of recoil nucleon

    International Nuclear Information System (INIS)

    Dorokhov, A.E.; Kanokov, Z.; Musakhanov, M.M.; Rakhimov, A.M.

    1989-01-01

    Pion production on a nucleon is studied in the chiral bag model (CBM). A CBM version is investigated in which the pions get into the bag and interact with quarks in a pseudovector way in the entire volume. Charged pion photoproduction amplitudes are found taking into account the recoil nucleon motion effects. Angular and energy distributions of charged pions, polarization of the recoil nucleon, multipoles are calculated. The recoil effects are shon to give an additional contribution to the static approximation of order of 10-20%. At bag radius value R=1 in the calculations are consistent with the experimental data

  9. Accounting emergy flows to determine the best production model of a coffee plantation

    Energy Technology Data Exchange (ETDEWEB)

    Giannetti, B.F.; Ogura, Y.; Bonilla, S.H. [Universidade Paulista, Programa de Pos Graduacao em Engenharia de Producao, R. Dr. Bacelar, 1212 Sao Paulo SP (Brazil); Almeida, C.M.V.B., E-mail: cmvbag@terra.com.br [Universidade Paulista, Programa de Pos Graduacao em Engenharia de Producao, R. Dr. Bacelar, 1212 Sao Paulo SP (Brazil)

    2011-11-15

    Cerrado, a savannah region, is Brazil's second largest ecosystem after the Amazon rainforest and is also threatened with imminent destruction. In the present study emergy synthesis was applied to assess the environmental performance of a coffee farm located in Coromandel, Minas Gerais, in the Brazilian Cerrado. The effects of land use on sustainability were evaluated by comparing the emergy indices along ten years in order to assess the energy flows driving the production process, and to determine the best production model combining productivity and environmental performance. The emergy indices are presented as a function of the annual crop. Results show that Santo Inacio farm should produce approximately 20 bags of green coffee per hectare to accomplish its best performance regarding both the production efficiency and the environment. The evaluation of coffee trade complements those obtained by contrasting productivity and environmental performance, and despite of the market prices variation, the optimum interval for Santo Inacio's farm is between 10 and 25 coffee bags/ha. - Highlights: > Emergy synthesis is used to assess the environmental performance of a coffee farm in Brazil. > The effects of land use on sustainability were evaluated along ten years. > The energy flows driving the production process were assessed. > The best production model combining productivity and environmental performance was determined.

  10. Accounting emergy flows to determine the best production model of a coffee plantation

    International Nuclear Information System (INIS)

    Giannetti, B.F.; Ogura, Y.; Bonilla, S.H.; Almeida, C.M.V.B.

    2011-01-01

    Cerrado, a savannah region, is Brazil's second largest ecosystem after the Amazon rainforest and is also threatened with imminent destruction. In the present study emergy synthesis was applied to assess the environmental performance of a coffee farm located in Coromandel, Minas Gerais, in the Brazilian Cerrado. The effects of land use on sustainability were evaluated by comparing the emergy indices along ten years in order to assess the energy flows driving the production process, and to determine the best production model combining productivity and environmental performance. The emergy indices are presented as a function of the annual crop. Results show that Santo Inacio farm should produce approximately 20 bags of green coffee per hectare to accomplish its best performance regarding both the production efficiency and the environment. The evaluation of coffee trade complements those obtained by contrasting productivity and environmental performance, and despite of the market prices variation, the optimum interval for Santo Inacio's farm is between 10 and 25 coffee bags/ha. - Highlights: → Emergy synthesis is used to assess the environmental performance of a coffee farm in Brazil. → The effects of land use on sustainability were evaluated along ten years. → The energy flows driving the production process were assessed. → The best production model combining productivity and environmental performance was determined.

  11. The Influence of Feedback on Task-Switching Performance: A Drift Diffusion Modeling Account

    Directory of Open Access Journals (Sweden)

    Russell Cohen Hoffing

    2018-02-01

    Full Text Available Task-switching is an important cognitive skill that facilitates our ability to choose appropriate behavior in a varied and changing environment. Task-switching training studies have sought to improve this ability by practicing switching between multiple tasks. However, an efficacious training paradigm has been difficult to develop in part due to findings that small differences in task parameters influence switching behavior in a non-trivial manner. Here, for the first time we employ the Drift Diffusion Model (DDM to understand the influence of feedback on task-switching and investigate how drift diffusion parameters change over the course of task switch training. We trained 316 participants on a simple task where they alternated sorting stimuli by color or by shape. Feedback differed in six different ways between subjects groups, ranging from No Feedback (NFB to a variety of manipulations addressing trial-wise vs. Block Feedback (BFB, rewards vs. punishments, payment bonuses and different payouts depending upon the trial type (switch/non-switch. While overall performance was found to be affected by feedback, no effect of feedback was found on task-switching learning. Drift Diffusion Modeling revealed that the reductions in reaction time (RT switch cost over the course of training were driven by a continually decreasing decision boundary. Furthermore, feedback effects on RT switch cost were also driven by differences in decision boundary, but not in drift rate. These results reveal that participants systematically modified their task-switching performance without yielding an overall gain in performance.

  12. Modeling co-occurrence of northern spotted and barred owls: accounting for detection probability differences

    Science.gov (United States)

    Bailey, Larissa L.; Reid, Janice A.; Forsman, Eric D.; Nichols, James D.

    2009-01-01

    Barred owls (Strix varia) have recently expanded their range and now encompass the entire range of the northern spotted owl (Strix occidentalis caurina). This expansion has led to two important issues of concern for management of northern spotted owls: (1) possible competitive interactions between the two species that could contribute to population declines of northern spotted owls, and (2) possible changes in vocalization behavior and detection probabilities of northern spotted owls induced by presence of barred owls. We used a two-species occupancy model to investigate whether there was evidence of competitive exclusion between the two species at study locations in Oregon, USA. We simultaneously estimated detection probabilities for both species and determined if the presence of one species influenced the detection of the other species. Model selection results and associated parameter estimates provided no evidence that barred owls excluded spotted owls from territories. We found strong evidence that detection probabilities differed for the two species, with higher probabilities for northern spotted owls that are the object of current surveys. Non-detection of barred owls is very common in surveys for northern spotted owls, and detection of both owl species was negatively influenced by the presence of the congeneric species. Our results suggest that analyses directed at hypotheses of barred owl effects on demographic or occupancy vital rates of northern spotted owls need to deal adequately with imperfect and variable detection probabilities for both species.

  13. On the influence of debris in glacier melt modelling: a new temperature-index model accounting for the debris thickness feedback

    Science.gov (United States)

    Carenzo, Marco; Mabillard, Johan; Pellicciotti, Francesca; Reid, Tim; Brock, Ben; Burlando, Paolo

    2013-04-01

    The increase of rockfalls from the surrounding slopes and of englacial melt-out material has led to an increase of the debris cover extent on Alpine glaciers. In recent years, distributed debris energy-balance models have been developed to account for the melt rate enhancing/reduction due to a thin/thick debris layer, respectively. However, such models require a large amount of input data that are not often available, especially in remote mountain areas such as the Himalaya. Some of the input data such as wind or temperature are also of difficult extrapolation from station measurements. Due to their lower data requirement, empirical models have been used in glacier melt modelling. However, they generally simplify the debris effect by using a single melt-reduction factor which does not account for the influence of debris thickness on melt. In this paper, we present a new temperature-index model accounting for the debris thickness feedback in the computation of melt rates at the debris-ice interface. The empirical parameters (temperature factor, shortwave radiation factor, and lag factor accounting for the energy transfer through the debris layer) are optimized at the point scale for several debris thicknesses against melt rates simulated by a physically-based debris energy balance model. The latter has been validated against ablation stake readings and surface temperature measurements. Each parameter is then related to a plausible set of debris thickness values to provide a general and transferable parameterization. The new model is developed on Miage Glacier, Italy, a debris cover glacier in which the ablation area is mantled in near-continuous layer of rock. Subsequently, its transferability is tested on Haut Glacier d'Arolla, Switzerland, where debris is thinner and its extension has been seen to expand in the last decades. The results show that the performance of the new debris temperature-index model (DETI) in simulating the glacier melt rate at the point scale

  14. Securing radioactive sources through a proper management

    Energy Technology Data Exchange (ETDEWEB)

    Mourao, Rogerio Pimenta [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN/CNEN-MG), Belo Horizonte, MG (Brazil). Servico de Gerencia de Rejeitos], e-mail: mouraor@cdtn.br

    2009-07-01

    The safety and security of radioactive sources have become a hot issue for the nuclear community in the last two decades. The Goiania accident in Brazil and the September 11th attack alerted governments and nuclear agencies around the world to the vulnerability of the thousands of disused radioactive sources ill-stored or misplaced in a myriad of ways, especially in countries with less developed infra-structure. Once the threat of environmental contamination or malevolent use of these sources became clear, the International Atomic Energy Agency and the American Government spawned initiatives to reduce this risk, basically stimulating the proper conditioning of the sources and, whenever possible, seeking their repatriation to the countries of origin. Since 1996 Brazil has been participating actively in this effort, having carried out hands-on operations to condition old radium sources in Latin American and Caribbean countries and also repatriated its own neutron sources to the United States. A new operation is presently being organized: the reconditioning of the high activity sources contained in teletherapy units stored in the country using a mobile hot cell developed in South Africa. Also an agreement is being negotiated between the US National Nuclear Security Agency and the Brazilian CNEN to repatriate hundreds of radioactive gauges presently stored at CNEN's source storage buildings. (author)

  15. Securing radioactive sources through a proper management

    International Nuclear Information System (INIS)

    Mourao, Rogerio Pimenta

    2009-01-01

    The safety and security of radioactive sources have become a hot issue for the nuclear community in the last two decades. The Goiania accident in Brazil and the September 11th attack alerted governments and nuclear agencies around the world to the vulnerability of the thousands of disused radioactive sources ill-stored or misplaced in a myriad of ways, especially in countries with less developed infra-structure. Once the threat of environmental contamination or malevolent use of these sources became clear, the International Atomic Energy Agency and the American Government spawned initiatives to reduce this risk, basically stimulating the proper conditioning of the sources and, whenever possible, seeking their repatriation to the countries of origin. Since 1996 Brazil has been participating actively in this effort, having carried out hands-on operations to condition old radium sources in Latin American and Caribbean countries and also repatriated its own neutron sources to the United States. A new operation is presently being organized: the reconditioning of the high activity sources contained in teletherapy units stored in the country using a mobile hot cell developed in South Africa. Also an agreement is being negotiated between the US National Nuclear Security Agency and the Brazilian CNEN to repatriate hundreds of radioactive gauges presently stored at CNEN's source storage buildings. (author)

  16. Proper Treatment of Acute Mesenteric Ischemia

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Sung Kwan; Han, Young Min [Dept. of Radiology, Chonbuk National University Hospital and School of Medicine, Jeonju (Korea, Republic of); Kwak, Hyo Sung [Research Institue of Clinical Medicine, Chonbuk National University Hospital and School of Medicine, Jeonju (Korea, Republic of); Yu, Hee Chul [Dept. of Radiology, Chonbuk National University Hospital and School of Medicine, Jeonju (Korea, Republic of)

    2011-10-15

    To evaluate the effectiveness of treatment options for Acute Mesenteric Ischemia and establish proper treatment guidelines. From January 2007 to May 2010, 14 patients (13 men and 1 woman, mean age: 52.1 years) with acute mesenteric ischemia were enrolled in this study. All of the lesions were detected by CT scan and angiography. Initially, 4 patients underwent conservative treatment. Eleven patients were managed by endovascular treatment. We evaluated the therapeutic success and survival rate of each patient. The causes of ischemia included thromboembolism in 6 patients and dissection in 8 patients. Nine patients showed bowel ischemia on CT scans, 4 dissection patients underwent conservative treatment, 3 patients had recurring symptoms, and 5 dissection patients underwent endovascular treatment. Overall success and survival rate was 100%. However, overall success was 83% and survival rate was 40% in the 6 thromboembolism patients. The choice of 20 hours as the critical time in which the procedure is ideally performed was statistically significant (p = 0.0476). A percutaneous endovascular procedure is an effective treatment for acute mesenteric ischemia, especially in patients who underwent treatment within 20 hours. However, further study and a long term follow-up are needed.

  17. Model of investment appraisal of high-rise construction with account of cost of land resources

    Science.gov (United States)

    Okolelova, Ella; Shibaeva, Marina; Trukhina, Natalya

    2018-03-01

    The article considers problems and potential of high-rise construction as a global urbanization. The results of theoretical and practical studies on the appraisal of investments in high-rise construction are provided. High-rise construction has a number of apparent upsides in modern terms of development of megapolises and primarily it is economically efficient. Amid serious lack of construction sites, skyscrapers successfully deal with the need of manufacturing, office and living premises. Nevertheless, there are plenty issues, which are related with high-rise construction, and only thorough scrutiny of them allow to estimate the real economic efficiency of this branch. The article focuses on the question of economic efficiency of high-rise construction. The suggested model allows adjusting the parameters of a facility under construction, setting the tone for market value as well as the coefficient for appreciation of the construction net cost, that depends on the number of storey's, in the form of function or discrete values.

  18. A coupled surface/subsurface flow model accounting for air entrapment and air pressure counterflow

    DEFF Research Database (Denmark)

    Delfs, Jens Olaf; Wang, Wenqing; Kalbacher, Thomas

    2013-01-01

    the mass exchange between compartments. A benchmark test, which is based on a classic experimental data set on infiltration excess (Horton) overland flow, identified a feedback mechanism between surface runoff and soil air pressures. Our study suggests that air compression in soils amplifies surface runoff......This work introduces the soil air system into integrated hydrology by simulating the flow processes and interactions of surface runoff, soil moisture and air in the shallow subsurface. The numerical model is formulated as a coupled system of partial differential equations for hydrostatic (diffusive...... wave) shallow flow and two-phase flow in a porous medium. The simultaneous mass transfer between the soil, overland, and atmosphere compartments is achieved by upgrading a fully established leakance concept for overland-soil liquid exchange to an air exchange flux between soil and atmosphere. In a new...

  19. Does Reading Cause Later Intelligence? Accounting for Stability in Models of Change.

    Science.gov (United States)

    Bailey, Drew H; Littlefield, Andrew K

    2017-11-01

    This study reanalyzes data presented by Ritchie, Bates, and Plomin (2015) who used a cross-lagged monozygotic twin differences design to test whether reading ability caused changes in intelligence. The authors used data from a sample of 1,890 monozygotic twin pairs tested on reading ability and intelligence at five occasions between the ages of 7 and 16, regressing twin differences in intelligence on twin differences in prior intelligence and twin differences in prior reading ability. Results from a state-trait model suggest that reported effects of reading ability on later intelligence may be artifacts of previously uncontrolled factors, both environmental in origin and stable during this developmental period, influencing both constructs throughout development. Implications for cognitive developmental theory and methods are discussed. © 2016 The Authors. Child Development © 2016 Society for Research in Child Development, Inc.

  20. Modelling the distribution of fish accounting for spatial correlation and overdispersion

    DEFF Research Database (Denmark)

    Lewy, Peter; Kristensen, Kasper

    2009-01-01

    The spatial distribution of cod (Gadus morhua) in the North Sea and the Skagerrak was analysed over a 24-year period using the Log Gaussian Cox Process (LGCP). In contrast to other spatial models of the distribution of fish, LGCP avoids problems with zero observations and includes the spatial...... correlation between observations. It is therefore possible to predict and interpolate unobserved densities at any location in the area. This is important for obtaining unbiased estimates of stock concentration and other measures depending on the distribution in the entire area. Results show that the spatial...... correlation and dispersion of cod catches remained unchanged during winter throughout the period, in spite of a drastic decline in stock abundance and a movement of the centre of gravity of the distribution towards the northeast in the same period. For the age groups considered, the concentration of the stock...

  1. The application of multilevel modelling to account for the influence of walking speed in gait analysis.

    Science.gov (United States)

    Keene, David J; Moe-Nilssen, Rolf; Lamb, Sarah E

    2016-01-01

    Differences in gait performance can be explained by variations in walking speed, which is a major analytical problem. Some investigators have standardised speed during testing, but this can result in an unnatural control of gait characteristics. Other investigators have developed test procedures where participants walking at their self-selected slow, preferred and fast speeds, with computation of gait characteristics at a standardised speed. However, this analysis is dependent upon an overlap in the ranges of gait speed observed within and between participants, and this is difficult to achieve under self-selected conditions. In this report a statistical analysis procedure is introduced that utilises multilevel modelling to analyse data from walking tests at self-selected speeds, without requiring an overlap in the range of speeds observed or the routine use of data transformations. Copyright © 2015 Elsevier B.V. All rights reserved.

  2. Pore Network Modeling: Alternative Methods to Account for Trapping and Spatial Correlation

    KAUST Repository

    De La Garza Martinez, Pablo

    2016-05-01

    Pore network models have served as a predictive tool for soil and rock properties with a broad range of applications, particularly in oil recovery, geothermal energy from underground reservoirs, and pollutant transport in soils and aquifers [39]. They rely on the representation of the void space within porous materials as a network of interconnected pores with idealised geometries. Typically, a two-phase flow simulation of a drainage (or imbibition) process is employed, and by averaging the physical properties at the pore scale, macroscopic parameters such as capillary pressure and relative permeability can be estimated. One of the most demanding tasks in these models is to include the possibility of fluids to remain trapped inside the pore space. In this work I proposed a trapping rule which uses the information of neighboring pores instead of a search algorithm. This approximation reduces the simulation time significantly and does not perturb the accuracy of results. Additionally, I included spatial correlation to generate the pore sizes using a matrix decomposition method. Results show higher relative permeabilities and smaller values for irreducible saturation, which emphasizes the effects of ignoring the intrinsic correlation seen in pore sizes from actual porous media. Finally, I implemented the algorithm from Raoof et al. (2010) [38] to generate the topology of a Fontainebleau sandstone by solving an optimization problem using the steepest descent algorithm with a stochastic approximation for the gradient. A drainage simulation is performed on this representative network and relative permeability is compared with published results. The limitations of this algorithm are discussed and other methods are suggested to create a more faithful representation of the pore space.

  3. Accounting for disturbance history in models: using remote sensing to constrain carbon and nitrogen pool spin-up.

    Science.gov (United States)

    Hanan, Erin J; Tague, Christina; Choate, Janet; Liu, Mingliang; Kolden, Crystal; Adam, Jennifer

    2018-03-24

    Disturbances such as wildfire, insect outbreaks, and forest clearing, play an important role in regulating carbon, nitrogen, and hydrologic fluxes in terrestrial watersheds. Evaluating how watersheds respond to disturbance requires understanding mechanisms that interact over multiple spatial and temporal scales. Simulation modeling is a powerful tool for bridging these scales; however, model projections are limited by uncertainties in the initial state of plant carbon and nitrogen stores. Watershed models typically use one of two methods to initialize these stores: spin-up to steady state, or remote sensing with allometric relationships. Spin-up involves running a model until vegetation reaches equilibrium based on climate; this approach assumes that vegetation across the watershed has reached maturity and is of uniform age, which fails to account for landscape heterogeneity and non-steady state conditions. By contrast, remote sensing, can provide data for initializing such conditions. However, methods for assimilating remote sensing into model simulations can also be problematic. They often rely on empirical allometric relationships between a single vegetation variable and modeled carbon and nitrogen stores. Because allometric relationships are species- and region-specific, they do not account for the effects of local resource limitation, which can influence carbon allocation (to leaves, stems, roots, etc.). To address this problem, we developed a new initialization approach using the catchment-scale ecohydrologic model RHESSys. The new approach merges the mechanistic stability of spin-up with the spatial fidelity of remote sensing. It uses remote sensing to define spatially explicit targets for one, or several vegetation state variables, such as leaf area index, across a watershed. The model then simulates the growth of carbon and nitrogen stores until the defined targets are met for all locations. We evaluated this approach in a mixed pine-dominated watershed in

  4. Hysteresis modelling of GO laminations for arbitrary in-plane directions taking into account the dynamics of orthogonal domain walls

    Energy Technology Data Exchange (ETDEWEB)

    Baghel, A.P.S.; Sai Ram, B. [Department of Electrical Engineering, Indian Institute of Technology Bombay, Mumbai 400076 (India); Chwastek, K. [Department of Electrical Engineering Czestochowa University of Technology (Poland); Daniel, L. [Group of Electrical Engineering-Paris (GeePs), CNRS(UMR8507)/CentraleSupelec/UPMC/Univ Paris-Sud, 11 rue Joliot-Curie, 91192 Gif-sur-Yvette (France); Kulkarni, S.V. [Department of Electrical Engineering, Indian Institute of Technology Bombay, Mumbai 400076 (India)

    2016-11-15

    The anisotropy of magnetic properties in grain-oriented steels is related to their microstructure. It results from the anisotropy of the single crystal properties combined to crystallographic texture. The magnetization process along arbitrary directions can be explained using phase equilibrium for domain patterns, which can be described using Neel's phase theory. According to the theory the fractions of 180° and 90° domain walls depend on the direction of magnetization. This paper presents an approach to model hysteresis loops of grain-oriented steels along arbitrary in-plane directions. The considered description is based on a modification of the Jiles–Atherton model. It includes a modified expression for the anhysteretic magnetization which takes into account contributions of two types of domain walls. The computed hysteresis curves for different directions are in good agreement with experimental results. - Highlights: • An extended Jiles–Atherton description is used to model hysteresis loops in GO steels. • The model stresses the role of material anisotropy and different contributions of the two types of domain walls. • Hysteresis loops can be modeled along arbitrary in-plane directions. • Modeling results are in good agreement with experiments.

  5. Probabilistic modelling in urban drainage – two approaches that explicitly account for temporal variation of model errors

    DEFF Research Database (Denmark)

    Löwe, Roland; Del Giudice, Dario; Mikkelsen, Peter Steen

    of input uncertainties observed in the models. The explicit inclusion of such variations in the modelling process will lead to a better fulfilment of the assumptions made in formal statistical frameworks, thus reducing the need to resolve to informal methods. The two approaches presented here...

  6. Editorial: The proper place for knowledge

    Directory of Open Access Journals (Sweden)

    Yngve Nordkvelle

    2009-11-01

    Full Text Available Knowledge is an interesting word, which never goes out of fashion. In the political context, knowledge is something everyone hails and cherishes. An example is that the Socialist Government in Norway renamed its "Ministry of Education and Research" to the "Ministry of Knowledge". It would probably be politically wrong to defy the word "knowledge". The word "knowledge" stirs, however, different sentiments in people. In modern education, the word signifies something notable, discernable, visual or at least possible to distinguish from what it is not. In learning in higher education, knowledge is most often considered as the raw material for learning, with the little extra that distinguishes it from "information". Knowledge is information with a direction, a purpose and meaning, but without the implied cultivation of a teaching and learning process. Given knowledge is used for educational purposes, the processing of knowledge from its basic concepts to embodied and reflected knowledge, properly understood and reconceptualised by the learner, transforms not only the learner, but also the knowledge. In a peripatetic tradition, one likes to think of knowledge as foundation elements for constructions of ethical wisdom as its highest reflective level. Probably we will never see a "Ministry of Wisdom" established, because what is "wisdom" is probably so much more politically charged than "Knowledge". One can only wonder why anyone would degrade a ministry for education to something less. In Europe a rewriting of university curricula is underway all over the continent, because "knowledge" is a key concept in the writing of "learning outcomes". It appears every college is absorbed in sorting out what knowledge is and how knowledge can be classified in categories and levels, and then composed to readable descriptions of syllabi, course descriptions and schemes. Let us hope they are more able than what has been the case. Professor Ronald Barnett of the

  7. An EMG-driven biomechanical model that accounts for the decrease in moment generation capacity during a dynamic fatigued condition.

    Science.gov (United States)

    Rao, Guillaume; Berton, Eric; Amarantini, David; Vigouroux, Laurent; Buchanan, Thomas S

    2010-07-01

    Although it is well known that fatigue can greatly reduce muscle forces, it is not generally included in biomechanical models. The aim of the present study was to develop an electromyographic-driven (EMG-driven) biomechanical model to estimate the contributions of flexor and extensor muscle groups to the net joint moment during a nonisokinetic functional movement (squat exercise) performed in nonfatigued and in fatigued conditions. A methodology that aims at balancing the decreased muscle moment production capacity following fatigue was developed. During an isometric fatigue session, a linear regression was created linking the decrease in force production capacity of the muscle (normalized force/EMG ratio) to the EMG mean frequency. Using the decrease in mean frequency estimated through wavelet transforms between dynamic squats performed before and after the fatigue session as input to the previous linear regression, a coefficient accounting for the presence of fatigue in the quadriceps group was computed. This coefficient was used to constrain the moment production capacity of the fatigued muscle group within an EMG-driven optimization model dedicated to estimate the contributions of the knee flexor and extensor muscle groups to the net joint moment. During squats, our results showed significant increases in the EMG amplitudes with fatigue (+23.27% in average) while the outputs of the EMG-driven model were similar. The modifications of the EMG amplitudes following fatigue were successfully taken into account while estimating the contributions of the flexor and extensor muscle groups to the net joint moment. These results demonstrated that the new procedure was able to estimate the decrease in moment production capacity of the fatigued muscle group.

  8. Accounting for intracell flow in models with emphasis on water table recharge and stream-aquifer interaction: 2. A procedure

    Science.gov (United States)

    Jorgensen, Donald G.; Signor, Donald C.; Imes, Jeffrey L.

    1989-01-01

    Intercepted intracell flow, especially if cell includes water table recharge and a stream ((sink), can result in significant model error if not accounted for. A procedure utilizing net flow per cell (Fn) that accounts for intercepted intracell flow can be used for both steady state and transient simulations. Germane to the procedure is the determination of the ratio of area of influence of the interior sink to the area of the cell (Ai/Ac). Ai is the area in which water table recharge has the potential to be intercepted by the sink. Determining Ai/Ac requires either a detailed water table map or observation of stream conditions within the cell. A proportioning parameter M, which is equal to 1 or slightly less and is a function of cell geometry, is used to determine how much of the water that has potential for interception is intercepted by the sink within the cell. Also germane to the procedure is the determination of the flow across the streambed (Fs), which is not directly a function of cell size, due to difference in head between the water level in the stream and the potentiometric surface of the aquifer underlying the streambed. The use of Fn for steady state simulations allows simulation of water levels without utilizing head-dependent or constant head boundary conditions which tend to constrain the model-calculated water levels, an undesirable result if a comparison of measured and calculated water levels is being made. Transient simulations of streams usually utilize a head-dependent boundary condition and a leakance value to model a stream. Leakance values for each model cell can be determined from a steady state simulation, which used the net flow per cell procedure. For transient simulation, Fn would not include Fs. Also, for transient simulation it is necessary to check Fn at different time intervals because M and Ai/Ac are not constant and change with time. The procedure was used successfully in two different models of the aquifer system

  9. The Army Did Not Properly Account For and Manage Force Provider Equipment in Afghanistan

    Science.gov (United States)

    2014-07-31

    1204, Arlington VA 22202-4302. Respondents should be aware that notwithstanding any other provision of law , no person shall be subject to a penalty for...together as one professional team, recognized as leaders in our field. For more information about whistleblower protection, please see the inside...modules and add-on kits in Southwest Asia . The Army deployed 48.25 FP module equivalents to Afghanistan from 2001 to 2010; however, 401st AFSB only

  10. Modeling the dynamic behavior of railway track taking into account the occurrence of defects in the system wheel-rail

    Directory of Open Access Journals (Sweden)

    Loktev Alexey

    2017-01-01

    Full Text Available This paper investigates the influence of wheel defects on the development of rail defects up to a state where rail prompt replacement becomes necessary taking into account different models of the dynamic contact between a wheel and a rail. In particular, the quasistatic Hertz model, the linear elastic model and the elastoplastic Aleksandrov-Kadomtsev model. Based on the model of the wheel-rail contact the maximum stresses are determined which take place in the rail in the presence of wheel defects (e.g. flat spot, weld-on deposit, etc.. In this paper, the solution of the inverse problem is presented, i.e., investigation of the influence of the strength of a wheel impact upon rails on wheel defects as well as evaluation of the stresses emerging in rails. During the motion of a railway vehicle, the wheel pair position in relation to rails changes significantly, which causes various combinations of wheel-rail contact areas. Even provided the constant axial load, the normal stresses will substantially change due to the differences in the radii of curvature of contact surfaces of these areas, as well as movement velocities of railway vehicles.

  11. Diarrhea Morbidities in Small Areas: Accounting for Non-Stationarity in Sociodemographic Impacts using Bayesian Spatially Varying Coefficient Modelling.

    Science.gov (United States)

    Osei, F B; Stein, A

    2017-08-30

    Model-based estimation of diarrhea risk and understanding the dependency on sociodemographic factors is important for prioritizing interventions. It is unsuitable to calibrate regression model with a single set of coefficients, especially for large spatial domains. For this purpose, we developed a Bayesian hierarchical varying coefficient model to account for non-stationarity in the covariates. We used the integrated nested Laplace approximation for parameter estimation. Diarrhea morbidities in Ghana motivated our empirical study. Results indicated improvement regarding model fit and epidemiological benefits. The findings highlighted substantial spatial, temporal, and spatio-temporal heterogeneities in both diarrhea risk and the coefficients of the sociodemographic factors. Diarrhea risk in peri-urban and urban districts were 13.2% and 10.8% higher than rural districts, respectively. The varying coefficient model indicated further details, as the coefficients varied across districts. A unit increase in the proportion of inhabitants with unsafe liquid waste disposal was found to increase diarrhea risk by 11.5%, with higher percentages within the south-central parts through to the south-western parts. Districts with safe and unsafe drinking water sources unexpectedly had a similar risk, as were districts with safe and unsafe toilets. The findings show that site-specific interventions need to consider the varying effects of sociodemographic factors.

  12. Low Energy Atomic Models Suggesting a Pilus Structure that could Account for Electrical Conductivity of Geobacter sulfurreducens Pili.

    Science.gov (United States)

    Xiao, Ke; Malvankar, Nikhil S; Shu, Chuanjun; Martz, Eric; Lovley, Derek R; Sun, Xiao

    2016-03-22

    The metallic-like electrical conductivity of Geobacter sulfurreducens pili has been documented with multiple lines of experimental evidence, but there is only a rudimentary understanding of the structural features which contribute to this novel mode of biological electron transport. In order to determine if it was feasible for the pilin monomers of G. sulfurreducens to assemble into a conductive filament, theoretical energy-minimized models of Geobacter pili were constructed with a previously described approach, in which pilin monomers are assembled using randomized structural parameters and distance constraints. The lowest energy models from a specific group of predicted structures lacked a central channel, in contrast to previously existing pili models. In half of the no-channel models the three N-terminal aromatic residues of the pilin monomer are arranged in a potentially electrically conductive geometry, sufficiently close to account for the experimentally observed metallic like conductivity of the pili that has been attributed to overlapping pi-pi orbitals of aromatic amino acids. These atomic resolution models capable of explaining the observed conductive properties of Geobacter pili are a valuable tool to guide further investigation of the metallic-like conductivity of the pili, their role in biogeochemical cycling, and applications in bioenergy and bioelectronics.

  13. Diagnostic and prognostic simulations with a full Stokes model accounting for superimposed ice of Midtre Lovénbreen, Svalbard

    Directory of Open Access Journals (Sweden)

    T. Zwinger

    2009-11-01

    Full Text Available We present steady state (diagnostic and transient (prognostic simulations of Midtre Lovénbreen, Svalbard performed with the thermo-mechanically coupled full-Stokes code Elmer. This glacier has an extensive data set of geophysical measurements available spanning several decades, that allow for constraints on model descriptions. Consistent with this data set, we included a simple model accounting for the formation of superimposed ice. Diagnostic results indicated that a dynamic adaptation of the free surface is necessary, to prevent non-physically high velocities in a region of under determined bedrock depths. Observations from ground penetrating radar of the basal thermal state agree very well with model predictions, while the dip angles of isochrones in radar data also match reasonably well with modelled isochrones, despite the numerical deficiencies of estimating ages with a steady state model.

    Prognostic runs for 53 years, using a constant accumulation/ablation pattern starting from the steady state solution obtained from the configuration of the 1977 DEM show that: 1 the unrealistic velocities in the under determined parts of the DEM quickly damp out; 2 the free surface evolution matches well measured elevation changes; 3 the retreat of the glacier under this scenario continues with the glacier tongue in a projection to 2030 being situated ≈500 m behind the position in 1977.

  14. Accounting rigid support at the border in a mixed model the finite element method in problems of ice cover destruction

    Directory of Open Access Journals (Sweden)

    V. V. Knyazkov

    2014-01-01

    Full Text Available To evaluate the force to damage the ice covers is necessary for estimation of icebreaking capability of vessels, as well as of hull strength of icebreakers, and navigation of ships in ice conditions. On the other hand, the use of ice cover support to arrange construction works from the ice is also of practical interest.By the present moment a great deal of investigations of ice cover deformation have been carried out to result, usually, in approximate calculations formula which was obtained after making a variety of assumptions. Nevertheless, we believe that it is possible to make further improvement in calculations. Application numerical methods, and, for example, FEM, makes possible to avoid numerous drawbacks of analytical methods dealing with both complex boundaries and load application areas and other problem peculiarities.The article considers an application of mixed models of FEM for investigating ice cover deformation. A simple flexible triangle element of mixed type was taken to solve this problem. Vector of generalized coordinates of the element contains apices flexures and normal bending moments in the middle of its sides. Compared to other elements mixed models easily satisfy compatibility requirements on the boundary of adjacent elements and do not require numerical displacement differentiation to define bending moments, because bending moments are included in vector of element generalized coordinates.The method of account of rigid support plate is proposed. The resulting ratio, taking into account the "stiffening", reduces the number of resolving systems of equations by the number of elements on the plate contour.To evaluate further the results the numerical realization of ice cover stress-strained problem it becomes necessary and correct to check whether calculation results correspond to accurate solution. Using an example of circular plate the convergence of numerical solutions to analytical solutions is showed.The article

  15. Model cortical association fields account for the time course and dependence on target complexity of human contour perception.

    Directory of Open Access Journals (Sweden)

    Vadas Gintautas

    2011-10-01

    Full Text Available Can lateral connectivity in the primary visual cortex account for the time dependence and intrinsic task difficulty of human contour detection? To answer this question, we created a synthetic image set that prevents sole reliance on either low-level visual features or high-level context for the detection of target objects. Rendered images consist of smoothly varying, globally aligned contour fragments (amoebas distributed among groups of randomly rotated fragments (clutter. The time course and accuracy of amoeba detection by humans was measured using a two-alternative forced choice protocol with self-reported confidence and variable image presentation time (20-200 ms, followed by an image mask optimized so as to interrupt visual processing. Measured psychometric functions were well fit by sigmoidal functions with exponential time constants of 30-91 ms, depending on amoeba complexity. Key aspects of the psychophysical experiments were accounted for by a computational network model, in which simulated responses across retinotopic arrays of orientation-selective elements were modulated by cortical association fields, represented as multiplicative kernels computed from the differences in pairwise edge statistics between target and distractor images. Comparing the experimental and the computational results suggests that each iteration of the lateral interactions takes at least [Formula: see text] ms of cortical processing time. Our results provide evidence that cortical association fields between orientation selective elements in early visual areas can account for important temporal and task-dependent aspects of the psychometric curves characterizing human contour perception, with the remaining discrepancies postulated to arise from the influence of higher cortical areas.

  16. Accounting for geochemical alterations of caprock fracture permeability in basin-scale models of leakage from geologic CO2 reservoirs

    Science.gov (United States)

    Guo, B.; Fitts, J. P.; Dobossy, M.; Bielicki, J. M.; Peters, C. A.

    2012-12-01

    Climate mitigation, public acceptance and energy, markets demand that the potential CO2 leakage rates from geologic storage reservoirs are predicted to be low and are known to a high level of certainty. Current approaches to predict CO2 leakage rates assume constant permeability of leakage pathways (e.g., wellbores, faults, fractures). A reactive transport model was developed to account for geochemical alterations that result in permeability evolution of leakage pathways. The one-dimensional reactive transport model was coupled with the basin-scale Estimating Leakage Semi-Analytical (ELSA) model to simulate CO2 and brine leakage through vertical caprock pathways for different CO2 storage reservoir sites and injection scenarios within the Mt. Simon and St. Peter sandstone formations of the Michigan basin. Mineral dissolution in the numerical reactive transport model expands leakage pathways and increases permeability as a result of calcite dissolution by reactions driven by CO2-acidified brine. A geochemical model compared kinetic and equilibrium treatments of calcite dissolution within each grid block for each time step. For a single fracture, we investigated the effect of the reactions on leakage by performing sensitivity analyses of fracture geometry, CO2 concentration, calcite abundance, initial permeability, and pressure gradient. Assuming that calcite dissolution reaches equilibrium at each time step produces unrealistic scenarios of buffering and permeability evolution within fractures. Therefore, the reactive transport model with a kinetic treatment of calcite dissolution was coupled to the ELSA model and used to compare brine and CO2 leakage rates at a variety of potential geologic storage sites within the Michigan basin. The results are used to construct maps based on the susceptibility to geochemically driven increases in leakage rates. These maps should provide useful and easily communicated inputs into decision-making processes for siting geologic CO2

  17. Serviceability limit state related to excessive lateral deformations to account for infill walls in the structural model

    Directory of Open Access Journals (Sweden)

    G. M. S. ALVA

    Full Text Available Brazilian Codes NBR 6118 and NBR 15575 provide practical values for interstory drift limits applied to conventional modeling in order to prevent negative effects in masonry infill walls caused by excessive lateral deformability, however these codes do not account for infill walls in the structural model. The inclusion of infill walls in the proposed model allows for a quantitative evaluation of structural stresses in these walls and an assessment of cracking in these elements (sliding shear diagonal tension and diagonal compression cracking. This paper presents the results of simulations of single-story one-bay infilled R/C frames. The main objective is to show how to check the serviceability limit states under lateral loads when the infill walls are included in the modeling. The results of numerical simulations allowed for an evaluation of stresses and the probable cracking pattern in infill walls. The results also allowed an identification of some advantages and limitations of the NBR 6118 practical procedure based on interstory drift limits.

  18. Analytical model and design of spoke-type permanent-magnet machines accounting for saturation and nonlinearity of magnetic bridges

    Energy Technology Data Exchange (ETDEWEB)

    Liang, Peixin; Chai, Feng [State Key Laboratory of Robotics and System, Harbin Institute of Technology, Harbin 150001 (China); Department of Electrical Engineering, Harbin Institute of Technology, Harbin 150001 (China); Bi, Yunlong [Department of Electrical Engineering, Harbin Institute of Technology, Harbin 150001 (China); Pei, Yulong, E-mail: peiyulong1@163.com [Department of Electrical Engineering, Harbin Institute of Technology, Harbin 150001 (China); Cheng, Shukang [State Key Laboratory of Robotics and System, Harbin Institute of Technology, Harbin 150001 (China); Department of Electrical Engineering, Harbin Institute of Technology, Harbin 150001 (China)

    2016-11-01

    Based on subdomain model, this paper presents an analytical method for predicting the no-load magnetic field distribution, back-EMF and torque in general spoke-type motors with magnetic bridges. Taking into account the saturation and nonlinearity of magnetic material, the magnetic bridges are equivalent to fan-shaped saturation regions. For getting standard boundary conditions, a lumped parameter magnetic circuit model and iterative method are employed to calculate the permeability. The final field domain is divided into five types of simple subdomains. Based on the method of separation of variables, the analytical expression of each subdomain is derived. The analytical results of the magnetic field distribution, Back-EMF and torque are verified by finite element method, which confirms the validity of the proposed model for facilitating the motor design and optimization. - Highlights: • The no-load magnetic field of poke-type motors is firstly calculated by analytical method. • The magnetic circuit model and iterative method are employed to calculate the permeability. • The analytical expression of each subdomain is derived.. • The proposed method can effectively reduce the predesign stages duration.

  19. Ensemble-based flash-flood modelling: Taking into account hydrodynamic parameters and initial soil moisture uncertainties

    Science.gov (United States)

    Edouard, Simon; Vincendon, Béatrice; Ducrocq, Véronique

    2018-05-01

    Intense precipitation events in the Mediterranean often lead to devastating flash floods (FF). FF modelling is affected by several kinds of uncertainties and Hydrological Ensemble Prediction Systems (HEPS) are designed to take those uncertainties into account. The major source of uncertainty comes from rainfall forcing and convective-scale meteorological ensemble prediction systems can manage it for forecasting purpose. But other sources are related to the hydrological modelling part of the HEPS. This study focuses on the uncertainties arising from the hydrological model parameters and initial soil moisture with aim to design an ensemble-based version of an hydrological model dedicated to Mediterranean fast responding rivers simulations, the ISBA-TOP coupled system. The first step consists in identifying the parameters that have the strongest influence on FF simulations by assuming perfect precipitation. A sensitivity study is carried out first using a synthetic framework and then for several real events and several catchments. Perturbation methods varying the most sensitive parameters as well as initial soil moisture allow designing an ensemble-based version of ISBA-TOP. The first results of this system on some real events are presented. The direct perspective of this work will be to drive this ensemble-based version with the members of a convective-scale meteorological ensemble prediction system to design a complete HEPS for FF forecasting.

  20. A statistical human rib cage geometry model accounting for variations by age, sex, stature and body mass index.

    Science.gov (United States)

    Shi, Xiangnan; Cao, Libo; Reed, Matthew P; Rupp, Jonathan D; Hoff, Carrie N; Hu, Jingwen

    2014-07-18

    In this study, we developed a statistical rib cage geometry model accounting for variations by age, sex, stature and body mass index (BMI). Thorax CT scans were obtained from 89 subjects approximately evenly distributed among 8 age groups and both sexes. Threshold-based CT image segmentation was performed to extract the rib geometries, and a total of 464 landmarks on the left side of each subject׳s ribcage were collected to describe the size and shape of the rib cage as well as the cross-sectional geometry of each rib. Principal component analysis and multivariate regression analysis were conducted to predict rib cage geometry as a function of age, sex, stature, and BMI, all of which showed strong effects on rib cage geometry. Except for BMI, all parameters also showed significant effects on rib cross-sectional area using a linear mixed model. This statistical rib cage geometry model can serve as a geometric basis for developing a parametric human thorax finite element model for quantifying effects from different human attributes on thoracic injury risks. Copyright © 2014 Elsevier Ltd. All rights reserved.

  1. Model selection for semiparametric marginal mean regression accounting for within-cluster subsampling variability and informative cluster size.

    Science.gov (United States)

    Shen, Chung-Wei; Chen, Yi-Hau

    2018-03-13

    We propose a model selection criterion for semiparametric marginal mean regression based on generalized estimating equations. The work is motivated by a longitudinal study on the physical frailty outcome in the elderly, where the cluster size, that is, the number of the observed outcomes in each subject, is "informative" in the sense that it is related to the frailty outcome itself. The new proposal, called Resampling Cluster Information Criterion (RCIC), is based on the resampling idea utilized in the within-cluster resampling method (Hoffman, Sen, and Weinberg, 2001, Biometrika 88, 1121-1134) and accommodates informative cluster size. The implementation of RCIC, however, is free of performing actual resampling of the data and hence is computationally convenient. Compared with the existing model selection methods for marginal mean regression, the RCIC method incorporates an additional component accounting for variability of the model over within-cluster subsampling, and leads to remarkable improvements in selecting the correct model, regardless of whether the cluster size is informative or not. Applying the RCIC method to the longitudinal frailty study, we identify being female, old age, low income and life satisfaction, and chronic health conditions as significant risk factors for physical frailty in the elderly. © 2018, The International Biometric Society.

  2. Multivariate space-time modelling of multiple air pollutants and their health effects accounting for exposure uncertainty.

    Science.gov (United States)

    Huang, Guowen; Lee, Duncan; Scott, E Marian

    2018-03-30

    The long-term health effects of air pollution are often estimated using a spatio-temporal ecological areal unit study, but this design leads to the following statistical challenges: (1) how to estimate spatially representative pollution concentrations for each areal unit; (2) how to allow for the uncertainty in these estimated concentrations when estimating their health effects; and (3) how to simultaneously estimate the joint effects of multiple correlated pollutants. This article proposes a novel 2-stage Bayesian hierarchical model for addressing these 3 challenges, with inference based on Markov chain Monte Carlo simulation. The first stage is a multivariate spatio-temporal fusion model for predicting areal level average concentrations of multiple pollutants from both monitored and modelled pollution data. The second stage is a spatio-temporal model for estimating the health impact of multiple correlated pollutants simultaneously, which accounts for the uncertainty in the estimated pollution concentrations. The novel methodology is motivated by a new study of the impact of both particulate matter and nitrogen dioxide concentrations on respiratory hospital admissions in Scotland between 2007 and 2011, and the results suggest that both pollutants exhibit substantial and independent health effects. © 2017 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd.

  3. MODEL OF DISTRIBUTION OF THE BUDGET OF THE PORTFOLIO OF IT PROJECTS TAKING IN-TO ACCOUNT THEIR PRIORITY

    Directory of Open Access Journals (Sweden)

    Anita V. Sotnikova

    2015-01-01

    Full Text Available Article is devoted to a problem of effective distribution of the general budget of a portfolio between the IT projects which are its part taking into ac-count their priority. The designated problem is actual in view of low results of activity of the consulting companies in the sphere of information technologies.For determination of priority of IT projects the method of analytical networks developed by T. Saati is used. For the purpose of application of this method the system of criteria (indicators reflecting influence of IT projects of a portfolio on the most significant purposes of implementation of IT projects of a portfolio is developed. As system of criteria the key indicators of efficiency defined when developing the Balanced system of indicators which meet above-mentioned requirements are used. The essence of a method of analytical net-works consists in paired comparison of key indicators of efficiency concerning the purpose of realization of a portfolio and IT projects which are a part of a portfolio. Result of use of a method of analytical networks are coefficients of priority of each IT project of a portfolio. The received coefficients of priority of IT projects are used in the offered model of distribution of the budget of a portfolio between IT projects. Thus, the budget of a portfolio of IT projects is distributed between them taking into account not only the income from implementation of each IT project, but also other criteria, important for the IT company, for example: the degree of compliance of the IT project to strategic objectives of the IT company defining expediency of implementation of the IT project; the term of implementation of the IT project determined by the customer. The developed model of distribution of the budget of a portfolio between IT projects is approved on the example of distribution of the budget between IT projects of the portfolio consisting of three IT projects. Taking into account the received

  4. Population Modeling of Modified Risk Tobacco Products Accounting for Smoking Reduction and Gradual Transitions of Relative Risk.

    Science.gov (United States)

    Poland, Bill; Teischinger, Florian

    2017-11-01

    As suggested by the Food and Drug Administration (FDA) Modified Risk Tobacco Product (MRTP) Applications Draft Guidance, we developed a statistical model based on public data to explore the effect on population mortality of an MRTP resulting in reduced conventional cigarette smoking. Many cigarette smokers who try an MRTP persist as dual users while smoking fewer conventional cigarettes per day (CPD). Lower-CPD smokers have lower mortality risk based on large cohort studies. However, with little data on the effect of smoking reduction on mortality, predictive modeling is needed. We generalize prior assumptions of gradual, exponential decay of Excess Risk (ER) of death, relative to never-smokers, after quitting or reducing CPD. The same age-dependent slopes are applied to all transitions, including initiation to conventional cigarettes and to a second product (MRTP). A Monte Carlo simulation model generates random individual product use histories, including CPD, to project cumulative deaths through 2060 in a population with versus without the MRTP. Transitions are modeled to and from dual use, which affects CPD and cigarette quit rates, and to MRTP use only. Results in a hypothetical scenario showed high sensitivity of long-run mortality to CPD reduction levels and moderate sensitivity to ER transition rates. Models to project population effects of an MRTP should account for possible mortality effects of reduced smoking among dual users. In addition, studies should follow dual-user CPD histories and quit rates over long time periods to clarify long-term usage patterns and thereby improve health impact projections. We simulated mortality effects of a hypothetical MRTP accounting for cigarette smoking reduction by smokers who add MRTP use. Data on relative mortality risk versus CPD suggest that this reduction may have a substantial effect on mortality rates, unaccounted for in other models. This effect is weighed with additional hypothetical effects in an example.

  5. Tritium accountancy

    International Nuclear Information System (INIS)

    Avenhaus, R.; Spannagel, G.

    1995-01-01

    Conventional accountancy means that for a given material balance area and a given interval of time the tritium balance is established so that at the end of that interval of time the book inventory is compared with the measured inventory. In this way, an optimal effectiveness of accountancy is achieved. However, there are still further objectives of accountancy, namely the timely detection of anomalies as well as the localization of anomalies in a major system. It can be shown that each of these objectives can be optimized only at the expense of the others. Recently, Near-Real-Time Accountancy procedures have been studied; their methodological background as well as their merits will be discussed. (orig.)

  6. How robust are the estimated effects of air pollution on health? Accounting for model uncertainty using Bayesian model averaging.

    Science.gov (United States)

    Pannullo, Francesca; Lee, Duncan; Waclawski, Eugene; Leyland, Alastair H

    2016-08-01

    The long-term impact of air pollution on human health can be estimated from small-area ecological studies in which the health outcome is regressed against air pollution concentrations and other covariates, such as socio-economic deprivation. Socio-economic deprivation is multi-factorial and difficult to measure, and includes aspects of income, education, and housing as well as others. However, these variables are potentially highly correlated, meaning one can either create an overall deprivation index, or use the individual characteristics, which can result in a variety of pollution-health effects. Other aspects of model choice may affect the pollution-health estimate, such as the estimation of pollution, and spatial autocorrelation model. Therefore, we propose a Bayesian model averaging approach to combine the results from multiple statistical models to produce a more robust representation of the overall pollution-health effect. We investigate the relationship between nitrogen dioxide concentrations and cardio-respiratory mortality in West Central Scotland between 2006 and 2012. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  7. Modeling the Photoelectron Spectra of MoNbO2(-) Accounting for Spin Contamination in Density Functional Theory.

    Science.gov (United States)

    Thompson, Lee M; Hratchian, Hrant P

    2015-08-13

    Spin contamination in density functional studies has been identified as a cause of discrepancies between theoretical and experimental spectra of metal oxide clusters such as MoNbO2. We perform calculations to simulate the photoelectron spectra of the MoNbO2 anion using broken-symmetry density functional theory incorporating recently developed approximate projection methods. These calculations are able to account for the presence of contaminating spin states at single-reference computational cost. Results using these new tools demonstrate the significant effect of spin-contamination on geometries and force constants and show that the related errors in simulated spectra may be largely overcome by using an approximate projection model.

  8. Accounting for pH heterogeneity and variability in modelling human health risks from cadmium in contaminated land

    International Nuclear Information System (INIS)

    Gay, J. Rebecca; Korre, Anna

    2009-01-01

    The authors have previously published a methodology which combines quantitative probabilistic human health risk assessment and spatial statistical methods (geostatistics) to produce an assessment, incorporating uncertainty, of risks to human health from exposure to contaminated land. The model assumes a constant soil to plant concentration factor (CF veg ) when calculating intake of contaminants. This model is modified here to enhance its use in a situation where CF veg varies according to soil pH, as is the case for cadmium. The original methodology uses sequential indicator simulation (SIS) to map soil concentration estimates for one contaminant across a site. A real, age-stratified population is mapped across the contaminated area, and intake of soil contaminants by individuals is calculated probabilistically using an adaptation of the Contaminated Land Exposure Assessment (CLEA) model. The proposed improvement involves not only the geostatistical estimation of the contaminant concentration, but also that of soil pH, which in turn leads to a variable CF veg estimate which influences the human intake results. The results presented demonstrate that taking pH into account can influence the outcome of the risk assessment greatly. It is proposed that a similar adaptation could be used for other combinations of soil variables which influence CF veg .

  9. Goals and Psychological Accounting

    DEFF Research Database (Denmark)

    Koch, Alexander Karl; Nafziger, Julia

    We model how people formulate and evaluate goals to overcome self-control problems. People often attempt to regulate their behavior by evaluating goal-related outcomes separately (in narrow psychological accounts) rather than jointly (in a broad account). To explain this evidence, our theory...... of endogenous narrow or broad psychological accounts combines insights from the literatures on goals and mental accounting with models of expectations-based reference-dependent preferences. By formulating goals the individual creates expectations that induce reference points for task outcomes. These goal......-induced reference points make substandard performance psychologically painful and motivate the individual to stick to his goals. How strong the commitment to goals is depends on the type of psychological account. We provide conditions when it is optimal to evaluate goals in narrow accounts. The key intuition...

  10. Proper Tools Helping Sustainability in Logistics Practice

    NARCIS (Netherlands)

    Alrik Stelling; Nico Lamers; Gerard Vos; Reinder Pieters; Stef Weijers; Erik Koekebakker

    2009-01-01

    Proliferation on sustainability is a must, for quite a lot of companies. Logisticians could use models in attaining sustainability, or at least in understanding its potentials. A sustainable business plan must be based on a clear vision and must be underpinned thoroughly, in order to get the board

  11. An empirical test of Birkett’s competency model for management accountants : A confirmative study conducted in the Netherlands

    NARCIS (Netherlands)

    Bots, J.M.; Groenland, E.A.G.; Swagerman, D.

    2009-01-01

    In 2002, the Accountants-in-Business section of the International Federation of Accountants (IFAC) issued the Competency Profiles for Management Accounting Practice and Practitioners report. This “Birkett Report” presents a framework for competency development during the careers of management

  12. A fundamentalist perspective on accounting and implications for accounting

    OpenAIRE

    Guohua Jiang; Stephen Penman

    2013-01-01

    This paper presents a framework for addressing normative accounting issues for reporting to shareholders. The framework is an alternative to the emerging Conceptual Framework of the International Accounting Standards Board and the Financial Accounting Standards Board. The framework can be broadly characterized as a utilitarian approach to accounting standard setting. It has two main features. First, accounting is linked to valuation models under which shareholders use accounting information t...

  13. Proper Elements and Secular Resonances for Irregular Satellites

    Science.gov (United States)

    Beaugé, C.; Nesvorný, D.

    2007-06-01

    We present results of an analytical study of proper elements and secular resonances for the irregular satellites of the outer planets. In the case of the Jovian system we identify three satellite families, two of them previously known (Carme and Ananke), plus a new agglomeration of four bodies that includes Pasiphae as its largest member. While the distribution of proper elements for Saturn's moons seems to be more random, a small cluster was found for the direct moons formed by Albiorix, Erriapo, and 2004 S1, slightly different from the so-called Gaulish cluster. No significant families are detected in the present study for the Uranian or Neptunian satellite systems. For each satellite system we determine the location of several secular resonances in the proper element space. Apart from the well-known resonance locks of Pasiphae, Sinope, and Siarnaq, a comparison between the resonance locations and proper elements shows that Saturn's satellite Narvi also exhibits temporary librations in the ϖ-ϖsolar resonance. However, unlike the resonant Jovian moons that are located in the same configuration, Narvi's critical argument librates alternately around values near 90° and 270°. Neither the Uranian nor Neptunian systems seem to have resonant moons. The resonant dynamics of the real satellites in the vicinity of ϖ˙-ϖ˙solar=0 is studied with a simple model for secular resonances based on the restricted three-body problem. Depending on the initial conditions, we show the existence of one or two modes of libration that can occur at different values of the critical angle, showing a good correspondence with the observed behavior of all the resonant moons. Finally, we discuss the global distribution of the real satellites with respect to the secular resonances, as compared with synthetic populations of bodies drawn solely from stability conditions. For Saturn, we find that the present satellite population appears compatible with simple random distributions. Although

  14. Vertical velocities from proper motions of red clump giants

    Science.gov (United States)

    López-Corredoira, M.; Abedi, H.; Garzón, F.; Figueras, F.

    2014-12-01

    Aims: We derive the vertical velocities of disk stars in the range of Galactocentric radii of R = 5 - 16 kpc within 2 kpc in height from the Galactic plane. This kinematic information is connected to dynamical aspects in the formation and evolution of the Milky Way, such as the passage of satellites and vertical resonance and determines whether the warp is a long-lived or a transient feature. Methods: We used the PPMXL survey, which contains the USNO-B1 proper motions catalog cross-correlated with the astrometry and near-infrared photometry of the 2MASS point source catalog. To improve the accuracy of the proper motions, the systematic shifts from zero were calculated by using the average proper motions of quasars in this PPMXL survey, and we applied the corresponding correction to the proper motions of the whole survey, which reduces the systematic error. From the color-magnitude diagram K versus (J - K) we selected the standard candles corresponding to red clump giants and used the information of their proper motions to build a map of the vertical motions of our Galaxy. We derived the kinematics of the warp both analytically and through a particle simulation to fit these data. Complementarily, we also carried out the same analysis with red clump giants spectroscopically selected with APOGEE data, and we predict the improvements in accuracy that will be reached with future Gaia data. Results: A simple model of warp with the height of the disk zw(R,φ) = γ(R - R⊙)sin(φ - φw) fits the vertical motions if dot {γ }/γ = -34±17 Gyr-1; the contribution to dot {γ } comes from the southern warp and is negligible in the north. If we assume this 2σ detection to be real, the period of this oscillation is shorter than 0.43 Gyr at 68.3% C.L. and shorter than 4.64 Gyr at 95.4% C.L., which excludes with high confidence the slow variations (periods longer than 5 Gyr) that correspond to long-lived features. Our particle simulation also indicates a probable abrupt decrease

  15. Modelling of the physico-chemical behaviour of clay minerals with a thermo-kinetic model taking into account particles morphology in compacted material.

    Science.gov (United States)

    Sali, D.; Fritz, B.; Clément, C.; Michau, N.

    2003-04-01

    Modelling of fluid-mineral interactions is largely used in Earth Sciences studies to better understand the involved physicochemical processes and their long-term effect on the materials behaviour. Numerical models simplify the processes but try to preserve their main characteristics. Therefore the modelling results strongly depend on the data quality describing initial physicochemical conditions for rock materials, fluids and gases, and on the realistic way of processes representations. The current geo-chemical models do not well take into account rock porosity and permeability and the particle morphology of clay minerals. In compacted materials like those considered as barriers in waste repositories, low permeability rocks like mudstones or compacted powders will be used : they contain mainly fine particles and the geochemical models used for predicting their interactions with fluids tend to misjudge their surface areas, which are fundamental parameters in kinetic modelling. The purpose of this study was to improve how to take into account the particles morphology in the thermo-kinetic code KINDIS and the reactive transport code KIRMAT. A new function was integrated in these codes, considering the reaction surface area as a volume depending parameter and the calculated evolution of the mass balance in the system was coupled with the evolution of reactive surface areas. We made application exercises for numerical validation of these new versions of the codes and the results were compared with those of the pre-existing thermo-kinetic code KINDIS. Several points are highlighted. Taking into account reactive surface area evolution during simulation modifies the predicted mass transfers related to fluid-minerals interactions. Different secondary mineral phases are also observed during modelling. The evolution of the reactive surface parameter helps to solve the competition effects between different phases present in the system which are all able to fix the chemical

  16. On the importance of accounting for competing risks in pediatric brain cancer: II. Regression modeling and sample size.

    Science.gov (United States)

    Tai, Bee-Choo; Grundy, Richard; Machin, David

    2011-03-15

    To accurately model the cumulative need for radiotherapy in trials designed to delay or avoid irradiation among children with malignant brain tumor, it is crucial to account for competing events and evaluate how each contributes to the timing of irradiation. An appropriate choice of statistical model is also important for adequate determination of sample size. We describe the statistical modeling of competing events (A, radiotherapy after progression; B, no radiotherapy after progression; and C, elective radiotherapy) using proportional cause-specific and subdistribution hazard functions. The procedures of sample size estimation based on each method are outlined. These are illustrated by use of data comparing children with ependymoma and other malignant brain tumors. The results from these two approaches are compared. The cause-specific hazard analysis showed a reduction in hazards among infants with ependymoma for all event types, including Event A (adjusted cause-specific hazard ratio, 0.76; 95% confidence interval, 0.45-1.28). Conversely, the subdistribution hazard analysis suggested an increase in hazard for Event A (adjusted subdistribution hazard ratio, 1.35; 95% confidence interval, 0.80-2.30), but the reduction in hazards for Events B and C remained. Analysis based on subdistribution hazard requires a larger sample size than the cause-specific hazard approach. Notable differences in effect estimates and anticipated sample size were observed between methods when the main event showed a beneficial effect whereas the competing events showed an adverse effect on the cumulative incidence. The subdistribution hazard is the most appropriate for modeling treatment when its effects on both the main and competing events are of interest. Copyright © 2011 Elsevier Inc. All rights reserved.

  17. ABACC - Brazil-Argentina Agency for Accounting and Control of Nuclear Materials, a model of integration and transparence

    International Nuclear Information System (INIS)

    Oliveira, Antonio A.; Do Canto, Odilon Marcusso

    2013-01-01

    Argentina and Brazil began its activities in the nuclear area about the same time, in the 50 century past. The existence of an international nuclear nonproliferation treaty-TNP-seen by Brazil and Argentina as discriminatory and prejudicial to the interests of the countries without nuclear weapons, led to the need for a common system of control of nuclear material between the two countries to somehow provide assurances to the international community of the exclusively peaceful purpose of its nuclear programs. The creation of a common system, assured the establishment of uniform procedures to implement safeguards in Argentina and Brazil, so the same requirements and safeguards procedures took effect in both countries, and the operators of nuclear facilities began to follow the same rules of control of nuclear materials and subjected to the same type of verification and control. On July 18, 1991, the Bilateral Agreement for the Exclusively Peaceful Use of Nuclear Energy created a binational body, the Argentina-Brazil Agency for Accounting and Control of Nuclear Materials-ABACC-to implement the so-called Common System of Accounting and Control of Nuclear materials - SCCC. The deal provided, permanently, a clear commitment to use exclusively for peaceful purposes all material and nuclear facilities under the jurisdiction or control of the two countries. The Quadripartite Agreement, signed in December of that year, between the two countries, ABACC and IAEA completed the legal framework for the implementation of comprehensive safeguards system. The 'model ABACC' now represents a paradigmatic framework in the long process of economic, political, technological and cultural integration of the two countries. Argentina and Brazil were able to establish a guarantee system that is unique in the world today and that consolidated and matured over more than twenty years, has earned the respect of the international community

  18. Comprehensive Revenue and Expense Data Collection Methodology for Teaching Health Centers: A Model for Accountable Graduate Medical Education Financing.

    Science.gov (United States)

    Regenstein, Marsha; Snyder, John E; Jewers, Mariellen Malloy; Nocella, Kiki; Mullan, Fitzhugh

    2018-04-01

    Despite considerable federal investment, graduate medical education financing is neither transparent for estimating residency training costs nor accountable for effectively producing a physician workforce that matches the nation's health care needs. The Teaching Health Center Graduate Medical Education (THCGME) program's authorization in 2010 provided an opportunity to establish a more transparent financing mechanism. We developed a standardized methodology for quantifying the necessary investment to train primary care physicians in high-need communities. The THCGME Costing Instrument was designed utilizing guidance from site visits, financial documentation, and expert review. It collects educational outlays, patient service expenses and revenues from residents' ambulatory and inpatient care, and payer mix. The instrument was fielded from April to November 2015 in 43 THCGME-funded residency programs of varying specialties and organizational structures. Of the 43 programs, 36 programs (84%) submitted THCGME Costing Instruments. The THCGME Costing Instrument collected standardized, detailed cost data on residency labor (n = 36), administration and educational outlays (n = 33), ambulatory care visits and payer mix (n = 30), patient service expenses (n =  26), and revenues generated by residents (n = 26), in contrast to Medicare cost reports, which include only costs incurred by residency programs. The THCGME Costing Instrument provides a model for calculating evidence-based costs and revenues of community-based residency programs, and it enhances accountability by offering an approach that estimates residency costs and revenues in a range of settings. The instrument may have feasibility and utility for application in other residency training settings.

  19. Signaling and Accounting Information

    OpenAIRE

    Stewart C. Myers

    1989-01-01

    This paper develops a signaling model in which accounting information improves real investment decisions. Pure cash flow reporting is shown to lead to underinvestment when managers have superior information but are acting in shareholders' interests. Accounting by prespecified, "objective" rules alleviates the underinvestment problem.

  20. Towards ecosystem accounting

    NARCIS (Netherlands)

    Duku, C.; Rathjens, H.; Zwart, S.J.; Hein, L.

    2015-01-01

    Ecosystem accounting is an emerging field that aims to provide a consistent approach to analysing environment-economy interactions. One of the specific features of ecosystem accounting is the distinction between the capacity and the flow of ecosystem services. Ecohydrological modelling to support

  1. Basis of accountability system

    International Nuclear Information System (INIS)

    Anon.

    1981-01-01

    The first part of this presentation describes in an introductory manner the accountability design approach which is used for the Model Plant in order to meet US safeguards requirements. The general requirements for the US national system are first presented. Next, the approach taken to meet each general requirement is described. The general concepts and principles of the accountability system are introduced. The second part of this presentation describes some basic concepts and techniques used in the model plant accounting system and relates them to US safeguards requirements. The specifics and mechanics of the model plant accounting system are presented in the third part. The purpose of this session is to enable participants to: (1) understand how the accounting system is designed to meet safeguards criteria for both IAEA and State Systems; (2) understand the principles of materials accounting used to account for element and isotope in the model plant; (3) understand how the computer-based accounting system operates to meet the above objectives

  2. Eye movement control in reading: accounting for initial fixation locations and refixations within the E-Z Reader model.

    Science.gov (United States)

    Reichle, E D; Rayner, K; Pollatsek, A

    1999-10-01

    Reilly and O'Regan (1998, Vision Research, 38, 303-317) used computer simulations to evaluate how well several different word-targeting strategies could account for results which show that the distributions of fixation locations in reading are systematically related to low-level oculomotor variables, such as saccade distance and launch site [McConkie, Kerr, Reddix & Zola, (1988). Vision Research, 28, 1107-1118]. Their simulation results suggested that fixation locations are primarily determined by word length information, and that the processing of language, such as the identification of words, plays only a minimal role in deciding where to move the eyes. This claim appears to be problematic for our model of eye movement control in reading, E-Z Reader [Rayner, Reichle & Pollatsek (1998). Eye movement control in reading: an overview and model. In G. Underwood, Eye guidance in reading and scene perception (pp. 243-268). Oxford, UK: Elsevier; Reichle, Pollatsek, Fisher & Rayner (1998). Psychological Review, 105, 125-157], because it assumes that lexical access is the engine that drives the eyes forward during reading. However, we show that a newer version of E-Z Reader which still assumes that lexical access is the engine driving eye movements also predicts the locations of fixations and within-word refixations, and therefore provides a viable framework for understanding how both linguistic and oculomotor variables affect eye movements in reading.

  3. Accounts Assistant

    Indian Academy of Sciences (India)

    CHITRA

    (Not more than three months old). Annexure 1. Indian Academy of Sciences. C V Raman Avenue, Bengaluru 560 080. Application for the Post of: Accounts Assistant / Administrative Assistant Trainee / Assistant – Official Language. Implementation Policy / Temporary Copy Editor and Proof Reader / Social Media Manager. 1.

  4. Estimating the Societal Benefits of THA After Accounting for Work Status and Productivity: A Markov Model Approach.

    Science.gov (United States)

    Koenig, Lane; Zhang, Qian; Austin, Matthew S; Demiralp, Berna; Fehring, Thomas K; Feng, Chaoling; Mather, Richard C; Nguyen, Jennifer T; Saavoss, Asha; Springer, Bryan D; Yates, Adolph J

    2016-12-01

    Demand for total hip arthroplasty (THA) is high and expected to continue to grow during the next decade. Although much of this growth includes working-aged patients, cost-effectiveness studies on THA have not fully incorporated the productivity effects from surgery. We asked: (1) What is the expected effect of THA on patients' employment and earnings? (2) How does accounting for these effects influence the cost-effectiveness of THA relative to nonsurgical treatment? Taking a societal perspective, we used a Markov model to assess the overall cost-effectiveness of THA compared with nonsurgical treatment. We estimated direct medical costs using Medicare claims data and indirect costs (employment status and worker earnings) using regression models and nonparametric simulations. For direct costs, we estimated average spending 1 year before and after surgery. Spending estimates included physician and related services, hospital inpatient and outpatient care, and postacute care. For indirect costs, we estimated the relationship between functional status and productivity, using data from the National Health Interview Survey and regression analysis. Using regression coefficients and patient survey data, we ran a nonparametric simulation to estimate productivity (probability of working multiplied by earnings if working minus the value of missed work days) before and after THA. We used the Australian Orthopaedic Association National Joint Replacement Registry to obtain revision rates because it contained osteoarthritis-specific THA revision rates by age and gender, which were unavailable in other registry reports. Other model assumptions were extracted from a previously published cost-effectiveness analysis that included a comprehensive literature review. We incorporated all parameter estimates into Markov models to assess THA effects on quality-adjusted life years and lifetime costs. We conducted threshold and sensitivity analyses on direct costs, indirect costs, and revision

  5. Measuring the safeguards value of material accountability

    International Nuclear Information System (INIS)

    Sicherman, A.

    1988-01-01

    Material accountability (MA) activities focus on providing after-the-fact indication of diversion or theft of special nuclear material (SNM). MA activities include maintaining records for tracking nuclear material and conducting periodic inventories and audits to ensure that loss has not occurred. This paper presents a value model concept for assessing the safeguards benefits of MA activities and for comparing these benefits to those provided by physical protection (PP) and material control (MC) components. The model considers various benefits of MA, which include: 1) providing information to assist in recovery of missing material, 2) providing assurance that physical protection and material control systems have been working, 3) defeating protracted theft attempts, and 4) properly resolving causes of and responding appropriately to anomalies of missing material and external alarms (e.g., hoax). Such a value model can aid decision-makers in allocating safeguards resources among PP, MC, and MA systems

  6. The proper generalized decomposition for advanced numerical simulations a primer

    CERN Document Server

    Chinesta, Francisco; Leygue, Adrien

    2014-01-01

    Many problems in scientific computing are intractable with classical numerical techniques. These fail, for example, in the solution of high-dimensional models due to the exponential increase of the number of degrees of freedom. Recently, the authors of this book and their collaborators have developed a novel technique, called Proper Generalized Decomposition (PGD) that has proven to be a significant step forward. The PGD builds by means of a successive enrichment strategy a numerical approximation of the unknown fields in a separated form. Although first introduced and successfully demonstrated in the context of high-dimensional problems, the PGD allows for a completely new approach for addressing more standard problems in science and engineering. Indeed, many challenging problems can be efficiently cast into a multi-dimensional framework, thus opening entirely new solution strategies in the PGD framework. For instance, the material parameters and boundary conditions appearing in a particular mathematical mod...

  7. AMERICAN ACCOUNTING

    Directory of Open Access Journals (Sweden)

    Mihaela Onica

    2005-01-01

    Full Text Available The international Accounting Standards already contribute to the generation of better and more easily comparable financial information on an international level, supporting thus a more effective allocationof the investments resources in the world. Under the circumstances, there occurs the necessity of a consistent application of the standards on a global level. The financial statements are part of thefinancial reporting process. A set of complete financial statements usually includes a balance sheet,a profit and loss account, a report of the financial item change (which can be presented in various ways, for example as a status of the treasury flows and of the funds flows and those notes, as well as those explanatory situations and materials which are part of the financial statements.

  8. American Accounting

    OpenAIRE

    Mihaela Cristina Onica

    2005-01-01

    The international Accounting Standards already contribute to the generation of better and more easily comparable financial information on an international level, supporting thus a more effective allocation of the investments resources in the world. Under the circumstances, there occurs the necessity of a consistent application of the standards on a global level. The financial statements are part of the financial reporting process. A set of complete financial statements usually includes a bala...

  9. Dynamics and Statics of Interfaces in Proper and Improper Ferroelastics

    Science.gov (United States)

    Sujatha, Narasimhan

    1990-01-01

    We have studied the dynamics and statics of twin boundaries and interfaces in proper and improper ferroelastic materials. We have studied three-dimensional improper ferroelastic transitions in perovskites that undergo cubic -trigonal transitions, while confining ourselves to one -dimensional proper ferroelastic transitions. The stability of elastic twin boundaries is studied within the 1D solitary wave elastic phi^4 and phi^6 models developed by Barsch and Falk, respectively. Static twin boundaries, represented by kink type solutions, are found to be linearly stable, whereas only moving heterophase inclusions are found to be nonlinearly stable in some cases. The correction due to the phase shift suffered by an ultrasonic wave in passing through a twinned bicrystal in elastic constant measurements is evaluated by performing a numerical calculation based on the linear stability equation. The twin boundaries are found to reduce the travel time, thereby contributing a positive correction to the measured elastic constants. This effect is most pronounced close to the transition temperature. Although the periodic solutions of the local phi^4 model, which represent ferroelectric twin bands in 1D are frequently referred to in the literature as unstable, we are not aware of any proof in the literature and are, therefore, also presenting a proof of the linear instability of these solutions. A Landau-Ginzburg theory for cubic-trigonal improper transitions in ABX_3 perovskites has been developed along similar lines as that for cubic -tetragonal transitions, including the rotation gradient terms. The antiphase and twin boundaries, along with the strain distribution and the stresses required to sustain them, are determined. Specific application to LaAlO _3 is carried out and the results are seen to be in marked contrast with those observed for SrTiO _3 by Cao and Barsch.

  10. 18 CFR 367.4250 - Account 425, Miscellaneous amortization.

    Science.gov (United States)

    2010-04-01

    ... GAS ACT Income Statement Chart of Accounts Service Company Operating Income § 367.4250 Account 425... which are properly deductible in determining the income of the service company before interest charges...

  11. Dorsoventral and Proximodistal Hippocampal Processing Account for the Influences of Sleep and Context on Memory (Reconsolidation: A Connectionist Model

    Directory of Open Access Journals (Sweden)

    Justin Lines

    2017-01-01

    Full Text Available The context in which learning occurs is sufficient to reconsolidate stored memories and neuronal reactivation may be crucial to memory consolidation during sleep. The mechanisms of context-dependent and sleep-dependent memory (reconsolidation are unknown but involve the hippocampus. We simulated memory (reconsolidation using a connectionist model of the hippocampus that explicitly accounted for its dorsoventral organization and for CA1 proximodistal processing. Replicating human and rodent (reconsolidation studies yielded the following results. (1 Semantic overlap between memory items and extraneous learning was necessary to explain experimental data and depended crucially on the recurrent networks of dorsal but not ventral CA3. (2 Stimulus-free, sleep-induced internal reactivations of memory patterns produced heterogeneous recruitment of memory items and protected memories from subsequent interference. These simulations further suggested that the decrease in memory resilience when subjects were not allowed to sleep following learning was primarily due to extraneous learning. (3 Partial exposure to the learning context during simulated sleep (i.e., targeted memory reactivation uniformly increased memory item reactivation and enhanced subsequent recall. Altogether, these results show that the dorsoventral and proximodistal organization of the hippocampus may be important components of the neural mechanisms for context-based and sleep-based memory (reconsolidations.

  12. Accent modulates access to word meaning: Evidence for a speaker-model account of spoken word recognition.

    Science.gov (United States)

    Cai, Zhenguang G; Gilbert, Rebecca A; Davis, Matthew H; Gaskell, M Gareth; Farrar, Lauren; Adler, Sarah; Rodd, Jennifer M

    2017-11-01

    Speech carries accent information relevant to determining the speaker's linguistic and social background. A series of web-based experiments demonstrate that accent cues can modulate access to word meaning. In Experiments 1-3, British participants were more likely to retrieve the American dominant meaning (e.g., hat meaning of "bonnet") in a word association task if they heard the words in an American than a British accent. In addition, results from a speeded semantic decision task (Experiment 4) and sentence comprehension task (Experiment 5) confirm that accent modulates on-line meaning retrieval such that comprehension of ambiguous words is easier when the relevant word meaning is dominant in the speaker's dialect. Critically, neutral-accent speech items, created by morphing British- and American-accented recordings, were interpreted in a similar way to accented words when embedded in a context of accented words (Experiment 2). This finding indicates that listeners do not use accent to guide meaning retrieval on a word-by-word basis; instead they use accent information to determine the dialectic identity of a speaker and then use their experience of that dialect to guide meaning access for all words spoken by that person. These results motivate a speaker-model account of spoken word recognition in which comprehenders determine key characteristics of their interlocutor and use this knowledge to guide word meaning access. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  13. Self-consistent modeling of induced magnetic field in Titan's atmosphere accounting for the generation of Schumann resonance

    Science.gov (United States)

    Béghin, Christian

    2015-02-01

    This model is worked out in the frame of physical mechanisms proposed in previous studies accounting for the generation and the observation of an atypical Schumann Resonance (SR) during the descent of the Huygens Probe in the Titan's atmosphere on 14 January 2005. While Titan is staying inside the subsonic co-rotating magnetosphere of Saturn, a secondary magnetic field carrying an Extremely Low Frequency (ELF) modulation is shown to be generated through ion-acoustic instabilities of the Pedersen current sheets induced at the interface region between the impacting magnetospheric plasma and Titan's ionosphere. The stronger induced magnetic field components are focused within field-aligned arcs-like structures hanging down the current sheets, with minimum amplitude of about 0.3 nT throughout the ramside hemisphere from the ionopause down to the Moon surface, including the icy crust and its interface with a conductive water ocean. The deep penetration of the modulated magnetic field in the atmosphere is thought to be allowed thanks to the force balance between the average temporal variations of thermal and magnetic pressures within the field-aligned arcs. However, there is a first cause of diffusion of the ELF magnetic components, probably due to feeding one, or eventually several SR eigenmodes. A second leakage source is ascribed to a system of eddy-Foucault currents assumed to be induced through the buried water ocean. The amplitude spectrum distribution of the induced ELF magnetic field components inside the SR cavity is found fully consistent with the measurements of the Huygens wave-field strength. Waiting for expected future in-situ exploration of Titan's lower atmosphere and the surface, the Huygens data are the only experimental means available to date for constraining the proposed model.

  14. Developing a Global Model of Accounting Education and Examining IES Compliance in Australia, Japan, and Sri Lanka

    Science.gov (United States)

    Watty, Kim; Sugahara, Satoshi; Abayadeera, Nadana; Perera, Luckmika

    2013-01-01

    The introduction of International Education Standards (IES) signals a clear move by the International Accounting Education Standards Board (IAESB) to ensure high quality standards in professional accounting education at a global level. This study investigated how IES are perceived and valued by member bodies and academics in three counties:…

  15. Stochastic inverse modelling of hydraulic conductivity fields taking into account independent stochastic structures: A 3D case study

    Science.gov (United States)

    Llopis-Albert, C.; Capilla, J. E.

    2010-09-01

    SummaryMajor factors affecting groundwater flow through fractured rocks include the geometry of each fracture, its properties and the fracture-network connectivity together with the porosity and conductivity of the rock matrix. When modelling fractured rocks this is translated into attaining a characterization of the hydraulic conductivity ( K) as adequately as possible, despite its high heterogeneity. This links with the main goal of this paper, which is to present an improvement of a stochastic inverse model, named as Gradual Conditioning (GC) method, to better characterise K in a fractured rock medium by considering different K stochastic structures, belonging to independent K statistical populations (SP) of fracture families and the rock matrix, each one with its own statistical properties. The new methodology is carried out by applying independent deformations to each SP during the conditioning process for constraining stochastic simulations to data. This allows that the statistical properties of each SPs tend to be preserved during the iterative optimization process. It is worthwhile mentioning that so far, no other stochastic inverse modelling technique, with the whole capabilities implemented in the GC method, is able to work with a domain covered by several different stochastic structures taking into account the independence of different populations. The GC method is based on a procedure that gradually changes an initial K field, which is conditioned only to K data, to approximate the reproduction of other types of information, i.e., piezometric head and solute concentration data. The approach is applied to the Äspö Hard Rock Laboratory (HRL) in Sweden, where, since the middle nineties, many experiments have been carried out to increase confidence in alternative radionuclide transport modelling approaches. Because the description of fracture locations and the distribution of hydrodynamic parameters within them are not accurate enough, we address the

  16. Emerging accounting trends accounting for leases.

    Science.gov (United States)

    Valletta, Robert; Huggins, Brian

    2010-12-01

    A new model for lease accounting can have a significant impact on hospitals and healthcare organizations. The new approach proposes a "right-of-use" model that involves complex estimates and significant administrative burden. Hospitals and health systems that draw heavily on lease arrangements should start preparing for the new approach now even though guidance and a final rule are not expected until mid-2011. This article highlights a number of considerations from the lessee point of view.

  17. RCM: a new model accounting for the non-linear chloride binding isotherm and the non-equilibrium conditions between the free- and bound-chloride concentrations

    NARCIS (Netherlands)

    Spiesz, Przemek; Ballari, M.M.; Brouwers, Jos

    2012-01-01

    In this paper a new theoretical model for the Rapid Chloride Migration test is presented. This model accounts for the non-linear chloride binding isotherm and the non-equilibrium conditions between the free- and bound-chloride concentrations in concrete. The new system of equations is solved

  18. Molecular weight​/branching distribution modeling of low-​density-​polyethylene accounting for topological scission and combination termination in continuous stirred tank reactor

    NARCIS (Netherlands)

    Yaghini, N.; Iedema, P.D.

    2014-01-01

    We present a comprehensive model to predict the molecular weight distribution (MWD),(1) and branching distribution of low-density polyethylene (IdPE),(2) for free radical polymerization system in a continuous stirred tank reactor (CSTR).(3) The model accounts for branching, by branching moment or

  19. Response function theories that account for size distribution effects - A review. [mathematical models concerning composite propellant heterogeneity effects on combustion instability

    Science.gov (United States)

    Cohen, N. S.

    1980-01-01

    The paper presents theoretical models developed to account for the heterogeneity of composite propellants in expressing the pressure-coupled combustion response function. It is noted that the model of Lengelle and Williams (1968) furnishes a viable basis to explain the effects of heterogeneity.

  20. The material control and accounting system model development in the Radiochemical plant of Siberian Chemical Combine (SChC)

    International Nuclear Information System (INIS)

    Kozyrev, A.S.; Purygin, V.Ya.; Skuratov, V.A.; Lapotkov, A.A.

    1999-01-01

    The nuclear material (NM) control and accounting computerized system is designed to automatically account NM reception, movement and storage at the Radiochemical Plant. The objective of this system development is to provide a constant surveillance over the process material movement, to improve their accountability and administrative work, to upgrade the plant protection against possible NM thefts, stealing and diversion, to rule out any casual errors of operators, to improve the timeliness and significance (reliability) of information about nuclear materials. The NM control and accounting system at the Radiochemical Plant should be based on the computerized network. It must keep track of all the material movements in each Material Balance Areas: material receipt from other plant; material local movement within the plant; material shipment to other plants; generation of required documents about NM movements and its accounting [ru

  1. Infrastrukturel Accountability

    DEFF Research Database (Denmark)

    Ubbesen, Morten Bonde

    Hvordan redegør man troværdigt for noget så diffust som en hel nations udledning af drivhusgasser? Det undersøger denne afhandling i et etnografisk studie af hvordan Danmarks drivhusgasregnskab udarbejdes, rapporteres og kontrolleres. Studiet trækker på begreber og forståelser fra 'Science & Tech...... & Technology Studies', og bidrager med begrebet 'infrastrukturel accountability' til nye måder at forstå og tænke om det arbejde, hvormed højt specialiserede praksisser dokumenterer og redegør for kvaliteten af deres arbejde....

  2. PROPER: Performance visualization for optimizing and comparing ranking classifiers in MATLAB.

    Science.gov (United States)

    Jahandideh, Samad; Sharifi, Fatemeh; Jaroszewski, Lukasz; Godzik, Adam

    2015-01-01

    One of the recent challenges of computational biology is development of new algorithms, tools and software to facilitate predictive modeling of big data generated by high-throughput technologies in biomedical research. To meet these demands we developed PROPER - a package for visual evaluation of ranking classifiers for biological big data mining studies in the MATLAB environment. PROPER is an efficient tool for optimization and comparison of ranking classifiers, providing over 20 different two- and three-dimensional performance curves.

  3. Modeling uranium(VI) adsorption onto montmorillonite under varying carbonate concentrations: A surface complexation model accounting for the spillover effect on surface potential

    Science.gov (United States)

    Tournassat, C.; Tinnacher, R. M.; Grangeon, S.; Davis, J. A.

    2018-01-01

    The prediction of U(VI) adsorption onto montmorillonite clay is confounded by the complexities of: (1) the montmorillonite structure in terms of adsorption sites on basal and edge surfaces, and the complex interactions between the electrical double layers at these surfaces, and (2) U(VI) solution speciation, which can include cationic, anionic and neutral species. Previous U(VI)-montmorillonite adsorption and modeling studies have typically expanded classical surface complexation modeling approaches, initially developed for simple oxides, to include both cation exchange and surface complexation reactions. However, previous models have not taken into account the unique characteristics of electrostatic surface potentials that occur at montmorillonite edge sites, where the electrostatic surface potential of basal plane cation exchange sites influences the surface potential of neighboring edge sites ('spillover' effect). A series of U(VI) - Na-montmorillonite batch adsorption experiments was conducted as a function of pH, with variable U(VI), Ca, and dissolved carbonate concentrations. Based on the experimental data, a new type of surface complexation model (SCM) was developed for montmorillonite, that specifically accounts for the spillover effect using the edge surface speciation model by Tournassat et al. (2016a). The SCM allows for a prediction of U(VI) adsorption under varying chemical conditions with a minimum number of fitting parameters, not only for our own experimental results, but also for a number of published data sets. The model agreed well with many of these datasets without introducing a second site type or including the formation of ternary U(VI)-carbonato surface complexes. The model predictions were greatly impacted by utilizing analytical measurements of dissolved inorganic carbon (DIC) concentrations in individual sample solutions rather than assuming solution equilibration with a specific partial pressure of CO2, even when the gas phase was

  4. Accounting for multimorbidity in pay for performance: a modelling study using UK Quality and Outcomes Framework data.

    Science.gov (United States)

    Ruscitto, Andrea; Mercer, Stewart W; Morales, Daniel; Guthrie, Bruce

    2016-08-01

    The UK Quality and Outcomes Framework (QOF) offers financial incentives to deliver high-quality care for individual diseases, but the single-disease focus takes no account of multimorbidity. To examine variation in QOF payments for two indicators incentivised in ≥1 disease domain. Modelling study using cross-sectional data from 314 general practices in Scotland. Maximum payments that practices could receive under existing financial incentives were calculated for blood pressure (BP) control and influenza immunisation according to the number of coexisting clinical conditions. Payments were recalculated assuming a single new indicator. Payment varied by condition (£4.71-£11.08 for one BP control and £2.09-£5.78 for one influenza immunisation). Practices earned more for delivering the same action in patients with multimorbidity: in patients with 2, 3, and ≥4 conditions mean payments were £13.95, £21.92, and £29.72 for BP control, and £7.48, £11.21, and £15.14 for influenza immunisation, respectively. Practices in deprived areas had more multiple incentivised patients. When recalculated so that each incentivised action was only paid for once, all practices received less for BP control: affluent practices received more and deprived practices received less for influenza immunisation. For patients with single conditions, existing QOF payment methods have more than twofold variation in payment for delivering the same process. Multiple payments were common in patients with multimorbidity. A payment method is required that ensures fairness of rewards while maintaining adequate funding for practices based on actual workload. © British Journal of General Practice 2016.

  5. Accounting for the Uncertainty Related to Building Occupants with Regards to Visual Comfort: A Literature Survey on Drivers and Models

    Directory of Open Access Journals (Sweden)

    Valentina Fabi

    2016-02-01

    Full Text Available The interactions between building occupants and control systems have a high influence on energy consumption and on indoor environmental quality. In the perspective of a future of “nearly-zero” energy buildings, it is crucial to analyse the energy-related interactions deeply to predict realistic energy use during the design stage. Since the reaction to thermal, acoustic, or visual stimuli is not the same for every human being, monitoring the behaviour inside buildings is an essential step to assert differences in energy consumption related to different interactions. Reliable information concerning occupants’ behaviours in a building could contribute to a better evaluation of building energy performances and design robustness, as well as supporting the development of occupants’ education to energy awareness. The present literature survey enlarges our understanding of which environmental conditions influence occupants’ manual controlling of the system in offices and by consequence the energy consumption. The purpose of this study was to investigate the possible drivers for light-switching to model occupant behaviour in office buildings. The probability of switching lighting systems on or off was related to the occupancy and differentiated for arrival, intermediate, and departure periods. The switching probability has been reported to be higher during the entering or the leaving time in relation to contextual variables. In the analysis of switch-on actions, users were often clustered between those who take daylight level into account and switch on lights only if necessary and people who totally disregard the natural lighting. This underlines the importance of how individuality is at the base of the definition of the different types of users.

  6. Accounting for parameter uncertainty in the definition of parametric distributions used to describe individual patient variation in health economic models

    NARCIS (Netherlands)

    Degeling, Koen; IJzerman, Maarten J; Koopman, Miriam; Koffijberg, Hendrik

    2017-01-01

    Background Parametric distributions based on individual patient data can be used to represent both stochastic and parameter uncertainty. Although general guidance is available on how parameter uncertainty should be accounted for in probabilistic sensitivity analysis, there is no comprehensive

  7. Accounting for parameter uncertainty in the definition of parametric distributions used to describe individual patient variation in health economic models

    NARCIS (Netherlands)

    Degeling, Koen; Ijzerman, Maarten J.; Koopman, Miriam; Koffijberg, Hendrik

    2017-01-01

    Background: Parametric distributions based on individual patient data can be used to represent both stochastic and parameter uncertainty. Although general guidance is available on how parameter uncertainty should be accounted for in probabilistic sensitivity analysis, there is no comprehensive

  8. Design Accountability

    DEFF Research Database (Denmark)

    Koskinen, Ilpo; Krogh, Peter

    2015-01-01

    . This paper looks at constructive design research which takes the entanglement of theory and practice as its hallmark, and uses it as a test case in exploring how design researchers can work with theory, methodology, and practice without losing their identity as design researchers. The crux of practice based...... design research is that where classical research is interested in singling out a particular aspect and exploring it in depth, design practice is characterized by balancing numerous concerns in a heterogenous and occasionally paradoxical product. It is on this basis the notion of design accountability......When design research builds on design practice, it may contribute to both theory and practice of design in ways richer than research that treats design as a topic. Such research, however, faces several tensions that it has to negotiate successfully in order not to lose its character as research...

  9. Nasogastric and feeding tubes. The importance of proper placement.

    Science.gov (United States)

    Gharib, A M; Stern, E J; Sherbin, V L; Rohrmann, C A

    1996-05-01

    The authors' experience in a radiology department suggested to them that there is a wide range of beliefs among practitioners regarding proper placement of nasogastric and feeding tubes. Improper positioning can cause serious problems, as they explain. Indications for different tube positions, complications of incorrect tube placement, and directions for proper positioning are discussed and illustrated.

  10. Chains as proper enrichment for intensively-farmed pigs?

    NARCIS (Netherlands)

    Bracke, M.B.M.

    2017-01-01

    This chapter primarily compiles work in which the author (Marc Bracke) has been involved with providing science-based decision support on the question of what is proper enrichment material for intensively-farmed pigs as required by EC Directive 2001/93/EC. Proper manipulable material should

  11. 29 CFR 1404.20 - Proper use of expedited arbitration.

    Science.gov (United States)

    2010-07-01

    ... 29 Labor 4 2010-07-01 2010-07-01 false Proper use of expedited arbitration. 1404.20 Section 1404... ARBITRATION SERVICES Expedited Arbitration § 1404.20 Proper use of expedited arbitration. (a) FMCS reserves the right to cease honoring request for Expedited Arbitration if a pattern of misuse of this becomes...

  12. The proper name as starting point for basic reading skills

    NARCIS (Netherlands)

    Both-De Vries, Anna C.; Bus, Adriana G

    Does alphabetic-phonetic writing start with the proper name and how does the name affect reading and writing skills? Sixty 4- to 5(1/2)-year-old children from middle SES families with Dutch as their first language wrote their proper name and named letters. For each child we created unique sets of

  13. Comparison of Methods of Teaching Children Proper Lifting ...

    African Journals Online (AJOL)

    Objective: This study was designed to determine the effects of three teaching methods on children\\'s ability to demonstrate and recall their mastery of proper lifting techniques. Method: Ninety-three primary five and six public school children who had no knowledge of proper lifting technique were assigned into three equal ...

  14. 32 CFR 536.27 - Identification of a proper claimant.

    Science.gov (United States)

    2010-07-01

    ... contract held valid by state law. (g) Interdepartmental waiver rule. Neither the U.S. government nor any of... is not a proper claimant for loss or damage to its property. A unit of local government other than a state, commonwealth, or territory is a proper claimant. Note to § 536.27: See the parallel discussion at...

  15. The Southern Proper Motion Program. IV. The SPM4 Catalog

    Science.gov (United States)

    Girard, Terrence M.; van Altena, William F.; Zacharias, Norbert; Vieira, Katherine; Casetti-Dinescu, Dana I.; Castillo, Danilo; Herrera, David; Lee, Young Sun; Beers, Timothy C.; Monet, David G.; López, Carlos E.

    2011-07-01

    We present the fourth installment of the Yale/San Juan Southern Proper Motion Catalog, SPM4. The SPM4 contains absolute proper motions, celestial coordinates, and B, V photometry for over 103 million stars and galaxies between the south celestial pole and -20° declination. The catalog is roughly complete to V = 17.5 and is based on photographic and CCD observations taken with the Yale Southern Observatory's double astrograph at Cesco Observatory in El Leoncito, Argentina. The proper-motion precision, for well-measured stars, is estimated to be 2-3 mas yr-1, depending on the type of second-epoch material. At the bright end, proper motions are on the International Celestial Reference System by way of Hipparcos Catalog stars, while the faint end is anchored to the inertial system using external galaxies. Systematic uncertainties in the absolute proper motions are on the order of 1 mas yr-1.

  16. Democratic Model of Public Policy Accountability. Case Study on Implementation of Street Vendors Empowerment Policy in Makassar City

    Directory of Open Access Journals (Sweden)

    Rulinawaty Kasmadsi

    2015-08-01

    Full Text Available Policy accountability is a form of manifestation of public officials responsible to the people. One form of policy accountability that is discussed here is street vendors policy accountability, because they are a group of citizens who have the economic activities in public spaces. The existence of this policy how-ever, the number of street vendors from year to year increase in Makassar City. Therefore, this study seeks to uncover and explain the democratic policy ac-countability through the street vendors’ responses and expectations to the implementation of street ven-dors empowerment policy in Makassar City; and to uncover and explain the democratic policy account-ability through the stakeholders’ responses and ex-pectations to the implementation of street vendors empowerment policy in Makassar City. To achieve these objectives, the study uses democracy theory, in which this theory focuses on togetherness in dis-cussing solutions to the various problems of street vendors and in the policy implementation as well.This study used a qualitative design and case studies strat-egy. Data collection techniques used was observa-tion, interview, and documentation. Data were ana-lyzed with case description its settings. The results of this study pointed out that the interests and needs of the street vendors are not met through the empow-erment policies vendors. This is caused by the ab-sence of accountability forum as a place of togeth-erness all of street vendors empowerment stakehold-ers’. Street vendors empowerment policy in Makassar City are designed base on a top-down approach, so they are considered as objects, which must accept all government programs aimed at them.

  17. Determination of a cohesive law for delamination modelling - Accounting for variation in crack opening and stress state across the test specimen width

    DEFF Research Database (Denmark)

    Joki, R. K.; Grytten, F.; Hayman, Brian

    2016-01-01

    The cohesive law for Mode I delamination in glass fibre Non-Crimped Fabric reinforced vinylester is determined for use in finite element models. The cohesive law is derived from a delamination test based on DCB specimens loaded with pure bending moments taking into account the presence of large...... by differentiating the fracture resistance with respect to opening displacement at the initial location of the crack tip, measured at the specimen edge. 2) Extend the bridging law to a cohesive law by accounting for crack tip fracture energy. 3) Fine-tune the cohesive law through an iterative modelling approach so...... that the changing state of stress and deformation across the width of the test specimen is taken into account. The changing state of stress and deformation across the specimen width is shown to be significant for small openings (small fracture process zone size). This will also be important for the initial part...

  18. The Perception of the Accounting Students on the Image of the Accountant and the Accounting Profession

    Directory of Open Access Journals (Sweden)

    Lucian Cernuşca

    2015-01-01

    Full Text Available This study aims to present the perception of the accounting students on the accountant image and the accounting profession, thus contributing to a better understanding of the option for the field of accounting and the motivations for choosing this profession. The paper consists of the following parts: introduction, literature review, research methodology, research findings, conclusions and bibliography. The accounting profession must be aligned to the current conditions the Romanian accounting system is going through to harmonize to the IFRS and European regulations and the development of information technologies and the transition to digital era. The role of the accountant changes from a simple digit operator to a modern one. This will be part of the managerial team, provide strategic and financial advice and effective solutions for the proper functioning of the organization, the modern stereotype involving creativity in the accounting activities. The research aims at understanding the role of the accounting profession as a social identity and as a social phenomenon and the implications for academia and professional bodies.

  19. Modelling representative and coherent Danish farm types based on farm accountancy data for use in enviromental assessments

    DEFF Research Database (Denmark)

    Dalgaard, Randi; Halberg, Niels; Kristensen, Ib S.

    2006-01-01

    is established in order to report Danish agro-economical data to the ‘Farm Accountancy Data Network’ (FADN), and to produce ‘The annual Danish account statistics for agriculture’. The farm accounts are selected and weighted to be representative for the Danish agricultural sector, and similar samples of farm......, homegrown feed, manure production, fertilizer use and crop production. The set of farm types was scaled up to national level thus representing the whole Danish agricultural sector and the resulting production, resource use and land use was checked against the national statistics. Nutrient balance....... The methane emission was higher from dairy farm types compared with all other farm types. In general the conventional dairy farms emitted more nitrate, ammonia, and nitrous oxi de, compared with organic dairy farms....

  20. Determination of a cohesive law for delamination modelling - Accounting for variation in crack opening and stress state across the test specimen width

    DEFF Research Database (Denmark)

    Joki, R. K.; Grytten, F.; Hayman, Brian

    2016-01-01

    The cohesive law for Mode I delamination in glass fibre Non-Crimped Fabric reinforced vinylester is determined for use in finite element models. The cohesive law is derived from a delamination test based on DCB specimens loaded with pure bending moments taking into account the presence of large...

  1. Can the Five Factor Model of Personality Account for the Variability of Autism Symptom Expression? Multivariate Approaches to Behavioral Phenotyping in Adult Autism Spectrum Disorder

    Science.gov (United States)

    Schwartzman, Benjamin C.; Wood, Jeffrey J.; Kapp, Steven K.

    2016-01-01

    The present study aimed to: determine the extent to which the five factor model of personality (FFM) accounts for variability in autism spectrum disorder (ASD) symptomatology in adults, examine differences in average FFM personality traits of adults with and without ASD and identify distinct behavioral phenotypes within ASD. Adults (N = 828;…

  2. Accounting concept of inventories in postindustrial economy

    Directory of Open Access Journals (Sweden)

    Pravdyuk N.L.

    2017-06-01

    Full Text Available The accounting of inventories has undergone significant changes over a relatively short period of time. It has changed the scientific picture of their definition and classification, measurement and write-offs reflected in the financial statements. However, these changes happen without proper interpretation and system analysis. And, at least in general terms the inventories are conducted in Ukraine according to IFRS; this causes some obstacles to the objective reflection of working capital of enterprises, and the transparency of disclosure and is not conducive to the formation of a proper investment climate. It is established that the information provision inventory control must meet the requirements of the postindustrial economy by the complicating and deepening the complexity of accounting, the introduction of new forms and their synthesis with the current one, a gradual reorganization to ensure the needs of consumers and enterprise evaluation. The results of the study have substantiated the fundamentals of accounting concepts in the postindustrial economy in the part of the circulating capital, which forms inventories. The information support of inventory management should be implemented in a hierarchical way, when it first and foremost analyzes the working capital, and further deals with inventories and stocks as its subordinate components. The author considers the material goods to be a broader concept than reserves, because they have a dual nature both estimated as the share of negotiable assets, and as the physical component of material costs. The paper gives the definition of this category of symbiosis, which is based on P(CBU 9. The general structure of the current inventories are of significant importance, which has differences in industries, the dominant of which is agriculture, industry, construction, trade, material production. The postindustrial economy caused the questions of differentiation of concepts "production" and "material

  3. Thin filaments at the Galactic Center: identification and proper motions

    Energy Technology Data Exchange (ETDEWEB)

    Muzic, K [I. Physikalishes Institut, Universitaet zu Koeln, Zuepicher Str. 77, 50937 Cologne (Germany); Eckart, A [I. Physikalishes Institut, Universitaet zu Koeln, Zuepicher Str. 77, 50937 Cologne (Germany); Schoedel, R [I. Physikalishes Institut, Universitaet zu Koeln, Zuepicher Str. 77, 50937 Cologne (Germany); Meyer, L [I. Physikalishes Institut, Universitaet zu Koeln, Zuepicher Str. 77, 50937 Cologne (Germany); Zensus, A [Max-Planck-Institut fuer Radioastronomie, Auf dem Huegel 69, 53121 Bonn (Germany)

    2006-12-15

    L'-band (3.8 {mu}m) images of the Galactic Center show a large number of thin filaments in the mini-spiral, located close to the mini-cavity, along the inner edge of the northern arm and in the vicinity of some stars. We interpret them as shock fronts formed by the interaction of a central wind with the mini-spiral or, in some cases, extended dusty stellar envelopes. The observations have been carried out using the NACO adaptive optics system at the ESO VLT, in 5 subsequent epochs from 2002 to 2006. We present a proper motion study of the thin filaments observed in the central parsec around Sgr A*, obtained using the cross- correlation technique. Our interpretation is consistent with a collimated outfbw model from the central few arcseconds. Two possible mechanisms could produce the postulated outfbw: stellar winds originating from the high-mass-loosing He-star cluster as well as a wind from Sgr A* due to accretion from the surrounding disk of stars.

  4. The mandate for a proper preservation in histopathological tissues.

    Science.gov (United States)

    Comănescu, Maria; Arsene, D; Ardeleanu, Carmen; Bussolati, G

    2012-01-01

    A sequence of technically reproducible procedures is mandatory to guarantee a proper preservation of tissues and to build up the basis for sound diagnoses. However, while the goal of these procedures was, until recently, to assure only structural (histological and cytological) preservation, an appropriate preservation of antigenic properties and of nucleic acid integrity is now additionally requested, in order to permit pathologists to provide the biological information necessary for the adoption of personalized therapies. The present review analyses the sequence of technical steps open to critical variations. Passages such as dehydration, paraffin embedding, sectioning and staining are relatively well standardized and allow adoption of dedicated (automatic) apparatuses, while other pre-analytical steps, i.e. time and modalities of transfer of surgical specimens from the surgical theatre to the pathology laboratory (s.c. "ischemia time") and the type and length of fixation are not standardized and are a potential cause of discrepancies in diagnostic results. Our group is involved in European-funded projects tackling these problems with the concrete objective of implementing a model of effective tumors investigations by high performance genetic and molecular methodologies. The problem of the discrepant quality level of histopathological and cytological preparations involved five European countries and exploiting the potential of "virtual slide technology". Concrete issues, techniques and pitfalls, as well as proposed guidelines for processing the tissues are shown in this presentation.

  5. Accounting for capacity and flow of ecosystem services: A conceptual model and a case study for Telemark, Norway

    NARCIS (Netherlands)

    Schroter, M.; Barton, D.N.; Remme, R.P.; Hein, L.G.

    2014-01-01

    Understanding the flow of ecosystem services and the capacity of ecosystems to generate these services is an essential element for understanding the sustainability of ecosystem use as well as developing ecosystem accounts. We conduct spatially explicit analyses of nine ecosystem services in Telemark

  6. A properly adjusted forage harvester can save time and money

    Science.gov (United States)

    A properly adjusted forage harvester can save fuel and increase the realizable milk per ton of your silage. This article details the adjustments necessary to minimize energy while maximizing productivity and forage quality....

  7. Whole of Government Accounts

    DEFF Research Database (Denmark)

    Pontoppidan, Caroline Aggestam; Chow, Danny; Day, Ronald

    In our comparative study, we surveyed an emerging literature on the use of consolidation in government accounting and develop a research agenda. We find heterogeneous approaches to the development of consolidation models across the five countries (Australia, New Zealand, UK, Canada and Sweden...... of financial reporting (GAAP)-based reforms when compared with budget-centric systems of accounting, which dominate government decision-making. At a trans-national level, there is a need to examine the embedded or implicit contests or ‘trials of strength’ between nations and/or institutions jockeying...... for influence. We highlight three arenas where such contests are being played out: 1. Statistical versus GAAP notions of accounting value, which features in all accounting debates over the merits and costs of ex-ante versus ex-post notions of value (i.e., the relevance versus reliability debate); 2. Private...

  8. Fast algorithms for finding proper strategies in game trees

    DEFF Research Database (Denmark)

    Miltersen, Peter Bro; Sørensen, Troels Bjerre

    2008-01-01

    We show how to find a normal form proper equilibrium in behavior strategies of a given two-player zero-sum extensive form game with imperfect information but perfect recall. Our algorithm solves a finite sequence of linear programs and runs in polynomial time. For the case of a perfect information...... game, we show how to find a normal form proper equilibrium in linear time by a simple backwards induction procedure....

  9. The PMA Catalogue: 420 million positions and absolute proper motions

    Science.gov (United States)

    Akhmetov, V. S.; Fedorov, P. N.; Velichko, A. B.; Shulga, V. M.

    2017-07-01

    We present a catalogue that contains about 420 million absolute proper motions of stars. It was derived from the combination of positions from Gaia DR1 and 2MASS, with a mean difference of epochs of about 15 yr. Most of the systematic zonal errors inherent in the 2MASS Catalogue were eliminated before deriving the absolute proper motions. The absolute calibration procedure (zero-pointing of the proper motions) was carried out using about 1.6 million positions of extragalactic sources. The mean formal error of the absolute calibration is less than 0.35 mas yr-1. The derived proper motions cover the whole celestial sphere without gaps for a range of stellar magnitudes from 8 to 21 mag. In the sky areas where the extragalactic sources are invisible (the avoidance zone), a dedicated procedure was used that transforms the relative proper motions into absolute ones. The rms error of proper motions depends on stellar magnitude and ranges from 2-5 mas yr-1 for stars with 10 mag < G < 17 mag to 5-10 mas yr-1 for faint ones. The present catalogue contains the Gaia DR1 positions of stars for the J2015 epoch. The system of the PMA proper motions does not depend on the systematic errors of the 2MASS positions, and in the range from 14 to 21 mag represents an independent realization of a quasi-inertial reference frame in the optical and near-infrared wavelength range. The Catalogue also contains stellar magnitudes taken from the Gaia DR1 and 2MASS catalogues. A comparison of the PMA proper motions of stars with similar data from certain recent catalogues has been undertaken.

  10. An Entropy Testing Model Research on the Quality of Internal Control and Accounting Conservatism: Empirical Evidence from the Financial Companies of China from 2007 to 2011

    Directory of Open Access Journals (Sweden)

    Zongrun Wang

    2014-01-01

    Full Text Available We set information disclosure of internal control as a starting point to explore the relationship between the quality of internal control and accounting conservatism, and then adopt the entropy testing model to calculate the index of the internal control quality with the sample data of Chinese listed companies in financial industry from 2007–2011. Regression results show that earnings conservatism exists. The stronger the internal control is, the higher the accounting conservatism can be. Companies which have enhanced their internal control are more conservative, and these results make no difference with other industries.

  11. Modal analysis of fluid flows using variants of proper orthogonal decomposition

    Science.gov (United States)

    Rowley, Clarence; Dawson, Scott

    2017-11-01

    This talk gives an overview of several methods for analyzing fluid flows, based on variants of proper orthogonal decomposition. These methods may be used to determine simplified, approximate models that capture the essential features of these flows, in order to better understand the dominant physical mechanisms, and potentially to develop appropriate strategies for model-based flow control. We discuss balanced proper orthogonal decomposition as an approximation of balanced truncation, and explain connections with system identification methods such as the eigensystem realization algorithm. We demonstrate the methods on several canonical examples, including a linearized channel flow and the flow past a circular cylinder. Supported by AFOSR, Grant FA9550-14-1-0289.

  12. Augmented Automated Material Accounting Statistics System (AMASS)

    International Nuclear Information System (INIS)

    Lumb, R.F.; Messinger, M.; Tingey, F.H.

    1983-01-01

    This paper describes an extension of the AMASS methodology which was previously presented at the 1981 INMM annual meeting. The main thrust of the current effort is to develop procedures and a computer program for estimating the variance of an Inventory Difference when many sources of variability, other than measurement error, are admitted in the model. Procedures also are included for the estimation of the variances associated with measurement error estimates and their effect on the estimated limit of error of the inventory difference (LEID). The algorithm for the LEID measurement component uncertainty involves the propagated component measurement variance estimates as well as their associated degrees of freedom. The methodology and supporting computer software is referred to as the augmented Automated Material Accounting Statistics System (AMASS). Specifically, AMASS accommodates five source effects. These are: (1) measurement errors (2) known but unmeasured effects (3) measurement adjustment effects (4) unmeasured process hold-up effects (5) residual process variation A major result of this effort is a procedure for determining the effect of bias correction on LEID, properly taking into account all the covariances that exist. This paper briefly describes the basic models that are assumed; some of the estimation procedures consistent with the model; data requirements, emphasizing availability and other practical considerations; discusses implications for bias corrections; and concludes by briefly describing the supporting computer program

  13. Constructing "proper" ROCs from ordinal response data using weighted power functions.

    Science.gov (United States)

    Mossman, Douglas; Peng, Hongying

    2014-05-01

    Receiver operating characteristic (ROC) analysis is the standard method for describing the accuracy of diagnostic systems where the decision task involves distinguishing between 2 mutually exclusive possibilities. The popular binormal curve-fitting model usually produces ROCs that are improper in that they do not have the ever-decreasing slope required by signal detection theory. Not infrequently, binormal ROCs have visible hooks that falsely imply worse-than-chance diagnostic differentiation where the curve lies below the no-information diagonal. In this article, we present and evaluate a 2-parameter, weighted power function (WPF) model that always results in a proper ROC curve with a positive, monotonically decreasing slope. We used a computer simulation study to compare results from binormal and WPF models. The WPF model produces ROC curves that are less biased and closer to the true values than are curves obtained using the binormal model. The better performance of the WPF model follows from its design constraint as a necessarily proper ROC. The WPF model fits a broader variety of data sets than previously published power function models while maintaining straightforward relationships among the original decision variable, specific operating points, ROC curve contours, and model parameters. Compared with other proper ROC models, the WPF model is distinctive in its simplicity, and it avoids the flaws of the conventional binormal ROC model.

  14. A physically meaningful equivalent circuit network model of a lithium-ion battery accounting for local electrochemical and thermal behaviour, variable double layer capacitance and degradation

    Science.gov (United States)

    von Srbik, Marie-Therese; Marinescu, Monica; Martinez-Botas, Ricardo F.; Offer, Gregory J.

    2016-09-01

    A novel electrical circuit analogy is proposed modelling electrochemical systems under realistic automotive operation conditions. The model is developed for a lithium ion battery and is based on a pseudo 2D electrochemical model. Although cast in the framework familiar to application engineers, the model is essentially an electrochemical battery model: all variables have a direct physical interpretation and there is direct access to all states of the cell via the model variables (concentrations, potentials) for monitoring and control systems design. This is the first Equivalent Circuit Network -type model that tracks directly the evolution of species inside the cell. It accounts for complex electrochemical phenomena that are usually omitted in online battery performance predictors such as variable double layer capacitance, the full current-overpotential relation and overpotentials due to mass transport limitations. The coupled electrochemical and thermal model accounts for capacity fade via a loss in active species and for power fade via an increase in resistive solid electrolyte passivation layers at both electrodes. The model's capability to simulate cell behaviour under dynamic events is validated against test procedures, such as standard battery testing load cycles for current rates up to 20 C, as well as realistic automotive drive cycle loads.

  15. The Drift Diffusion Model can account for the accuracy and reaction time of value-based choices under high and low time pressure

    Directory of Open Access Journals (Sweden)

    Milica Milosavljevic

    2010-10-01

    Full Text Available An important open problem is how values are compared to make simple choices. A natural hypothesis is that the brain carries out the computations associated with the value comparisons in a manner consistent with the Drift Diffusion Model (DDM, since this model has been able to account for a large amount of data in other domains. We investigated the ability of four different versions of the DDM to explain the data in a real binary food choice task under conditions of high and low time pressure. We found that a seven-parameter version of the DDM can account for the choice and reaction time data with high-accuracy, in both the high and low time pressure conditions. The changes associated with the introduction of time pressure could be traced to changes in two key model parameters: the barrier height and the noise in the slope of the drift process.

  16. What natural capital disclosure for integrated reporting? Designing & modelling an Integrated Financial - Natural Capital Accounting and Reporting Framework.

    OpenAIRE

    Houdet, Joel; Burritt, Roger; N. Farrell, Katharine; Martin-Ortega, Julia; Ramin, Kurt; Spurgeon, James; Atkins, Jill; Steuerman, David; Jones, Michael; Maleganos, John; Ding, Helen; Ochieng, Cosmas; Naicker, Kiruben; Chikozho, Claudious; Finisdore, John

    2014-01-01

    Business and government leaders from around the world are increasingly sounding the alarm about the need for effective management of business dependencies and impacts on ecosystems. As a consequence, financial institutions have recently made a formal commitment to work towards integrating natural capital considerations into their decision-making processes, including helping improve the accounting and disclosure practices of reporting organisations. Though various frameworks and standards have...

  17. Implementation of a pilot accountable care organization payment model and the use of discretionary and nondiscretionary cardiovascular care.

    Science.gov (United States)

    Colla, Carrie H; Goodney, Philip P; Lewis, Valerie A; Nallamothu, Brahmajee K; Gottlieb, Daniel J; Meara, Ellen

    2014-11-25

    Accountable care organizations (ACOs) seek to reduce growth in healthcare spending while ensuring high-quality care. We hypothesized that accountable care organization implementation would selectively limit the use of discretionary cardiovascular care (defined as care occurring in the absence of indications such as myocardial infarction or stroke), while maintaining high-quality care, such as nondiscretionary cardiovascular imaging and procedures. The intervention group was composed of fee-for-service Medicare patients (n=819 779) from 10 groups participating in a Medicare pilot accountable care organization, the Physician Group Practice Demonstration (PGPD). Matched controls were patients (n=934 621) from nonparticipating groups in the same regions. We compared use of cardiovascular care before (2002-2004) and after (2005-2009) PGPD implementation, studying both discretionary and nondiscretionary carotid and coronary imaging and procedures. Our main outcome measure was the difference in the proportion of patients treated with imaging and procedures among patients of PGPD practices compared with patients in control practices, before and after PGPD implementation (difference-in-difference). For discretionary imaging, the difference-in-difference between PGPD practices and controls was not statistically significant for discretionary carotid imaging (0.17%; 95% confidence interval, -0.51% to 0.85%; P=0.595) or discretionary coronary imaging (-0.19%; 95% confidence interval, -0.73% to 0.35%; P=0.468). Similarly, the difference-in-difference was also minimal for discretionary carotid revascularization (0.003%; 95% confidence interval, -0.008% to 0.002%; P=0.705) and coronary revascularization (-0.02%; 95% confidence interval, -0.11% to 0.07%; P=0.06). The difference-in-difference associated with PGPD implementation was also essentially 0 for nondiscretionary cardiovascular imaging or procedures. Implementation of a pilot accountable care organization did not limit the

  18. The consequences of not accounting for background selection in demographic inference.

    Science.gov (United States)

    Ewing, Gregory B; Jensen, Jeffrey D

    2016-01-01

    Recently, there has been increased awareness of the role of background selection (BGS) in both data analysis and modelling advances. However, BGS is still difficult to take into account because of tractability issues with simulations and difficulty with nonequilibrium demographic models. Often, simple rescaling adjustments of effective population size are used. However, there has been neither a proper characterization of how BGS could bias or shift inference when not properly taken into account, nor a thorough analysis of whether rescaling is a sufficient solution. Here, we carry out extensive simulations with BGS to determine biases and behaviour of demographic inference using an approximate Bayesian approach. We find that results can be positively misleading with significant bias, and describe the parameter space in which BGS models replicate observed neutral nonequilibrium expectations. © 2015 John Wiley & Sons Ltd.

  19. Research on orbit prediction for solar-based calibration proper satellite

    Science.gov (United States)

    Chen, Xuan; Qi, Wenwen; Xu, Peng

    2018-03-01

    Utilizing the mathematical model of the orbit mechanics, the orbit prediction is to forecast the space target's orbit information of a certain time based on the orbit of the initial moment. The proper satellite radiometric calibration and calibration orbit prediction process are introduced briefly. On the basis of the research of the calibration space position design method and the radiative transfer model, an orbit prediction method for proper satellite radiometric calibration is proposed to select the appropriate calibration arc for the remote sensor and to predict the orbit information of the proper satellite and the remote sensor. By analyzing the orbit constraint of the proper satellite calibration, the GF-1solar synchronous orbit is chose as the proper satellite orbit in order to simulate the calibration visible durance for different satellites to be calibrated. The results of simulation and analysis provide the basis for the improvement of the radiometric calibration accuracy of the satellite remote sensor, which lays the foundation for the high precision and high frequency radiometric calibration.

  20. Nuclear material accounting handbook

    International Nuclear Information System (INIS)

    2008-01-01

    The handbook documents existing best practices and methods used to account for nuclear material and to prepare the required nuclear material accounting reports for submission to the IAEA. It provides a description of the processes and steps necessary for the establishment, implementation and maintenance of nuclear material accounting and control at the material balance area, facility and State levels, and defines the relevant terms. This handbook serves the needs of State personnel at various levels, including State authorities, facility operators and participants in training programmes. It can assist in developing and maintaining accounting systems which will support a State's ability to account for its nuclear material such that the IAEA can verify State declarations, and at the same time support the State's ability to ensure its nuclear security. In addition, the handbook is useful for IAEA staff, who is closely involved with nuclear material accounting. The handbook includes the steps and procedures a State needs to set up and maintain to provide assurance that it can account for its nuclear material and submit the prescribed nuclear material accounting reports defined in Section 1 and described in Sections 3 and 4 in terms of the relevant agreement(s), thereby enabling the IAEA to discharge its verification function as defined in Section 1 and described in Sections 3 and 4. The contents of the handbook are based on the model safeguards agreement and, where applicable, there will also be reference to the model additional protocol. As a State using The handbook consists of five sections. In Section 1, definitions or descriptions of terms used are provided in relation to where the IAEA applies safeguards or, for that matter, accounting for and control of nuclear material in a State. The IAEA's approach in applying safeguards in a State is also defined and briefly described, with special emphasis on verification. In Section 2, the obligations of the State

  1. Physiologically motivated time-delay model to account for mechanisms underlying enterohepatic circulation of piroxicam in human beings.

    Science.gov (United States)

    Tvrdonova, Martina; Dedik, Ladislav; Mircioiu, Constantin; Miklovicova, Daniela; Durisova, Maria

    2009-01-01

    The study was conducted to formulate a physiologically motivated time-delay (PM TD) mathematical model for human beings, which incorporates disintegration of a drug formulation, dissolution, discontinuous gastric emptying and enterohepatic circulation (EHC) of a drug. Piroxicam, administered to 24 European, healthy individuals in 20 mg capsules Feldene Pfizer, was used as a model drug. Plasma was analysed for piroxicam by a validated high-performance liquid chromatography method. The PM TD mathematical model was developed using measured plasma piroxicam concentration-time profiles of the individuals and tools of a computationally efficient mathematical analysis and modeling, based on the theory of linear dynamic systems. The constructed model was capable of (i) quantifying different fractions of the piroxicam dose sequentially disposable for absorption and (ii) estimating time delays between time when the piroxicam dose reaches stomach and time when individual of fractions of the piroxicam dose is disposable for absorption. The model verification was performed through a formal proof, based on comparisons of observed and model-predicted plasma piroxicam concentration-time profiles. The model verification showed an adequate model performance and agreement between the compared profiles. Accordingly, it confirmed that the developed model was an appropriate representative of the piroxicam fate in the individuals enrolled. The presented model provides valuable information on factors that control dynamic mechanisms of EHC, that is, information unobtainable with the models proposed for the EHC analysis previously.

  2. Account of nonlocal ionization by fast electrons in the fluid models of a direct current glow discharge

    Energy Technology Data Exchange (ETDEWEB)

    Rafatov, I. [Physics Department, Middle East Technical University, Ankara (Turkey); Bogdanov, E. A.; Kudryavtsev, A. A. [Saint Petersburg State University, St. Petersburg (Russian Federation)

    2012-09-15

    We developed and tested a simple hybrid model for a glow discharge, which incorporates nonlocal ionization by fast electrons into the 'simple' and 'extended' fluid frameworks. Calculations have been performed for an argon gas. Comparison with the experimental data as well as with the hybrid (particle) and fluid modelling results demonstated good applicability of the proposed model.

  3. A complete soil hydraulic model accounting for capillary and adsorptive water retention, capillary and film conductivity, and hysteresis

    NARCIS (Netherlands)

    Sakai, Masaru; Van Genuchten, Martinus Th|info:eu-repo/dai/nl/31481518X; Alazba, A. A.; Setiawan, Budi Indra; Minasny, Budiman

    2015-01-01

    A soil hydraulic model that considers capillary hysteretic and adsorptive water retention as well as capillary and film conductivity covering the complete soil moisture range is presented. The model was obtained by incorporating the capillary hysteresis model of Parker and Lenhard into the hydraulic

  4. Accounting for the uncertainty related to building occupants with regards to visual comfort: A literature survey on drivers and models

    DEFF Research Database (Denmark)

    Fabi, Valentina; Andersen, Rune Korsholm; Corgnati, Stefano

    2016-01-01

    was related to the occupancy and differentiated for arrival, intermediate, and departure periods. The switching probability has been reported to be higher during the entering or the leaving time in relation to contextual variables. In the analysis of switch-on actions, users were often clustered between those...... who take daylight level into account and switch on lights only if necessary and people who totally disregard the natural lighting. This underlines the importance of how individuality is at the base of the definition of the different types of users....

  5. micro-mechanical modeling and numerical simulation of creep in concrete taking into account the effects of micro-cracking and hygro-thermal

    International Nuclear Information System (INIS)

    Thai, M.Q.

    2012-01-01

    Concrete is a complex heterogeneous material whose deformations include a delayed part that is affected by a number of factors such as temperature, relative humidity and microstructure evolution. Taking into account differed deformations and in particular creep is essential in the computation of concrete structures such as those dedicated to radioactive waste storage. The present work aims: (1) at elaborating a simple and robust model of creep for concrete by using micro-mechanics and accounting for the effects of damage, temperature and relative humidity; (2) at numerically implementing the creep model developed in a finite element code so as to simulate the behavior of simple structural elements in concrete. To achieve this twofold objective, the present work is partitioned into three parts. In the first part the cement-based material at the microscopic scale is taken to consist of a linear viscoelastic matrix characterized by a generalized Maxwell model and of particulate phases representing elastic aggregates and pores. The Mori-Tanaka micro-mechanical scheme, the Laplace-Carson transform and its inversion are then used to obtain analytical or numerical estimates for the mechanical and hydro-mechanical parameters of the material. Next, the original micromechanical model of creep is coupled to the damage model of Mazars through the concept of pseudo-deformations introduced by Schapery. The parameters involved in the creep-damage model thus established are systematically identified using available experimental data. Finally, the effects of temperature and relative humidity are accounted for in the creep-damage model by using the equivalent time method; the efficiency of this approach is demonstrated and discussed in the case of simple creep tests. (author) [fr

  6. How Students Learn from Multiple Contexts and Definitions: Proper Time as a Coordination Class

    Science.gov (United States)

    Levrini, Olivia; diSessa, Andrea A.

    2008-01-01

    This article provides an empirical analysis of a single classroom episode in which students reveal difficulties with the concept of proper time in special relativity but slowly make progress in improving their understanding. The theoretical framework used is "coordination class theory," which is an evolving model of concepts and conceptual change.…

  7. A Comparison of Seven Cox Regression-Based Models to Account for Heterogeneity Across Multiple HIV Treatment Cohorts in Latin America and the Caribbean.

    Science.gov (United States)

    Giganti, Mark J; Luz, Paula M; Caro-Vega, Yanink; Cesar, Carina; Padgett, Denis; Koenig, Serena; Echevarria, Juan; McGowan, Catherine C; Shepherd, Bryan E

    2015-05-01

    Many studies of HIV/AIDS aggregate data from multiple cohorts to improve power and generalizability. There are several analysis approaches to account for cross-cohort heterogeneity; we assessed how different approaches can impact results from an HIV/AIDS study investigating predictors of mortality. Using data from 13,658 HIV-infected patients starting antiretroviral therapy from seven Latin American and Caribbean cohorts, we illustrate the assumptions of seven readily implementable approaches to account for across cohort heterogeneity with Cox proportional hazards models, and we compare hazard ratio estimates across approaches. As a sensitivity analysis, we modify cohort membership to generate specific heterogeneity conditions. Hazard ratio estimates varied slightly between the seven analysis approaches, but differences were not clinically meaningful. Adjusted hazard ratio estimates for the association between AIDS at treatment initiation and death varied from 2.00 to 2.20 across approaches that accounted for heterogeneity; the adjusted hazard ratio was estimated as 1.73 in analyses that ignored across cohort heterogeneity. In sensitivity analyses with more extreme heterogeneity, we noted a slightly greater distinction between approaches. Despite substantial heterogeneity between cohorts, the impact of the specific approach to account for heterogeneity was minimal in our case study. Our results suggest that it is important to account for across cohort heterogeneity in analyses, but that the specific technique for addressing heterogeneity may be less important. Because of their flexibility in accounting for cohort heterogeneity, we prefer stratification or meta-analysis methods, but we encourage investigators to consider their specific study conditions and objectives.

  8. Modeling of the cesium 137 air transfer taking account of dust-making distinction on arable and long-fallow lands

    International Nuclear Information System (INIS)

    Bogdanov, A.P.; Zhmura, G.M.

    1997-01-01

    The mathematical model for air transfer of cesium 137 out from the contaminated regions which takes into account of dust-making distinction on arable and long-fallow lands is suggested. The calculation results of near-ground concentrations of cesium 137 for several towns of Belarus are presented. The sources of the contamination of the atmosphere in each calculated point have been analysed

  9. Testing the efficacy of existing force-endurance models to account for the prevalence of obesity in the workforce.

    Science.gov (United States)

    Pajoutan, Mojdeh; Cavuoto, Lora A; Mehta, Ranjana K

    2017-10-01

    This study evaluates whether the existing force-endurance relationship models are predictive of endurance time for overweight and obese individuals, and if not, provide revised models that can be applied for ergonomics practice. Data was collected from 141 participants (49 normal weight, 50 overweight, 42 obese) who each performed isometric endurance tasks of hand grip, shoulder flexion, and trunk extension at four levels of relative workload. Subject-specific fatigue rates and a general model of the force-endurance relationship were determined and compared to two fatigue models from the literature. There was a lack of fit between previous models and the current data for the grip (ICC = 0.8), with a shift toward lower endurance times for the new data. Application of the revised models can facilitate improved workplace design and job evaluation to accommodate the capacities of the current workforce.

  10. Determination of proper peaking time for Ultra-LEGe detector

    Energy Technology Data Exchange (ETDEWEB)

    Karabidak, S.M., E-mail: salihm@ktu.edu.t [Department of Physics, Karadeniz Technical University, 61080 Trabzon (Turkey); Department of Engineering Physics, Guemueshane University, 29100 Guemueshane (Turkey); Kaya, S. [Department of Engineering Physics, Guemueshane University, 29100 Guemueshane (Turkey); Cevik, U. [Department of Physics, Karadeniz Technical University, 61080 Trabzon (Turkey); Celik, A. [Department of Physics, Giresun University, 28049 Giresun (Turkey)

    2011-04-15

    Reducing count losses and pile-up pulse effects in quantitative and qualitative analysis is necessary for accuracy of analysis. Therefore, the proper peaking time for particular detector systems is important. The characteristic X-rays emitted from pure some elements were detected by using an Ultra-LEGe detector connecting a Tennelec TC 244 spectroscopy amplifier at different peaking time modes. Overall pulse widths were determined by an HM 203-7 oscilloscope connecting amplifier. The proper peaking time for Ultra-LEGe is determined as 3.84 {mu}s.

  11. Contingent Valuation Method and the beta model: an accounting economic vision for environmental damage in Atlântico Sul Shipyard

    Directory of Open Access Journals (Sweden)

    Silvana Karina de Melo Travassos

    2018-02-01

    Full Text Available ABSTRACT The objective of this paper is to apply the beta model as an alternative to the Valuation Method in order to estimate the environmental asset Willingness to Pay (WTP so that the Tribunal de Contas do Estado de Pernambuco (TCE/PE can supervise the Atlântico Sul Shipyard (ASS as a negative environmental externality, which is discussed here from an accounting perspective. Our methodology is exploratory, and the beta regression model was used in the contingent valuation to estimate the environmental asset. The results allowed estimating the value of the Ipojuca mangrove at US$ 134,079,793.50, and the value of the environmental damage caused by the shipyard to the public asset was valued at US$ 61,378,155.37. This latter value is object of interest to the inspection body. However, the final estimated value of the Ipojuca mangrove prompts a discussion about the implications from an accounting point of view, such as the attribution of monetary value to a public asset that does not have a financial value, problems regarding the conceptualization and valuation of public assets for governmental patrimony. It is concluded that the beta regression model to estimate the WTP for contingent valuation will serve as a contribution to the research on accounting measurement techniques for public assets.

  12. Accounting for Heterogeneity in Relative Treatment Effects for Use in Cost-Effectiveness Models and Value-of-Information Analyses.

    Science.gov (United States)

    Welton, Nicky J; Soares, Marta O; Palmer, Stephen; Ades, Anthony E; Harrison, David; Shankar-Hari, Manu; Rowan, Kathy M

    2015-07-01

    Cost-effectiveness analysis (CEA) models are routinely used to inform health care policy. Key model inputs include relative effectiveness of competing treatments, typically informed by meta-analysis. Heterogeneity is ubiquitous in meta-analysis, and random effects models are usually used when there is variability in effects across studies. In the absence of observed treatment effect modifiers, various summaries from the random effects distribution (random effects mean, predictive distribution, random effects distribution, or study-specific estimate [shrunken or independent of other studies]) can be used depending on the relationship between the setting for the decision (population characteristics, treatment definitions, and other contextual factors) and the included studies. If covariates have been measured that could potentially explain the heterogeneity, then these can be included in a meta-regression model. We describe how covariates can be included in a network meta-analysis model and how the output from such an analysis can be used in a CEA model. We outline a model selection procedure to help choose between competing models and stress the importance of clinical input. We illustrate the approach with a health technology assessment of intravenous immunoglobulin for the management of adult patients with severe sepsis in an intensive care setting, which exemplifies how risk of bias information can be incorporated into CEA models. We show that the results of the CEA and value-of-information analyses are sensitive to the model and highlight the importance of sensitivity analyses when conducting CEA in the presence of heterogeneity. The methods presented extend naturally to heterogeneity in other model inputs, such as baseline risk. © The Author(s) 2015.

  13. Demographic factors and retrieval of object and proper names after age 70.

    Directory of Open Access Journals (Sweden)

    Gitit Kavé

    Full Text Available This research aimed to investigate whether demographic factors are similarly related to retrieval of object and proper names.The sample included 5,907 individuals above age 70 who participated in the Health and Retirement Study between 2004 and 2012. Participants were asked to name two objects as well as the US President and Vice President. Latent growth curve models examined the associations of age, education, and self-rated health with baseline levels and change trajectories in retrieval.Age and education were more strongly related to retrieval of proper names than to retrieval of object names, both for baseline scores and for change trajectory. Similar effects of self-rated health emerged for both types of stimuli.The results show that examining object names and proper names together as indication of cognitive status in the HRS might overlook important differences between the two types of stimuli, in both baseline performance and longitudinal change.

  14. Designing a Complete Model for Evaluating Companies in "The Modern Economy" and Refining Financial-Accounting Information

    Directory of Open Access Journals (Sweden)

    Pepi Mitică

    2017-01-01

    Full Text Available The limitations of current evaluation methods call for the expansion of approaches to identifying new solutions for representing the value of ICT companies. Features "The modern economy", the imperative of eliminating the inflection points, the necessity formulating an equidistant definition of value and the absence of a degree correlation refining the accounting regulations on intangible assets with development economic and social based on intellectual capital are as many arguments for the emergence of a new representation of value. The new FMV (Future Market Value method provides economic information in its dynamics and value in its evolution .Concerns practitioners in the field over the last decade reflect a consistency with the premises of our research.

  15. NUMERICAL MODELING OF CONJUGATE HEAT TRANSFER IN AN INSULATED GLASS UNIT (IGU WITH ACCOUNT FOR ITS DEFORMATION

    Directory of Open Access Journals (Sweden)

    Golubev Stanislav Sergeevich

    2012-12-01

    The effects of different climatic impacts lead to the deformation of glasses within an IGU (and its vertical cavity, respectively. Deformation of glasses and vertical cavities reduces the thermal resistance of an IGU. A numerical simulation of conjugate heat transfer within an IGU was implemented as part of the research into this phenomenon. Calculations were performed in ANSYS FLUENT CFD package. Basic equations describing the conservation of mass, conservation of momentum (in the Boussinesq approximation, conservation of energy were solved. Also, the radiation of the cavity wall was taken into account. Vertical walls were considered as non-isothermal, while horizontal walls were adiabatic. Calculations were made for several patterns of glass deformations. Calculation results demonstrate that the heat flow over vertical walls intensifies as the distance between centres of IGU glasses is reduced. The temperature in the central area of the hot glass drops.

  16. Multi-regional input–output model and ecological network analysis for regional embodied energy accounting in China

    International Nuclear Information System (INIS)

    Zhang, Yan; Zheng, Hongmei; Yang, Zhifeng; Su, Meirong; Liu, Gengyuan; Li, Yanxian

    2015-01-01

    Chinese regions frequently exchange materials, but regional differences in economic development create unbalanced flows of these resources. In this study, we examined energy by assessing embodied energy consumption to describe the energy-flow structure in China's seven regions. Based on multi-regional monetary input–output tables and energy statistical yearbooks for Chinese provinces in 2002 and 2007, we accounted for both direct and indirect energy consumption, respectively, and the integral input and output of the provinces. Most integral inputs of energy flowed from north to south or from east to west, whereas integral output flows were mainly from northeast to southwest. This differed from the direct flows, which were predominantly from north to south and west to east. This demonstrates the importance of calculating both direct and indirect energy flows. Analysis of the distance and direction traveled by the energy consumption centers of gravity showed that the centers for embodied energy consumption and inputs moved southeast because of the movements of the centers of the Eastern region. However, the center for outputs moved northeast because the movement of the Central region. These analyses provide a basis for identifying how regional economic development policies influence the embodied energy consumption and its flows among regions. - Highlights: • We integrated multi-regional input–output analysis with ecological network analysis. • We accounted for both direct and indirect energy consumption. • The centers of gravity for embodied energy flows moved southeast from 2002 to 2007. • The results support planning of energy consumption and energy flows among regions.

  17. Physical-based analytical model of flexible a-IGZO TFTs accounting for both charge injection and transport

    NARCIS (Netherlands)

    Ghittorelli, M.; Torricelli, F.; Steen, J.L. van der; Garripoli, C.; Tripathi, A.; Gelinck, G.H.; Cantatore, E.; Kovacs-Vajna, Z.M.

    2016-01-01

    Here we show a new physical-based analytical model of a-IGZO TFTs. TFTs scaling from L=200 μm to L=15 μm and fabricated on plastic foil are accurately reproduced with a unique set of parameters. The model is used to design a zero-VGS inverter. It is a valuable tool for circuit design and technology

  18. A new slurry pH model accounting for effects of ammonia and carbon dioxide volatilization on solution speciation

    DEFF Research Database (Denmark)

    Petersen, V.; Markfoged, R.; Hafner, S. D.

    2014-01-01

    and crucial pH changes in the surface of stored slurry or slurry applied in the field. In the model, slurry pH is calculated by simultaneously determining: (1) speciation of the acid-base reactions, (2) diffusion of each buffer species, and (3) emission of NH3 and CO2. New features of the model include...

  19. Two-region mass transfer to account for 2D profile scale heterogeneity in a 1D effective plot scale flow model

    Science.gov (United States)

    Filipovic, Vilim; Coquet, Yves; Gerke, Horst H.

    2017-04-01

    In arable soil landscapes, specific spatial heterogeneities related to tillage and trafficking can influence the movement of water and chemicals. The structure in the topsoil is characterized by spatial patterns with locally compacted zones. The contrasting hydraulic properties of more-and-less compacted soil zones can result in heterogeneous flow fields and preferential flow. Two- or three-dimensional models used to account for soil spatial variability are relatively too complex when trying to include local heterogeneities in the description of field scale flow and transport problems. The idea was to reduce the model complexity linked to the explicit description of heterogeneities in 2D or 3D without deteriorating the validity of simulation results. When reducing the spatial dimensionality, the geometry in a 2D, cross-sectional explicit plot description is removed on the expense of an increased complexity of the 1D model with two flow domains and mass exchange between them. Our objective was to design a simplified 1D model approach that effectively accounts for plot-scale soil structural variability. In this simplified 1D model, effective soil hydraulic parameters can be assigned to each of the two domains separately. Different theoretical scenarios simulating different shape, size and arrangement of compacted clods in the tilled layer were set to estimate their effect on solute behaviour. The mass exchange parameters could be determined from structure quantification and by comparing simplified 1D with reference 2D results accounting for defined soil structural (i.e., here the compacted regions) geometries. The mass exchange is strongly related to the geometry of the compacted zones including their distribution and size within the non-compacted soil. Additionally, the simplified model approach was tested by comparing it with measured results from a field tracer experiment.

  20. Biosphere modelling for safety assessment of geological disposal taking account of denudation of contaminated soils. Research document

    International Nuclear Information System (INIS)

    Kato, Tomoko

    2003-03-01

    Biosphere models for safety assessment of geological disposal have been developed on the assumption that the repository-derived radionuclides reach surface environment by groundwater. In the modelling, river, deep well and marine have been considered as geosphere-biosphere (GBIs) and some Japanese-specific ''reference biospheres'' have been developed using an approach consistent with the BIOMOVS II/BIOMASS Reference Biosphere Methodology. In this study, it is assumed that the repository-derived radionuclide would reach surface environment in the form of solid phase by uplift and erosion of contaminated soil and sediment. The radionuclides entered into the surface environment by these processes could be distributed between solid and liquid phases and could spread within the biosphere via solid phase and also liquid phase. Based on these concepts, biosphere model that considers variably saturated zone under surface soil (VSZ) as a GBI was developed for calculating the flux-to-dose conversion factors of three exposure groups (farming, freshwater fishing, marine fishing) based on the Reference Biosphere Methodology. The flux-to-dose conversion factors for faming exposure group were the highest, and ''inhalation of dust'', external irradiation from soil'' and ''ingestion of soil'' were the dominant exposure pathways for most of radionuclides considered in this model. It is impossible to compare the flux-to-dose conversion factors calculated by the biosphere model in this study with those calculated by the biosphere models developed in the previous studies because the migration processes considered when the radionuclides entered the surface environment through the aquifer are different among the models; i.e. it has been assumed that the repository-derived radionuclides entered the GBIs such as river, deep well and marine via groundwater without dilution and retardation at the aquifer in the previous biosphere models. Consequently, it must be modelled the migration of