WorldWideScience

Sample records for economic computer modeling

  1. Computational Economic Modeling of Migration

    OpenAIRE

    Klabunde, Anna

    2014-01-01

    In this paper an agent-based model of endogenously evolving migrant networks is developed to identify the determinants of migration and return decisions. Individuals are connected by links, the strength of which declines over time and distance. Methodologically, this paper combines parameterization using data from the Mexican Migration Project with calibration. It is shown that expected earnings, an idiosyncratic home bias, network ties to other migrants, strength of links to the home country...

  2. Computer models for economic and silvicultural decisions

    Science.gov (United States)

    Rosalie J. Ingram

    1989-01-01

    Computer systems can help simplify decisionmaking to manage forest ecosystems. We now have computer models to help make forest management decisions by predicting changes associated with a particular management action. Models also help you evaluate alternatives. To be effective, the computer models must be reliable and appropriate for your situation.

  3. Efficient sampling and meta-modeling in computational economic models

    NARCIS (Netherlands)

    Salle, I.; Yıldızoğlu, M.

    2014-01-01

    Extensive exploration of simulation models comes at a high computational cost, all the more when the model involves a lot of parameters. Economists usually rely on random explorations, such as Monte Carlo simulations, and basic econometric modeling to approximate the properties of computational

  4. Computer model for economic study of unbleached kraft paperboard production

    Science.gov (United States)

    Peter J. Ince

    1984-01-01

    Unbleached kraft paperboard is produced from wood fiber in an industrial papermaking process. A highly specific and detailed model of the process is presented. The model is also presented as a working computer program. A user of the computer program will provide data on physical parameters of the process and on prices of material inputs and outputs. The program is then...

  5. Use of a Deterministic Macroeconomic Computer Model as a Teaching Aid in Economic Statistics.

    Science.gov (United States)

    Tedford, John R.

    A simple deterministic macroeconomic computer model was tested in a junior-senior level economic statistics course to demonstrate how and why some common errors arise when statistical estimation techniques are applied to economic relationships in empirical problematic situations. The computer model was treated as the true universe or real-world…

  6. Heterogeneous computing in economics

    DEFF Research Database (Denmark)

    Dziubinski, Matt P.; Grassi, Stefano

    2014-01-01

    This paper shows the potential of heterogeneous computing in solving dynamic equilibrium models in economics. We illustrate the power and simplicity of C++ Accelerated Massive Parallelism (C++ AMP) recently introduced by Microsoft. Starting from the same exercise as Aldrich et al. (J Econ Dyn...

  7. The model of localized business community economic development under limited financial resources: computer model and experiment

    Directory of Open Access Journals (Sweden)

    Berg Dmitry

    2016-01-01

    Full Text Available Globalization processes now affect and are affected by most of organizations, different type resources, and the natural environment. One of the main restrictions initiated by these processes is the financial one: money turnover in global markets leads to its concentration in the certain financial centers, and local business communities suffer from the money lack. This work discusses the advantages of complementary currency introduction into a local economics. By the computer simulation with the engineered program model and the real economic experiment it was proved that the complementary currency does not compete with the traditional currency, furthermore, it acts in compliance with it, providing conditions for the sustainable business community development.

  8. Modeling the economic costs of disasters and recovery: analysis using a dynamic computable general equilibrium model

    Science.gov (United States)

    Xie, W.; Li, N.; Wu, J.-D.; Hao, X.-L.

    2014-04-01

    Disaster damages have negative effects on the economy, whereas reconstruction investment has positive effects. The aim of this study is to model economic causes of disasters and recovery involving the positive effects of reconstruction activities. Computable general equilibrium (CGE) model is a promising approach because it can incorporate these two kinds of shocks into a unified framework and furthermore avoid the double-counting problem. In order to factor both shocks into the CGE model, direct loss is set as the amount of capital stock reduced on the supply side of the economy; a portion of investments restores the capital stock in an existing period; an investment-driven dynamic model is formulated according to available reconstruction data, and the rest of a given country's saving is set as an endogenous variable to balance the fixed investment. The 2008 Wenchuan Earthquake is selected as a case study to illustrate the model, and three scenarios are constructed: S0 (no disaster occurs), S1 (disaster occurs with reconstruction investment) and S2 (disaster occurs without reconstruction investment). S0 is taken as business as usual, and the differences between S1 and S0 and that between S2 and S0 can be interpreted as economic losses including reconstruction and excluding reconstruction, respectively. The study showed that output from S1 is found to be closer to real data than that from S2. Economic loss under S2 is roughly 1.5 times that under S1. The gap in the economic aggregate between S1 and S0 is reduced to 3% at the end of government-led reconstruction activity, a level that should take another four years to achieve under S2.

  9. Economic models for management of resources in peer-to-peer and grid computing

    Science.gov (United States)

    Buyya, Rajkumar; Stockinger, Heinz; Giddy, Jonathan; Abramson, David

    2001-07-01

    The accelerated development in Peer-to-Peer (P2P) and Grid computing has positioned them as promising next generation computing platforms. They enable the creation of Virtual Enterprises (VE) for sharing resources distributed across the world. However, resource management, application development and usage models in these environments is a complex undertaking. This is due to the geographic distribution of resources that are owned by different organizations or peers. The resource owners of each of these resources have different usage or access policies and cost models, and varying loads and availability. In order to address complex resource management issues, we have proposed a computational economy framework for resource allocation and for regulating supply and demand in Grid computing environments. The framework provides mechanisms for optimizing resource provider and consumer objective functions through trading and brokering services. In a real world market, there exist various economic models for setting the price for goods based on supply-and-demand and their value to the user. They include commodity market, posted price, tenders and auctions. In this paper, we discuss the use of these models for interaction between Grid components in deciding resource value and the necessary infrastructure to realize them. In addition to normal services offered by Grid computing systems, we need an infrastructure to support interaction protocols, allocation mechanisms, currency, secure banking, and enforcement services. Furthermore, we demonstrate the usage of some of these economic models in resource brokering through Nimrod/G deadline and cost-based scheduling for two different optimization strategies on the World Wide Grid (WWG) testbed that contains peer-to-peer resources located on five continents: Asia, Australia, Europe, North America, and South America.

  10. Economic Assessment of Correlated Energy-Water Impacts using Computable General Equilibrium Modeling

    Science.gov (United States)

    Qiu, F.; Andrew, S.; Wang, J.; Yan, E.; Zhou, Z.; Veselka, T.

    2016-12-01

    Many studies on energy and water are rightfully interested in the interaction of water and energy, and their projected dependence into the future. Water is indeed an essential input to the power sector currently, and energy is required to pump water for end use in either household consumption or in industrial uses. However, each presented study either qualitatively discusses the issues, particularly about how better understanding the interconnectedness of the system is paramount in getting better policy recommendations, or considers a partial equilibrium framework where water use and energy use changes are considered explicitly without thought to other repercussions throughout the regional/national/international economic landscapes. While many studies are beginning to ask the right questions, the lack of numerical rigor raises questions of concern in conclusions discerned. Most use life cycle analysis as a method for providing numerical results, though this lacks the flexibility that economics can provide. In this study, we will perform economic analysis using computable general equilibrium models with energy-water interdependencies captured as an important factor. We atempt to answer important and interesting questions in the studies: how can we characterize the economic choice of energy technology adoptions and their implications on water use in the domestic economy. Moreover, given predictions of reductions in rain fall in the near future, how does this impact the water supply in the midst of this energy-water trade-off?

  11. [Economic benefits of overlapping induction: investigation using a computer simulation model].

    Science.gov (United States)

    Hunziker, S; Baumgart, A; Denz, C; Schüpfer, G

    2009-06-01

    The aim of this study was to investigate the potential economic benefit of overlapping anaesthesia induction given that all patient diagnosis-related groups (AP DRG) are used as the model for hospital reimbursement. A computer simulation model was used for this purpose. Due to the resource-intensive production process, the operating room (OR) environment is the most expensive part of the supply chain for surgical disciplines. The economical benefit of a parallel production process (additional personnel, adaptation of the process) as compared to a conventional serial layout was assessed. A computer-based simulation method was used with commercially available simulation software. Assumptions for revenues were made by reimbursement based on AP DRG. Based on a system analysis a model for the computer simulation was designed on a step-by-step abstraction process. In the model two operating rooms were used for parallel processing and two operating rooms for a serial production process. Six different types of surgical procedures based on historical case durations were investigated. The contribution margin was calculated based on the increased revenues minus the cost for the additional anaesthesia personnel. Over a period of 5 weeks 41 additional surgical cases were operated under the assumption of duration of surgery of 89+/-4 min (mean+/-SD). The additional contribution margin was CHF 104,588. In the case of longer surgical procedures with 103+/-25 min duration (mean+/-SD), an increase of 36 cases was possible in the same time period and the contribution margin was increased by CHF 384,836. When surgical cases with a mean procedural time of 243+/-55 min were simulated, 15 additional cases were possible. Therefore, the additional contribution margin was CHF 321,278. Although costs increased in this simulation when a serial production process was changed to a parallel system layout due to more personnel, an increase of the contribution margin was possible, especially with

  12. New Research Perspectives in the Emerging Field of Computational Intelligence to Economic Modeling

    Directory of Open Access Journals (Sweden)

    Vasile MAZILESCU

    2009-01-01

    Full Text Available Computational Intelligence (CI is a new development paradigm of intelligentsystems which has resulted from a synergy between fuzzy sets, artificial neuralnetworks, evolutionary computation, machine learning, etc., broadeningcomputer science, physics, economics, engineering, mathematics, statistics. It isimperative to know why these tools can be potentially relevant and effective toeconomic and financial modeling. This paper presents, after a synergic newparadigm of intelligent systems, as a practical case study the fuzzy and temporalproperties of knowledge formalism embedded in an Intelligent Control System(ICS, based on FT-algorithm. We are not dealing high with level reasoningmethods, because we think that real-time problems can only be solved by ratherlow-level reasoning. Most of the overall run-time of fuzzy expert systems isused in the match phase. To achieve a fast reasoning the number of fuzzy setoperations must be reduced. For this, we use a fuzzy compiled structure ofknowledge, like Rete, because it is required for real-time responses. Solving thematch-time predictability problem would allow us to built much more powerfulreasoning techniques.

  13. AJAN TABANLI MODELLEME VE HESAPLAMALI İKTİSAT - AGENT-BASED MODELLING AND COMPUTATIONAL ECONOMICS

    Directory of Open Access Journals (Sweden)

    Emrah KELEŞ

    2014-07-01

    Full Text Available ÖzetRasyonellik ve homojenlik varsayımları ile iktisadi ajanlar arasındaki etkileşimi göz ardı eden temsiliajan yaklaşımı, dinamik stokastik genel denge modellerine dayanan yerleşik iktisada duyulan güveninazalmasına yol açmıştır. 1990’ların sonlarından itibaren ajan tabanlı hesaplamalı yaklaşım finansal iktisat,endüstriyel organizasyon, makro iktisat, politik iktisat ve iktiadi ağ oluşumu başta olmak üzere sosyalbilimlerde yaygınlaşmaya başlamıştır. Son olarak 2008 küresel finansal kriz yerleşik, iktisadın dahayüksek sesle tartışılmasına ve ajan tabanlı yaklaşımın daha çok benimsenmesine neden olmuştur. Bu yeniyaklaşım araştırmacılara pasif haldeki fiziksel varlıklardan durumları, inanışları ve davranış kuralları olanaktif karar alıcılara kadar çeşitli ajanların bulunduğu yapay bir dünya kurmalarına imkân vermektedir.Bu yapay dünyalarda ajanların birbirleriyle ya da çevreleriyle etkileşimi onların adaptif (uyarlanabilirolmasına ve kompleks adaptif bir sistem meydana getirmelerine izin vermektedir. Bu çalışmada, ajantabanlı yaklaşımın temel unsurlarının incelenmesi ve DSGE modellerine göre üstünlüklerinin gösterilmesiamaçlanmıştır.AbstractAssumptions of rationality and homogeneity, and framework of representative agent that rule out interactionsbetween agents have led to a decline in confidence to mainstream economics based on dynamicstochastic equilibrium models. Starting from late 1990s, agent-based computational approach has becomeincreasingly popular in social sciences, especially in financial economics, industrial organization, macroeconomics,political economy, and economic network formation. Finally, 2008 global financial crisis hascaused mainstream to be argued loudly and agent-based approach to be adopted more. This new approachenables researchers to construct artificial worlds where various agents ranging from passive entities to active

  14. AJAN TABANLI MODELLEME VE HESAPLAMALI İKTİSAT - AGENT-BASED MODELLING AND COMPUTATIONAL ECONOMICS

    Directory of Open Access Journals (Sweden)

    Emrah KELEŞ

    2014-08-01

    Full Text Available ÖzetRasyonellik ve homojenlik varsayımları ile iktisadi ajanlar arasındaki etkileşimi göz ardı eden temsiliajan yaklaşımı, dinamik stokastik genel denge modellerine dayanan yerleşik iktisada duyulan güveninazalmasına yol açmıştır. 1990’ların sonlarından itibaren ajan tabanlı hesaplamalı yaklaşım finansal iktisat,endüstriyel organizasyon, makro iktisat, politik iktisat ve iktiadi ağ oluşumu başta olmak üzere sosyalbilimlerde yaygınlaşmaya başlamıştır. Son olarak 2008 küresel finansal kriz yerleşik, iktisadın dahayüksek sesle tartışılmasına ve ajan tabanlı yaklaşımın daha çok benimsenmesine neden olmuştur. Bu yeniyaklaşım araştırmacılara pasif haldeki fiziksel varlıklardan durumları, inanışları ve davranış kuralları olanaktif karar alıcılara kadar çeşitli ajanların bulunduğu yapay bir dünya kurmalarına imkân vermektedir.Bu yapay dünyalarda ajanların birbirleriyle ya da çevreleriyle etkileşimi onların adaptif (uyarlanabilirolmasına ve kompleks adaptif bir sistem meydana getirmelerine izin vermektedir. Bu çalışmada, ajantabanlı yaklaşımın temel unsurlarının incelenmesi ve DSGE modellerine göre üstünlüklerinin gösterilmesiamaçlanmıştır.AbstractAssumptions of rationality and homogeneity, and framework of representative agent that rule out interactionsbetween agents have led to a decline in confidence to mainstream economics based on dynamicstochastic equilibrium models. Starting from late 1990s, agent-based computational approach has becomeincreasingly popular in social sciences, especially in financial economics, industrial organization, macroeconomics,political economy, and economic network formation. Finally, 2008 global financial crisis hascaused mainstream to be argued loudly and agent-based approach to be adopted more. This new approachenables researchers to construct artificial worlds where various agents ranging from passive entities to active

  15. Building and dwelling register in the computer supported decission models of spatial economics

    Directory of Open Access Journals (Sweden)

    Marija Bogataj

    1992-12-01

    Full Text Available The theoretical results, that originate from the theorg of the urban economics and from other theories of spatial economics, can be successfullg applied on the urban growth control in Slovenia if there is available GIS with Building and Dwelling Register (BDR embedded in it. BDR will enable us to upgrade the relevant MASS appraisal and to formalise the skeleton for multicriterion, multilevel decision models. In this paper the simple spatial decision results (originated from some basic Alonso's considerations are discussed, which have to be studied simultaneously in the models of spatial interactions and supported by GIS, where Building and Dwelling Register is playing its role. How to construct this register, which will support the decision models, is briefly discussed.

  16. Computable models

    CERN Document Server

    Turner, Raymond

    2009-01-01

    Computational models can be found everywhere in present day science and engineering. In providing a logical framework and foundation for the specification and design of specification languages, Raymond Turner uses this framework to introduce and study computable models. In doing so he presents the first systematic attempt to provide computational models with a logical foundation. Computable models have wide-ranging applications from programming language semantics and specification languages, through to knowledge representation languages and formalism for natural language semantics. They are al

  17. Computable general equilibrium modelling of economic impacts from volcanic event scenarios at regional and national scale, Mt. Taranaki, New Zealand

    Science.gov (United States)

    McDonald, G. W.; Cronin, S. J.; Kim, J.-H.; Smith, N. J.; Murray, C. A.; Procter, J. N.

    2017-12-01

    The economic impacts of volcanism extend well beyond the direct costs of loss of life and asset damage. This paper presents one of the first attempts to assess the economic consequences of disruption associated with volcanic impacts at a range of temporal and spatial scales using multi-regional and dynamic computable general equilibrium (CGE) modelling. Based on the last decade of volcanic research findings at Mt. Taranaki, three volcanic event scenarios (Tahurangi, Inglewood and Opua) differentiated by critical physical thresholds were generated. In turn, the corresponding disruption economic impacts were calculated for each scenario. Under the Tahurangi scenario (annual probability of 0.01-0.02), a small-scale explosive (Volcanic Explosivity Index (VEI) 2-3) and dome forming eruption, the economic impacts were negligible with complete economic recovery experienced within a year. The larger Inglewood sub-Plinian to Plinian eruption scenario event (VEI > 4, annualised probability of 0.003) produced significant impacts on the Taranaki region economy of 207 million (representing 4.0% of regional gross domestic product (GDP) 1 year after the event, 2007 New Zealand dollars), that will take around 5 years to recover. The Opua scenario, the largest magnitude volcanic hazard modelled, is a major flank collapse and debris avalanche event with an annual probability of 0.00018. The associated economic impacts of this scenario were 397 million (representing 7.7% of regional GDP 1 year after the event) with the Taranaki region economy suffering permanent structural changes. Our dynamic analysis illustrates that different economic impacts play out at different stages in a volcanic crisis. We also discuss the key strengths and weaknesses of our modelling along with potential extensions.

  18. Trade Barrier Elimination, Economics of Scale and Market Competition: Computable General Equilibrium Model

    Directory of Open Access Journals (Sweden)

    Widyastutik Widyastutik

    2017-07-01

    Full Text Available The ASEAN and its dialogue partner countries agreed to reduce trade barriers in the services sector, one of which is sea transport services. The purpose of this study is to estimate the equivalent tax of non-tariff barriers in the sea transport services. Besides that, this study is going to analyze the economic impacts of the regulatory barriers elimination in the sea transport services of ASEAN and its dialogue partner countries. Using the gravity model, it can be identified that trade barriers of sea transport services sector of ASEAN and dialogue partner countries are still relatively high. Additionally, by adopting IC-IRTS model in Global CGE Model (GTAP, the simulation results show consistent results with the theory of pro-competitive effects. The greater gain from trade is obtained in the CGE model assuming IC-IRTS compared to PC-CRTS. China gains a greater benefit that is indicated by the highest increase in welfare and GDP followed by Japan and AustraliaDOI: 10.15408/sjie.v6i2.5279

  19. Heterogeneous Computing in Economics: A Simplified Approach

    DEFF Research Database (Denmark)

    Dziubinski, Matt P.; Grassi, Stefano

    This paper shows the potential of heterogeneous computing in solving dynamic equilibrium models in economics. We illustrate the power and simplicity of the C++ Accelerated Massive Parallelism recently introduced by Microsoft. Starting from the same exercise as Aldrich et al. (2011) we document...... a speed gain together with a simplified programming style that naturally enables parallelization....

  20. Economic modeling using artificial intelligence methods

    CERN Document Server

    Marwala, Tshilidzi

    2013-01-01

    This book examines the application of artificial intelligence methods to model economic data. It addresses causality and proposes new frameworks for dealing with this issue. It also applies evolutionary computing to model evolving economic environments.

  1. Create full-scale predictive economic models on ROI and innovation with performance computing

    Energy Technology Data Exchange (ETDEWEB)

    Joseph, Earl C. [IDC Research, Inc., Framingham, MA (United States); Conway, Steve [IDC Research, Inc., Framingham, MA (United States)

    2017-10-27

    The U.S. Department of Energy (DOE), the world's largest buyer and user of supercomputers, awarded IDC Research, Inc. a grant to create two macroeconomic models capable of quantifying, respectively, financial and non-financial (innovation) returns on investments in HPC resources. Following a 2013 pilot study in which we created the models and tested them on about 200 real-world HPC cases, DOE authorized us to conduct a full-out, three-year grant study to collect and measure many more examples, a process that would also subject the methodology to further testing and validation. A secondary, "stretch" goal of the full-out study was to advance the methodology from association toward (but not all the way to) causation, by eliminating the effects of some of the other factors that might be contributing, along with HPC investments, to the returns produced in the investigated projects.

  2. Agent-based computational economics using NetLogo

    CERN Document Server

    Damaceanu, Romulus-Catalin

    2013-01-01

    Agent-based Computational Economics using NetLogo explores how researchers can create, use and implement multi-agent computational models in Economics by using NetLogo software platform. Problems of economic science can be solved using multi-agent modelling (MAM). This technique uses a computer model to simulate the actions and interactions of autonomous entities in a network, in order to analyze the effects on the entire economic system. MAM combines elements of game theory, complex systems, emergence and evolutionary programming. The Monte Carlo method is also used in this e-book to introduc

  3. Models of Economic Analysis

    Directory of Open Access Journals (Sweden)

    Adrian Ioana

    2013-07-01

    Full Text Available The article presents specific aspects of management and models for economic analysis. Thus, we present the main types of economic analysis: statistical analysis, dynamic analysis, static analysis, mathematical analysis, psychological analysis. Also we present the main object of the analysis: the technological activity analysis of a company, the analysis of the production costs, the economic activity analysis of a company, the analysis of equipment, the analysis of labor productivity, the analysis of the goods flow.Keywords: Economic analysis, Models, Adoption of innovation

  4. To test or to treat? An analysis of influenza testing and antiviral treatment strategies using economic computer modeling.

    Directory of Open Access Journals (Sweden)

    Bruce Y Lee

    Full Text Available BACKGROUND: Due to the unpredictable burden of pandemic influenza, the best strategy to manage testing, such as rapid or polymerase chain reaction (PCR, and antiviral medications for patients who present with influenza-like illness (ILI is unknown. METHODOLOGY/PRINCIPAL FINDINGS: We developed a set of computer simulation models to evaluate the potential economic value of seven strategies under seasonal and pandemic influenza conditions: (1 using clinical judgment alone to guide antiviral use, (2 using PCR to determine whether to initiate antivirals, (3 using a rapid (point-of-care test to determine antiviral use, (4 using a combination of a point-of-care test and clinical judgment, (5 using clinical judgment and confirming the diagnosis with PCR testing, (6 treating all with antivirals, and (7 not treating anyone with antivirals. For healthy younger adults ( or = 65 years old, in both seasonal and pandemic influenza scenarios, employing PCR was the most cost-effective option, with the closest competitor being clinical judgment (when judgment accuracy > or = 50%. Point-of-care testing plus clinical judgment was cost-effective with higher probabilities of influenza. Treating all symptomatic ILI patients with antivirals was cost-effective only in older adults. CONCLUSIONS/SIGNIFICANCE: Our study delineated the conditions under which different testing and antiviral strategies may be cost-effective, showing the importance of accuracy, as seen with PCR or highly sensitive clinical judgment.

  5. Models of Economic Analysis

    OpenAIRE

    Adrian Ioana; Tiberiu Socaciu

    2013-01-01

    The article presents specific aspects of management and models for economic analysis. Thus, we present the main types of economic analysis: statistical analysis, dynamic analysis, static analysis, mathematical analysis, psychological analysis. Also we present the main object of the analysis: the technological activity analysis of a company, the analysis of the production costs, the economic activity analysis of a company, the analysis of equipment, the analysis of labor productivity, the anal...

  6. Computer-aided system for health economic evaluation.

    Science.gov (United States)

    Lin, Wen-Chou; Yen, Amy Ming-Fang; Chang, Chi-Ming; Chao, Chih-Hsung; Chen, Tony Hsiu-Hsi

    2009-10-01

    Health policy makers are usually stranded by the complicated infrastructure and intensive computation related to economic evaluation. It is therefore valuable to develop a computer-aided tool to help health personnel to perform economic evaluation with ease. The infrastructure for economic evaluation was first designed. Markov process with micro-simulation was applied to model the disease natural history or lifetime sequale to project the effectiveness by comparing all possible decisions. All the essential elements of economic evaluation together with sensitivity analysis are encoded in this computer-aided software written with SAS Screen Control Language in user-defined menu style. ILLUSTRATION: Screening versus no screening for colorectal cancer was used as an example. The computer-aided model for economic evaluation was developed in this study. It is anticipated that the flexibility and user-defined menu style facilitate the wide application of economic evaluation to health care intervention program.

  7. Economic Model For a Return on Investment Analysis of United States Government High Performance Computing (HPC) Research and Development (R & D) Investment

    Energy Technology Data Exchange (ETDEWEB)

    Joseph, Earl C. [IDC Research Inc., Framingham, MA (United States); Conway, Steve [IDC Research Inc., Framingham, MA (United States); Dekate, Chirag [IDC Research Inc., Framingham, MA (United States)

    2013-09-30

    This study investigated how high-performance computing (HPC) investments can improve economic success and increase scientific innovation. This research focused on the common good and provided uses for DOE, other government agencies, industry, and academia. The study created two unique economic models and an innovation index: 1 A macroeconomic model that depicts the way HPC investments result in economic advancements in the form of ROI in revenue (GDP), profits (and cost savings), and jobs. 2 A macroeconomic model that depicts the way HPC investments result in basic and applied innovations, looking at variations by sector, industry, country, and organization size. A new innovation index that provides a means of measuring and comparing innovation levels. Key findings of the pilot study include: IDC collected the required data across a broad set of organizations, with enough detail to create these models and the innovation index. The research also developed an expansive list of HPC success stories.

  8. ECONOMIC MODELING PROCESSES USING MATLAB

    Directory of Open Access Journals (Sweden)

    Anamaria G. MACOVEI

    2008-06-01

    Full Text Available To study economic phenomena and processes using mathem atical modeling, and to determine the approximatesolution to a problem we need to choose a method of calculation and a numerical computer program, namely thepackage of programs MatLab. Any economic process or phenomenon is a mathematical description of h is behavior,and thus draw up an economic and mathematical model that has the following stages: formulation of the problem, theanalysis process modeling, the production model and design verification, validation and implementation of the model.This article is presented an economic model and its modeling is using mathematical equations and software packageMatLab, which helps us approximation effective solution. As data entry is considered the net cost, the cost of direct andtotal cost and the link between them. I presented the basic formula for determining the total cost. Economic modelcalculations were made in MatLab software package and with graphic representation of its interpretation of the resultsachieved in terms of our specific problem.

  9. Economic communication model set

    Science.gov (United States)

    Zvereva, Olga M.; Berg, Dmitry B.

    2017-06-01

    This paper details findings from the research work targeted at economic communications investigation with agent-based models usage. The agent-based model set was engineered to simulate economic communications. Money in the form of internal and external currencies was introduced into the models to support exchanges in communications. Every model, being based on the general concept, has its own peculiarities in algorithm and input data set since it was engineered to solve the specific problem. Several and different origin data sets were used in experiments: theoretic sets were estimated on the basis of static Leontief's equilibrium equation and the real set was constructed on the basis of statistical data. While simulation experiments, communication process was observed in dynamics, and system macroparameters were estimated. This research approved that combination of an agent-based and mathematical model can cause a synergetic effect.

  10. Economic Modelling in Institutional Economic Theory

    National Research Council Canada - National Science Library

    Wadim Strielkowski; Evgeny Popov

    2017-01-01

    Our paper is centered around the formation of theory of institutional modelling that includes principles and ideas reflecting the laws of societal development within the framework of institutional economic theory...

  11. Energy-economic policy modeling

    Science.gov (United States)

    Sanstad, Alan H.

    2018-01-01

    Computational models based on economic principles and methods are powerful tools for understanding and analyzing problems in energy and the environment and for designing policies to address them. Among their other features, some current models of this type incorporate information on sustainable energy technologies and can be used to examine their potential role in addressing the problem of global climate change. The underlying principles and the characteristics of the models are summarized, and examples of this class of model and their applications are presented. Modeling epistemology and related issues are discussed, as well as critiques of the models. The paper concludes with remarks on the evolution of the models and possibilities for their continued development.

  12. Economic Modelling in Institutional Economic Theory

    Directory of Open Access Journals (Sweden)

    Wadim Strielkowski

    2017-06-01

    Full Text Available Our paper is centered around the formation of theory of institutional modelling that includes principles and ideas reflecting the laws of societal development within the framework of institutional economic theory. We scrutinize and discuss the scientific principles of this institutional modelling that are increasingly postulated by the classics of institutional theory and find their way into the basics of the institutional economics. We propose scientific ideas concerning the new innovative approaches to institutional modelling. These ideas have been devised and developed on the basis of the results of our own original design, as well as on the formalisation and measurements of economic institutions, their functioning and evolution. Moreover, we consider the applied aspects of the institutional theory of modelling and employ them in our research for formalizing our results and maximising the practical outcome of our paper. Our results and findings might be useful for the researchers and stakeholders searching for the systematic and comprehensive description of institutional level modelling, the principles involved in this process and the main provisions of the institutional theory of economic modelling.

  13. The Development of a Distributive Interactive Computing Model in Consumer Economics, Utilizing Jerome S. Bruner's Theory of Instruction.

    Science.gov (United States)

    Morrison, James L.

    A computerized delivery system in consumer economics developed at the University of Delaware uses the PLATO system to provide a basis for analyzing consumer behavior in the marketplace. The 16 sequential lessons, part of the Consumer in the Marketplace Series (CMS), demonstrate consumer economic theory in layman's terms and are structured to focus…

  14. HOMER Economic Models - US Navy

    Energy Technology Data Exchange (ETDEWEB)

    Bush, Jason William [Idaho National Lab. (INL), Idaho Falls, ID (United States); Myers, Kurt Steven [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-02-01

    This LETTER REPORT has been prepared by Idaho National Laboratory for US Navy NAVFAC EXWC to support in testing pre-commercial SIREN (Simulated Integration of Renewable Energy Networks) computer software models. In the logistics mode SIREN software simulates the combination of renewable power sources (solar arrays, wind turbines, and energy storage systems) in supplying an electrical demand. NAVFAC EXWC will create SIREN software logistics models of existing or planned renewable energy projects at five Navy locations (San Nicolas Island, AUTEC, New London, & China Lake), and INL will deliver additional HOMER computer models for comparative analysis. In the transient mode SIREN simulates the short time-scale variation of electrical parameters when a power outage or other destabilizing event occurs. In the HOMER model, a variety of inputs are entered such as location coordinates, Generators, PV arrays, Wind Turbines, Batteries, Converters, Grid costs/usage, Solar resources, Wind resources, Temperatures, Fuels, and Electric Loads. HOMER's optimization and sensitivity analysis algorithms then evaluate the economic and technical feasibility of these technology options and account for variations in technology costs, electric load, and energy resource availability. The Navy can then use HOMER’s optimization and sensitivity results to compare to those of the SIREN model. The U.S. Department of Energy (DOE) Idaho National Laboratory (INL) possesses unique expertise and experience in the software, hardware, and systems design for the integration of renewable energy into the electrical grid. NAVFAC EXWC will draw upon this expertise to complete mission requirements.

  15. Plasticity: modeling & computation

    National Research Council Canada - National Science Library

    Borja, Ronaldo Israel

    2013-01-01

    .... "Plasticity Modeling & Computation" is a textbook written specifically for students who want to learn the theoretical, mathematical, and computational aspects of inelastic deformation in solids...

  16. Economic Growth Models Transition

    Directory of Open Access Journals (Sweden)

    Coralia Angelescu

    2006-03-01

    Full Text Available The transitional recession in countries of Eastern Europe has been much longer than expected. The legacy and recent policy mistakes have both contributed to the slow progress. As structural reforms and gradual institution building have taken hold, the post-socialist economics have started to recover, with some leading countries building momentum toward faster growth. There is a possibility that in wider context of globalization several of these emerging market economies will be able to catch up with the more advanced industrial economies in a matter of one or two generations. Over the past few years, most candidate countries have made progress in the transition to a competitive market economy, macroeconomic stabilization and structural reform. However their income levels have remained far below those in the Member States. Measured by per capita income in purchasing power standards, there has been a very limited amount of catching up over the past fourteen years. Prior, the distinctions between Solow-Swan model and endogenous growth model. The interdependence between transition and integration are stated in this study. Finally, some measures of macroeconomic policy for sustainable growth are proposed in correlation with real macroeconomic situation of the Romanian economy. Our study would be considered the real convergence for the Romanian economy and the recommendations for the adequate policies to achieve a fast real convergence and sustainable growth.

  17. Computational neurogenetic modeling

    CERN Document Server

    Benuskova, Lubica

    2010-01-01

    Computational Neurogenetic Modeling is a student text, introducing the scope and problems of a new scientific discipline - Computational Neurogenetic Modeling (CNGM). CNGM is concerned with the study and development of dynamic neuronal models for modeling brain functions with respect to genes and dynamic interactions between genes. These include neural network models and their integration with gene network models. This new area brings together knowledge from various scientific disciplines, such as computer and information science, neuroscience and cognitive science, genetics and molecular biol

  18. Computer Security Models

    Science.gov (United States)

    1984-09-01

    September 1984 MTR9S31 " J. K. Millen Computer Security C. M. Cerniglia Models * 0 Ne c - ¢- C. S• ~CONTRACT SPONSOR OUSDRE/C31 & ESO/ALEE...given in Section 5, in the form of questions and answers about security modeling. A glossary of terms used in the context of computer security is...model, so we will not be able to pursue it in this report. MODEL CHARACTERISTICS Computer security models are engineering models, giving them somewhat

  19. Numerical modeling of economic uncertainty

    DEFF Research Database (Denmark)

    Schjær-Jacobsen, Hans

    2007-01-01

    Representation and modeling of economic uncertainty is addressed by different modeling methods, namely stochastic variables and probabilities, interval analysis, and fuzzy numbers, in particular triple estimates. Focusing on discounted cash flow analysis numerical results are presented, comparisons...

  20. Economic modeling of HIV treatments.

    Science.gov (United States)

    Simpson, Kit N

    2010-05-01

    To review the general literature on microeconomic modeling and key points that must be considered in the general assessment of economic modeling reports, discuss the evolution of HIV economic models and identify models that illustrate this development over time, as well as examples of current studies. Recommend improvements in HIV economic modeling. Recent economic modeling studies of HIV include examinations of scaling up antiretroviral (ARV) in South Africa, screening prior to use of abacavir, preexposure prophylaxis, early start of ARV in developing countries and cost-effectiveness comparisons of specific ARV drugs using data from clinical trials. These studies all used extensively published second-generation Markov models in their analyses. There have been attempts to simplify approaches to cost-effectiveness estimates by using simple decision trees or cost-effectiveness calculations with short-time horizons. However, these approaches leave out important cumulative economic effects that will not appear early in a treatment. Many economic modeling studies were identified in the 'gray' literature, but limited descriptions precluded an assessment of their adherence to modeling guidelines, and thus to the validity of their findings. There is a need for developing third-generation models to accommodate new knowledge about adherence, adverse effects, and viral resistance.

  1. Interface of Computation, Game Theory, and Economics (Dagstuhl Seminar 13161)

    OpenAIRE

    Hart, Sergiu; Tardos, Éva; von Stengel, Bernhard

    2013-01-01

    This report documents the program and the outcomes of Dagstuhl Seminar 13161 "Interface of Computation, Game Theory, and Economics". The workshop was strongly interdisciplinary, on the leading edge of current topics generally connected to algorithmic game theory: Mechanism design and auctions, interactions in networks, social models, and dynamics and equilibrium in games and markets. We summarize these topics, give the talk abstracts, and comment on experiences related to the organization ...

  2. Rational Expectations and Economic Models.

    Science.gov (United States)

    Sheffrin, Steven M.

    1980-01-01

    Examines how rational expectation models can help describe and predict trends within an economy and explains research needs within the discipline of economics which will enable economists to make more valid predictions. (DB)

  3. The Japanese Economic Model: JEM

    OpenAIRE

    Fujiwara, Ippei; Hara, Naoko; Hirose, Yasuo; Teranishi, Yuki

    2004-01-01

    In this paper, we set out the Japanese Economic Model (JEM), a large-scale macroeconomic model of the Japanese economy. Although the JEM is a theoretical model designed with a view to overcoming the Lucas ( 1976) critique of traditional large-scale macroeconomic models, it can also be used for both projection and simulation analysis. This is achieved by embedding a mechanism within which "short-run dynamics," basically captured by a vector autoregression model, eventually converge to a "short...

  4. Global Economic Models

    DEFF Research Database (Denmark)

    Fontoynont, Marc; de Boer, Jan; Rötlander, Johan

    This documents presents financial data relative to lighting installations, before and after retrofit operations. Data are calculated over a large number of years to combine installation costs, maintenance ,and energy use The general principe was to compare the running costs of the “do nothing...... of Ownership (TCO) of lighting installations has been calculated for various types of buildings: offices, schools, homes and industrial buildings. The data we supply attempt to answer to the following questions: Which installations are low hanging fruits ( with shortest payback time), For which type...... of building are retrofit operation more profitable, How do various parameters influence the payback time (investment costs, efficacy of luminaires and sources, cost of electricity, etc.) Then we have investigated various financial models to initiate successful investments in retrofit operations, Direct...

  5. Airline return-on-investment model for technology evaluation. [computer program to measure economic value of advanced technology applied to passenger aircraft

    Science.gov (United States)

    1974-01-01

    This report presents the derivation, description, and operating instructions for a computer program (TEKVAL) which measures the economic value of advanced technology features applied to long range commercial passenger aircraft. The program consists of three modules; and airplane sizing routine, a direct operating cost routine, and an airline return-on-investment routine. These modules are linked such that they may be operated sequentially or individually, with one routine generating the input for the next or with the option of externally specifying the input for either of the economic routines. A very simple airplane sizing technique was previously developed, based on the Brequet range equation. For this program, that sizing technique has been greatly expanded and combined with the formerly separate DOC and ROI programs to produce TEKVAL.

  6. Mathematical modelling in economic processes.

    Directory of Open Access Journals (Sweden)

    L.V. Kravtsova

    2008-06-01

    Full Text Available In article are considered a number of methods of mathematical modelling of economic processes and opportunities of use of spreadsheets Excel for reception of the optimum decision of tasks or calculation of financial operations with the help of the built-in functions.

  7. Economic-mathematical methods and models under uncertainty

    CERN Document Server

    Aliyev, A G

    2013-01-01

    Brief Information on Finite-Dimensional Vector Space and its Application in EconomicsBases of Piecewise-Linear Economic-Mathematical Models with Regard to Influence of Unaccounted Factors in Finite-Dimensional Vector SpacePiecewise Linear Economic-Mathematical Models with Regard to Unaccounted Factors Influence in Three-Dimensional Vector SpacePiecewise-Linear Economic-Mathematical Models with Regard to Unaccounted Factors Influence on a PlaneBases of Software for Computer Simulation and Multivariant Prediction of Economic Even at Uncertainty Conditions on the Base of N-Comp

  8. Computer Modeling and Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Pronskikh, V. S. [Fermilab

    2014-05-09

    Verification and validation of computer codes and models used in simulation are two aspects of the scientific practice of high importance and have recently been discussed by philosophers of science. While verification is predominantly associated with the correctness of the way a model is represented by a computer code or algorithm, validation more often refers to model’s relation to the real world and its intended use. It has been argued that because complex simulations are generally not transparent to a practitioner, the Duhem problem can arise for verification and validation due to their entanglement; such an entanglement makes it impossible to distinguish whether a coding error or model’s general inadequacy to its target should be blamed in the case of the model failure. I argue that in order to disentangle verification and validation, a clear distinction between computer modeling (construction of mathematical computer models of elementary processes) and simulation (construction of models of composite objects and processes by means of numerical experimenting with them) needs to be made. Holding on to that distinction, I propose to relate verification (based on theoretical strategies such as inferences) to modeling and validation, which shares the common epistemology with experimentation, to simulation. To explain reasons of their intermittent entanglement I propose a weberian ideal-typical model of modeling and simulation as roles in practice. I suggest an approach to alleviate the Duhem problem for verification and validation generally applicable in practice and based on differences in epistemic strategies and scopes

  9. Coupling Climate Models and Forward-Looking Economic Models

    Science.gov (United States)

    Judd, K.; Brock, W. A.

    2010-12-01

    Authors: Dr. Kenneth L. Judd, Hoover Institution, and Prof. William A. Brock, University of Wisconsin Current climate models range from General Circulation Models (GCM’s) with millions of degrees of freedom to models with few degrees of freedom. Simple Energy Balance Climate Models (EBCM’s) help us understand the dynamics of GCM’s. The same is true in economics with Computable General Equilibrium Models (CGE’s) where some models are infinite-dimensional multidimensional differential equations but some are simple models. Nordhaus (2007, 2010) couples a simple EBCM with a simple economic model. One- and two- dimensional ECBM’s do better at approximating damages across the globe and positive and negative feedbacks from anthroprogenic forcing (North etal. (1981), Wu and North (2007)). A proper coupling of climate and economic systems is crucial for arriving at effective policies. Brock and Xepapadeas (2010) have used Fourier/Legendre based expansions to study the shape of socially optimal carbon taxes over time at the planetary level in the face of damages caused by polar ice cap melt (as discussed by Oppenheimer, 2005) but in only a “one dimensional” EBCM. Economists have used orthogonal polynomial expansions to solve dynamic, forward-looking economic models (Judd, 1992, 1998). This presentation will couple EBCM climate models with basic forward-looking economic models, and examine the effectiveness and scaling properties of alternative solution methods. We will use a two dimensional EBCM model on the sphere (Wu and North, 2007) and a multicountry, multisector regional model of the economic system. Our aim will be to gain insights into intertemporal shape of the optimal carbon tax schedule, and its impact on global food production, as modeled by Golub and Hertel (2009). We will initially have limited computing resources and will need to focus on highly aggregated models. However, this will be more complex than existing models with forward

  10. Computational human body models

    NARCIS (Netherlands)

    Wismans, J.S.H.M.; Happee, R.; Dommelen, J.A.W. van

    2005-01-01

    Computational human body models are widely used for automotive crashsafety research and design and as such have significantly contributed to a reduction of traffic injuries and fatalities. Currently crash simulations are mainly performed using models based on crash-dummies. However crash dummies

  11. Computational Intelligence, Cyber Security and Computational Models

    CERN Document Server

    Anitha, R; Lekshmi, R; Kumar, M; Bonato, Anthony; Graña, Manuel

    2014-01-01

    This book contains cutting-edge research material presented by researchers, engineers, developers, and practitioners from academia and industry at the International Conference on Computational Intelligence, Cyber Security and Computational Models (ICC3) organized by PSG College of Technology, Coimbatore, India during December 19–21, 2013. The materials in the book include theory and applications for design, analysis, and modeling of computational intelligence and security. The book will be useful material for students, researchers, professionals, and academicians. It will help in understanding current research trends and findings and future scope of research in computational intelligence, cyber security, and computational models.

  12. CMS Computing Model Evolution

    CERN Document Server

    Grandi, Claudio; Colling, D; Fisk, I; Girone, M

    2014-01-01

    The CMS Computing Model was developed and documented in 2004. Since then the model has evolved to be more flexible and to take advantage of new techniques, but many of the original concepts remain and are in active use. In this presentation we will discuss the changes planned for the restart of the LHC program in 2015. We will discuss the changes planning in the use and definition of the computing tiers, that were defined with the MONARC project. We will present how we intend to use new services and infrastructure to provide more efficient and transparent access to the data. We will discuss the computing plans to make better use of the computing capacity by scheduling more of the processor nodes, making better use of the disk storage, and more intelligent use of the networking.

  13. Big Data, Computational Science, Economics, Finance, Marketing, Management, and Psychology: Connections

    NARCIS (Netherlands)

    C-L. Chang (Chia-Lin); M.J. McAleer (Michael); W.-K. Wong (Wing-Keung)

    2018-01-01

    textabstractThe paper provides a review of the literature that connects Big Data, Computational Science, Economics, Finance, Marketing, Management, and Psychology, and discusses some research that is related to the seven disciplines. Academics could develop theoretical models and subsequent

  14. Computationally modeling interpersonal trust

    Science.gov (United States)

    Lee, Jin Joo; Knox, W. Bradley; Wormwood, Jolie B.; Breazeal, Cynthia; DeSteno, David

    2013-01-01

    We present a computational model capable of predicting—above human accuracy—the degree of trust a person has toward their novel partner by observing the trust-related nonverbal cues expressed in their social interaction. We summarize our prior work, in which we identify nonverbal cues that signal untrustworthy behavior and also demonstrate the human mind's readiness to interpret those cues to assess the trustworthiness of a social robot. We demonstrate that domain knowledge gained from our prior work using human-subjects experiments, when incorporated into the feature engineering process, permits a computational model to outperform both human predictions and a baseline model built in naiveté of this domain knowledge. We then present the construction of hidden Markov models to investigate temporal relationships among the trust-related nonverbal cues. By interpreting the resulting learned structure, we observe that models built to emulate different levels of trust exhibit different sequences of nonverbal cues. From this observation, we derived sequence-based temporal features that further improve the accuracy of our computational model. Our multi-step research process presented in this paper combines the strength of experimental manipulation and machine learning to not only design a computational trust model but also to further our understanding of the dynamics of interpersonal trust. PMID:24363649

  15. Computationally Modeling Interpersonal Trust

    Directory of Open Access Journals (Sweden)

    Jin Joo eLee

    2013-12-01

    Full Text Available We present a computational model capable of predicting—above human accuracy—the degree of trust a person has toward their novel partner by observing the trust-related nonverbal cues expressed in their social interaction. We summarize our prior work, in which we identify nonverbal cues that signal untrustworthy behavior and also demonstrate the human mind’s readiness to interpret those cues to assess the trustworthiness of a social robot. We demonstrate that domain knowledge gained from our prior work using human-subjects experiments, when incorporated into the feature engineering process, permits a computational model to outperform both human predictions and a baseline model built in naivete' of this domain knowledge. We then present the construction of hidden Markov models to incorporate temporal relationships among the trust-related nonverbal cues. By interpreting the resulting learned structure, we observe that models built to emulate different levels of trust exhibit different sequences of nonverbal cues. From this observation, we derived sequence-based temporal features that further improve the accuracy of our computational model. Our multi-step research process presented in this paper combines the strength of experimental manipulation and machine learning to not only design a computational trust model but also to further our understanding of the dynamics of interpersonal trust.

  16. Model space of economic events

    Science.gov (United States)

    Romanovsky, M. Yu.

    A method for constructing the model or virtual space of economic events when economic objects can be considered as material ones is suggested. We describe change of share rates in time at stock markets as the potential difference of attracted bodies in time in this virtual space. Each share of each enterprise is displayed by a single particle with a unit “charge”. It is shown that the random value of potential difference at the origin of coordinates measured at a definite time interval has the probability density coinciding with the known distribution of “Levy flights” or “Levy walks”. A distribution of alteration in time of the “Standard and Poor” index value obtained by Mantegna and Stanley (they shown that it is the “Levy walks” distribution too) (Mantegna and Stanley, Nature 376 (1995) 46) is used for determination of the introduced potential dependence on coordinates in the model space. A simple phenomenological model of interaction potential is introduced. The potential law of each particle turns out to be closed to r-2.14 in the minimum possible three-dimensional model space. This model permits calculation of time of random potential correlations at a certain point of the model space. These correlations could characterize the time period of making a decision by an investor at stock exchange. It is shown that this time is notably shorter in unstable periods (1987). A “microscopical” model of interaction in the virtual space is also discussed.

  17. A method for the assessment of site-specific economic impacts of commercial and industrial biomass energy facilities. A handbook and computer model

    Energy Technology Data Exchange (ETDEWEB)

    1994-10-01

    A handbook on ``A Method for the Assessment of Site-specific Econoomic Impacts of Industrial and Commercial Biomass Energy Facilities`` has been prepared by Resource Systems Group Inc. under contract to the Southeastern Regional Biomass Energy Program (SERBEP). The handbook includes a user-friendly Lotus 123 spreadsheet which calculates the economic impacts of biomass energy facilities. The analysis uses a hybrid approach, combining direct site-specific data provided by the user, with indirect impact multipliers from the US Forest Service IMPLAN input/output model for each state. Direct economic impacts are determined primarily from site-specific data and indirect impacts are determined from the IMPLAN multipliers. The economic impacts are given in terms of income, employment, and state and federal taxes generated directly by the specific facility and by the indirect economic activity associated with each project. A worksheet is provided which guides the user in identifying and entering the appropriate financial data on the plant to be evaluated. The WLAN multipliers for each state are included in a database within the program. The multipliers are applied automatically after the user has entered the site-specific data and the state in which the facility is located. Output from the analysis includes a summary of direct and indirect income, employment and taxes. Case studies of large and small wood energy facilities and an ethanol plant are provided as examples to demonstrate the method. Although the handbook and program are intended for use by those with no previous experience in economic impact analysis, suggestions are given for the more experienced user who may wish to modify the analysis techniques.

  18. "Economic microscope": The agent-based model set as an instrument in an economic system research

    Science.gov (United States)

    Berg, D. B.; Zvereva, O. M.; Akenov, Serik

    2017-07-01

    To create a valid model of a social or economic system one must consider a lot of parameters, conditions and restrictions. Systems and, consequently, the corresponding models are proved to be very complicated. The problem of such system model engineering can't be solved only with mathematical methods usage. The decision could be found in computer simulation. Simulation does not reject mathematical methods, mathematical expressions could become the foundation for a computer model. In these materials the set of agent-based computer models is under discussion. All the set models simulate productive agents communications, but every model is geared towards the specific goal, and, thus, has its own algorithm and its own peculiarities. It is shown that computer simulation can discover new features of the agents' behavior that can not be obtained by analytical solvation of mathematical equations and thus plays the role of some kind of economic microscope.

  19. The ATLAS Computing Model

    CERN Document Server

    Adams, D; Bee, C P; Hawkings, R; Jarp, S; Jones, R; Malon, D; Poggioli, L; Poulard, G; Quarrie, D; Wenaus, T

    2005-01-01

    The ATLAS Offline Computing Model is described. The main emphasis is on the steady state, when normal running is established. The data flow from the output of the ATLAS trigger system through processing and analysis stages is analysed, in order to estimate the computing resources, in terms of CPU power, disk and tape storage and network bandwidth, which will be necessary to guarantee speedy access to ATLAS data to all members of the Collaboration. Data Challenges and the commissioning runs are used to prototype the Computing Model and test the infrastructure before the start of LHC operation. The initial planning for the early stages of data-taking is also presented. In this phase, a greater degree of access to the unprocessed or partially processed raw data is envisaged.

  20. Rediscovering the Economics of Keynes in an Agent-Based Computational Setting

    DEFF Research Database (Denmark)

    Bruun, Charlotte

    The aim of this paper is to use agent-based computational economics to explore the economic thinking of Keynes. Taking his starting point at the macroeconomic level, Keynes argued that economic systems are characterized by fundamental uncertainty - an uncertainty that makes rule-based behaviour...... in commen with what we today call complex dynamic systems, and today we may aply the method of agent-based computational economics to the ideas of Keynes. The presented agent-based Keynesian model demonstrate, as argued by Keynes, that the economy can selforganize without relying on price movement...

  1. Chaos Modelling with Computers

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 1; Issue 5. Chaos Modelling with Computers Unpredicatable Behaviour of ... Author Affiliations. Balakrishnan Ramasamy1 T S K V Iyer2. Siemens Communication Software, 10th floor Raheja Towers 26-27, M G Road Bangalore 560 001, India.

  2. The Economics of Online Dating: A Course in Economic Modeling

    Science.gov (United States)

    Monaco, Andrew J.

    2018-01-01

    The author discusses the development of a unique course, The Economics of Online Dating. The course is an upper-level undergraduate course that combines intensive discussion, peer review, and economic theory to teach modeling skills to undergraduates. The course uses the framework of "online dating," interpreted broadly, as a point of…

  3. Understanding student computational thinking with computational modeling

    Science.gov (United States)

    Aiken, John M.; Caballero, Marcos D.; Douglas, Scott S.; Burk, John B.; Scanlon, Erin M.; Thoms, Brian D.; Schatz, Michael F.

    2013-01-01

    Recently, the National Research Council's framework for next generation science standards highlighted "computational thinking" as one of its "fundamental practices". 9th Grade students taking a physics course that employed the Arizona State University's Modeling Instruction curriculum were taught to construct computational models of physical systems. Student computational thinking was assessed using a proctored programming assignment, written essay, and a series of think-aloud interviews, where the students produced and discussed a computational model of a baseball in motion via a high-level programming environment (VPython). Roughly a third of the students in the study were successful in completing the programming assignment. Student success on this assessment was tied to how students synthesized their knowledge of physics and computation. On the essay and interview assessments, students displayed unique views of the relationship between force and motion; those who spoke of this relationship in causal (rather than observational) terms tended to have more success in the programming exercise.

  4. Understanding Student Computational Thinking with Computational Modeling

    CERN Document Server

    Aiken, John M; Douglas, Scott S; Burk, John B; Scanlon, Erin M; Thoms, Brian D; Schatz, Michael F

    2012-01-01

    Recently, the National Research Council's framework for next generation science standards highlighted "computational thinking" as one of its "fundamental practices". Students taking a physics course that employed the Arizona State University's Modeling Instruction curriculum were taught to construct computational models of physical systems. Student computational thinking was assessed using a proctored programming assignment, written essay, and a series of think-aloud interviews, where the students produced and discussed a computational model of a baseball in motion via a high-level programming environment (VPython). Roughly a third of the students in the study were successful in completing the programming assignment. Student success on this assessment was tied to how students synthesized their knowledge of physics and computation. On the essay and interview assessments, students displayed unique views of the relationship between force and motion; those who spoke of this relationship in causal (rather than obs...

  5. Parallel computing in enterprise modeling.

    Energy Technology Data Exchange (ETDEWEB)

    Goldsby, Michael E.; Armstrong, Robert C.; Shneider, Max S.; Vanderveen, Keith; Ray, Jaideep; Heath, Zach; Allan, Benjamin A.

    2008-08-01

    This report presents the results of our efforts to apply high-performance computing to entity-based simulations with a multi-use plugin for parallel computing. We use the term 'Entity-based simulation' to describe a class of simulation which includes both discrete event simulation and agent based simulation. What simulations of this class share, and what differs from more traditional models, is that the result sought is emergent from a large number of contributing entities. Logistic, economic and social simulations are members of this class where things or people are organized or self-organize to produce a solution. Entity-based problems never have an a priori ergodic principle that will greatly simplify calculations. Because the results of entity-based simulations can only be realized at scale, scalable computing is de rigueur for large problems. Having said that, the absence of a spatial organizing principal makes the decomposition of the problem onto processors problematic. In addition, practitioners in this domain commonly use the Java programming language which presents its own problems in a high-performance setting. The plugin we have developed, called the Parallel Particle Data Model, overcomes both of these obstacles and is now being used by two Sandia frameworks: the Decision Analysis Center, and the Seldon social simulation facility. While the ability to engage U.S.-sized problems is now available to the Decision Analysis Center, this plugin is central to the success of Seldon. Because Seldon relies on computationally intensive cognitive sub-models, this work is necessary to achieve the scale necessary for realistic results. With the recent upheavals in the financial markets, and the inscrutability of terrorist activity, this simulation domain will likely need a capability with ever greater fidelity. High-performance computing will play an important part in enabling that greater fidelity.

  6. Computationally modeling interpersonal trust

    OpenAIRE

    Lee, Jin Joo; Knox, W. Bradley; Wormwood, Jolie B.; Breazeal, Cynthia; DeSteno, David

    2013-01-01

    We present a computational model capable of predicting—above human accuracy—the degree of trust a person has toward their novel partner by observing the trust-related nonverbal cues expressed in their social interaction. We summarize our prior work, in which we identify nonverbal cues that signal untrustworthy behavior and also demonstrate the human mind’s readiness to interpret those cues to assess the trustworthiness of a social robot. We demonstrate that domain knowledge gained from our pr...

  7. Economic growth rate management by soft computing approach

    Science.gov (United States)

    Maksimović, Goran; Jović, Srđan; Jovanović, Radomir

    2017-01-01

    Economic growth rate management is very important process in order to improve the economic stability of any country. The main goal of the study was to manage the impact of agriculture, manufacturing, industry and services on the economic growth rate prediction. Soft computing methodology was used in order to select the inputs influence on the economic growth rate prediction. It is known that the economic growth may be developed on the basis of combination of different factors. Gross domestic product (GDP) was used as economic growth indicator. It was found services have the highest impact on the GDP growth rate. On the contrary, the manufacturing has the smallest impact on the GDP growth rate.

  8. Economic modelling of pork production-marketing chains

    OpenAIRE

    Ouden, den, W.

    1996-01-01

    The research described in this thesis was focused on the development of economic simulation and optimization computer models to support decision making with respect to pork production- marketing chains. The models include three production stages: pig farrowing, pig fattening and pig slaughtering including cutting of carcasses. Transportation of live pigs between these stages was also considered. The pork chain simulation model was developed and described to simulate technical and economic per...

  9. Economic modelling under conditions of exploitation of cohesive construction minerals

    Directory of Open Access Journals (Sweden)

    Milan Mikoláš

    2011-01-01

    Full Text Available Managers of mining companies use for decision-making on optimization of manufacturing processes advanced modelling methods and simulations on computers. The article proposes and analyses the model of a mining company production cycle consisting of a three-dimensional quarry model, technology model and economic-mathematical model. Based on the latter model an economic simulation model of a quarry has been created in the MS Excel program currently available on all personal computers, which measures outputs in the form of changes in total and unit costs according to the generic classification of costs in response to changes in inputs in the form of parameters of technology equipment and other operating parameters. Managers use the economic simulation model of quarry as decision support to increase profitability or improve competitiveness of their product from the sector of construction minerals.

  10. The Antares computing model

    Energy Technology Data Exchange (ETDEWEB)

    Kopper, Claudio, E-mail: claudio.kopper@nikhef.nl [NIKHEF, Science Park 105, 1098 XG Amsterdam (Netherlands)

    2013-10-11

    Completed in 2008, Antares is now the largest water Cherenkov neutrino telescope in the Northern Hemisphere. Its main goal is to detect neutrinos from galactic and extra-galactic sources. Due to the high background rate of atmospheric muons and the high level of bioluminescence, several on-line and off-line filtering algorithms have to be applied to the raw data taken by the instrument. To be able to handle this data stream, a dedicated computing infrastructure has been set up. The paper covers the main aspects of the current official Antares computing model. This includes an overview of on-line and off-line data handling and storage. In addition, the current usage of the “IceTray” software framework for Antares data processing is highlighted. Finally, an overview of the data storage formats used for high-level analysis is given.

  11. Production economic models of fisheries

    DEFF Research Database (Denmark)

    Andersen, Jesper Levring

    The overall purpose of this PhD thesis is to investigate different aspects of fishermen’s behaviour using production economic models at the individual and industry levels. Three parts make up this thesis. The first part provides an overview of the thesis. The second part consists of four papers...... be too harsh, because the fishermen are operating in an uncertain environment with variations in fish stocks, weather, etc. Fishermen therefore seek in their ex ante decisions to cope with uncertainty. If the conditions are better than expected, this may result in some inputs not being used. In ex post...... efficiency evaluation, this is interpreted as inefficiency, although it was - in an ex ante perception - rational to bring the inputs along. This type of inefficiency can be denoted rational inefficiency. By further developing the method from Bogetoft and Hougaard (2003), an evaluation of 308 Danish fishing...

  12. Neurobiology of economic choice: a good-based model.

    Science.gov (United States)

    Padoa-Schioppa, Camillo

    2011-01-01

    Traditionally the object of economic theory and experimental psychology, economic choice recently became a lively research focus in systems neuroscience. Here I summarize the emerging results and propose a unifying model of how economic choice might function at the neural level. Economic choice entails comparing options that vary on multiple dimensions. Hence, while choosing, individuals integrate different determinants into a subjective value; decisions are then made by comparing values. According to the good-based model, the values of different goods are computed independently of one another, which implies transitivity. Values are not learned as such, but rather computed at the time of choice. Most importantly, values are compared within the space of goods, independent of the sensorimotor contingencies of choice. Evidence from neurophysiology, imaging, and lesion studies indicates that abstract representations of value exist in the orbitofrontal and ventromedial prefrontal cortices. The computation and comparison of values may thus take place within these regions.

  13. NEUROBIOLOGY OF ECONOMIC CHOICE: A GOOD-BASED MODEL

    Science.gov (United States)

    Padoa-Schioppa, Camillo

    2012-01-01

    Traditionally the object of economic theory and experimental psychology, economic choice recently became a lively research focus in systems neuroscience. Here I summarize the emerging results and I propose a unifying model of how economic choice might function at the neural level. Economic choice entails comparing options that vary on multiple dimensions. Hence, while choosing, individuals integrate different determinants into a subjective value; decisions are then made by comparing values. According to the good-based model, the values of different goods are computed independently of one another, which implies transitivity. Values are not learned as such, but rather computed at the time of choice. Most importantly, values are compared within the space of goods, independent of the sensori-motor contingencies of choice. Evidence from neurophysiology, imaging and lesion studies indicates that abstract representations of value exist in the orbitofrontal and ventromedial prefrontal cortices. The computation and comparison of values may thus take place within these regions. PMID:21456961

  14. DNA computing models

    CERN Document Server

    Ignatova, Zoya; Zimmermann, Karl-Heinz

    2008-01-01

    In this excellent text, the reader is given a comprehensive introduction to the field of DNA computing. The book emphasizes computational methods to tackle central problems of DNA computing, such as controlling living cells, building patterns, and generating nanomachines.

  15. Confronting the Economic Model with the Data

    DEFF Research Database (Denmark)

    Johansen, Søren

    2006-01-01

    Econometrics is about confronting economic models with the data. In doing so it is crucial to choose a statistical model that not only contains the economic model as a submodel, but also contains the data generating process. When this is the case, the statistical model can be analyzed by likelihood...

  16. Computational modelling of polymers

    Science.gov (United States)

    Celarier, Edward A.

    1991-01-01

    Polymeric materials and polymer/graphite composites show a very diverse range of material properties, many of which make them attractive candidates for a variety of high performance engineering applications. Their properties are ultimately determined largely by their chemical structure, and the conditions under which they are processed. It is the aim of computational chemistry to be able to simulate candidate polymers on a computer, and determine what their likely material properties will be. A number of commercially available software packages purport to predict the material properties of samples, given the chemical structures of their constituent molecules. One such system, Cerius, has been in use at LaRC. It is comprised of a number of modules, each of which performs a different kind of calculation on a molecule in the programs workspace. Particularly, interest is in evaluating the suitability of this program to aid in the study of microcrystalline polymeric materials. One of the first model systems examined was benzophenone. The results of this investigation are discussed.

  17. Economic modelling of pork production-marketing chains

    NARCIS (Netherlands)

    Ouden, den M.

    1996-01-01

    The research described in this thesis was focused on the development of economic simulation and optimization computer models to support decision making with respect to pork production- marketing chains. The models include three production stages: pig farrowing, pig fattening and pig slaughtering

  18. Dynamics of Population and Economic Growth: A Computer-Based Instruction Program.

    Science.gov (United States)

    Roh, Chaisung; Handler, Paul

    A computer-assisted instructional (CAI) program at the University of Illinois is used to teach the dynamics of population growth. Socio-economic models are also developed to show the consequences of population growth upon variables such as income, productivity, and the demand for food. A one-sex population projection model allows students to…

  19. Models of optical quantum computing

    Directory of Open Access Journals (Sweden)

    Krovi Hari

    2017-03-01

    Full Text Available I review some work on models of quantum computing, optical implementations of these models, as well as the associated computational power. In particular, we discuss the circuit model and cluster state implementations using quantum optics with various encodings such as dual rail encoding, Gottesman-Kitaev-Preskill encoding, and coherent state encoding. Then we discuss intermediate models of optical computing such as boson sampling and its variants. Finally, we review some recent work in optical implementations of adiabatic quantum computing and analog optical computing. We also provide a brief description of the relevant aspects from complexity theory needed to understand the results surveyed.

  20. The relationship between venture capital investment and macro economic variables via statistical computation method

    Science.gov (United States)

    Aygunes, Gunes

    2017-07-01

    The objective of this paper is to survey and determine the macroeconomic factors affecting the level of venture capital (VC) investments in a country. The literary depends on venture capitalists' quality and countries' venture capital investments. The aim of this paper is to give relationship between venture capital investment and macro economic variables via statistical computation method. We investigate the countries and macro economic variables. By using statistical computation method, we derive correlation between venture capital investments and macro economic variables. According to method of logistic regression model (logit regression or logit model), macro economic variables are correlated with each other in three group. Venture capitalists regard correlations as a indicator. Finally, we give correlation matrix of our results.

  1. ECONOMIC-MATHEMATICAL CLUSTER’S MODELS

    Directory of Open Access Journals (Sweden)

    Nikolay Dmitriyevich Naydenov

    2015-11-01

    Full Text Available The article describes the economic and mathematical models of cluster formations: a model city on the line, the model of network competition consumers one-agent cluster model, the multi-agent playing model of cluster growth, the model comprehensive income cluster members, the artificial neural networks, the balance cluster model, the stability of the cluster model. The article shows that the economic-mathematical modeling processes, clustering as the method allows to improve forecasting, planning and evaluation of the level of clustering in the region.Purpose. Show the level of development of economic and mathematical models as a tool for the analysis of clusters of integration associations in the regions.Methodology. Economic-mathematical modeling, analysis, synthesis, comparison, statistical surveys.Results. The high activity of research in the field of economic and mathematical modeling of cluster formations revealed. The essential characteristics of cluster formations using economic and mathematical models investigated.Practical implications. The economic policy of the regions, countries and municipalities.

  2. Network models in economics and finance

    CERN Document Server

    Pardalos, Panos; Rassias, Themistocles

    2014-01-01

    Using network models to investigate the interconnectivity in modern economic systems allows researchers to better understand and explain some economic phenomena. This volume presents contributions by known experts and active researchers in economic and financial network modeling. Readers are provided with an understanding of the latest advances in network analysis as applied to economics, finance, corporate governance, and investments. Moreover, recent advances in market network analysis  that focus on influential techniques for market graph analysis are also examined. Young researchers will find this volume particularly useful in facilitating their introduction to this new and fascinating field. Professionals in economics, financial management, various technologies, and network analysis, will find the network models presented in this book beneficial in analyzing the interconnectivity in modern economic systems.

  3. Value Concept and Economic Growth Model

    Directory of Open Access Journals (Sweden)

    Truong Hong Trinh

    2014-12-01

    Full Text Available This paper approaches the value added method for Gross Domestic Product (GDP measurement that explains the interrelationship between the expenditure approach and the income approach. The economic growth model is also proposed with three key elements of capital accumulation, technological innovation, and institutional reform. Although capital accumulation and technological innovation are two integrated elements in driving economic growth, institutional reforms play a key role in creating incentives that effect the transitional and steady state growth rate in the real world economy. The paper provides a theoretical insight on economic growth to understand incentives and driving forces in economic growth model.

  4. Stated Preference Economic Development Model

    Science.gov (United States)

    2015-02-01

    economic development in your community and Afghanistan. Do you give your consent for me to proceed?” M-21. Informed Consent _____ ( tick ...dairy cows , and medicine for livestock. The villagers will form a council in order to estimate the costs and implementation of the project. The

  5. Mathematical Models of Economic Growth

    NARCIS (Netherlands)

    J. Tinbergen (Jan); H.C. Bos (Henk)

    1962-01-01

    textabstractEconomics Handbook Series, edited by Seymour E. Harris. In Spanish: Modelos Matematicos del Crecimiento Económico, Series ‘Biblioteca de Ciencias Sociales’, Aguilar, Madrid, 1966, XVI + 165 p. In French: Modèles Mathématiques de Croissance Economique. Series ‘Techniques Economiques

  6. Assessing the economic impact of public investment in Malaysia: a case study on MyRapid Transit project using a dynamic computable general equilibrium model

    OpenAIRE

    Muniandy, Meenachi

    2017-01-01

    The central focus of this thesis is the question of whether public investment in transport infrastructure contributes positively to Malaysia’s economic growth and welfare. Although there are strong analytical reasons to believe that public investment spending is one of the important variables that influence growth, there remains significant uncertainty about its actual degree of influence. In Malaysia, whenever there is a collapse in domestic demand, government spending becomes an important m...

  7. Study on Chinese Low Carbon Economic Model

    OpenAIRE

    Huifeng Li; Xiaofang Wang

    2010-01-01

    Global warming, which is caused by over consumption of fossil energy during economic development in human society, threatens global ecological balance, tampers social and economic development, imperils energy security, ecological safety, water and food safety, and endangers the entire human race. Low carbon economy is a new economic development model based on low energy consumption, low pollution, and low-emission-based. This paper analyzes the Status quo and Limits of the Chinese Low Carbon ...

  8. Computational modeling in biomechanics

    CERN Document Server

    Mofrad, Mohammad

    2010-01-01

    This book provides a glimpse of the diverse and important roles that modern computational technology is playing in various areas of biomechanics. It includes unique chapters on ab initio quantum mechanical, molecular dynamic and scale coupling methods..

  9. Big data, computational science, economics, finance, marketing, management, and psychology: connections

    OpenAIRE

    Chang, Chia-Lin; McAleer, Michael; Wong, Wing-Keung

    2018-01-01

    textabstractThe paper provides a review of the literature that connects Big Data, Computational Science, Economics, Finance, Marketing, Management, and Psychology, and discusses some research that is related to the seven disciplines. Academics could develop theoretical models and subsequent econometric and statistical models to estimate the parameters in the associated models, as well as conduct simulation to examine whether the estimators in their theories on estimation and hypothesis testin...

  10. Geometric Modeling for Computer Vision

    Science.gov (United States)

    1974-10-01

    The main contribution of this thesis is the development of a three dimensional geometric modeling system for application to computer vision . In... computer vision geometric models provide a goal for descriptive image analysis, an origin for verification image synthesis, and a context for spatial

  11. Mathematical Modeling and Computational Thinking

    Science.gov (United States)

    Sanford, John F.; Naidu, Jaideep T.

    2017-01-01

    The paper argues that mathematical modeling is the essence of computational thinking. Learning a computer language is a valuable assistance in learning logical thinking but of less assistance when learning problem-solving skills. The paper is third in a series and presents some examples of mathematical modeling using spreadsheets at an advanced…

  12. Modelling Migration and Economic Agglomeration with Active Brownian Particles

    CERN Document Server

    Schweitzer, F

    1999-01-01

    We propose a stochastic dynamic model of migration and economic aggregation in a system of employed (immobile) and unemployed (mobile) agents which respond to local wage gradients. Dependent on the local economic situation, described by a production function which includes cooperative effects, employed agents can become unemployed and vice versa. The spatio-temporal distribution of employed and unemployed agents is investigated both analytically and by means of stochastic computer simulations. We find the establishment of distinct economic centers out of a random initial distribution. The evolution of these centers occurs in two different stages: (i) small economic centers are formed based on the positive feedback of mutual stimulation/cooperation among the agents, (ii) some of the small centers grow at the expense of others, which finally leads to the concentration of the labor force in different extended economic regions. This crossover to large-scale production is accompanied by an increase in the unemploy...

  13. Evaluation of trade influence on economic growth rate by computational intelligence approach

    Science.gov (United States)

    Sokolov-Mladenović, Svetlana; Milovančević, Milos; Mladenović, Igor

    2017-01-01

    In this study was analyzed the influence of trade parameters on the economic growth forecasting accuracy. Computational intelligence method was used for the analyzing since the method can handle highly nonlinear data. It is known that the economic growth could be modeled based on the different trade parameters. In this study five input parameters were considered. These input parameters were: trade in services, exports of goods and services, imports of goods and services, trade and merchandise trade. All these parameters were calculated as added percentages in gross domestic product (GDP). The main goal was to select which parameters are the most impactful on the economic growth percentage. GDP was used as economic growth indicator. Results show that the imports of goods and services has the highest influence on the economic growth forecasting accuracy.

  14. Computational Efficiency of Economic MPC for Power Systems Operation

    DEFF Research Database (Denmark)

    Standardi, Laura; Poulsen, Niels Kjølstad; Jørgensen, John Bagterp

    2013-01-01

    In this work, we propose an Economic Model Predictive Control (MPC) strategy to operate power systems that consist of independent power units. The controller balances the power supply and demand, minimizing production costs. The control problem is formulated as a linear program that is solved...

  15. MOBILE COMPUTING USING ANDROID WITH AN EMPHASIZES ON ECONOMIC APPLICATION

    Directory of Open Access Journals (Sweden)

    Magnolia TILCA

    2016-12-01

    Full Text Available As result of the convergence of computers and mobile phones, Android comes with the great opportunity of implementing personal mobile applications in a Java language environment. The paper presents two main aspects concerning the new age of programming. The first aspect refers to the existing library of applications in economic domain. The paper extracts the most useful financial and business apps that run on Android smart-phones. The second aspect presents an example of building an economic application using one of Android’s various IDEs (Integrated Development Environments: Android Studio. The developed app calculates the prognosis of a business size starting from known statistic data sets and evaluates the effort of the business implementation, using the linear regression method. The app differs from other Android applications by the explicit answers that result from the regression calculus

  16. Structural modelling of economic growth: Technological changes

    Directory of Open Access Journals (Sweden)

    Sukharev Oleg

    2016-01-01

    Full Text Available Neoclassical and Keynesian theories of economic growth assume the use of Cobb-Douglas modified functions and other aggregate econometric approaches to growth dynamics modelling. In that case explanations of economic growth are based on the logic of the used mathematical ratios often including the ideas about aggregated values change and factors change a priori. The idea of assessment of factor productivity is the fundamental one among modern theories of economic growth. Nevertheless, structural parameters of economic system, institutions and technological changes are practically not considered within known approaches, though the latter is reflected in the changing parameters of production function. At the same time, on the one hand, the ratio of structural elements determines the future value of the total productivity of the factors and, on the other hand, strongly influences the rate of economic growth and its mode of innovative dynamics. To put structural parameters of economic system into growth models with the possibility of assessment of such modes under conditions of interaction of new and old combinations is an essential step in the development of the theory of economic growth/development. It allows forming stimulation policy of economic growth proceeding from the structural ratios and relations recognized for this economic system. It is most convenient in such models to use logistic functions demonstrating the resource change for old and new combination within the economic system. The result of economy development depends on starting conditions, and on institutional parameters of velocity change of resource borrowing in favour of a new combination and creation of its own resource. Model registration of the resource is carried out through the idea of investments into new and old combinations.

  17. JEDI: Jobs and Economic Development Impact Model

    Energy Technology Data Exchange (ETDEWEB)

    2017-06-13

    The Jobs and Economic Development Impact (JEDI) models are user-friendly tools that estimate the economic impacts of constructing and operating power generation and biofuel plants at the local (usually state) level. First developed by NREL's researchers to model wind energy jobs and impacts, JEDI has been expanded to also estimate the economic impacts of biofuels, coal, conventional hydro, concentrating solar power, geothermal, marine and hydrokinetic power, natural gas, photovoltaics, and transmission lines. This fact sheet focuses on JEDI for wind energy projects and is revised with 2017 figures.

  18. Computational investigation of fluid flow and heat transfer of an economizer by porous medium approach

    Science.gov (United States)

    Babu, C. Rajesh; Kumar, P.; Rajamohan, G.

    2017-07-01

    Computation of fluid flow and heat transfer in an economizer is simulated by a porous medium approach, with plain tubes having a horizontal in-line arrangement and cross flow arrangement in a coal-fired thermal power plant. The economizer is a thermal mechanical device that captures waste heat from the thermal exhaust flue gasses through heat transfer surfaces to preheat boiler feed water. In order to evaluate the fluid flow and heat transfer on tubes, a numerical analysis on heat transfer performance is carried out on an 110 t/h MCR (Maximum continuous rating) boiler unit. In this study, thermal performance is investigated using the computational fluid dynamics (CFD) simulation using ANSYS FLUENT. The fouling factor ε and the overall heat transfer coefficient ψ are employed to evaluate the fluid flow and heat transfer. The model demands significant computational details for geometric modeling, grid generation, and numerical calculations to evaluate the thermal performance of an economizer. The simulation results show that the overall heat transfer coefficient 37.76 W/(m2K) and economizer coil side pressure drop of 0.2 (kg/cm2) are found to be conformity within the tolerable limits when compared with existing industrial economizer data.

  19. Computational modeling of microstructure

    OpenAIRE

    Luskin, Mitchell

    2003-01-01

    Many materials such as martensitic or ferromagnetic crystals are observed to be in metastable states exhibiting a fine-scale, structured spatial oscillation called microstructure; and hysteresis is observed as the temperature, boundary forces, or external magnetic field changes. We have developed a numerical analysis of microstructure and used this theory to construct numerical methods that have been used to compute approximations to the deformation of crystals with microstructure.

  20. Economic Modeling in SocialWork Education

    Directory of Open Access Journals (Sweden)

    Barry R. Cournoyer

    2000-12-01

    Full Text Available Economic modeling provides academic administrators with a logical framework for analyzing costs associated with the processes involved in the delivery of social work education. The specific costs associated with activities such as teaching, research, and service may be determined for a school of social work as a whole or for specific responsibility centers (e.g., programs and services within the school. Economic modeling utilizes modern spreadsheet software that can be configured in relation to the idiosyncratic needs and budgeting strategies that exist in virtually all colleges and universities. As a versatile planning tool, it enables managers to identify specific “cost-drivers” that cause the occurrence of real costs in relation to designated programmatic initiatives. In addition, economic modeling provides academic planners and decision-makers a useful vehicle for considering the economic impact of various projected (“what if” scenarios.

  1. Computational models of syntactic acquisition.

    Science.gov (United States)

    Yang, Charles

    2012-03-01

    The computational approach to syntactic acquisition can be fruitfully pursued by integrating results and perspectives from computer science, linguistics, and developmental psychology. In this article, we first review some key results in computational learning theory and their implications for language acquisition. We then turn to examine specific learning models, some of which exploit distributional information in the input while others rely on a constrained space of hypotheses, yet both approaches share a common set of characteristics to overcome the learning problem. We conclude with a discussion of how computational models connects with the empirical study of child grammar, making the case for computationally tractable, psychologically plausible and developmentally realistic models of acquisition. WIREs Cogn Sci 2012, 3:205-213. doi: 10.1002/wcs.1154 For further resources related to this article, please visit the WIREs website. Copyright © 2011 John Wiley & Sons, Ltd.

  2. Computational Modeling of Space Physiology

    Science.gov (United States)

    Lewandowski, Beth E.; Griffin, Devon W.

    2016-01-01

    The Digital Astronaut Project (DAP), within NASAs Human Research Program, develops and implements computational modeling for use in the mitigation of human health and performance risks associated with long duration spaceflight. Over the past decade, DAP developed models to provide insights into space flight related changes to the central nervous system, cardiovascular system and the musculoskeletal system. Examples of the models and their applications include biomechanical models applied to advanced exercise device development, bone fracture risk quantification for mission planning, accident investigation, bone health standards development, and occupant protection. The International Space Station (ISS), in its role as a testing ground for long duration spaceflight, has been an important platform for obtaining human spaceflight data. DAP has used preflight, in-flight and post-flight data from short and long duration astronauts for computational model development and validation. Examples include preflight and post-flight bone mineral density data, muscle cross-sectional area, and muscle strength measurements. Results from computational modeling supplement space physiology research by informing experimental design. Using these computational models, DAP personnel can easily identify both important factors associated with a phenomenon and areas where data are lacking. This presentation will provide examples of DAP computational models, the data used in model development and validation, and applications of the model.

  3. Economic Models as Devices of Policy Change

    DEFF Research Database (Denmark)

    Henriksen, Lasse Folke

    2013-01-01

    of models in Danish economic policy, where, from the 1970s onwards, executive public servants in this area have exclusively been specialists in model design. To understand changes in economic policy, this paper starts with a discussion of whether the notion of paradigm shift is adequate. It then examines...... the extent to which the performativity approach can help identify macroscopic changes in policy from seemingly microscopic changes in policy models. The concept of performativity is explored as a means of thinking about the constitution of agency directed at policy change. The paper brings this concept......Can the emergence of a new policy model be a catalyst for a paradigm shift in the overall interpretative framework of how economic policy is conducted within a society? This paper claims that models are understudied as devices used by actors to induce policy change. This paper explores the role...

  4. Hurricane Sandy Economic Impacts Assessment: A Computable General Equilibrium Approach and Validation

    Energy Technology Data Exchange (ETDEWEB)

    Boero, Riccardo [Los Alamos National Laboratory; Edwards, Brian Keith [Los Alamos National Laboratory

    2017-08-07

    Economists use computable general equilibrium (CGE) models to assess how economies react and self-organize after changes in policies, technology, and other exogenous shocks. CGE models are equation-based, empirically calibrated, and inspired by Neoclassical economic theory. The focus of this work was to validate the National Infrastructure Simulation and Analysis Center (NISAC) CGE model and apply it to the problem of assessing the economic impacts of severe events. We used the 2012 Hurricane Sandy event as our validation case. In particular, this work first introduces the model and then describes the validation approach and the empirical data available for studying the event of focus. Shocks to the model are then formalized and applied. Finally, model results and limitations are presented and discussed, pointing out both the model degree of accuracy and the assessed total damage caused by Hurricane Sandy.

  5. Information security: where computer science, economics and psychology meet.

    Science.gov (United States)

    Anderson, Ross; Moore, Tyler

    2009-07-13

    Until ca. 2000, information security was seen as a technological discipline, based on computer science but with mathematics helping in the design of ciphers and protocols. That perspective started to change as researchers and practitioners realized the importance of economics. As distributed systems are increasingly composed of machines that belong to principals with divergent interests, incentives are becoming as important to dependability as technical design. A thriving new field of information security economics provides valuable insights not just into 'security' topics such as privacy, bugs, spam and phishing, but into more general areas of system dependability and policy. This research programme has recently started to interact with psychology. One thread is in response to phishing, the most rapidly growing form of online crime, in which fraudsters trick people into giving their credentials to bogus websites; a second is through the increasing importance of security usability; and a third comes through the psychology-and-economics tradition. The promise of this multidisciplinary research programme is a novel framework for analysing information security problems-one that is both principled and effective.

  6. Patient-Specific Computational Modeling

    CERN Document Server

    Peña, Estefanía

    2012-01-01

    This book addresses patient-specific modeling. It integrates computational modeling, experimental procedures, imagine clinical segmentation and mesh generation with the finite element method (FEM) to solve problems in computational biomedicine and bioengineering. Specific areas of interest include cardiovascular problems, ocular and muscular systems and soft tissue modeling. Patient-specific modeling has been the subject of serious research over the last seven years and interest in the area is continually growing and this area is expected to further develop in the near future.

  7. Trust Models in Ubiquitous Computing

    DEFF Research Database (Denmark)

    Nielsen, Mogens; Krukow, Karl; Sassone, Vladimiro

    2008-01-01

    We recapture some of the arguments for trust-based technologies in ubiquitous computing, followed by a brief survey of some of the models of trust that have been introduced in this respect. Based on this, we argue for the need of more formal and foundational trust models.......We recapture some of the arguments for trust-based technologies in ubiquitous computing, followed by a brief survey of some of the models of trust that have been introduced in this respect. Based on this, we argue for the need of more formal and foundational trust models....

  8. Economic aspects and models for building codes

    DEFF Research Database (Denmark)

    Bonke, Jens; Pedersen, Dan Ove; Johnsen, Kjeld

    It is the purpose of this bulletin to present an economic model for estimating the consequence of new or changed building codes. The object is to allow comparative analysis in order to improve the basis for decisions in this field. The model is applied in a case study....

  9. Advanced Small Modular Reactor Economics Model Development

    Energy Technology Data Exchange (ETDEWEB)

    Harrison, Thomas J [ORNL

    2014-10-01

    The US Department of Energy Office of Nuclear Energy’s Advanced Small Modular Reactor (SMR) research and development activities focus on four key areas: Developing assessment methods for evaluating advanced SMR technologies and characteristics; and Developing and testing of materials, fuels and fabrication techniques; and Resolving key regulatory issues identified by US Nuclear Regulatory Commission and industry; and Developing advanced instrumentation and controls and human-machine interfaces. This report focuses on development of assessment methods to evaluate advanced SMR technologies and characteristics. Specifically, this report describes the expansion and application of the economic modeling effort at Oak Ridge National Laboratory. Analysis of the current modeling methods shows that one of the primary concerns for the modeling effort is the handling of uncertainty in cost estimates. Monte Carlo–based methods are commonly used to handle uncertainty, especially when implemented by a stand-alone script within a program such as Python or MATLAB. However, a script-based model requires each potential user to have access to a compiler and an executable capable of handling the script. Making the model accessible to multiple independent analysts is best accomplished by implementing the model in a common computing tool such as Microsoft Excel. Excel is readily available and accessible to most system analysts, but it is not designed for straightforward implementation of a Monte Carlo–based method. Using a Monte Carlo algorithm requires in-spreadsheet scripting and statistical analyses or the use of add-ons such as Crystal Ball. An alternative method uses propagation of error calculations in the existing Excel-based system to estimate system cost uncertainty. This method has the advantage of using Microsoft Excel as is, but it requires the use of simplifying assumptions. These assumptions do not necessarily bring into question the analytical results. In fact, the

  10. Computer-Aided Modeling Framework

    DEFF Research Database (Denmark)

    Fedorova, Marina; Sin, Gürkan; Gani, Rafiqul

    Models are playing important roles in design and analysis of chemicals based products and the processes that manufacture them. Computer-aided methods and tools have the potential to reduce the number of experiments, which can be expensive and time consuming, and there is a benefit of working...... development and application. The proposed work is a part of the project for development of methods and tools that will allow systematic generation, analysis and solution of models for various objectives. It will use the computer-aided modeling framework that is based on a modeling methodology, which combines...... as the user can then generate many problem-specific models for different applications. The templates are part of the model generation feature of the framework. Also, the model development and use for a product performance evaluation has been developed. The application of the modeling template is highlighted...

  11. Economic Models and Algorithms for Distributed Systems

    CERN Document Server

    Neumann, Dirk; Altmann, Jorn; Rana, Omer F

    2009-01-01

    Distributed computing models for sharing resources such as Grids, Peer-to-Peer systems, or voluntary computing are becoming increasingly popular. This book intends to discover fresh avenues of research and amendments to existing technologies, aiming at the successful deployment of commercial distributed systems

  12. Spatial Economics Model Predicting Transport Volume

    Directory of Open Access Journals (Sweden)

    Lu Bo

    2016-10-01

    Full Text Available It is extremely important to predict the logistics requirements in a scientific and rational way. However, in recent years, the improvement effect on the prediction method is not very significant and the traditional statistical prediction method has the defects of low precision and poor interpretation of the prediction model, which cannot only guarantee the generalization ability of the prediction model theoretically, but also cannot explain the models effectively. Therefore, in combination with the theories of the spatial economics, industrial economics, and neo-classical economics, taking city of Zhuanghe as the research object, the study identifies the leading industry that can produce a large number of cargoes, and further predicts the static logistics generation of the Zhuanghe and hinterlands. By integrating various factors that can affect the regional logistics requirements, this study established a logistics requirements potential model from the aspect of spatial economic principles, and expanded the way of logistics requirements prediction from the single statistical principles to an new area of special and regional economics.

  13. Building Better Ecological Machines: Complexity Theory and Alternative Economic Models

    Directory of Open Access Journals (Sweden)

    Jess Bier

    2016-12-01

    Full Text Available Computer models of the economy are regularly used to predict economic phenomena and set financial policy. However, the conventional macroeconomic models are currently being reimagined after they failed to foresee the current economic crisis, the outlines of which began to be understood only in 2007-2008. In this article we analyze the most prominent of this reimagining: Agent-Based models (ABMs. ABMs are an influential alternative to standard economic models, and they are one focus of complexity theory, a discipline that is a more open successor to the conventional chaos and fractal modeling of the 1990s. The modelers who create ABMs claim that their models depict markets as ecologies, and that they are more responsive than conventional models that depict markets as machines. We challenge this presentation, arguing instead that recent modeling efforts amount to the creation of models as ecological machines. Our paper aims to contribute to an understanding of the organizing metaphors of macroeconomic models, which we argue is relevant conceptually and politically, e.g., when models are used for regulatory purposes.

  14. A review of mathematical models in economic environmental problems

    DEFF Research Database (Denmark)

    Nahorski, Z.; Ravn, H.F.

    2000-01-01

    The paper presents a review of mathematical models used,in economic analysis of environmental problems. This area of research combines macroeconomic models of growth, as dependent on capital, labour, resources, etc., with environmental models describing such phenomena like natural resources...... exhaustion or pollution accumulation and degradation. In simpler cases the models can be treated analytically and the utility function can be optimized using, e.g., such tools as the maximum principle. In more complicated cases calculation of the optimal environmental policies requires a computer solution....

  15. Economic model predictive control theory, formulations and chemical process applications

    CERN Document Server

    Ellis, Matthew; Christofides, Panagiotis D

    2017-01-01

    This book presents general methods for the design of economic model predictive control (EMPC) systems for broad classes of nonlinear systems that address key theoretical and practical considerations including recursive feasibility, closed-loop stability, closed-loop performance, and computational efficiency. Specifically, the book proposes: Lyapunov-based EMPC methods for nonlinear systems; two-tier EMPC architectures that are highly computationally efficient; and EMPC schemes handling explicitly uncertainty, time-varying cost functions, time-delays and multiple-time-scale dynamics. The proposed methods employ a variety of tools ranging from nonlinear systems analysis, through Lyapunov-based control techniques to nonlinear dynamic optimization. The applicability and performance of the proposed methods are demonstrated through a number of chemical process examples. The book presents state-of-the-art methods for the design of economic model predictive control systems for chemical processes. In addition to being...

  16. Computer Based Modelling and Simulation

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 6; Issue 3. Computer Based Modelling and Simulation - Modelling Deterministic Systems. N K Srinivasan. General Article Volume 6 Issue 3 March 2001 pp 46-54. Fulltext. Click here to view fulltext PDF. Permanent link:

  17. Computational intelligence paradigms in economic and financial decision making

    CERN Document Server

    Resta, Marina

    2016-01-01

    The book focuses on a set of cutting-edge research techniques, highlighting the potential of soft computing tools in the analysis of economic and financial phenomena and in providing support for the decision-making process. In the first part the textbook presents a comprehensive and self-contained introduction to the field of self-organizing maps, elastic maps and social network analysis tools and provides necessary background material on the topic, including a discussion of more recent developments in the field. In the second part the focus is on practical applications, with particular attention paid to budgeting problems, market simulations, and decision-making processes, and on how such problems can be effectively managed by developing proper methods to automatically detect certain patterns. The book offers a valuable resource for both students and practitioners with an introductory-level college math background.

  18. Sparse High Dimensional Models in Economics.

    Science.gov (United States)

    Fan, Jianqing; Lv, Jinchi; Qi, Lei

    2011-09-01

    This paper reviews the literature on sparse high dimensional models and discusses some applications in economics and finance. Recent developments of theory, methods, and implementations in penalized least squares and penalized likelihood methods are highlighted. These variable selection methods are proved to be effective in high dimensional sparse modeling. The limits of dimensionality that regularization methods can handle, the role of penalty functions, and their statistical properties are detailed. Some recent advances in ultra-high dimensional sparse modeling are also briefly discussed.

  19. Computational models of complex systems

    CERN Document Server

    Dabbaghian, Vahid

    2014-01-01

    Computational and mathematical models provide us with the opportunities to investigate the complexities of real world problems. They allow us to apply our best analytical methods to define problems in a clearly mathematical manner and exhaustively test our solutions before committing expensive resources. This is made possible by assuming parameter(s) in a bounded environment, allowing for controllable experimentation, not always possible in live scenarios. For example, simulation of computational models allows the testing of theories in a manner that is both fundamentally deductive and experimental in nature. The main ingredients for such research ideas come from multiple disciplines and the importance of interdisciplinary research is well recognized by the scientific community. This book provides a window to the novel endeavours of the research communities to present their works by highlighting the value of computational modelling as a research tool when investigating complex systems. We hope that the reader...

  20. Modeling Economic Planning for Exhaustible Natural Resources ...

    African Journals Online (AJOL)

    Petroleum economics is a key component of any field development plan (FDP) for crude oil and natural gas fields having finite life. This paper presents an analytical equation to model the relationship between initial speculative fund(s) and investment cost(s) in a project with finite life. We define a utility function for three ...

  1. Computational models of adult neurogenesis

    Science.gov (United States)

    Cecchi, Guillermo A.; Magnasco, Marcelo O.

    2005-10-01

    Experimental results in recent years have shown that adult neurogenesis is a significant phenomenon in the mammalian brain. Little is known, however, about the functional role played by the generation and destruction of neurons in the context of an adult brain. Here, we propose two models where new projection neurons are incorporated. We show that in both models, using incorporation and removal of neurons as a computational tool, it is possible to achieve a higher computational efficiency that in purely static, synapse-learning-driven networks. We also discuss the implication for understanding the role of adult neurogenesis in specific brain areas like the olfactory bulb and the dentate gyrus.

  2. Introduction to computation and modeling for differential equations

    CERN Document Server

    Edsberg, Lennart

    2008-01-01

    An introduction to scientific computing for differential equationsIntroduction to Computation and Modeling for Differential Equations provides a unified and integrated view of numerical analysis, mathematical modeling in applications, and programming to solve differential equations, which is essential in problem-solving across many disciplines, such as engineering, physics, and economics. This book successfully introduces readers to the subject through a unique ""Five-M"" approach: Modeling, Mathematics, Methods, MATLAB, and Multiphysics. This approach facilitates a thorough understanding of h

  3. DFI Computer Modeling Software (CMS)

    Energy Technology Data Exchange (ETDEWEB)

    Cazalet, E.G.; Deziel, L.B. Jr.; Haas, S.M.; Martin, T.W.; Nesbitt, D.M.; Phillips, R.L.

    1979-10-01

    The data base management system used to create, edit and store models data and solutions for the LEAP system is described. The software is entirely in FORTRAN-G for the IBM 370 series of computers and provides interface with a commercial data base system SYSTEM-2000.

  4. Pervasive Computing and Prosopopoietic Modelling

    DEFF Research Database (Denmark)

    Michelsen, Anders Ib

    2011-01-01

    of the classical rhetoric term of ’prosopopoeia’ into the debate on large technological systems. First, the paper introduces the paradoxical distinction/complicity by debating Gilbert Simondon’s notion of a ‘margin of indeterminacy’ vis-a-vis computing. Second, it debates the idea of prosopopoietic modeling...

  5. Economic foundations of LEAP Model 22C

    Energy Technology Data Exchange (ETDEWEB)

    Hansen, J.A.; Becker, M.; Trimble, J.L.

    1981-04-01

    The model for long-term energy projections, the DOE/EIA's Long Term Energy Analysis Program (LEAP), is discussed. This report identifies and presents an initial assessment of the major underlying economic assumptions of LEAP Model 22C, the version of LEAP used to prepare the 1978 Annual Report to Congress. The major economic assumptions of Model 22C discussed are: competitive markets; constant input proportions in production processes; constant returns to scale in production processes; unlimited supplies of capital and variable inputs at fixed prices; continuous (infinitely divisible) units of input; investment based on perfect foresight; technical change limited to exogenous forecasts; minimal regional differences (except for coal production); unlimited supplies of foreign oil at exogenously specified prices; very little government regulation; and, final demand quantities which are essentially exogenousy specified. The solution algorithm and the method of analysis are presented.

  6. Computational Modeling in Liver Surgery

    Directory of Open Access Journals (Sweden)

    Bruno Christ

    2017-11-01

    Full Text Available The need for extended liver resection is increasing due to the growing incidence of liver tumors in aging societies. Individualized surgical planning is the key for identifying the optimal resection strategy and to minimize the risk of postoperative liver failure and tumor recurrence. Current computational tools provide virtual planning of liver resection by taking into account the spatial relationship between the tumor and the hepatic vascular trees, as well as the size of the future liver remnant. However, size and function of the liver are not necessarily equivalent. Hence, determining the future liver volume might misestimate the future liver function, especially in cases of hepatic comorbidities such as hepatic steatosis. A systems medicine approach could be applied, including biological, medical, and surgical aspects, by integrating all available anatomical and functional information of the individual patient. Such an approach holds promise for better prediction of postoperative liver function and hence improved risk assessment. This review provides an overview of mathematical models related to the liver and its function and explores their potential relevance for computational liver surgery. We first summarize key facts of hepatic anatomy, physiology, and pathology relevant for hepatic surgery, followed by a description of the computational tools currently used in liver surgical planning. Then we present selected state-of-the-art computational liver models potentially useful to support liver surgery. Finally, we discuss the main challenges that will need to be addressed when developing advanced computational planning tools in the context of liver surgery.

  7. Mathematical model in economic environmental problems

    Energy Technology Data Exchange (ETDEWEB)

    Nahorski, Z. [Polish Academy of Sciences, Systems Research Inst. (Poland); Ravn, H.F. [Risoe National Lab. (Denmark)

    1996-12-31

    The report contains a review of basic models and mathematical tools used in economic regulation problems. It starts with presentation of basic models of capital accumulation, resource depletion, pollution accumulation, and population growth, as well as construction of utility functions. Then the one-state variable model is discussed in details. The basic mathematical methods used consist of application of the maximum principle and phase plane analysis of the differential equations obtained as the necessary conditions of optimality. A summary of basic results connected with these methods is given in appendices. (au) 13 ills.; 17 refs.

  8. Evaluation of Computer-Assisted Instruction in Principles of Economics

    Directory of Open Access Journals (Sweden)

    Dennis Coates

    2001-04-01

    Full Text Available Despite increasing use, little is known about the effectiveness of web-based instructional material. This study assesses the effectiveness of supplementary web-based materials and activities in introductory economics courses. We have collected data on 66 students from three principles sections that describe demographic characteristics, use of web-based instructional resources, and performance on graded quizzes and examinations. We use this data to statistically assess the effectiveness of the web-based material. Student utilization of web-based material was extensive. Students frequently used on-line practice quizzes and accessed the web-based material often. A sizable fraction of the students actively posted and read threaded discussions on the course bulletin board. The statistical analysis shows that both on-line computer graded practice quizzes and posting to the class bulletin board are positively correlated with student performance on the quizzes and exams, but use of web-based content and passive reading of bulletin board posts ("lurking" is not. These results suggest that faculty should focus more on developing self-test quizzes and effective bulletin board discussion projects and less on generating on-line content.

  9. Regional Economic Integration: The European Model

    Directory of Open Access Journals (Sweden)

    Pierre Jacquet

    2001-12-01

    Full Text Available Regional initiatives have flourished since the end of the Cold War. Given the degree of integration it has achieved, the European Union is often cited as a model of regional integration. This article argues that European integration has been the product of very specific historical conditions, implying that the politics and dynamics of regional integration in Europe may not be appropriate for and replicable in other regions. It also discusses European monetary union (EMU and argues that integration within EMU is still incomplete in particular because economic policies are not sufficiently coordinated. Finally, it discusses the contribution of regional integration to global governance and suggests that the "European model" can be a source of inspiration for the rest of the world not as much as a process to emulate as for the lessons that can be learned from the dynamics and experience of European integration about some important issues in managing economic interdependence.

  10. Computational Models of Face Perception.

    Science.gov (United States)

    Martinez, Aleix M

    2017-06-01

    Faces are one of the most important means of communication in humans. For example, a short glance at a person's face provides information on identity and emotional state. What are the computations the brain uses to solve these problems so accurately and seemingly effortlessly? This article summarizes current research on computational modeling, a technique used to answer this question. Specifically, my research studies the hypothesis that this algorithm is tasked to solve the inverse problem of production. For example, to recognize identity, our brain needs to identify shape and shading image features that are invariant to facial expression, pose and illumination. Similarly, to recognize emotion, the brain needs to identify shape and shading features that are invariant to identity, pose and illumination. If one defines the physics equations that render an image under different identities, expressions, poses and illuminations, then gaining invariance to these factors is readily resolved by computing the inverse of this rendering function. I describe our current understanding of the algorithms used by our brains to resolve this inverse problem. I also discuss how these results are driving research in computer vision to design computer systems that are as accurate, robust and efficient as humans.

  11. Economic Modeling of Compressed Air Energy Storage

    Directory of Open Access Journals (Sweden)

    Rui Bo

    2013-04-01

    Full Text Available Due to the variable nature of wind resources, the increasing penetration level of wind power will have a significant impact on the operation and planning of the electric power system. Energy storage systems are considered an effective way to compensate for the variability of wind generation. This paper presents a detailed production cost simulation model to evaluate the economic value of compressed air energy storage (CAES in systems with large-scale wind power generation. The co-optimization of energy and ancillary services markets is implemented in order to analyze the impacts of CAES, not only on energy supply, but also on system operating reserves. Both hourly and 5-minute simulations are considered to capture the economic performance of CAES in the day-ahead (DA and real-time (RT markets. The generalized network flow formulation is used to model the characteristics of CAES in detail. The proposed model is applied on a modified IEEE 24-bus reliability test system. The numerical example shows that besides the economic benefits gained through energy arbitrage in the DA market, CAES can also generate significant profits by providing reserves, compensating for wind forecast errors and intra-hour fluctuation, and participating in the RT market.

  12. Cosmic logic: a computational model

    Science.gov (United States)

    Vanchurin, Vitaly

    2016-02-01

    We initiate a formal study of logical inferences in context of the measure problem in cosmology or what we call cosmic logic. We describe a simple computational model of cosmic logic suitable for analysis of, for example, discretized cosmological systems. The construction is based on a particular model of computation, developed by Alan Turing, with cosmic observers (CO), cosmic measures (CM) and cosmic symmetries (CS) described by Turing machines. CO machines always start with a blank tape and CM machines take CO's Turing number (also known as description number or Gödel number) as input and output the corresponding probability. Similarly, CS machines take CO's Turing number as input, but output either one if the CO machines are in the same equivalence class or zero otherwise. We argue that CS machines are more fundamental than CM machines and, thus, should be used as building blocks in constructing CM machines. We prove the non-computability of a CS machine which discriminates between two classes of CO machines: mortal that halts in finite time and immortal that runs forever. In context of eternal inflation this result implies that it is impossible to construct CM machines to compute probabilities on the set of all CO machines using cut-off prescriptions. The cut-off measures can still be used if the set is reduced to include only machines which halt after a finite and predetermined number of steps.

  13. Minimal models of multidimensional computations.

    Directory of Open Access Journals (Sweden)

    Jeffrey D Fitzgerald

    2011-03-01

    Full Text Available The multidimensional computations performed by many biological systems are often characterized with limited information about the correlations between inputs and outputs. Given this limitation, our approach is to construct the maximum noise entropy response function of the system, leading to a closed-form and minimally biased model consistent with a given set of constraints on the input/output moments; the result is equivalent to conditional random field models from machine learning. For systems with binary outputs, such as neurons encoding sensory stimuli, the maximum noise entropy models are logistic functions whose arguments depend on the constraints. A constraint on the average output turns the binary maximum noise entropy models into minimum mutual information models, allowing for the calculation of the information content of the constraints and an information theoretic characterization of the system's computations. We use this approach to analyze the nonlinear input/output functions in macaque retina and thalamus; although these systems have been previously shown to be responsive to two input dimensions, the functional form of the response function in this reduced space had not been unambiguously identified. A second order model based on the logistic function is found to be both necessary and sufficient to accurately describe the neural responses to naturalistic stimuli, accounting for an average of 93% of the mutual information with a small number of parameters. Thus, despite the fact that the stimulus is highly non-Gaussian, the vast majority of the information in the neural responses is related to first and second order correlations. Our results suggest a principled and unbiased way to model multidimensional computations and determine the statistics of the inputs that are being encoded in the outputs.

  14. Computational Models of Rock Failure

    Science.gov (United States)

    May, Dave A.; Spiegelman, Marc

    2017-04-01

    Practitioners in computational geodynamics, as per many other branches of applied science, typically do not analyse the underlying PDE's being solved in order to establish the existence or uniqueness of solutions. Rather, such proofs are left to the mathematicians, and all too frequently these results lag far behind (in time) the applied research being conducted, are often unintelligible to the non-specialist, are buried in journals applied scientists simply do not read, or simply have not been proven. As practitioners, we are by definition pragmatic. Thus, rather than first analysing our PDE's, we first attempt to find approximate solutions by throwing all our computational methods and machinery at the given problem and hoping for the best. Typically this approach leads to a satisfactory outcome. Usually it is only if the numerical solutions "look odd" that we start delving deeper into the math. In this presentation I summarise our findings in relation to using pressure dependent (Drucker-Prager type) flow laws in a simplified model of continental extension in which the material is assumed to be an incompressible, highly viscous fluid. Such assumptions represent the current mainstream adopted in computational studies of mantle and lithosphere deformation within our community. In short, we conclude that for the parameter range of cohesion and friction angle relevant to studying rocks, the incompressibility constraint combined with a Drucker-Prager flow law can result in problems which have no solution. This is proven by a 1D analytic model and convincingly demonstrated by 2D numerical simulations. To date, we do not have a robust "fix" for this fundamental problem. The intent of this submission is to highlight the importance of simple analytic models, highlight some of the dangers / risks of interpreting numerical solutions without understanding the properties of the PDE we solved, and lastly to stimulate discussions to develop an improved computational model of

  15. Economic tour package model using heuristic

    Science.gov (United States)

    Rahman, Syariza Abdul; Benjamin, Aida Mauziah; Bakar, Engku Muhammad Nazri Engku Abu

    2014-07-01

    A tour-package is a prearranged tour that includes products and services such as food, activities, accommodation, and transportation, which are sold at a single price. Since the competitiveness within tourism industry is very high, many of the tour agents try to provide attractive tour-packages in order to meet tourist satisfaction as much as possible. Some of the criteria that are considered by the tourist are the number of places to be visited and the cost of the tour-packages. Previous studies indicate that tourists tend to choose economical tour-packages and aiming to visit as many places as they can cover. Thus, this study proposed tour-package model using heuristic approach. The aim is to find economical tour-packages and at the same time to propose as many places as possible to be visited by tourist in a given geographical area particularly in Langkawi Island. The proposed model considers only one starting point where the tour starts and ends at an identified hotel. This study covers 31 most attractive places in Langkawi Island from various categories of tourist attractions. Besides, the allocation of period for lunch and dinner are included in the proposed itineraries where it covers 11 popular restaurants around Langkawi Island. In developing the itinerary, the proposed heuristic approach considers time window for each site (hotel/restaurant/place) so that it represents real world implementation. We present three itineraries with different time constraints (1-day, 2-day and 3-day tour-package). The aim of economic model is to minimize the tour-package cost as much as possible by considering entrance fee of each visited place. We compare the proposed model with our uneconomic model from our previous study. The uneconomic model has no limitation to the cost with the aim to maximize the number of places to be visited. Comparison between the uneconomic and economic itinerary has shown that the proposed model have successfully achieved the objective that

  16. Modeling and simulation of economic processes

    Directory of Open Access Journals (Sweden)

    Bogdan Brumar

    2010-12-01

    Full Text Available In general, any activity requires a longer action often characterized by a degree of uncertainty, insecurity, in terms of size of the objective pursued. Because of the complexity of real economic systems, the stochastic dependencies between different variables and parameters considered, not all systems can be adequately represented by a model that can be solved by analytical methods and covering all issues for management decision analysis-economic horizon real. Often in such cases, it is considered that the simulation technique is the only alternative available. Using simulation techniques to study real-world systems often requires a laborious work. Making a simulation experiment is a process that takes place in several stages.

  17. Rediscovering the Economics of Keynes in an Agent-Based Computational Setting

    DEFF Research Database (Denmark)

    Bruun, Charlotte

    2016-01-01

    The aim of this paper is to use agent-based computational economics to explore the economic thinking of Keynes. Taking his starting point at the macroeconomic level, Keynes argued that economic systems are characterized by fundamental uncertainty — an uncertainty that makes rule-based behavior...

  18. Computational fluid dynamics modelling in cardiovascular medicine.

    Science.gov (United States)

    Morris, Paul D; Narracott, Andrew; von Tengg-Kobligk, Hendrik; Silva Soto, Daniel Alejandro; Hsiao, Sarah; Lungu, Angela; Evans, Paul; Bressloff, Neil W; Lawford, Patricia V; Hose, D Rodney; Gunn, Julian P

    2016-01-01

    This paper reviews the methods, benefits and challenges associated with the adoption and translation of computational fluid dynamics (CFD) modelling within cardiovascular medicine. CFD, a specialist area of mathematics and a branch of fluid mechanics, is used routinely in a diverse range of safety-critical engineering systems, which increasingly is being applied to the cardiovascular system. By facilitating rapid, economical, low-risk prototyping, CFD modelling has already revolutionised research and development of devices such as stents, valve prostheses, and ventricular assist devices. Combined with cardiovascular imaging, CFD simulation enables detailed characterisation of complex physiological pressure and flow fields and the computation of metrics which cannot be directly measured, for example, wall shear stress. CFD models are now being translated into clinical tools for physicians to use across the spectrum of coronary, valvular, congenital, myocardial and peripheral vascular diseases. CFD modelling is apposite for minimally-invasive patient assessment. Patient-specific (incorporating data unique to the individual) and multi-scale (combining models of different length- and time-scales) modelling enables individualised risk prediction and virtual treatment planning. This represents a significant departure from traditional dependence upon registry-based, population-averaged data. Model integration is progressively moving towards 'digital patient' or 'virtual physiological human' representations. When combined with population-scale numerical models, these models have the potential to reduce the cost, time and risk associated with clinical trials. The adoption of CFD modelling signals a new era in cardiovascular medicine. While potentially highly beneficial, a number of academic and commercial groups are addressing the associated methodological, regulatory, education- and service-related challenges. Published by the BMJ Publishing Group Limited. For permission

  19. Computer modeling and urban recycling

    Energy Technology Data Exchange (ETDEWEB)

    Biddle, D.C.; Storey, M

    1989-09-01

    A computer model developed by Philadelphia Recycling Office (PRO) to determine the operational constraints of various policy choices in planning municipal recycling collection services is described. Such a computer model can provide quick and organized summaries of policy options without overwelming decision makers with detailed and time-consuming calculations. Named OMAR, (Operations Model for the Analysis of Recycling), this program is a Lotus 1-2-3 spreadsheet. Data collected from the city's pilot project are central to some of the indices used by the model. Pilot project data and indices are imported from other files in a somewhat lengthy procedure. There are two components to the structure of the analytical section of OMAR. The first, the material components, is based on the algorithm which estimates the amount of material that is available for collection on a given day. The second, the capacity component, is derived from the algorithm which estimates the amount of material that can be collected by a single crew in a day. Equations for calculating such components are presented. The feasibitlity of using OMAR as a reporting tool for planners is also discussed.

  20. Computational Modeling in Tissue Engineering

    CERN Document Server

    2013-01-01

    One of the major challenges in tissue engineering is the translation of biological knowledge on complex cell and tissue behavior into a predictive and robust engineering process. Mastering this complexity is an essential step towards clinical applications of tissue engineering. This volume discusses computational modeling tools that allow studying the biological complexity in a more quantitative way. More specifically, computational tools can help in:  (i) quantifying and optimizing the tissue engineering product, e.g. by adapting scaffold design to optimize micro-environmental signals or by adapting selection criteria to improve homogeneity of the selected cell population; (ii) quantifying and optimizing the tissue engineering process, e.g. by adapting bioreactor design to improve quality and quantity of the final product; and (iii) assessing the influence of the in vivo environment on the behavior of the tissue engineering product, e.g. by investigating vascular ingrowth. The book presents examples of each...

  1. The Specification and Modeling of Computer Security

    Science.gov (United States)

    1990-01-01

    Computer security models are specifications designed, among other things, to limit the damage caused by Trojan Horse programs such as computer... computer security modeling in general, the Bell and LaPadula model in particular, and the limitations of the model. Many of the issues raised are of

  2. Business model elements impacting cloud computing adoption

    DEFF Research Database (Denmark)

    Bogataj, Kristina; Pucihar, Andreja; Sudzina, Frantisek

    The paper presents a proposed research framework for identification of business model elements impacting Cloud Computing Adoption. We provide a definition of main Cloud Computing characteristics, discuss previous findings on factors impacting Cloud Computing Adoption, and investigate technology...... adoption theories, such as Diffusion of Innovations, Technology Acceptance Model, Unified Theory of Acceptance and Use of Technology. Further on, at research model for identification of Cloud Computing Adoption factors from a business model perspective is presented. The following business model building...

  3. A mean-field game economic growth model

    KAUST Repository

    Gomes, Diogo A.

    2016-08-05

    Here, we examine a mean-field game (MFG) that models the economic growth of a population of non-cooperative, rational agents. In this MFG, agents are described by two state variables - the capital and consumer goods they own. Each agent seeks to maximize his/her utility by taking into account statistical data about the whole population. The individual actions drive the evolution of the players, and a market-clearing condition determines the relative price of capital and consumer goods. We study the existence and uniqueness of optimal strategies of the agents and develop numerical methods to compute these strategies and the equilibrium price.

  4. Modeling Computer Virus and Its Dynamics

    OpenAIRE

    Peng, Mei; He, Xing; Huang, Junjian; Dong, Tao

    2013-01-01

    Based on that the computer will be infected by infected computer and exposed computer, and some of the computers which are in suscepitible status and exposed status can get immunity by antivirus ability, a novel coumputer virus model is established. The dynamic behaviors of this model are investigated. First, the basic reproduction number R0, which is a threshold of the computer virus spreading in internet, is determined. Second, this model has a virus-free equilibrium P0, which means that th...

  5. Opportunity for Realizing Ideal Computing System using Cloud Computing Model

    OpenAIRE

    Sreeramana Aithal; Vaikunth Pai T

    2017-01-01

    An ideal computing system is a computing system with ideal characteristics. The major components and their performance characteristics of such hypothetical system can be studied as a model with predicted input, output, system and environmental characteristics using the identified objectives of computing which can be used in any platform, any type of computing system, and for application automation, without making modifications in the form of structure, hardware, and software coding by an exte...

  6. STUDY OF MULTI-AREA ECONOMIC DISPATCH ON SOFT COMPUTING TECHNIQUES

    OpenAIRE

    Dr. A. Amsavalli; D. Vigneshwaran; M. Lavanya; S. Vijayaraj

    2016-01-01

    This paper analyses the performance of soft computing techniques to solve single and multi area economic dispatch problems. The paper also includes of inter-area flow constraints on power system network, which are normally ignored in most economic load dispatch problems. Economic load dispatch results for these two area, three area and four area systems are presented in the paper and they determine the importance of multiple area representation of a system in economic load dispatch. Such repr...

  7. USING GEM - GLOBAL ECONOMIC MODEL IN ACHIEVING A GLOBAL ECONOMIC FORECAST

    Directory of Open Access Journals (Sweden)

    Camelia Madalina Orac

    2013-12-01

    Full Text Available The global economic development model has proved to be insufficiently reliable under the new economic crisis. As a result, the entire theoretical construction about the global economy needs rethinking and reorientation. In this context, it is quite clear that only through effective use of specific techniques and tools of economic-mathematical modeling, statistics, regional analysis and economic forecasting it is possible to obtain an overview of the future economy.

  8. The mineral sector and economic development in Ghana: A computable general equilibrium analysis

    Science.gov (United States)

    Addy, Samuel N.

    A computable general equilibrium model (CGE) model is formulated for conducting mineral policy analysis in the context of national economic development for Ghana. The model, called GHANAMIN, places strong emphasis on production, trade, and investment. It can be used to examine both micro and macro economic impacts of policies associated with mineral investment, taxation, and terms of trade changes, as well as mineral sector performance impacts due to technological change or the discovery of new deposits. Its economywide structure enables the study of broader development policy with a focus on individual or multiple sectors, simultaneously. After going through a period of contraction for about two decades, mining in Ghana has rebounded significantly and is currently the main foreign exchange earner. Gold alone contributed 44.7 percent of 1994 total export earnings. GHANAMIN is used to investigate the economywide impacts of mineral tax policies, world market mineral prices changes, mining investment, and increased mineral exports. It is also used for identifying key sectors for economic development. Various simulations were undertaken with the following results: Recently implemented mineral tax policies are welfare increasing, but have an accompanying decrease in the output of other export sectors. World mineral price rises stimulate an increase in real GDP; however, this increase is less than real GDP decreases associated with price declines. Investment in the non-gold mining sector increases real GDP more than investment in gold mining, because of the former's stronger linkages to the rest of the economy. Increased mineral exports are very beneficial to the overall economy. Foreign direct investment (FDI) in mining increases welfare more so than domestic capital, which is very limited. Mining investment and the increased mineral exports since 1986 have contributed significantly to the country's economic recovery, with gold mining accounting for 95 percent of the

  9. Forecasting models for national economic planning

    CERN Document Server

    Heesterman, A R G

    1972-01-01

    This book is about the specification of linear econometric models, and for this reason some important related fields have been deliberately omitted. I did not want to discuss the problems of parameter-estimation, at least not in any detail, as there are other books on these problems written by specialized statisticians. This book is about the models them­ selves and macro-economic models in particular. A second related sub­ ject is the policy decision that can be made with the help of a model. While I did write a chapter on policy decisions, I limited myself to some extent because of my views on planning as such. The logical approach to this problem is in terms of mathematical programming, but our models and our ideas about the policies we want are too crude for its effective utilisation. A realistic formulation of the problem should involve non­ linearities in an essential way, the models I consider (and most existing models) are linear. At the present state of econometrics, I do not really believe in suc...

  10. International Conference on Computational Intelligence, Cyber Security, and Computational Models

    CERN Document Server

    Ramasamy, Vijayalakshmi; Sheen, Shina; Veeramani, C; Bonato, Anthony; Batten, Lynn

    2016-01-01

    This book aims at promoting high-quality research by researchers and practitioners from academia and industry at the International Conference on Computational Intelligence, Cyber Security, and Computational Models ICC3 2015 organized by PSG College of Technology, Coimbatore, India during December 17 – 19, 2015. This book enriches with innovations in broad areas of research like computational modeling, computational intelligence and cyber security. These emerging inter disciplinary research areas have helped to solve multifaceted problems and gained lot of attention in recent years. This encompasses theory and applications, to provide design, analysis and modeling of the aforementioned key areas.

  11. Assessing economic impacts of China’s water pollution mitigation measures through a dynamic computable general equilibrium analysis.

    NARCIS (Netherlands)

    Qin, Changbo; Qin, Changbo; Bressers, Johannes T.A.; Su, Zhongbo; Jia, Yangwen; wang, Hao

    2011-01-01

    In this letter, we apply an extended environmental dynamic computable general equilibrium model to assess the economic consequences of implementing a total emission control policy. On the basis of emission levels in 2007, we simulate different emission reduction scenarios, ranging from 20 to 50%

  12. Computing Models for FPGA-Based Accelerators

    Science.gov (United States)

    Herbordt, Martin C.; Gu, Yongfeng; VanCourt, Tom; Model, Josh; Sukhwani, Bharat; Chiu, Matt

    2011-01-01

    Field-programmable gate arrays are widely considered as accelerators for compute-intensive applications. A critical phase of FPGA application development is finding and mapping to the appropriate computing model. FPGA computing enables models with highly flexible fine-grained parallelism and associative operations such as broadcast and collective response. Several case studies demonstrate the effectiveness of using these computing models in developing FPGA applications for molecular modeling. PMID:21603152

  13. Using Evolutionary Computation to Solve the Economic Load Dispatch Problem

    Directory of Open Access Journals (Sweden)

    Samir SAYAH

    2008-06-01

    Full Text Available This paper reports on an evolutionary algorithm based method for solving the economic load dispatch (ELD problem. The objective is to minimize the nonlinear function, which is the total fuel cost of thermal generating units, subject to the usual constraints.The IEEE 30 bus test system was used for testing and validation purposes. The results obtained demonstrate the effectiveness of the proposed method for solving the economic load dispatch problem.

  14. Computer modelling for ecosystem service assessment: Chapter 4.4

    Science.gov (United States)

    Dunford, Robert; Harrison, Paula; Bagstad, Kenneth J.

    2017-01-01

    Computer models are simplified representations of the environment that allow biophysical, ecological, and/or socio-economic characteristics to be quantified and explored. Modelling approaches differ from mapping approaches (Chapter 5) as (i) they are not forcibly spatial (although many models do produce spatial outputs); (ii) they focus on understanding and quantifying the interactions between different components of social and/or environmental systems and (iii)

  15. A Computational Theory of Modelling

    Science.gov (United States)

    Rossberg, Axel G.

    2003-04-01

    A metatheory is developed which characterizes the relationship between a modelled system, which complies with some ``basic theory'', and a model, which does not, and yet reproduces important aspects of the modelled system. A model is represented by an (in a certain sense, s.b.) optimal algorithm which generates data that describe the model's state or evolution complying with a ``reduced theory''. Theories are represented by classes of (in a similar sense, s.b.) optimal algorithms that test if their input data comply with the theory. The metatheory does not prescribe the formalisms (data structure, language) to be used for the description of states or evolutions. Transitions to other formalisms and loss of accuracy, common to theory reduction, are explicitly accounted for. The basic assumption of the theory is that resources such as the code length (~ programming time) and the computation time for modelling and testing are costly, but the relative cost of each recourse is unknown. Thus, if there is an algorithm a for which there is no other algorithm b solving the same problem but using less of each recourse, then a is considered optimal. For tests (theories), the set X of wrongly admitted inputs is treated as another resource. It is assumed that X1 is cheaper than X2 when X1 ⊂ X2 (X1 ≠ X2). Depending on the problem, the algorithmic complexity of a reduced theory can be smaller or larger than that of the basic theory. The theory might help to distinguish actual properties of complex systems from mere mental constructs. An application to complex spatio-temporal patterns is discussed.

  16. Towards the Epidemiological Modeling of Computer Viruses

    Directory of Open Access Journals (Sweden)

    Xiaofan Yang

    2012-01-01

    Full Text Available Epidemic dynamics of computer viruses is an emerging discipline aiming to understand the way that computer viruses spread on networks. This paper is intended to establish a series of rational epidemic models of computer viruses. First, a close inspection of some common characteristics shared by all typical computer viruses clearly reveals the flaws of previous models. Then, a generic epidemic model of viruses, which is named as the SLBS model, is proposed. Finally, diverse generalizations of the SLBS model are suggested. We believe this work opens a door to the full understanding of how computer viruses prevail on the Internet.

  17. Quantum Computation Beyond the Circuit Model

    OpenAIRE

    Jordan, Stephen P.

    2008-01-01

    The quantum circuit model is the most widely used model of quantum computation. It provides both a framework for formulating quantum algorithms and an architecture for the physical construction of quantum computers. However, several other models of quantum computation exist which provide useful alternative frameworks for both discovering new quantum algorithms and devising new physical implementations of quantum computers. In this thesis, I first present necessary background material for a ge...

  18. Computational modeling of epithelial tissues.

    Science.gov (United States)

    Smallwood, Rod

    2009-01-01

    There is an extensive literature on the computational modeling of epithelial tissues at all levels from subcellular to whole tissue. This review concentrates on behavior at the individual cell to whole tissue level, and particularly on organizational aspects, and provides an indication of where information from other areas, such as the modeling of angiogenesis, is relevant. The skin, and the lining of all of the body cavities (lung, gut, cervix, bladder etc) are epithelial tissues, which in a topological sense are the boundary between inside and outside the body. They are thin sheets of cells (usually of the order of 0.5 mm thick) without extracellular matrix, have a relatively simple structure, and contain few types of cells. They have important barrier, secretory and transport functions, which are essential for the maintenance of life, so homeostasis and wound healing are important aspects of the behavior of epithelial tissues. Carcinomas originate in epithelial tissues.There are essentially two approaches to modeling tissues--to start at the level of the tissue (i.e., a length scale of the order of 1 mm) and develop generalized equations for behavior (a continuum approach); or to start at the level of the cell (i.e., a length scale of the order of 10 µm) and develop tissue behavior as an emergent property of cellular behavior (an individual-based approach). As will be seen, these are not mutually exclusive approaches, and they come in a variety of flavors.

  19. Model dynamics for quantum computing

    Science.gov (United States)

    Tabakin, Frank

    2017-08-01

    A model master equation suitable for quantum computing dynamics is presented. In an ideal quantum computer (QC), a system of qubits evolves in time unitarily and, by virtue of their entanglement, interfere quantum mechanically to solve otherwise intractable problems. In the real situation, a QC is subject to decoherence and attenuation effects due to interaction with an environment and with possible short-term random disturbances and gate deficiencies. The stability of a QC under such attacks is a key issue for the development of realistic devices. We assume that the influence of the environment can be incorporated by a master equation that includes unitary evolution with gates, supplemented by a Lindblad term. Lindblad operators of various types are explored; namely, steady, pulsed, gate friction, and measurement operators. In the master equation, we use the Lindblad term to describe short time intrusions by random Lindblad pulses. The phenomenological master equation is then extended to include a nonlinear Beretta term that describes the evolution of a closed system with increasing entropy. An external Bath environment is stipulated by a fixed temperature in two different ways. Here we explore the case of a simple one-qubit system in preparation for generalization to multi-qubit, qutrit and hybrid qubit-qutrit systems. This model master equation can be used to test the stability of memory and the efficacy of quantum gates. The properties of such hybrid master equations are explored, with emphasis on the role of thermal equilibrium and entropy constraints. Several significant properties of time-dependent qubit evolution are revealed by this simple study.

  20. Economic analysis of HPAI control in the Netherlands I: epidemiological modelling to support economic analysis.

    Science.gov (United States)

    Longworth, N; Mourits, M C M; Saatkamp, H W

    2014-06-01

    Economic analysis of control strategies for contagious diseases is a necessity in the development of contingency plans. Economic impacts arising from epidemics such as highly pathogenic avian influenza (HPAI) consist of direct costs (DC), direct consequential costs (DCC), indirect consequential costs (ICC) and aftermath costs (AC). Epidemiological models to support economic analysis need to provide adequate outputs for these critical economic parameters. Of particular importance for DCC, ICC and AC is the spatial production structure of a region. Spatial simulation models are therefore particularly suited for economic analysis; however, they often require a large number of parameters. The aims of this study are (i) to provide an economic rationale of epidemiological modelling in general, (ii) to provide a transparent description of the parameterization of a spatially based epidemiological model for the analysis of HPAI control in the Netherlands and (iii) to discuss the validity and usefulness of this model for subsequent economic analysis. In the model, HPAI virus transmission occurs via local spread and animal movements. Control mechanisms include surveillance and tracing, movement restrictions and depopulation. Sensitivity analysis of key parameters indicated that the epidemiological outputs with the largest influence on the economic impacts (i.e. epidemic duration and number of farms in the movement restriction zone) were more robust than less influential indicators (i.e. number of infected farms). Economically relevant outputs for strategy comparison were most sensitive to the relative role of the different transmission parameters. The default simulation and results of the sensitivity analysis were consistent with the general outcomes of known HPAI models. Comparison was, however, limited due to the absence of some economically relevant outputs. It was concluded that the model creates economically relevant, adequate and credible output for subsequent use in

  1. Toward Environmentally Sustainable Mobile Computing Through an Economic Framework

    National Research Council Canada - National Science Library

    Joseph, Siny; Namboodiri, Vinod; Dev, Vishnu Cherusola

    2014-01-01

    .... Prior work with energy efficiency in mobile devices has primarily focused on the goal of maximizing battery life of these devices and not on the broader concept of environmentally sustainable mobile computing...

  2. Harrod Model and Modelling of Socio-Economic Processes

    Directory of Open Access Journals (Sweden)

    Chernyshov Sergey I.

    2013-11-01

    Full Text Available Harrod method in the differential form has a discrete character and the resulting growth of economy exponentially is insubstantial. The mistake is based on adjunction of the capital and annual income through a constant ratio. This becomes clear from the positions of study of dimensionality of the used values, which is knowingly avoided in the mathematical economy. Representation of the capital through intensity of income in categories of continuous analysis is quite naturally realised with the help of the Steklov function. It forms a correct Harrod method (CHM, which, unlike the above mentioned exponent, results in inevitability of economic crises, however, the moments of their appearance are calculable. The Steklov function allows generalisation by means of the component designed for monitoring of the economic situation with the aim to specify model parameters. Refraction of CHM to the balance of participants of the economic system in cost interpretation is quite fruitful. The obtained model is a system of differential equations of the first order with variable ratios. Due to this the article formulates general principles of modelling of socio-economic processes.

  3. International Jobs and Economic Development Impacts (I-JEDI) Model

    Energy Technology Data Exchange (ETDEWEB)

    2016-09-01

    International Jobs and Economic Development Impacts (I-JEDI) is a freely available economic model that estimates gross economic impacts from wind, solar, biomass, and geothermal energy projects. Building on a similar model for the United States, I-JEDI was developed by the National Renewable Energy Laboratory under the U.S. government's Enhancing Capacity for Low Emission Development Strategies (EC-LEDS) program to support partner countries in assessing economic impacts of LEDS actions in the energy sector.

  4. Economic weights for feed intake in the growing pig derived from a growth model and an economic model

    NARCIS (Netherlands)

    Hermesch, S.; Kanis, E.; Eissen, J.J.

    2003-01-01

    Economic weights are obtained for feed intake using a growth model and an economic model. The underlying concept of the growth model is the linear plateau model. Parameters of this model are the marginal ratio (MR) of extra fat and extra protein deposition with increasing feed intake (FI) and the

  5. The impact of the British model on economic growth

    Directory of Open Access Journals (Sweden)

    Simon György Jr.

    2007-01-01

    Full Text Available The paper is searching for an answer to the question how the British model affected economic development in its mother country, the United Kingdom. The statistical analysis, models of mathematical economics and econometric investigation make it probable to conclude that there was a substantial difference in success between the Thatcherite and the Blairite economic policies; the latter proved more effective. It is particularly remarkable that the Blairite model, connecting privatization with a successful employment policy, reduced unemployment and social sensitivity, has not only speeded up economic growth but also improved economic equilibrium, curtailing, among others, the budget deficit.

  6. Economics and computer science of a radio spectrum reallocation.

    Science.gov (United States)

    Leyton-Brown, Kevin; Milgrom, Paul; Segal, Ilya

    2017-07-11

    The recent "incentive auction" of the US Federal Communications Commission was the first auction to reallocate radio frequencies between two different kinds of uses: from broadcast television to wireless Internet access. The design challenge was not just to choose market rules to govern a fixed set of potential trades but also, to determine the broadcasters' property rights, the goods to be exchanged, the quantities to be traded, the computational procedures, and even some of the performance objectives. An essential and unusual challenge was to make the auction simple enough for human participants while still ensuring that the computations would be tractable and capable of delivering nearly efficient outcomes.

  7. Computational biomechanics for medicine imaging, modeling and computing

    CERN Document Server

    Doyle, Barry; Wittek, Adam; Nielsen, Poul; Miller, Karol

    2016-01-01

    The Computational Biomechanics for Medicine titles provide an opportunity for specialists in computational biomechanics to present their latest methodologies and advancements. This volume comprises eighteen of the newest approaches and applications of computational biomechanics, from researchers in Australia, New Zealand, USA, UK, Switzerland, Scotland, France and Russia. Some of the interesting topics discussed are: tailored computational models; traumatic brain injury; soft-tissue mechanics; medical image analysis; and clinically-relevant simulations. One of the greatest challenges facing the computational engineering community is to extend the success of computational mechanics to fields outside traditional engineering, in particular to biology, the biomedical sciences, and medicine. We hope the research presented within this book series will contribute to overcoming this grand challenge.

  8. Disciplines, models, and computers: the path to computational quantum chemistry.

    Science.gov (United States)

    Lenhard, Johannes

    2014-12-01

    Many disciplines and scientific fields have undergone a computational turn in the past several decades. This paper analyzes this sort of turn by investigating the case of computational quantum chemistry. The main claim is that the transformation from quantum to computational quantum chemistry involved changes in three dimensions. First, on the side of instrumentation, small computers and a networked infrastructure took over the lead from centralized mainframe architecture. Second, a new conception of computational modeling became feasible and assumed a crucial role. And third, the field of computa- tional quantum chemistry became organized in a market-like fashion and this market is much bigger than the number of quantum theory experts. These claims will be substantiated by an investigation of the so-called density functional theory (DFT), the arguably pivotal theory in the turn to computational quantum chemistry around 1990.

  9. A hybrid agent-based computational economics and optimization approach for supplier selection problem

    Directory of Open Access Journals (Sweden)

    Zahra Pourabdollahi

    2017-12-01

    Full Text Available Supplier evaluation and selection problem is among the most important of logistics decisions that have been addressed extensively in supply chain management. The same logistics decision is also important in freight transportation since it identifies trade relationships between business establishments and determines commodity flows between production and consumption points. The commodity flows are then used as input to freight transportation models to determine cargo movements and their characteristics including mode choice and shipment size. Various approaches have been proposed to explore this latter problem in previous studies. Traditionally, potential suppliers are evaluated and selected using only price/cost as the influential criteria and the state-of-practice methods. This paper introduces a hybrid agent-based computational economics and optimization approach for supplier selection. The proposed model combines an agent-based multi-criteria supplier evaluation approach with a multi-objective optimization model to capture both behavioral and economical aspects of the supplier selection process. The model uses a system of ordered response models to determine importance weights of the different criteria in supplier evaluation from a buyers’ point of view. The estimated weights are then used to calculate a utility for each potential supplier in the market and rank them. The calculated utilities are then entered into a mathematical programming model in which best suppliers are selected by maximizing the total accrued utility for all buyers and minimizing total shipping costs while balancing the capacity of potential suppliers to ensure market clearing mechanisms. The proposed model, herein, was implemented under an operational agent-based supply chain and freight transportation framework for the Chicago Metropolitan Area.

  10. An Alternative Theoretical Model for Economic Reforms in Africa ...

    African Journals Online (AJOL)

    This paper offers an alternative model for economic reforms in Africa. It proposes that Africa can still get on the pathway of sustained economic growth if economic reforms can focus on a key variable, namely, the price of non-tradables. Prices of non-tradables are generally less in Africa than in advanced economies, and the ...

  11. Quantum vertex model for reversible classical computing.

    Science.gov (United States)

    Chamon, C; Mucciolo, E R; Ruckenstein, A E; Yang, Z-C

    2017-05-12

    Mappings of classical computation onto statistical mechanics models have led to remarkable successes in addressing some complex computational problems. However, such mappings display thermodynamic phase transitions that may prevent reaching solution even for easy problems known to be solvable in polynomial time. Here we map universal reversible classical computations onto a planar vertex model that exhibits no bulk classical thermodynamic phase transition, independent of the computational circuit. Within our approach the solution of the computation is encoded in the ground state of the vertex model and its complexity is reflected in the dynamics of the relaxation of the system to its ground state. We use thermal annealing with and without 'learning' to explore typical computational problems. We also construct a mapping of the vertex model into the Chimera architecture of the D-Wave machine, initiating an approach to reversible classical computation based on state-of-the-art implementations of quantum annealing.

  12. Quantum vertex model for reversible classical computing

    Science.gov (United States)

    Chamon, C.; Mucciolo, E. R.; Ruckenstein, A. E.; Yang, Z.-C.

    2017-05-01

    Mappings of classical computation onto statistical mechanics models have led to remarkable successes in addressing some complex computational problems. However, such mappings display thermodynamic phase transitions that may prevent reaching solution even for easy problems known to be solvable in polynomial time. Here we map universal reversible classical computations onto a planar vertex model that exhibits no bulk classical thermodynamic phase transition, independent of the computational circuit. Within our approach the solution of the computation is encoded in the ground state of the vertex model and its complexity is reflected in the dynamics of the relaxation of the system to its ground state. We use thermal annealing with and without `learning' to explore typical computational problems. We also construct a mapping of the vertex model into the Chimera architecture of the D-Wave machine, initiating an approach to reversible classical computation based on state-of-the-art implementations of quantum annealing.

  13. Modeling Computer Virus and Its Dynamics

    Directory of Open Access Journals (Sweden)

    Mei Peng

    2013-01-01

    Full Text Available Based on that the computer will be infected by infected computer and exposed computer, and some of the computers which are in suscepitible status and exposed status can get immunity by antivirus ability, a novel coumputer virus model is established. The dynamic behaviors of this model are investigated. First, the basic reproduction number R0, which is a threshold of the computer virus spreading in internet, is determined. Second, this model has a virus-free equilibrium P0, which means that the infected part of the computer disappears, and the virus dies out, and P0 is a globally asymptotically stable equilibrium if R01 then this model has only one viral equilibrium P*, which means that the computer persists at a constant endemic level, and P* is also globally asymptotically stable. Finally, some numerical examples are given to demonstrate the analytical results.

  14. USING COMPUTER TECHNOLOGY TO ENHANCE AGRICULTURAL ECONOMICS TEACHING

    OpenAIRE

    Monson, Michael J.

    1995-01-01

    Computers are becoming an affordable and effective tool for assisting with classroom instruction. This paper describes experiences utilizing a hypermedia presentation system for a farm management course. Some advantages as well as drawbacks and issues associated with using microcomputer-controlled hypermedia in the classroom are presented. Hopefully, readers will find some assistance in planning the design and implementation of such techniques for their own classes.

  15. Proof of Economic Viability of Blended Learning Business Models

    Science.gov (United States)

    Druhmann, Carsten; Hohenberg, Gregor

    2014-01-01

    The discussion on economically sustainable business models with respect to information technology is lacking in many aspects of proven approaches. In the following contribution the economic viability is valued based on a procedural model for design and evaluation of e-learning business models in the form of a case study. As a case study object a…

  16. The IceCube Computing Infrastructure Model

    CERN Document Server

    CERN. Geneva

    2012-01-01

    Besides the big LHC experiments a number of mid-size experiments is coming online which need to define new computing models to meet the demands on processing and storage requirements of those experiments. We present the hybrid computing model of IceCube which leverages GRID models with a more flexible direct user model as an example of a possible solution. In IceCube a central datacenter at UW-Madison servers as Tier-0 with a single Tier-1 datacenter at DESY Zeuthen. We describe the setup of the IceCube computing infrastructure and report on our experience in successfully provisioning the IceCube computing needs.

  17. Computational Models for Nonlinear Aeroelastic Systems Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Clear Science Corp. and Duke University propose to develop and demonstrate new and efficient computational methods of modeling nonlinear aeroelastic systems. The...

  18. New Perspectives on Economic Modeling for Digital Curation

    DEFF Research Database (Denmark)

    Grindley, Neil; Kejser, Ulla Bøgvad; L'Hours, Hervé

    2014-01-01

    models that represent different aspects of the economic lifecycle based around curation. The framework includes a sustainability model, a cost and benefit model, a business model, and a cost model. The framework provides a common vocabulary and clarifies the roles and responsibilities of managers...... “Collaboration to Clarify the Cost of Curation”, which is bringing together and bridging existing knowledge, models and tools to create a better understanding of the economics of curation....

  19. Computational nanophotonics modeling and applications

    CERN Document Server

    Musa, Sarhan M

    2013-01-01

    This reference offers tools for engineers, scientists, biologists, and others working with the computational techniques of nanophotonics. It introduces the key concepts of computational methods in a manner that is easily digestible for newcomers to the field. The book also examines future applications of nanophotonics in the technical industry and covers new developments and interdisciplinary research in engineering, science, and medicine. It provides an overview of the key computational nanophotonics and describes the technologies with an emphasis on how they work and their key benefits.

  20. Building bridges between perceptual and economic decision-making: neural and computational mechanisms

    Directory of Open Access Journals (Sweden)

    Christopher eSummerfield

    2012-05-01

    Full Text Available Investigation into the neural and computational bases of decision-making has proceeded in two parallel but distinct streams. Perceptual decision making (PDM is concerned with how observers detect, discriminate and categorise noisy sensory information. Economic decision making (EDM explores how options are selected on the basis of their reinforcement history. Traditionally, the subfields of PDM and EDM have employed different paradigms, proposed different mechanistic models, explored different brain regions, disagreed about whether decisions approach optimality. Nevertheless, we argue that there is a common framework for understanding decisions made in both domains, under which an agent has to combine sensory information (what is the stimulus with value information (what is it worth. We review computational models of the decision process typically used in PDM, based around the idea that decisions involve a serial integration of evidence, and assess their applicability to decisions between good and gambles. Subsequently, we consider the contribution of three key brain regions – the parietal cortex, the basal ganglia, and the orbitofrontal cortex – to perceptual and economic decision-making, with a focus on the mechanisms by which sensory and reward information are integrated during choice. We find that although the parietal cortex is often implicated in the integration of sensory evidence, there is evidence for its role in encoding the expected value of a decision. Similarly, although much research has emphasised the role of the striatum and orbitofrontal cortex in value-guided choices, they may play an important role in categorisation of perceptual information. In conclusion, we consider how findings from the two fields might be brought together, in order to move towards a general framework for understanding decision-making in humans and other primates.

  1. Using the Economic Balance Model to Teach Supply-Side and Demand-Side Economics.

    Science.gov (United States)

    Pisciotta, John

    1983-01-01

    The Economic Balance model can be used in secondary economics classes to show demand- and supply-sides of the overall economy as well as how the two sides influence each other. Demand-side approaches to recession and inflation and supply-side approaches to expansion of production capacity and inflation are discussed. (AM)

  2. Conceptual model of management steadfast economic development production-economic systems

    OpenAIRE

    Prokhorova, V.

    2010-01-01

    The article is devoted developments of conceptual model of management proof economic development of the industrialeconomy systems. Features are certain, the algorithm of impulse is offered and intercommunication of contours of management proof economic development of the industrialeconomy systems is investigational

  3. Projecting global tropical cyclone economic damages with validation of tropical cyclone economic damage model

    Science.gov (United States)

    Iseri, Y.; Iwasaki, A.; Miyazaki, C.; Kanae, S.

    2014-12-01

    Tropical cyclones (TCs) sometimes cause serious damages to human society and thus possible changes of TC properties in the future have been concerned. In fact, the Fifth Assessment Report (AR5) by IPCC (Intergovernmental Panel on Climate Change) mentions likely increasing in intensity and rain rate of TCs. In addition, future change of socioeconomic condition (e.g. population growth) might worsen TC impacts in the future. Thereby, in this study, we developed regression models to estimate economic damages by TCs (hereafter TC damage model), and employed those models to project TC economic damages under several future climate and socioeconomic scenarios. We developed the TC damage models for each of 4 regions; western North Pacific, North American, North Indian, and Southern Hemisphere. The inputs for TC damage model are tropical cyclone central pressure, populations in the area exposed by tropical cyclone wind, and GDP (Gross Domestic Product) per capita. The TC damage models we firstly developed tended to overestimate very low damages and also underestimate very high damages. Thereby we modified structure of TC damage models to improve model performance, and then executed extensive validation of the model. The modified model presented better performance in estimating very low and high TC damages. After the modification and validation of the model, we determined the structure of TC damage models and projected TC economic damages. The result indicated increase in TC economic damage in global scale, while TC economic damage against world GDP would decrease in the future, which result is consistent with previous study.

  4. Climate Ocean Modeling on Parallel Computers

    Science.gov (United States)

    Wang, P.; Cheng, B. N.; Chao, Y.

    1998-01-01

    Ocean modeling plays an important role in both understanding the current climatic conditions and predicting future climate change. However, modeling the ocean circulation at various spatial and temporal scales is a very challenging computational task.

  5. Computational Intelligence. Mortality Models for the Actuary

    NARCIS (Netherlands)

    Willemse, W.J.

    2001-01-01

    This thesis applies computational intelligence to the field of actuarial (insurance) science. In particular, this thesis deals with life insurance where mortality modelling is important. Actuaries use ancient models (mortality laws) from the nineteenth century, for example Gompertz' and Makeham's

  6. Large Scale Computations in Air Pollution Modelling

    DEFF Research Database (Denmark)

    Zlatev, Z.; Brandt, J.; Builtjes, P. J. H.

    Proceedings of the NATO Advanced Research Workshop on Large Scale Computations in Air Pollution Modelling, Sofia, Bulgaria, 6-10 July 1998......Proceedings of the NATO Advanced Research Workshop on Large Scale Computations in Air Pollution Modelling, Sofia, Bulgaria, 6-10 July 1998...

  7. Applications of computer modeling to fusion research

    Energy Technology Data Exchange (ETDEWEB)

    Dawson, J.M.

    1989-01-01

    Progress achieved during this report period is presented on the following topics: Development and application of gyrokinetic particle codes to tokamak transport, development of techniques to take advantage of parallel computers; model dynamo and bootstrap current drive; and in general maintain our broad-based program in basic plasma physics and computer modeling.

  8. Computer Aided Continuous Time Stochastic Process Modelling

    DEFF Research Database (Denmark)

    Kristensen, N.R.; Madsen, Henrik; Jørgensen, Sten Bay

    2001-01-01

    A grey-box approach to process modelling that combines deterministic and stochastic modelling is advocated for identification of models for model-based control of batch and semi-batch processes. A computer-aided tool designed for supporting decision-making within the corresponding modelling cycle...

  9. Modeling the Economic Behavior of Households within the Context of Development of Economic Thought

    Directory of Open Access Journals (Sweden)

    Ivanov Roman V.

    2017-04-01

    Full Text Available The main purpose of the publication is to study formation of the household economic behavior modeling in the context of development of economic thought and methods of the economic-mathematical modeling. The study was carried out under the assumption that, when studying the development of theoretical and methodological foundations of the economic behavior of households one must take into account not only the history of development of economic theory, but also the transformation of attitudes in other areas of human knowledge, in particular the paradigm shift in scientific thinking. It has been specified that the massive use of mathematical methods in economics is associated with formation of the marginal theory and at the same time – with the proliferation of the marginal analysis. At the present stage, the economic behavior of households is being analyzed in the terms of concepts such as neoclassicism, institutionalism and behaviorism. But by dividing the concepts of «individual» and «household», it can be argued that precisely the institutionalism in conjunction with synergistic approach provide the basis for elaboration of strategies for the economic behavior of households, ensuring their economic security.

  10. Modelling management process of key drivers for economic sustainability in the modern conditions of economic development

    Directory of Open Access Journals (Sweden)

    Pishchulina E.S.

    2017-01-01

    Full Text Available The text is about issues concerning the management of driver for manufacturing enterprise economic sustainability and manufacturing enterprise sustainability assessment as the key aspect of the management of enterprise economic sustainability. The given issues become topical as new requirements for the methods of manufacturing enterprise management in the modern conditions of market economy occur. An economic sustainability model that is considered in the article is an integration of enterprise economic growth, economic balance of external and internal environment and economic sustainability. The method of assessment of economic sustainability of a manufacturing enterprise proposed in the study allows to reveal some weaknesses in the enterprise performance, and untapped reserves, which can be further used to improve the economic sustainability and efficiency of the enterprise. The management of manufacturing enterprise economic sustainability is one of the most important factors of business functioning and development in modern market economy. The relevance of this trend is increasing in accordance with the objective requirements of the growing volumes of production and sale, the increasing complexity of economic relations, changing external environment of an enterprise.

  11. CAN “THE IDEAL” ECONOMIC MODELS BE CONSIDERED COMPREHENSIBLE ?!

    Directory of Open Access Journals (Sweden)

    A.A. Kuklin

    2006-03-01

    Full Text Available This article opens the cycle of authors’ publications on the questions of theoretical approaches and representations of systems’ economics development. Digression of origin and development of social-economical formations is given. The attempt to consider the transformation of systems’ economics with the consideration of latent dynamic characteristics, to reveal the laws of structural-genetic and functional attribute of changing economy forming was made. Also it is offered to consider the models of systems’ economics transformation and their influence on territorial economic safety of different level.

  12. Is our economic model compatible with sustainable development?

    DEFF Research Database (Denmark)

    Røpke, Inge

    2003-01-01

    The paper concerns the contradictions between sustainability and the present economic growth model. The discussion relates to the work of Jan Otto Andersson.......The paper concerns the contradictions between sustainability and the present economic growth model. The discussion relates to the work of Jan Otto Andersson....

  13. Economic modeling and energy policy planning. [technology transfer, market research

    Science.gov (United States)

    Thompson, R. G.; Schwartz, A., Jr.; Lievano, R. J.; Stone, J. C.

    1974-01-01

    A structural economic model is presented for estimating the demand functions for natural gas and crude oil in industry and in steam electric power generation. Extensions of the model to other commodities are indicated.

  14. Model Railroading and Computer Fundamentals

    Science.gov (United States)

    McCormick, John W.

    2007-01-01

    Less than one half of one percent of all processors manufactured today end up in computers. The rest are embedded in other devices such as automobiles, airplanes, trains, satellites, and nearly every modern electronic device. Developing software for embedded systems requires a greater knowledge of hardware than developing for a typical desktop…

  15. Computer models of concrete structures

    OpenAIRE

    Cervenka, Vladimir; Eligehausen, Rolf; Pukl, Radomir

    1991-01-01

    The application of the nonlinear finite element analysis of concrete structures as a design tool is discussed. A computer program for structures in plane stress state is described and examples of its application in the research of fastening technique and in engineering practice are shown.

  16. Economic evaluation in chronic pain: a systematic review and de novo flexible economic model.

    Science.gov (United States)

    Sullivan, W; Hirst, M; Beard, S; Gladwell, D; Fagnani, F; López Bastida, J; Phillips, C; Dunlop, W C N

    2016-07-01

    There is unmet need in patients suffering from chronic pain, yet innovation may be impeded by the difficulty of justifying economic value in a field beset by data limitations and methodological variability. A systematic review was conducted to identify and summarise the key areas of variability and limitations in modelling approaches in the economic evaluation of treatments for chronic pain. The results of the literature review were then used to support the development of a fully flexible open-source economic model structure, designed to test structural and data assumptions and act as a reference for future modelling practice. The key model design themes identified from the systematic review included: time horizon; titration and stabilisation; number of treatment lines; choice/ordering of treatment; and the impact of parameter uncertainty (given reliance on expert opinion). Exploratory analyses using the model to compare a hypothetical novel therapy versus morphine as first-line treatments showed cost-effectiveness results to be sensitive to structural and data assumptions. Assumptions about the treatment pathway and choice of time horizon were key model drivers. Our results suggest structural model design and data assumptions may have driven previous cost-effectiveness results and ultimately decisions based on economic value. We therefore conclude that it is vital that future economic models in chronic pain are designed to be fully transparent and hope our open-source code is useful in order to aspire to a common approach to modelling pain that includes robust sensitivity analyses to test structural and parameter uncertainty.

  17. [Decision modeling for economic evaluation of health technologies].

    Science.gov (United States)

    de Soárez, Patrícia Coelho; Soares, Marta Oliveira; Novaes, Hillegonda Maria Dutilh

    2014-10-01

    Most economic evaluations that participate in decision-making processes for incorporation and financing of technologies of health systems use decision models to assess the costs and benefits of the compared strategies. Despite the large number of economic evaluations conducted in Brazil, there is a pressing need to conduct an in-depth methodological study of the types of decision models and their applicability in our setting. The objective of this literature review is to contribute to the knowledge and use of decision models in the national context of economic evaluations of health technologies. This article presents general definitions about models and concerns with their use; it describes the main models: decision trees, Markov chains, micro-simulation, simulation of discrete and dynamic events; it discusses the elements involved in the choice of model; and exemplifies the models addressed in national economic evaluation studies of diagnostic and therapeutic preventive technologies and health programs.

  18. Mathematical model comparing of the multi-level economics systems

    Science.gov (United States)

    Brykalov, S. M.; Kryanev, A. V.

    2017-12-01

    The mathematical model (scheme) of a multi-level comparison of the economic system, characterized by the system of indices, is worked out. In the mathematical model of the multi-level comparison of the economic systems, the indicators of peer review and forecasting of the economic system under consideration can be used. The model can take into account the uncertainty in the estimated values of the parameters or expert estimations. The model uses the multi-criteria approach based on the Pareto solutions.

  19. A Categorisation of Cloud Computing Business Models

    OpenAIRE

    Chang, V; Bacigalupo, D; Wills, G; De Roure, D

    2010-01-01

    This paper reviews current cloud computing business models and presents proposals on how organisations can achieve sustainability by adopting appropriate models. We classify cloud computing business models into eight types: (1) Service Provider and Service Orientation; (2) Support and Services Contracts; (3) In-House Private Clouds; (4) All-In-One Enterprise Cloud; (5) One-Stop Resources and Services; (6) Government funding; (7) Venture Capitals; and (8) Entertainment and Social Networking. U...

  20. Computable general equilibrium model fiscal year 2013 capability development report

    Energy Technology Data Exchange (ETDEWEB)

    Edwards, Brian Keith [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Rivera, Michael Kelly [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Boero, Riccardo [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-05-17

    This report documents progress made on continued developments of the National Infrastructure Simulation and Analysis Center (NISAC) Computable General Equilibrium Model (NCGEM), developed in fiscal year 2012. In fiscal year 2013, NISAC the treatment of the labor market and tests performed with the model to examine the properties of the solutions computed by the model. To examine these, developers conducted a series of 20 simulations for 20 U.S. States. Each of these simulations compared an economic baseline simulation with an alternative simulation that assumed a 20-percent reduction in overall factor productivity in the manufacturing industries of each State. Differences in the simulation results between the baseline and alternative simulations capture the economic impact of the reduction in factor productivity. While not every State is affected in precisely the same way, the reduction in manufacturing industry productivity negatively affects the manufacturing industries in each State to an extent proportional to the reduction in overall factor productivity. Moreover, overall economic activity decreases when manufacturing sector productivity is reduced. Developers ran two additional simulations: (1) a version of the model for the State of Michigan, with manufacturing divided into two sub-industries (automobile and other vehicle manufacturing as one sub-industry and the rest of manufacturing as the other subindustry); and (2) a version of the model for the United States, divided into 30 industries. NISAC conducted these simulations to illustrate the flexibility of industry definitions in NCGEM and to examine the simulation properties of in more detail.

  1. computer modeling ter modeling ter modeling of platinum reforming

    African Journals Online (AJOL)

    eobe

    Usually, the reformate that is leaving any stage o composition is assessed by laboratory analysis. The i which in most cases is avoided because of long comp hich in most cases is avoided because of long comp approach has considered a computer model as mean bed reactors in platforming u bed reactors in platforming ...

  2. Modeling and simulation the computer science of illusion

    CERN Document Server

    Raczynski, Stanislaw

    2006-01-01

    Simulation is the art of using tools - physical or conceptual models, or computer hardware and software, to attempt to create the illusion of reality. The discipline has in recent years expanded to include the modelling of systems that rely on human factors and therefore possess a large proportion of uncertainty, such as social, economic or commercial systems. These new applications make the discipline of modelling and simulation a field of dynamic growth and new research. Stanislaw Raczynski outlines the considerable and promising research that is being conducted to counter the problems of

  3. Uncertainty in biology a computational modeling approach

    CERN Document Server

    Gomez-Cabrero, David

    2016-01-01

    Computational modeling of biomedical processes is gaining more and more weight in the current research into the etiology of biomedical problems and potential treatment strategies.  Computational modeling allows to reduce, refine and replace animal experimentation as well as to translate findings obtained in these experiments to the human background. However these biomedical problems are inherently complex with a myriad of influencing factors, which strongly complicates the model building and validation process.  This book wants to address four main issues related to the building and validation of computational models of biomedical processes: Modeling establishment under uncertainty Model selection and parameter fitting Sensitivity analysis and model adaptation Model predictions under uncertainty In each of the abovementioned areas, the book discusses a number of key-techniques by means of a general theoretical description followed by one or more practical examples.  This book is intended for graduate stude...

  4. Podemos prever a taxa de cambio brasileira? Evidência empírica utilizando inteligência computacional e modelos econométricos Can we forecast Brazilian exchange rates? Empirical evidences using computational intelligence and econometric models

    Directory of Open Access Journals (Sweden)

    Leandro dos Santos Coelho

    2008-12-01

    evidence that these computational intelligence models are able to provide a more accurate forecast given their capacity for capturing nonlinearities and other stylized facts of financial time series. Thus, this paper investigates the hypothesis that the mathematical models of multilayer perception, radial basis function neural networks (NN, and the Takagi-Sugeno (TS fuzzy systems are able to provide a more accurate out-of-sample forecast than the traditional AutoRegressive Moving Average (ARMA and ARMA Generalized AutoRegressive Conditional Heteroskedasticity (ARMA-GARCH models. Using a series of Brazilian exchange rate (R$/US$ returns with 15 minutes, 60 minutes, 120 minutes, daily and weekly basis, the one-step-ahead forecast performance is compared. The results indicate that forecast performance is strongly related to the series' frequency, possibly due to nonlinearities effects. Besides, the forecasting evaluation shows that NN models perform better than the ARMA and ARMA-GARCH ones. In the trade strategy based on forecasts, NN models achieved higher returns when compared to a buy-and-hold strategy and to the other models considered in this study.

  5. HTGR Application Economic Model Users' Manual

    Energy Technology Data Exchange (ETDEWEB)

    A.M. Gandrik

    2012-01-01

    The High Temperature Gas-Cooled Reactor (HTGR) Application Economic Model was developed at the Idaho National Laboratory for the Next Generation Nuclear Plant Project. The HTGR Application Economic Model calculates either the required selling price of power and/or heat for a given internal rate of return (IRR) or the IRR for power and/or heat being sold at the market price. The user can generate these economic results for a range of reactor outlet temperatures; with and without power cycles, including either a Brayton or Rankine cycle; for the demonstration plant, first of a kind, or nth of a kind project phases; for up to 16 reactor modules; and for module ratings of 200, 350, or 600 MWt. This users manual contains the mathematical models and operating instructions for the HTGR Application Economic Model. Instructions, screenshots, and examples are provided to guide the user through the HTGR Application Economic Model. This model was designed for users who are familiar with the HTGR design and Excel and engineering economics. Modification of the HTGR Application Economic Model should only be performed by users familiar with the HTGR and its applications, Excel, and Visual Basic.

  6. Computational Model for Corneal Transplantation

    Science.gov (United States)

    Cabrera, Delia

    2003-10-01

    We evaluated the refractive consequences of corneal transplants using a biomechanical model with homogeneous and inhomogeneous Young's modulus distributions within the cornea, taking into account ablation of some stromal tissue. A FEM model was used to simulate corneal transplants in diseased cornea. The diseased cornea was modeled as an axisymmetric structure taking into account a nonlinearly elastic, isotropic formulation. The model simulating the penetrating keratoplasty procedure gives more change in the postoperative corneal curvature when compared to the models simulating the anterior and posterior lamellar graft procedures. When a lenticle shaped tissue was ablated in the graft during the anterior and posterior keratoplasty, the models provided an additional correction of about -3.85 and -4.45 diopters, respectively. Despite the controversy around the corneal thinning disorders treatment with volume removal procedures, results indicate that significant changes in corneal refractive power could be introduced by a corneal transplantation combined with myopic laser ablation.

  7. Modelling Dynamics of Main Economic Indicators of an Enterprise

    Directory of Open Access Journals (Sweden)

    Sherstennykov Yurii V.

    2014-03-01

    Full Text Available The article develops an economic and mathematical model of dynamics of main economic indicators of an enterprise, reflected in six book-keeping accounts with consideration of logistics and interrelation with current market characteristics and needs of products consumers. It applies this model for a quantitative study of influence of an advertising campaign and seasonality upon quantitative indicators of economic activity of the enterprise. The enterprise operation programme includes internal financial and economic procedures, which ensure the production process, and also connection with suppliers and buyers (customers. When setting different initial conditions, it is possible to trace transitional processes and enterprise entering (under favourable conditions the stationary mode of operation or its laying-off (in case of insufficiency of circulating funds. The developed model contains many parameters, which allow not only study of dependence of enterprise operation on alteration of one of them but also optimisation of economic conditions of functioning.

  8. Computer modeling of human decision making

    Science.gov (United States)

    Gevarter, William B.

    1991-01-01

    Models of human decision making are reviewed. Models which treat just the cognitive aspects of human behavior are included as well as models which include motivation. Both models which have associated computer programs, and those that do not, are considered. Since flow diagrams, that assist in constructing computer simulation of such models, were not generally available, such diagrams were constructed and are presented. The result provides a rich source of information, which can aid in construction of more realistic future simulations of human decision making.

  9. Models of economic growth and development in the context of ...

    African Journals Online (AJOL)

    Countries like New Zealand, Iceland, and Denmark offer evidence of this. Models of African development such as the Lagos Plan of Action in terms of the whole continent are discussed within the context of existing impediments to such progress. Keywords: economic growth, economic development, human capital, growth ...

  10. Models of economic geography : dynamics, estimation and policy evaluation

    NARCIS (Netherlands)

    Knaap, Thijs

    2004-01-01

    In this thesis we look at economic geography models from a number of angles. We started by placing the theory in a context of preceding theories, both earlier work on spatial economics and other children of the monopolistic competition ‘revolution.’ Next, we looked at the theoretical properties of

  11. Notions of similarity for computational biology models

    KAUST Repository

    Waltemath, Dagmar

    2016-03-21

    Computational models used in biology are rapidly increasing in complexity, size, and numbers. To build such large models, researchers need to rely on software tools for model retrieval, model combination, and version control. These tools need to be able to quantify the differences and similarities between computational models. However, depending on the specific application, the notion of similarity may greatly vary. A general notion of model similarity, applicable to various types of models, is still missing. Here, we introduce a general notion of quantitative model similarities, survey the use of existing model comparison methods in model building and management, and discuss potential applications of model comparison. To frame model comparison as a general problem, we describe a theoretical approach to defining and computing similarities based on different model aspects. Potentially relevant aspects of a model comprise its references to biological entities, network structure, mathematical equations and parameters, and dynamic behaviour. Future similarity measures could combine these model aspects in flexible, problem-specific ways in order to mimic users\\' intuition about model similarity, and to support complex model searches in databases.

  12. Computational fluid dynamics modeling in yarn engineering

    CSIR Research Space (South Africa)

    Patanaik, A

    2011-07-01

    Full Text Available This chapter deals with the application of computational fluid dynamics (CFD) modeling in reducing yarn hairiness during the ring spinning process and thereby “engineering” yarn with desired properties. Hairiness significantly affects the appearance...

  13. A new epidemic model of computer viruses

    Science.gov (United States)

    Yang, Lu-Xing; Yang, Xiaofan

    2014-06-01

    This paper addresses the epidemiological modeling of computer viruses. By incorporating the effect of removable storage media, considering the possibility of connecting infected computers to the Internet, and removing the conservative restriction on the total number of computers connected to the Internet, a new epidemic model is proposed. Unlike most previous models, the proposed model has no virus-free equilibrium and has a unique endemic equilibrium. With the aid of the theory of asymptotically autonomous systems as well as the generalized Poincare-Bendixson theorem, the endemic equilibrium is shown to be globally asymptotically stable. By analyzing the influence of different system parameters on the steady number of infected computers, a collection of policies is recommended to prohibit the virus prevalence.

  14. Computational Models for Nonlinear Aeroelastic Systems Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Clear Science Corp. and Duke University propose to develop and demonstrate a new and efficient computational method of modeling nonlinear aeroelastic systems. The...

  15. Two Computer Programs for Equipment Cost Estimation and Economic Evaluation of Chemical Processes.

    Science.gov (United States)

    Kuri, Carlos J.; Corripio, Armando B.

    1984-01-01

    Describes two computer programs for use in process design courses: an easy-to-use equipment cost estimation program based on latest cost correlations available and an economic evaluation program which calculates two profitability indices. Comparisons between programed and hand-calculated results are included. (JM)

  16. Predictive Models and Computational Embryology

    Science.gov (United States)

    EPA’s ‘virtual embryo’ project is building an integrative systems biology framework for predictive models of developmental toxicity. One schema involves a knowledge-driven adverse outcome pathway (AOP) framework utilizing information from public databases, standardized ontologies...

  17. Enhanced absorption cycle computer model

    Science.gov (United States)

    Grossman, G.; Wilk, M.

    1993-09-01

    Absorption heat pumps have received renewed and increasing attention in the past two decades. The rising cost of electricity has made the particular features of this heat-powered cycle attractive for both residential and industrial applications. Solar-powered absorption chillers, gas-fired domestic heat pumps, and waste-heat-powered industrial temperature boosters are a few of the applications recently subjected to intensive research and development. The absorption heat pump research community has begun to search for both advanced cycles in various multistage configurations and new working fluid combinations with potential for enhanced performance and reliability. The development of working absorption systems has created a need for reliable and effective system simulations. A computer code has been developed for simulation of absorption systems at steady state in a flexible and modular form, making it possible to investigate various cycle configurations with different working fluids. The code is based on unit subroutines containing the governing equations for the system's components and property subroutines containing thermodynamic properties of the working fluids. The user conveys to the computer an image of his cycle by specifying the different subunits and their interconnections. Based on this information, the program calculates the temperature, flow rate, concentration, pressure, and vapor fraction at each state point in the system, and the heat duty at each unit, from which the coefficient of performance (COP) may be determined. This report describes the code and its operation, including improvements introduced into the present version. Simulation results are described for LiBr-H2O triple-effect cycles, LiCl-H2O solar-powered open absorption cycles, and NH3-H2O single-effect and generator-absorber heat exchange cycles. An appendix contains the user's manual.

  18. Computer Model Locates Environmental Hazards

    Science.gov (United States)

    2008-01-01

    Catherine Huybrechts Burton founded San Francisco-based Endpoint Environmental (2E) LLC in 2005 while she was a student intern and project manager at Ames Research Center with NASA's DEVELOP program. The 2E team created the Tire Identification from Reflectance model, which algorithmically processes satellite images using turnkey technology to retain only the darkest parts of an image. This model allows 2E to locate piles of rubber tires, which often are stockpiled illegally and cause hazardous environmental conditions and fires.

  19. Moving forward socio-economically focused models of deforestation.

    Science.gov (United States)

    Dezécache, Camille; Salles, Jean-Michel; Vieilledent, Ghislain; Hérault, Bruno

    2017-09-01

    Whilst high-resolution spatial variables contribute to a good fit of spatially explicit deforestation models, socio-economic processes are often beyond the scope of these models. Such a low level of interest in the socio-economic dimension of deforestation limits the relevancy of these models for decision-making and may be the cause of their failure to accurately predict observed deforestation trends in the medium term. This study aims to propose a flexible methodology for taking into account multiple drivers of deforestation in tropical forested areas, where the intensity of deforestation is explicitly predicted based on socio-economic variables. By coupling a model of deforestation location based on spatial environmental variables with several sub-models of deforestation intensity based on socio-economic variables, we were able to create a map of predicted deforestation over the period 2001-2014 in French Guiana. This map was compared to a reference map for accuracy assessment, not only at the pixel scale but also over cells ranging from 1 to approximately 600 sq. km. Highly significant relationships were explicitly established between deforestation intensity and several socio-economic variables: population growth, the amount of agricultural subsidies, gold and wood production. Such a precise characterization of socio-economic processes allows to avoid overestimation biases in high deforestation areas, suggesting a better integration of socio-economic processes in the models. Whilst considering deforestation as a purely geographical process contributes to the creation of conservative models unable to effectively assess changes in the socio-economic and political contexts influencing deforestation trends, this explicit characterization of the socio-economic dimension of deforestation is critical for the creation of deforestation scenarios in REDD+ projects. © 2017 John Wiley & Sons Ltd.

  20. A Computational Framework for Realistic Retina Modeling.

    Science.gov (United States)

    Martínez-Cañada, Pablo; Morillas, Christian; Pino, Begoña; Ros, Eduardo; Pelayo, Francisco

    2016-11-01

    Computational simulations of the retina have led to valuable insights about the biophysics of its neuronal activity and processing principles. A great number of retina models have been proposed to reproduce the behavioral diversity of the different visual processing pathways. While many of these models share common computational stages, previous efforts have been more focused on fitting specific retina functions rather than generalizing them beyond a particular model. Here, we define a set of computational retinal microcircuits that can be used as basic building blocks for the modeling of different retina mechanisms. To validate the hypothesis that similar processing structures may be repeatedly found in different retina functions, we implemented a series of retina models simply by combining these computational retinal microcircuits. Accuracy of the retina models for capturing neural behavior was assessed by fitting published electrophysiological recordings that characterize some of the best-known phenomena observed in the retina: adaptation to the mean light intensity and temporal contrast, and differential motion sensitivity. The retinal microcircuits are part of a new software platform for efficient computational retina modeling from single-cell to large-scale levels. It includes an interface with spiking neural networks that allows simulation of the spiking response of ganglion cells and integration with models of higher visual areas.

  1. Model to Implement Virtual Computing Labs via Cloud Computing Services

    Directory of Open Access Journals (Sweden)

    Washington Luna Encalada

    2017-07-01

    Full Text Available In recent years, we have seen a significant number of new technological ideas appearing in literature discussing the future of education. For example, E-learning, cloud computing, social networking, virtual laboratories, virtual realities, virtual worlds, massive open online courses (MOOCs, and bring your own device (BYOD are all new concepts of immersive and global education that have emerged in educational literature. One of the greatest challenges presented to e-learning solutions is the reproduction of the benefits of an educational institution’s physical laboratory. For a university without a computing lab, to obtain hands-on IT training with software, operating systems, networks, servers, storage, and cloud computing similar to that which could be received on a university campus computing lab, it is necessary to use a combination of technological tools. Such teaching tools must promote the transmission of knowledge, encourage interaction and collaboration, and ensure students obtain valuable hands-on experience. That, in turn, allows the universities to focus more on teaching and research activities than on the implementation and configuration of complex physical systems. In this article, we present a model for implementing ecosystems which allow universities to teach practical Information Technology (IT skills. The model utilizes what is called a “social cloud”, which utilizes all cloud computing services, such as Software as a Service (SaaS, Platform as a Service (PaaS, and Infrastructure as a Service (IaaS. Additionally, it integrates the cloud learning aspects of a MOOC and several aspects of social networking and support. Social clouds have striking benefits such as centrality, ease of use, scalability, and ubiquity, providing a superior learning environment when compared to that of a simple physical lab. The proposed model allows students to foster all the educational pillars such as learning to know, learning to be, learning

  2. Decision modeling for economic evaluation of health technologies

    National Research Council Canada - National Science Library

    de Soárez, Patrícia Coelho; Soares, Marta Oliveira; Novaes, Hillegonda Maria Dutilh

    2014-01-01

    Most economic evaluations that participate in decision-making processes for incorporation and financing of technologies of health systems use decision models to assess the costs and benefits of the compared strategies...

  3. Computer Modeling of Direct Metal Laser Sintering

    Science.gov (United States)

    Cross, Matthew

    2014-01-01

    A computational approach to modeling direct metal laser sintering (DMLS) additive manufacturing process is presented. The primary application of the model is for determining the temperature history of parts fabricated using DMLS to evaluate residual stresses found in finished pieces and to assess manufacturing process strategies to reduce part slumping. The model utilizes MSC SINDA as a heat transfer solver with imbedded FORTRAN computer code to direct laser motion, apply laser heating as a boundary condition, and simulate the addition of metal powder layers during part fabrication. Model results are compared to available data collected during in situ DMLS part manufacture.

  4. Visual and Computational Modelling of Minority Games

    Directory of Open Access Journals (Sweden)

    Robertas Damaševičius

    2017-02-01

    Full Text Available The paper analyses the Minority Game and focuses on analysis and computational modelling of several variants (variable payoff, coalition-based and ternary voting of Minority Game using UAREI (User-Action-Rule-Entities-Interface model. UAREI is a model for formal specification of software gamification, and the UAREI visual modelling language is a language used for graphical representation of game mechanics. The URAEI model also provides the embedded executable modelling framework to evaluate how the rules of the game will work for the players in practice. We demonstrate flexibility of UAREI model for modelling different variants of Minority Game rules for game design.

  5. Computer Based Modelling and Simulation

    Indian Academy of Sciences (India)

    Likewise, ships and buildings are built by naval and civil architects. While these are useful, they are, in most cases, static models. We are ..... The basic theory of transition from one state to another was developed by the Russian mathematician. Andrei Markov and hence the name Markov chains. Andrei Markov [1856-1922] ...

  6. Computational aspects of premixing modelling

    Energy Technology Data Exchange (ETDEWEB)

    Fletcher, D.F. [Sydney Univ., NSW (Australia). Dept. of Chemical Engineering; Witt, P.J.

    1998-01-01

    In the steam explosion research field there is currently considerable effort being devoted to the modelling of premixing. Practically all models are based on the multiphase flow equations which treat the mixture as an interpenetrating continuum. Solution of these equations is non-trivial and a wide range of solution procedures are in use. This paper addresses some numerical aspects of this problem. In particular, we examine the effect of the differencing scheme for the convective terms and show that use of hybrid differencing can cause qualitatively wrong solutions in some situations. Calculations are performed for the Oxford tests, the BNL tests, a MAGICO test and to investigate various sensitivities of the solution. In addition, we show that use of a staggered grid can result in a significant error which leads to poor predictions of `melt` front motion. A correction is given which leads to excellent convergence to the analytic solution. Finally, we discuss the issues facing premixing model developers and highlight the fact that model validation is hampered more by the complexity of the process than by numerical issues. (author)

  7. Computational Modeling of Culture's Consequences

    NARCIS (Netherlands)

    Hofstede, G.J.; Jonker, C.M.; Verwaart, T.

    2010-01-01

    This paper presents an approach to formalize the influence of culture on the decision functions of agents in social simulations. The key components are (a) a definition of the domain of study in the form of a decision model, (b) knowledge acquisition based on a dimensional theory of culture,

  8. Computational modeling of concrete flow

    DEFF Research Database (Denmark)

    Roussel, Nicolas; Geiker, Mette Rica; Dufour, Frederic

    2007-01-01

    particle flow, and numerical techniques allowing the modeling of particles suspended in a fluid. The general concept behind each family of techniques is described. Pros and cons for each technique are given along with examples and references to applications to fresh cementitious materials....

  9. MULTIREGION: a simulation-forecasting model of BEA economic area population and employment. [Bureau of Economic Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Olsen, R.J.; Westley, G.W.; Herzog, H.W. Jr.; Kerley, C.R.; Bjornstad, D.J.; Vogt, D.P.; Bray, L.G.; Grady, S.T.; Nakosteen, R.A.

    1977-10-01

    This report documents the development of MULTIREGION, a computer model of regional and interregional socio-economic development. The MULTIREGION model interprets the economy of each BEA economic area as a labor market, measures all activity in terms of people as members of the population (labor supply) or as employees (labor demand), and simultaneously simulates or forecasts the demands and supplies of labor in all BEA economic areas at five-year intervals. In general the outputs of MULTIREGION are intended to resemble those of the Water Resource Council's OBERS projections and to be put to similar planning and analysis purposes. This report has been written at two levels to serve the needs of multiple audiences. The body of the report serves as a fairly nontechnical overview of the entire MULTIREGION project; a series of technical appendixes provide detailed descriptions of the background empirical studies of births, deaths, migration, labor force participation, natural resource employment, manufacturing employment location, and local service employment used to construct the model.

  10. Economic Impacts of Potential Foot and Mouth Disease Agro-terrorism in the United States: A Computable General Equilibrium Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Oladosu, Gbadebo A [ORNL; Rose, Adam [University of Southern California, Los Angeles; Bumsoo, Lee [University of Illinois

    2013-01-01

    The foot and mouth disease (FMD) virus has high agro-terrorism potential because it is contagious, can be easily transmitted via inanimate objects and can be spread by wind. An outbreak of FMD in developed countries results in massive slaughtering of animals (for disease control) and disruptions in meat supply chains and trade, with potentially large economic losses. Although the United States has been FMD-free since 1929, the potential of FMD as a deliberate terrorist weapon calls for estimates of the physical and economic damage that could result from an outbreak. This paper estimates the economic impacts of three alternative scenarios of potential FMD attacks using a computable general equilibrium (CGE) model of the US economy. The three scenarios range from a small outbreak successfully contained within a state to a large multi-state attack resulting in slaughtering of 30 percent of the national livestock. Overall, the value of total output losses in our simulations range between $37 billion (0.15% of 2006 baseline economic output) and $228 billion (0.92%). Major impacts stem from the supply constraint on livestock due to massive animal slaughtering. As expected, the economic losses are heavily concentrated in agriculture and food manufacturing sectors, with losses ranging from $23 billion to $61 billion in the two industries.

  11. System dynamics modelling and simulating the effects of intellectual capital on economic growth

    Directory of Open Access Journals (Sweden)

    Ivona Milić Beran

    2015-10-01

    Full Text Available System dynamics modelling is one of the best scientific methods for modelling complex, nonlinear natural, economic and technical system dynamics as it enables both monitoring and assessment of the effects of intellectual capital on economic growth. Intellectual capital is defined as “the ability to transform knowledge and intangible assets into resources to create wealth for a company and a country.” Transformation of knowledge is crucial. Knowledge increases a country’s wealth only if its importance is recognized and applied differently from existing work practices. The aim of this paper is to show the efficiency of modelling system dynamics and simulating the effects of intellectual capital on economic growth. A computer simulation provided a mathematical model, providing practical insight into the dynamic behavior of the observed system, i.e. the analysis of economic growth and observation of mutual correlation between individual parameters. The results of the simulation are presented in graphical form. The dynamic model of the effects of intellectual capital on Croatia’s economic growth has been verified by comparing simulation results with existing data on economic growth.

  12. Modelling the interaction between flooding events and economic growth

    Directory of Open Access Journals (Sweden)

    J. Grames

    2015-06-01

    Full Text Available Socio-hydrology describes the interaction between the socio-economy and water. Recent models analyze the interplay of community risk-coping culture, flooding damage and economic growth (Di Baldassarre et al., 2013; Viglione et al., 2014. These models descriptively explain the feedbacks between socio-economic development and natural disasters like floods. Contrary to these descriptive models, our approach develops an optimization model, where the intertemporal decision of an economic agent interacts with the hydrological system. In order to build this first economic growth model describing the interaction between the consumption and investment decisions of an economic agent and the occurrence of flooding events, we transform an existing descriptive stochastic model into an optimal deterministic model. The intermediate step is to formulate and simulate a descriptive deterministic model. We develop a periodic water function to approximate the former discrete stochastic time series of rainfall events. Due to the non-autonomous exogenous periodic rainfall function the long-term path of consumption and investment will be periodic.

  13. Note---A Note on Economic Models for R&D Project Selection in the Presence of Project Interactions

    OpenAIRE

    Paula M. Goldstein; Howard M. Singer

    1986-01-01

    In their paper, "Economic Models for R&D Project Selection in the Presence of Project Interactions," Fox, Baker, and Bryant (Fox, G. E., N. R. Baker, J. L. Bryant. 1984. Economic models for R and D project selection in the presence of project interactions. Management Sci. 30 (July) 890--902.) proposed a model that included benefit interactions, called present value (PV) interactions, between R&D projects. Several computational errors in this paper invalidated the illustrative example which so...

  14. Impact Analysis of Economic Linkages of South Korea with North Korea Using a CGE Model

    OpenAIRE

    Kim, Euijune; Shin, Hyewon

    2014-01-01

    The purpose of this paper is to estimate impacts of core infrastructure investments in North Korea on South and North Koreas. The investment expenditures of core infrastructure projects in North Korea are calibrated as 9.35 billion US$ including highway, railroad and industrial complex. Since South and North Koreas are based on market and planned economies respectively, the Computable General Equilibrium model is applied to the economic analysis of South Korea and an Input-Output Model for th...

  15. Models and relations in economics and econometrics

    DEFF Research Database (Denmark)

    Juselius, Katarina

    1999-01-01

    Based on a money market analysis using the cointegrated VAR model the paper demonstrates some possible pitfalls in macroeconomic inference as a direct consequence of inadequate stochastic model formulation. A number of questions related to concepts such as empirical and theoretical steady......-states, speed of adjustment, feedback and interaction effects, and driving forces are addressed within the framework of the cointegrated VAR model. The interpretation and analysis of common driving trends are related to the notion of shocks or disturbances to a system, distinguishing between permanent...

  16. On the macro-economic impact of bioenergy and biochemicals – Introducing advanced bioeconomy sectors into an economic modelling framework with a case study for the Netherlands

    NARCIS (Netherlands)

    Meijl, van H.; Tsiropoulos, I.; Bartelings, H.; Hoefnagels, R.; Smeets, E.; Tabeau, A.; Faaij, A.

    2018-01-01

    Advanced uses of biomass for bioenergy and biochemicals are being gradually introduced and are expected to grow considerably in regional economies, thus raising questions on their mid-term macro-economic impacts. To assess these impacts, we use a computable general equilibrium model and a regional

  17. Introductory review of computational cell cycle modeling.

    Science.gov (United States)

    Kriete, Andres; Noguchi, Eishi; Sell, Christian

    2014-01-01

    Recent advances in the modeling of the cell cycle through computer simulation demonstrate the power of systems biology. By definition, systems biology has the goal to connect a parts list, prioritized through experimental observation or high-throughput screens, by the topology of interactions defining intracellular networks to predict system function. Computer modeling of biological systems is often compared to a process of reverse engineering. Indeed, designed or engineered technical systems share many systems-level properties with biological systems; thus studying biological systems within an engineering framework has proven successful. Here we review some aspects of this process as it pertains to cell cycle modeling.

  18. A computational model of the cerebellum

    Energy Technology Data Exchange (ETDEWEB)

    Travis, B.J.

    1990-01-01

    The need for realistic computational models of neural microarchitecture is growing increasingly apparent. While traditional neural networks have made inroads on understanding cognitive functions, more realism (in the form of structural and connectivity constraints) is required to explain processes such as vision or motor control. A highly detailed computational model of mammalian cerebellum has been developed. It is being compared to physiological recordings for validation purposes. The model is also being used to study the relative contributions of each component to cerebellar processing. 28 refs., 4 figs.

  19. Computational modeling of failure in composite laminates

    NARCIS (Netherlands)

    Van der Meer, F.P.

    2010-01-01

    There is no state of the art computational model that is good enough for predictive simulation of the complete failure process in laminates. Already on the single ply level controversy exists. Much work has been done in recent years in the development of continuum models, but these fail to predict

  20. Modeling User Behavior in Computer Learning Tasks.

    Science.gov (United States)

    Mantei, Marilyn M.

    Model building techniques from Artifical Intelligence and Information-Processing Psychology are applied to human-computer interface tasks to evaluate existing interfaces and suggest new and better ones. The model is in the form of an augmented transition network (ATN) grammar which is built by applying grammar induction heuristics on a sequential…

  1. Generating Computational Models for Serious Gaming

    NARCIS (Netherlands)

    Westera, Wim

    2018-01-01

    Many serious games include computational models that simulate dynamic systems. These models promote enhanced interaction and responsiveness. Under the social web paradigm more and more usable game authoring tools become available that enable prosumers to create their own games, but the inclusion of

  2. Generating computational models for serious gaming

    NARCIS (Netherlands)

    Westera, Wim

    2014-01-01

    Many serious games include computational models that simulate dynamic systems. These models promote enhanced interaction and responsiveness. Under the social web paradigm more and more usable game authoring tools become available that enable prosumers to create their own games, but the inclusion of

  3. A review on economic emission dispatch problems using quantum computational intelligence

    Science.gov (United States)

    Mahdi, Fahad Parvez; Vasant, Pandian; Kallimani, Vish; Abdullah-Al-Wadud, M.

    2016-11-01

    Economic emission dispatch (EED) problems are one of the most crucial problems in power systems. Growing energy demand, limitation of natural resources and global warming make this topic into the center of discussion and research. This paper reviews the use of Quantum Computational Intelligence (QCI) in solving Economic Emission Dispatch problems. QCI techniques like Quantum Genetic Algorithm (QGA) and Quantum Particle Swarm Optimization (QPSO) algorithm are discussed here. This paper will encourage the researcher to use more QCI based algorithm to get better optimal result for solving EED problems.

  4. Do's and Don'ts of Computer Models for Planning

    Science.gov (United States)

    Hammond, John S., III

    1974-01-01

    Concentrates on the managerial issues involved in computer planning models. Describes what computer planning models are and the process by which managers can increase the likelihood of computer planning models being successful in their organizations. (Author/DN)

  5. The economic impacts of the September 11 terrorist attacks: a computable general equilibrium analysis

    Energy Technology Data Exchange (ETDEWEB)

    Oladosu, Gbadebo A [ORNL; Rose, Adam [University of Southern California, Los Angeles; Bumsoo, Lee [University of Illinois; Asay, Gary [University of Southern California

    2009-01-01

    This paper develops a bottom-up approach that focuses on behavioral responses in estimating the total economic impacts of the September 11, 2001, World Trade Center (WTC) attacks. The estimation includes several new features. First, is the collection of data on the relocation of firms displaced by the attack, the major source of resilience in muting the direct impacts of the event. Second, is a new estimate of the major source of impacts off-site -- the ensuing decline of air travel and related tourism in the U.S. due to the social amplification of the fear of terrorism. Third, the estimation is performed for the first time using Computable General Equilibrium (CGE) analysis, including a new approach to reflecting the direct effects of external shocks. This modeling framework has many advantages in this application, such as the ability to include behavioral responses of individual businesses and households, to incorporate features of inherent and adaptive resilience at the level of the individual decision maker and the market, and to gauge quantity and price interaction effects across sectors of the regional and national economies. We find that the total business interruption losses from the WTC attacks on the U.S. economy were only slightly over $100 billion, or less than 1.0% of Gross Domestic Product. The impacts were only a loss of $14 billion of Gross Regional Product for the New York Metropolitan Area.

  6. R.M. Solow Adjusted Model of Economic Growth

    Directory of Open Access Journals (Sweden)

    Ion Gh. Rosca

    2007-05-01

    Full Text Available Besides the models of M. Keynes, R.F. Harrod, E. Domar, D. Romer, Ramsey-Cass-Koopmans etc., the R.M. Solow model is part of the category which characterizes the economic growth. The paper proposes the study of the R.M. Solow adjusted model of economic growth, while the adjustment consisting in the model adaptation to the Romanian economic characteristics. The article is the first one from a three paper series dedicated to the macroeconomic modelling theme, using the R.M. Solow model, such as: “Measurement of the economic growth and extensions of the R.M. Solow adjusted model” and “Evolution scenarios at the Romanian economy level using the R.M. Solow adjusted model”. The analysis part of the model is based on the study of the equilibrium to the continuous case with some interpretations of the discreet one, by using the state diagram. The optimization problem at the economic level is also used; it is built up of a specified number of representative consumers and firms in order to reveal the interaction between these elements.

  7. Models of neuromodulation for computational psychiatry.

    Science.gov (United States)

    Iglesias, Sandra; Tomiello, Sara; Schneebeli, Maya; Stephan, Klaas E

    2017-05-01

    Psychiatry faces fundamental challenges: based on a syndrome-based nosology, it presently lacks clinical tests to infer on disease processes that cause symptoms of individual patients and must resort to trial-and-error treatment strategies. These challenges have fueled the recent emergence of a novel field-computational psychiatry-that strives for mathematical models of disease processes at physiological and computational (information processing) levels. This review is motivated by one particular goal of computational psychiatry: the development of 'computational assays' that can be applied to behavioral or neuroimaging data from individual patients and support differential diagnosis and guiding patient-specific treatment. Because the majority of available pharmacotherapeutic approaches in psychiatry target neuromodulatory transmitters, models that infer (patho)physiological and (patho)computational actions of different neuromodulatory transmitters are of central interest for computational psychiatry. This article reviews the (many) outstanding questions on the computational roles of neuromodulators (dopamine, acetylcholine, serotonin, and noradrenaline), outlines available evidence, and discusses promises and pitfalls in translating these findings to clinical applications. WIREs Cogn Sci 2017, 8:e1420. doi: 10.1002/wcs.1420 For further resources related to this article, please visit the WIREs website. © 2016 Wiley Periodicals, Inc.

  8. Recent Advances in Computational Modeling of Thrombosis

    OpenAIRE

    Yesudasan, Sumith; Averett, Rodney D.

    2018-01-01

    The study of thrombosis is crucial to understand and develop new therapies for diseases like deep vein thrombosis, diabetes related strokes, pulmonary embolism etc. The last two decades have seen an exponential growth in studies related to the blood clot formation using computational tools and through experiments. Despite of this growth, the complete mechanism behind thrombus formation and hemostasis is not known yet. The computational models and methods used in this context are diversified i...

  9. Finite difference computing with exponential decay models

    CERN Document Server

    Langtangen, Hans Petter

    2016-01-01

    This text provides a very simple, initial introduction to the complete scientific computing pipeline: models, discretization, algorithms, programming, verification, and visualization. The pedagogical strategy is to use one case study – an ordinary differential equation describing exponential decay processes – to illustrate fundamental concepts in mathematics and computer science. The book is easy to read and only requires a command of one-variable calculus and some very basic knowledge about computer programming. Contrary to similar texts on numerical methods and programming, this text has a much stronger focus on implementation and teaches testing and software engineering in particular. .

  10. On the need and use of models to explore the role of economic confidence:a survey.

    Energy Technology Data Exchange (ETDEWEB)

    Sprigg, James A.; Paez, Paul J. (University of New Mexico, Albuquerque, NM); Hand, Michael S. (University of New Mexico, Albuquerque, NM)

    2005-04-01

    Empirical studies suggest that consumption is more sensitive to current income than suggested under the permanent income hypothesis, which raises questions regarding expectations for future income, risk aversion, and the role of economic confidence measures. This report surveys a body of fundamental economic literature as well as burgeoning computational modeling methods to support efforts to better anticipate cascading economic responses to terrorist threats and attacks. This is a three part survey to support the incorporation of models of economic confidence into agent-based microeconomic simulations. We first review broad underlying economic principles related to this topic. We then review the economic principle of confidence and related empirical studies. Finally, we provide a brief survey of efforts and publications related to agent-based economic simulation.

  11. Simulating Quantile Models with Applications to Economics and Management

    Science.gov (United States)

    Machado, José A. F.

    2010-05-01

    The massive increase in the speed of computers over the past forty years changed the way that social scientists, applied economists and statisticians approach their trades and also the very nature of the problems that they could feasibly tackle. The new methods that use intensively computer power go by the names of "computer-intensive" or "simulation". My lecture will start with bird's eye view of the uses of simulation in Economics and Statistics. Then I will turn out to my own research on uses of computer- intensive methods. From a methodological point of view the question I address is how to infer marginal distributions having estimated a conditional quantile process, (Counterfactual Decomposition of Changes in Wage Distributions using Quantile Regression," Journal of Applied Econometrics 20, 2005). Illustrations will be provided of the use of the method to perform counterfactual analysis in several different areas of knowledge.

  12. Modeling PPP Economic Benefits for Lunar ISRU

    Science.gov (United States)

    Blair, B.

    2017-10-01

    A new tool is needed for selecting the PPP strategy that could maximize the rate of lunar commercialization by attracting private capital into the development of critical infrastructure and robust capability. A PPP model under development for NASA-ESO will be described.

  13. Economic Modeling and Analysis of Educational Vouchers

    Science.gov (United States)

    Epple, Dennis; Romano, Richard

    2012-01-01

    The analysis of educational vouchers has evolved from market-based analogies to models that incorporate distinctive features of the educational environment. These distinctive features include peer effects, scope for private school pricing and admissions based on student characteristics, the linkage of household residential and school choices in…

  14. Computational disease modeling – fact or fiction?

    Directory of Open Access Journals (Sweden)

    Stephan Klaas

    2009-06-01

    Full Text Available Abstract Background Biomedical research is changing due to the rapid accumulation of experimental data at an unprecedented scale, revealing increasing degrees of complexity of biological processes. Life Sciences are facing a transition from a descriptive to a mechanistic approach that reveals principles of cells, cellular networks, organs, and their interactions across several spatial and temporal scales. There are two conceptual traditions in biological computational-modeling. The bottom-up approach emphasizes complex intracellular molecular models and is well represented within the systems biology community. On the other hand, the physics-inspired top-down modeling strategy identifies and selects features of (presumably essential relevance to the phenomena of interest and combines available data in models of modest complexity. Results The workshop, "ESF Exploratory Workshop on Computational disease Modeling", examined the challenges that computational modeling faces in contributing to the understanding and treatment of complex multi-factorial diseases. Participants at the meeting agreed on two general conclusions. First, we identified the critical importance of developing analytical tools for dealing with model and parameter uncertainty. Second, the development of predictive hierarchical models spanning several scales beyond intracellular molecular networks was identified as a major objective. This contrasts with the current focus within the systems biology community on complex molecular modeling. Conclusion During the workshop it became obvious that diverse scientific modeling cultures (from computational neuroscience, theory, data-driven machine-learning approaches, agent-based modeling, network modeling and stochastic-molecular simulations would benefit from intense cross-talk on shared theoretical issues in order to make progress on clinically relevant problems.

  15. CONCEPTUAL MODEL OF ECONOMICAL SAFETY MANAGEMENT OF A PHARMACY ORGANIZATION

    Directory of Open Access Journals (Sweden)

    D. A. Kuznetsov

    2015-01-01

    Full Text Available Active influence of external environment on pharmaceutical activity, and the inner processes in pharmaceutical organizations may be accompanied with certain threats appearance for economical state of an organization. In this connection, there is an objective need in administering of the safety for a pharmaceutical organization and its management basing on the corresponding knowledge. Formation of conceptual model of economical safety management of a pharmaceutical organization is the purpose of this study. Using logic analysis and generalization of literature data and the results of our studies for economical safety of pharmaceutical organizations we have revealed and characterized the principal element of a concept of “economical safety of pharmaceutical organization” and the interconnection of these elements. As the result of this study, we have formed a conceptual model of the management of analyzed subsystem of pharmaceutical management. 

  16. Generic modelling framework for economic analysis of battery systems

    DEFF Research Database (Denmark)

    You, Shi; Rasmussen, Claus Nygaard

    2011-01-01

    Deregulated electricity markets provide opportunities for Battery Systems (BS) to participate in energy arbitrage and ancillary services (regulation, operating reserves, contingency reserves, voltage regulation, power quality etc.). To evaluate the economic viability of BS with different business...... for battery cycle life estimation, since the cycle life plays a central role in the economic analysis of BS. To illustrate the modelling framework, a case study using a Sodium Sulfur Battery (NAS) system with 5-minute regulating service is performed. The economic performances of two dispatch scenarios, a so...

  17. A parallel computational model for GATE simulations.

    Science.gov (United States)

    Rannou, F R; Vega-Acevedo, N; El Bitar, Z

    2013-12-01

    GATE/Geant4 Monte Carlo simulations are computationally demanding applications, requiring thousands of processor hours to produce realistic results. The classical strategy of distributing the simulation of individual events does not apply efficiently for Positron Emission Tomography (PET) experiments, because it requires a centralized coincidence processing and large communication overheads. We propose a parallel computational model for GATE that handles event generation and coincidence processing in a simple and efficient way by decentralizing event generation and processing but maintaining a centralized event and time coordinator. The model is implemented with the inclusion of a new set of factory classes that can run the same executable in sequential or parallel mode. A Mann-Whitney test shows that the output produced by this parallel model in terms of number of tallies is equivalent (but not equal) to its sequential counterpart. Computational performance evaluation shows that the software is scalable and well balanced. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  18. Environmental, social, and economic implications of global reuse and recycling of personal computers.

    Science.gov (United States)

    Williams, Eric; Kahhat, Ramzy; Allenby, Braden; Kavazanjian, Edward; Kim, Junbeum; Xu, Ming

    2008-09-01

    Reverse supply chains for the reuse, recycling, and disposal of goods are globalizing. This article critically reviews the environmental, economic, and social issues associated with international reuse and recycling of personal computers. Computers and other e-waste are often exported for reuse and recycling abroad. On the environmental side, our analysis suggests that the risk of leaching of toxic materials in computers from well-managed sanitary landfills is very small. On the other hand, there is an increasing body of scientific evidence that the environmental impacts of informal recycling in developing countries are serious. On the basis of existing evidence informal recycling is the most pressing environmental issue associated with e-waste. Socially, used markets abroad improve access to information technology by making low-priced computers available. Economically, the reuse and recycling sector provides employment. Existing policies efforts to manage e-waste focus on mandating domestic recycling systems and reducing toxic content of processes. We argue that existing policy directions will mitigate but not solve the problem of the environmental impacts of informal recycling. There are many opportunities yet to be explored to develop policies and technologies for reuse/recycling systems which are environmentally safe, encourage reuse of computers, and provide jobs.

  19. FPL-PELPS : a price endogenous linear programming system for economic modeling, supplement to PELPS III, version 1.1.

    Science.gov (United States)

    Patricia K. Lebow; Henry Spelter; Peter J. Ince

    2003-01-01

    This report provides documentation and user information for FPL-PELPS, a personal computer price endogenous linear programming system for economic modeling. Originally developed to model the North American pulp and paper industry, FPL-PELPS follows its predecessors in allowing the modeling of any appropriate sector to predict consumption, production and capacity by...

  20. Economic analysis and assessment of syngas production using a modeling approach

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Hakkwan; Parajuli, Prem B.; Yu, Fei; Columbus, Eugene P.

    2011-08-10

    Economic analysis and modeling are essential and important issues for the development of current feedstock and process technology for bio-gasification. The objective of this study was to develop an economic model and apply to predict the unit cost of syngas production from a micro-scale bio-gasification facility. An economic model was programmed in C++ computer programming language and developed using a parametric cost approach, which included processes to calculate the total capital costs and the total operating costs. The model used measured economic data from the bio-gasification facility at Mississippi State University. The modeling results showed that the unit cost of syngas production was $1.217 for a 60 Nm-3 h-1 capacity bio-gasifier. The operating cost was the major part of the total production cost. The equipment purchase cost and the labor cost were the largest part of the total capital cost and the total operating cost, respectively. Sensitivity analysis indicated that labor costs rank the top as followed by equipment cost, loan life, feedstock cost, interest rate, utility cost, and waste treatment cost. The unit cost of syngas production increased with the increase of all parameters with exception of loan life. The annual cost regarding equipment, labor, feedstock, waste treatment, and utility cost showed a linear relationship with percent changes, while loan life and annual interest rate showed a non-linear relationship. This study provides the useful information for economic analysis and assessment of the syngas production using a modeling approach.

  1. Children, computer exposure and musculoskeletal outcomes: the development of pathway models for school and home computer-related musculoskeletal outcomes.

    Science.gov (United States)

    Harris, Courtenay; Straker, Leon; Pollock, Clare; Smith, Anne

    2015-01-01

    Children's computer use is rapidly growing, together with reports of related musculoskeletal outcomes. Models and theories of adult-related risk factors demonstrate multivariate risk factors associated with computer use. Children's use of computers is different from adult's computer use at work. This study developed and tested a child-specific model demonstrating multivariate relationships between musculoskeletal outcomes, computer exposure and child factors. Using pathway modelling, factors such as gender, age, television exposure, computer anxiety, sustained attention (flow), socio-economic status and somatic complaints (headache and stomach pain) were found to have effects on children's reports of musculoskeletal symptoms. The potential for children's computer exposure to follow a dose-response relationship was also evident. Developing a child-related model can assist in understanding risk factors for children's computer use and support the development of recommendations to encourage children to use this valuable resource in educational, recreational and communication environments in a safe and productive manner. Computer use is an important part of children's school and home life. Application of this developed model, that encapsulates related risk factors, enables practitioners, researchers, teachers and parents to develop strategies that assist young people to use information technology for school, home and leisure in a safe and productive manner.

  2. Biomedical Imaging and Computational Modeling in Biomechanics

    CERN Document Server

    Iacoviello, Daniela

    2013-01-01

    This book collects the state-of-art and new trends in image analysis and biomechanics. It covers a wide field of scientific and cultural topics, ranging from remodeling of bone tissue under the mechanical stimulus up to optimizing the performance of sports equipment, through the patient-specific modeling in orthopedics, microtomography and its application in oral and implant research, computational modeling in the field of hip prostheses, image based model development and analysis of the human knee joint, kinematics of the hip joint, micro-scale analysis of compositional and mechanical properties of dentin, automated techniques for cervical cell image analysis, and iomedical imaging and computational modeling in cardiovascular disease.   The book will be of interest to researchers, Ph.D students, and graduate students with multidisciplinary interests related to image analysis and understanding, medical imaging, biomechanics, simulation and modeling, experimental analysis.

  3. Modelling the economic tradeoffs between allocating water for crop ...

    African Journals Online (AJOL)

    2009-06-05

    Jun 5, 2009 ... conditions of water scarcity a tradeoff exists between allocating water for salinity management and production. Currently no model in South Africa is able to model explicitly the impact of salinity management through leaching on the economic efficiency of ..... Resource constraints. Total production is ...

  4. Modelling of the relation of natural disasters and the economic ...

    African Journals Online (AJOL)

    This study, with taking some distance from theoretical models which they have expressed on this matter, has tried to model the relation of expense caused by natural incidents and economic growth using the technique of neural network. The results from data analysis with neural network show that natural incidents and the ...

  5. On the Economic Order Quantity Model With Transportation Costs

    NARCIS (Netherlands)

    S.I. Birbil (Ilker); K. Bulbul; J.B.G. Frenk (Hans); H.M. Mulder (Martyn)

    2009-01-01

    textabstractWe consider an economic order quantity type model with unit out-of-pocket holding costs, unit opportunity costs of holding, fixed ordering costs and general transportation costs. For these models, we analyze the associated optimization problem and derive an easy procedure for determining

  6. Criteria for comparing economic impact models of tourism

    NARCIS (Netherlands)

    Klijs, J.; Heijman, W.J.M.; Korteweg Maris, D.; Bryon, J.

    2012-01-01

    There are substantial differences between models of the economic impacts of tourism. Not only do the nature and precision of results vary, but data demands, complexity and underlying assumptions also differ. Often, it is not clear whether the models chosen are appropriate for the specific situation

  7. Perspective: Economic models of pastoral land tenure | Behnke ...

    African Journals Online (AJOL)

    Reviews some existing economic models of pastoral land tenure, and identifies what appear to be the reasons for their appeal to policy makers. Puts forward an alternative model which can more precisely account for the organization of pastoral tenure systems and which suggests new ways of managing African rangeland.

  8. Activity-Based Costing Model for Assessing Economic Performance.

    Science.gov (United States)

    DeHayes, Daniel W.; Lovrinic, Joseph G.

    1994-01-01

    An economic model for evaluating the cost performance of academic and administrative programs in higher education is described. Examples from its application at Indiana University-Purdue University Indianapolis are used to illustrate how the model has been used to control costs and reengineer processes. (Author/MSE)

  9. Diagnostics of Expectations of Economic Agents as an Instrument for the Modelling of Economic Cycles

    Directory of Open Access Journals (Sweden)

    Marat Rashitovich Safiullin

    2017-06-01

    Full Text Available The emerging trends of the development of are based on rapid institutional transformations and corresponding progressive forms of value added creation. Therefore, the analysis of these trends demand advanced and evidence-based approaches. Earlier, the processes of economic industrialization, implementation of large-scale solutions, high localization of economic processes were the priority directions for the development. Nowadays, the major proprieties are such strategic focus for the development as the formation and large-scale replication of the local low-concentrated growth points, diversification of business activity, and distribution of technological, institutional, product developments; development of the social parameters of economic growth including those based on the principles of environmental friendliness of business activities, etc. The complexity and multidimensionality of the processes of socio-economic development create a basis for the improvement of the traditional theoretical approaches to modeling and forecasting of economic growth and, respectively, of the cyclic development of the economy. The strengthening of globalization processes as well as the regionalization of the economy, formation of the complex and mobile dynamic structures generating the crisis phenomena make relevant the problem of the modern regulation of the cyclic development of the economy. Its solution is impossible with classical methods of the cyclist theory. The traditional approaches for the modeling of the cyclic development of the economy can lead to a decline in the quality of the predictive models constructed on the basis of extrapolation methods with the application of the scenario forecasts of market and institutional factors which are the drivers of the phase changes of a cycle. The above-mentioned means that the current development of the considered predictive models supposes a number of risks connected with the accuracy of prediction and

  10. Analisis Model Manajemen Insiden Berbasis Cloud Computing

    Directory of Open Access Journals (Sweden)

    Anggi Sukamto

    2015-05-01

    Full Text Available Dukungan teknologi informasi yang diterapkan oleh organisasi membutuhkan suatu manajemen agar penggunaannya dapat memenuhi tujuan penerapan teknologi tersebut. Salah satu kerangka kerja manajemen layanan teknologi informasi yang dapat diadopsi oleh organisasi adalah Information Technology Infrastructure Library (ITIL. Dukungan layanan (service support merupakan bagian dari proses ITIL. Pada umumnya, aktivitas dukungan layanan dilaksanakan dengan penggunaan teknologi yang dapat diakses melalui internet. Kondisi tersebut mengarah pada suatu konsep cloud computing. Cloud computing memungkinkan suatu instansi atau perusahaan untuk bisa mengatur sumber daya melalui jaringan internet. Fokus penelitian ini adalah menganalisis proses dan pelaku yang terlibat dalam dukungan layanan khususnya pada proses manajemen insiden, serta mengidentifikasi potensi penyerahan pelaku ke bentuk layanan cloud computing. Berdasarkan analisis yang dilakukan maka usulan model manajemen insiden berbasis cloud ini dapat diterapkan dalam suatu organisasi yang telah menggunakan teknologi komputer untuk mendukung kegiatan operasional. Kata Kunci—Cloud computing, ITIL, Manajemen Insiden, Service Support, Service Desk.

  11. Computer Aided Modelling – Opportunities and Challenges

    DEFF Research Database (Denmark)

    2011-01-01

    This chapter considers the opportunities that are present in developing, extending and applying aspects of computer-aided modelling principles and practice. What are the best tasks to be done by modellers and what needs the application of CAPE tools? How do we efficiently develop model-based solu......This chapter considers the opportunities that are present in developing, extending and applying aspects of computer-aided modelling principles and practice. What are the best tasks to be done by modellers and what needs the application of CAPE tools? How do we efficiently develop model......-based solutions to significant problems? The important issues of workflow and data flow are discussed together with fit-for-purpose model development. As well, the lack of tools around multiscale modelling provides opportunities for the development of efficient tools to address such challenges. The ability...... to easily generate new models from underlying phenomena continues to be a challenge, especially in the face of time and cost constraints.Integrated frameworks that allow flexibility of model development and access to a range of embedded tools are central to future model developments. The challenges...

  12. Applied Mathematics, Modelling and Computational Science

    CERN Document Server

    Kotsireas, Ilias; Makarov, Roman; Melnik, Roderick; Shodiev, Hasan

    2015-01-01

    The Applied Mathematics, Modelling, and Computational Science (AMMCS) conference aims to promote interdisciplinary research and collaboration. The contributions in this volume cover the latest research in mathematical and computational sciences, modeling, and simulation as well as their applications in natural and social sciences, engineering and technology, industry, and finance. The 2013 conference, the second in a series of AMMCS meetings, was held August 26–30 and organized in cooperation with AIMS and SIAM, with support from the Fields Institute in Toronto, and Wilfrid Laurier University. There were many young scientists at AMMCS-2013, both as presenters and as organizers. This proceedings contains refereed papers contributed by the participants of the AMMCS-2013 after the conference. This volume is suitable for researchers and graduate students, mathematicians and engineers, industrialists, and anyone who would like to delve into the interdisciplinary research of applied and computational mathematics ...

  13. Integrating interactive computational modeling in biology curricula.

    Directory of Open Access Journals (Sweden)

    Tomáš Helikar

    2015-03-01

    Full Text Available While the use of computer tools to simulate complex processes such as computer circuits is normal practice in fields like engineering, the majority of life sciences/biological sciences courses continue to rely on the traditional textbook and memorization approach. To address this issue, we explored the use of the Cell Collective platform as a novel, interactive, and evolving pedagogical tool to foster student engagement, creativity, and higher-level thinking. Cell Collective is a Web-based platform used to create and simulate dynamical models of various biological processes. Students can create models of cells, diseases, or pathways themselves or explore existing models. This technology was implemented in both undergraduate and graduate courses as a pilot study to determine the feasibility of such software at the university level. First, a new (In Silico Biology class was developed to enable students to learn biology by "building and breaking it" via computer models and their simulations. This class and technology also provide a non-intimidating way to incorporate mathematical and computational concepts into a class with students who have a limited mathematical background. Second, we used the technology to mediate the use of simulations and modeling modules as a learning tool for traditional biological concepts, such as T cell differentiation or cell cycle regulation, in existing biology courses. Results of this pilot application suggest that there is promise in the use of computational modeling and software tools such as Cell Collective to provide new teaching methods in biology and contribute to the implementation of the "Vision and Change" call to action in undergraduate biology education by providing a hands-on approach to biology.

  14. Integrating Interactive Computational Modeling in Biology Curricula

    Science.gov (United States)

    Helikar, Tomáš; Cutucache, Christine E.; Dahlquist, Lauren M.; Herek, Tyler A.; Larson, Joshua J.; Rogers, Jim A.

    2015-01-01

    While the use of computer tools to simulate complex processes such as computer circuits is normal practice in fields like engineering, the majority of life sciences/biological sciences courses continue to rely on the traditional textbook and memorization approach. To address this issue, we explored the use of the Cell Collective platform as a novel, interactive, and evolving pedagogical tool to foster student engagement, creativity, and higher-level thinking. Cell Collective is a Web-based platform used to create and simulate dynamical models of various biological processes. Students can create models of cells, diseases, or pathways themselves or explore existing models. This technology was implemented in both undergraduate and graduate courses as a pilot study to determine the feasibility of such software at the university level. First, a new (In Silico Biology) class was developed to enable students to learn biology by “building and breaking it” via computer models and their simulations. This class and technology also provide a non-intimidating way to incorporate mathematical and computational concepts into a class with students who have a limited mathematical background. Second, we used the technology to mediate the use of simulations and modeling modules as a learning tool for traditional biological concepts, such as T cell differentiation or cell cycle regulation, in existing biology courses. Results of this pilot application suggest that there is promise in the use of computational modeling and software tools such as Cell Collective to provide new teaching methods in biology and contribute to the implementation of the “Vision and Change” call to action in undergraduate biology education by providing a hands-on approach to biology. PMID:25790483

  15. Integrating interactive computational modeling in biology curricula.

    Science.gov (United States)

    Helikar, Tomáš; Cutucache, Christine E; Dahlquist, Lauren M; Herek, Tyler A; Larson, Joshua J; Rogers, Jim A

    2015-03-01

    While the use of computer tools to simulate complex processes such as computer circuits is normal practice in fields like engineering, the majority of life sciences/biological sciences courses continue to rely on the traditional textbook and memorization approach. To address this issue, we explored the use of the Cell Collective platform as a novel, interactive, and evolving pedagogical tool to foster student engagement, creativity, and higher-level thinking. Cell Collective is a Web-based platform used to create and simulate dynamical models of various biological processes. Students can create models of cells, diseases, or pathways themselves or explore existing models. This technology was implemented in both undergraduate and graduate courses as a pilot study to determine the feasibility of such software at the university level. First, a new (In Silico Biology) class was developed to enable students to learn biology by "building and breaking it" via computer models and their simulations. This class and technology also provide a non-intimidating way to incorporate mathematical and computational concepts into a class with students who have a limited mathematical background. Second, we used the technology to mediate the use of simulations and modeling modules as a learning tool for traditional biological concepts, such as T cell differentiation or cell cycle regulation, in existing biology courses. Results of this pilot application suggest that there is promise in the use of computational modeling and software tools such as Cell Collective to provide new teaching methods in biology and contribute to the implementation of the "Vision and Change" call to action in undergraduate biology education by providing a hands-on approach to biology.

  16. Evolutionary modelling of the macro-economic impacts of catastrophic flood events

    NARCIS (Netherlands)

    Safarzynska, K.E.; Brouwer, R.; Hofkes, M.

    2013-01-01

    This paper examines the possible contribution of evolutionary economics to macro-economic modelling of flood impacts to provide guidance for future economic risk modelling. Most macro-economic models start from a neoclassical economic perspective and focus on equilibrium outcomes, either in a static

  17. A Stochastic Dynamic Model of Computer Viruses

    Directory of Open Access Journals (Sweden)

    Chunming Zhang

    2012-01-01

    Full Text Available A stochastic computer virus spread model is proposed and its dynamic behavior is fully investigated. Specifically, we prove the existence and uniqueness of positive solutions, and the stability of the virus-free equilibrium and viral equilibrium by constructing Lyapunov functions and applying Ito's formula. Some numerical simulations are finally given to illustrate our main results.

  18. Computational Failure Modeling of Lower Extremities

    Science.gov (United States)

    2012-01-01

    State University, 2010. RTO-HFM-207 25- 17 Computational Failure Modeling of Lower Extremities [28] Kleb, B., “nato-rto—A LATEX Class and BIBTEX Style ...BIOMEDICAL SCIENCES A HEPPER D POPE RM 1A BLDG 245 PORTON DOWN SALISBURY WILTSHIRE SP4 OJQ UNITED KINGDOM 4 DRDC VALCARTIER

  19. Computational Modeling of Complex Protein Activity Networks

    NARCIS (Netherlands)

    Schivo, Stefano; Leijten, Jeroen; Karperien, Marcel; Post, Janine N.; Prignet, Claude

    2017-01-01

    Because of the numerous entities interacting, the complexity of the networks that regulate cell fate makes it impossible to analyze and understand them using the human brain alone. Computational modeling is a powerful method to unravel complex systems. We recently described the development of a

  20. Computational modelling for dry-powder inhalers

    NARCIS (Netherlands)

    Kröger, Ralf; Woolhouse, Robert; Becker, Michael; Wachtel, Herbert; de Boer, Anne; Horner, Marc

    2012-01-01

    Computational fluid dynamics (CFD) is a simulation tool used for modelling powder flow through inhalers to allow optimisation both of device design and drug powder. Here, Ralf Kröger, Consulting Senior CFD Engineer, ANSYS Germany GmbH; Marc Horner, Lead Technical Services Engineer, Healthcare,

  1. STEW A Nonlinear Data Modeling Computer Program

    CERN Document Server

    Chen, H

    2000-01-01

    A nonlinear data modeling computer program, STEW, employing the Levenberg-Marquardt algorithm, has been developed to model the experimental sup 2 sup 3 sup 9 Pu(n,f) and sup 2 sup 3 sup 5 U(n,f) cross sections. This report presents results of the modeling of the sup 2 sup 3 sup 9 Pu(n,f) and sup 2 sup 3 sup 5 U(n,f) cross-section data. The calculation of the fission transmission coefficient is based on the double-humped-fission-barrier model of Bjornholm and Lynn. Incident neutron energies of up to 5 MeV are considered.

  2. Computing Linear Mathematical Models Of Aircraft

    Science.gov (United States)

    Duke, Eugene L.; Antoniewicz, Robert F.; Krambeer, Keith D.

    1991-01-01

    Derivation and Definition of Linear Aircraft Model (LINEAR) computer program provides user with powerful, and flexible, standard, documented, and verified software tool for linearization of mathematical models of aerodynamics of aircraft. Intended for use in software tool to drive linear analysis of stability and design of control laws for aircraft. Capable of both extracting such linearized engine effects as net thrust, torque, and gyroscopic effects, and including these effects in linear model of system. Designed to provide easy selection of state, control, and observation variables used in particular model. Also provides flexibility of allowing alternate formulations of both state and observation equations. Written in FORTRAN.

  3. A Multilayer Model of Computer Networks

    OpenAIRE

    Shchurov, Andrey A.

    2015-01-01

    The fundamental concept of applying the system methodology to network analysis declares that network architecture should take into account services and applications which this network provides and supports. This work introduces a formal model of computer networks on the basis of the hierarchical multilayer networks. In turn, individual layers are represented as multiplex networks. The concept of layered networks provides conditions of top-down consistency of the model. Next, we determined the...

  4. Applied modelling and computing in social science

    CERN Document Server

    Povh, Janez

    2015-01-01

    In social science outstanding results are yielded by advanced simulation methods, based on state of the art software technologies and an appropriate combination of qualitative and quantitative methods. This book presents examples of successful applications of modelling and computing in social science: business and logistic process simulation and optimization, deeper knowledge extractions from big data, better understanding and predicting of social behaviour and modelling health and environment changes.

  5. Physical-Socio-Economic Modeling of Climate Change

    Science.gov (United States)

    Chamberlain, R. G.; Vatan, F.

    2008-12-01

    Because of the global nature of climate change, any assessment of the effects of plans, policies, and response to climate change demands a model that encompasses the entire Earth System, including socio- economic factors. Physics-based climate models of the factors that drive global temperatures, rainfall patterns, and sea level are necessary but not sufficient to guide decision making. Actions taken by farmers, industrialists, environmentalists, politicians, and other policy makers may result in large changes to economic factors, international relations, food production, disease vectors, and beyond. These consequences will not be felt uniformly around the globe or even across a given region. Policy models must comprehend all of these considerations. Combining physics-based models of the Earth's climate and biosphere with societal models of population dynamics, economics, and politics is a grand challenge with high stakes. We propose to leverage our recent advances in modeling and simulation of military stability and reconstruction operations to models that address all these areas of concern. Following over twenty years' experience of successful combat simulation, JPL has started developing Minerva, which will add demographic, economic, political, and media/information models to capabilities that already exist. With these new models, for which we have design concepts, it will be possible to address a very wide range of potential national and international problems that were previously inaccessible. Our climate change model builds on Minerva and expands the geographical horizon from playboxes containing regions and neighborhoods to the entire globe. This system consists of a collection of interacting simulation models that specialize in different aspects of the global situation. They will each contribute to and draw from a pool of shared data. The basic models are: the physical model; the demographic model; the political model; the economic model; and the media

  6. Computational social network modeling of terrorist recruitment.

    Energy Technology Data Exchange (ETDEWEB)

    Berry, Nina M.; Turnley, Jessica Glicken (Sandia National Laboratories, Albuquerque, NM); Smrcka, Julianne D. (Sandia National Laboratories, Albuquerque, NM); Ko, Teresa H.; Moy, Timothy David (Sandia National Laboratories, Albuquerque, NM); Wu, Benjamin C.

    2004-10-01

    The Seldon terrorist model represents a multi-disciplinary approach to developing organization software for the study of terrorist recruitment and group formation. The need to incorporate aspects of social science added a significant contribution to the vision of the resulting Seldon toolkit. The unique addition of and abstract agent category provided a means for capturing social concepts like cliques, mosque, etc. in a manner that represents their social conceptualization and not simply as a physical or economical institution. This paper provides an overview of the Seldon terrorist model developed to study the formation of cliques, which are used as the major recruitment entity for terrorist organizations.

  7. Testing computational toxicology models with phytochemicals.

    Science.gov (United States)

    Valerio, Luis G; Arvidson, Kirk B; Busta, Emily; Minnier, Barbara L; Kruhlak, Naomi L; Benz, R Daniel

    2010-02-01

    Computational toxicology employing quantitative structure-activity relationship (QSAR) modeling is an evidence-based predictive method being evaluated by regulatory agencies for risk assessment and scientific decision support for toxicological endpoints of interest such as rodent carcinogenicity. Computational toxicology is being tested for its usefulness to support the safety assessment of drug-related substances (e.g. active pharmaceutical ingredients, metabolites, impurities), indirect food additives, and other applied uses of value for protecting public health including safety assessment of environmental chemicals. The specific use of QSAR as a chemoinformatic tool for estimating the rodent carcinogenic potential of phytochemicals present in botanicals, herbs, and natural dietary sources is investigated here by an external validation study, which is the most stringent scientific method of measuring predictive performance. The external validation statistics for predicting rodent carcinogenicity of 43 phytochemicals, using two computational software programs evaluated at the FDA, are discussed. One software program showed very good performance for predicting non-carcinogens (high specificity), but both exhibited poor performance in predicting carcinogens (sensitivity), which is consistent with the design of the models. When predictions were considered in combination with each other rather than based on any one software, the performance for sensitivity was enhanced, However, Chi-square values indicated that the overall predictive performance decreases when using the two computational programs with this particular data set. This study suggests that complementary multiple computational toxicology software need to be carefully selected to improve global QSAR predictions for this complex toxicological endpoint.

  8. ONTOLOGICAL MODEL OF STRATEGIC ECONOMIC SECURITY OF ENTERPRISE

    Directory of Open Access Journals (Sweden)

    L. A. Zaporozhtseva

    2014-01-01

    Full Text Available Article explains the necessity the application of the ontological approach to modeling the strategic economic security in the formalization of the basic categories of domain company recognized its benefits. Among the advantages of the model distinguishes its versatility and ability to describe various aspects of strategic security - the system strategies and goals of the organization and business processes; possibility of its use at different levels of detail - from the top-level description of the basic categories of management, to design-level analytic applications; as well as the adaptability of the model, with depth on particular aspects determined by practical necessity and not regulated methodology. The model integrates various aspects of the concept of enterprise architecture and organizes conceptual apparatus. Ontological model easy to understand and adjust as business architects and specialists in designing systems of economic security and offers many categories of verbal representation of the domain of the enterprise. Proved the feasibility of using process-functional approach in providing strategic economic security, according to which the components of such a security company proposed as business processes, finance, staff and contractors. The article presents the author's ontological model of strategic economic security, including endangered sites, the presence of factors that threaten the security of the object and the subject of providing security. Further, it is proved that in the subjects of security impact on the object using the tools, measures and activities within the strategy formed the mechanism is implemented managerial decisions to strengthen the strategic economic security. The process of diagnosis, detection, identification of threats of economic security, and the development of enterprise development strategies, taking into account its level of economic security must be under the constant supervision of the process of

  9. Computer Model Of Fragmentation Of Atomic Nuclei

    Science.gov (United States)

    Wilson, John W.; Townsend, Lawrence W.; Tripathi, Ram K.; Norbury, John W.; KHAN FERDOUS; Badavi, Francis F.

    1995-01-01

    High Charge and Energy Semiempirical Nuclear Fragmentation Model (HZEFRG1) computer program developed to be computationally efficient, user-friendly, physics-based program for generating data bases on fragmentation of atomic nuclei. Data bases generated used in calculations pertaining to such radiation-transport applications as shielding against radiation in outer space, radiation dosimetry in outer space, cancer therapy in laboratories with beams of heavy ions, and simulation studies for designing detectors for experiments in nuclear physics. Provides cross sections for production of individual elements and isotopes in breakups of high-energy heavy ions by combined nuclear and Coulomb fields of interacting nuclei. Written in ANSI FORTRAN 77.

  10. Economics.

    Science.gov (United States)

    Online-Offline, 1998

    1998-01-01

    This issue focuses on the theme of economics, and presents educational resources for teaching basics to children. Web sites, CD-ROMs and software, videos, books, and additional resources, as well as activities which focus on economics are described. Includes short features on related topics, and the subtopics of trade, money and banking, and…

  11. Modelling the role of forests on water provision services: a hydro-economic valuation approach

    Science.gov (United States)

    Beguería, S.; Campos, P.

    2015-12-01

    Hydro-economic models that allow integrating the ecological, hydrological, infrastructure, economic and social aspects into a coherent, scientifically- informed framework constitute preferred tools for supporting decision making in the context of integrated water resources management. We present a case study of water regulation and provision services of forests in the Andalusia region of Spain. Our model computes the physical water flows and conducts an economic environmental income and asset valuation of forest surface and underground water yield. Based on available hydrologic and economic data, we develop a comprehensive water account for all the forest lands at the regional scale. This forest water environmental valuation is integrated within a much larger project aiming at providing a robust and easily replicable accounting tool to evaluate yearly the total income and capital of forests, encompassing all measurable sources of private and public incomes (timber and cork production, auto-consumption, recreational activities, biodiversity conservation, carbon sequestration, water production, etc.). We also force our simulation with future socio-economic scenarios to quantify the physical and economic efects of expected trends or simulated public and private policies on future water resources. Only a comprehensive integrated tool may serve as a basis for the development of integrated policies, such as those internationally agreed and recommended for the management of water resources.

  12. Computational modelling of evolution: ecosystems and language

    CERN Document Server

    Lipowski, Adam

    2008-01-01

    Recently, computational modelling became a very important research tool that enables us to study problems that for decades evaded scientific analysis. Evolutionary systems are certainly examples of such problems: they are composed of many units that might reproduce, diffuse, mutate, die, or in some cases for example communicate. These processes might be of some adaptive value, they influence each other and occur on various time scales. That is why such systems are so difficult to study. In this paper we briefly review some computational approaches, as well as our contributions, to the evolution of ecosystems and language. We start from Lotka-Volterra equations and the modelling of simple two-species prey-predator systems. Such systems are canonical example for studying oscillatory behaviour in competitive populations. Then we describe various approaches to study long-term evolution of multi-species ecosystems. We emphasize the need to use models that take into account both ecological and evolutionary processe...

  13. Rough – Granular Computing knowledge discovery models

    Directory of Open Access Journals (Sweden)

    Mohammed M. Eissa

    2016-11-01

    Full Text Available Medical domain has become one of the most important areas of research in order to richness huge amounts of medical information about the symptoms of diseases and how to distinguish between them to diagnose it correctly. Knowledge discovery models play vital role in refinement and mining of medical indicators to help medical experts to settle treatment decisions. This paper introduces four hybrid Rough – Granular Computing knowledge discovery models based on Rough Sets Theory, Artificial Neural Networks, Genetic Algorithm and Rough Mereology Theory. A comparative analysis of various knowledge discovery models that use different knowledge discovery techniques for data pre-processing, reduction, and data mining supports medical experts to extract the main medical indicators, to reduce the misdiagnosis rates and to improve decision-making for medical diagnosis and treatment. The proposed models utilized two medical datasets: Coronary Heart Disease dataset and Hepatitis C Virus dataset. The main purpose of this paper was to explore and evaluate the proposed models based on Granular Computing methodology for knowledge extraction according to different evaluation criteria for classification of medical datasets. Another purpose is to make enhancement in the frame of KDD processes for supervised learning using Granular Computing methodology.

  14. Computer programs for capital cost estimation, lifetime economic performance simulation, and computation of cost indexes for laser fusion and other advanced technology facilities

    Energy Technology Data Exchange (ETDEWEB)

    Pendergrass, J.H.

    1978-01-01

    Three FORTRAN programs, CAPITAL, VENTURE, and INDEXER, have been developed to automate computations used in assessing the economic viability of proposed or conceptual laser fusion and other advanced-technology facilities, as well as conventional projects. The types of calculations performed by these programs are, respectively, capital cost estimation, lifetime economic performance simulation, and computation of cost indexes. The codes permit these three topics to be addressed with considerable sophistication commensurate with user requirements and available data.

  15. Economic MPC based on LPV model for thermostatically controlled loads

    DEFF Research Database (Denmark)

    Zemtsov, Nikita; Hlava, Jaroslav; Frantsuzova, Galina

    2017-01-01

    control method which can be used to syncronize the power consumption with undispatchable renewable electricity production. Thermal behavior of TCLs can be described by linear models based on energy balance of the system. In some cases, parameters of the model may be time-varying. In this work, we present...... a modified economic MPC based on linear parameter-varying model. In particular, we provide an exact transformation from a standard economic MPC formulation to a linear program. We assume that the variables influencing the model parameters are known (predictable) for the prediction horizon of the controller....... As a case study, we present control system that minimizes operational cost of swimming pool heating system, where parameters of the model depend on the weather forecast. Simulation results demonstrate that the proposed method is able to deal with this kind of systems....

  16. Modeling and Forecasting Mortality With Economic Growth: A Multipopulation Approach.

    Science.gov (United States)

    Boonen, Tim J; Li, Hong

    2017-10-01

    Research on mortality modeling of multiple populations focuses mainly on extrapolating past mortality trends and summarizing these trends by one or more common latent factors. This article proposes a multipopulation stochastic mortality model that uses the explanatory power of economic growth. In particular, we extend the Li and Lee model (Li and Lee 2005) by including economic growth, represented by the real gross domestic product (GDP) per capita, to capture the common mortality trend for a group of populations with similar socioeconomic conditions. We find that our proposed model provides a better in-sample fit and an out-of-sample forecast performance. Moreover, it generates lower (higher) forecasted period life expectancy for countries with high (low) GDP per capita than the Li and Lee model.

  17. CONTEMPORARY ECONOMIC GROWTH MODELS AND THEORIES: A LITERATURE REVIEW

    Directory of Open Access Journals (Sweden)

    Ilkhom SHARIPOV

    2015-11-01

    Full Text Available One of the most important aspects of human development is the ability to have a decent standard of living. The secret of the "economic miracle" of many countries that have high standard of living, in fact, is simple and quite obvious. All these countries are characterized by high and sustained development of national economy, low unemployed population rate, growth of income and consumption. There is no doubt that economic growth leads to an increase in the wealth of the country as a whole, extending its potential in the fight against poverty, unemployment and solving other social problems. That is why a high level of economic growth is one of the main targets of economic policy in many countries around the world. This brief literature review discusses main existing theories and models of economic growth, its endogenous and exogenous aspects. The main purpose of this paper is to determine the current state of development of the economic growth theories and to determine their future directions of development.

  18. Physics and financial economics (1776-2014): puzzles, Ising and agent-based models.

    Science.gov (United States)

    Sornette, Didier

    2014-06-01

    This short review presents a selected history of the mutual fertilization between physics and economics--from Isaac Newton and Adam Smith to the present. The fundamentally different perspectives embraced in theories developed in financial economics compared with physics are dissected with the examples of the volatility smile and of the excess volatility puzzle. The role of the Ising model of phase transitions to model social and financial systems is reviewed, with the concepts of random utilities and the logit model as the analog of the Boltzmann factor in statistical physics. Recent extensions in terms of quantum decision theory are also covered. A wealth of models are discussed briefly that build on the Ising model and generalize it to account for the many stylized facts of financial markets. A summary of the relevance of the Ising model and its extensions is provided to account for financial bubbles and crashes. The review would be incomplete if it did not cover the dynamical field of agent-based models (ABMs), also known as computational economic models, of which the Ising-type models are just special ABM implementations. We formulate the 'Emerging Intelligence Market Hypothesis' to reconcile the pervasive presence of 'noise traders' with the near efficiency of financial markets. Finally, we note that evolutionary biology, more than physics, is now playing a growing role to inspire models of financial markets.

  19. Sustainable economic production quantity models for inventory systems with shortage

    DEFF Research Database (Denmark)

    Taleizadeh, Ata Allah; Soleymanfar, Vahid Reza; Govindan, Kannan

    2018-01-01

    (EPQ). The theoretical sustainable EOQ and EPQ models are basic models that ignore many real-life conditions such as the possibility of stock-out in inventory systems. In this paper, we develop four new sustainable economic production quantity models that consider different shortage situations. To find...... optimal values of inventory system variables, we solve four independent profit maximization problems for four different situations. These proposed models include a basic model in which shortages are not allowed, and when shortages are allowed, the lost sale, full backordering and partial backordering...

  20. Analysis of a Model for Computer Virus Transmission

    OpenAIRE

    Qin, Peng

    2015-01-01

    Computer viruses remain a significant threat to computer networks. In this paper, the incorporation of new computers to the network and the removing of old computers from the network are considered. Meanwhile, the computers are equipped with antivirus software on the computer network. The computer virus model is established. Through the analysis of the model, disease-free and endemic equilibrium points are calculated. The stability conditions of the equilibria are derived. To illustrate our t...

  1. Decision-analytical modelling in health-care economic evaluations.

    Science.gov (United States)

    Sun, Xin; Faunce, Thomas

    2008-11-01

    Decision-analytical modelling is widely used in health-care economic evaluations, especially in situations where evaluators lack clinical trial data, and in circumstances where such evaluations factor into reimbursement pricing decisions. This paper aims to improve the understanding and use of modelling techniques in this context, with particular emphasis on Markov modelling. We provide an overview, in this paper, of the principles and methodological details of decision-analytical modelling. We propose a common route for practicing modelling that accommodates any type of decision-analytical modelling techniques. We use the treatment of chronic hepatitis B as an example to indicate the process of development, presentation and analysis of the Markov model, and discuss the strengths, weaknesses and pitfalls of different approaches. Good practice of modelling requires careful planning, conduct and analysis of the model, and needs input from modellers and users.

  2. [Demo-economic models of development: evolution and recent trends].

    Science.gov (United States)

    Bourcier De Carbon, P

    1983-01-01

    Among the recommendations of the 1974 World Population Conference in Bucharest was the elaboration of empirical and inductive demographic-economic models to assist in planning. 1 of the disadvantages of existing models and systems of national income accounting was that income distribution was ignored in favor of the total value of production. Demographic variables were not regarded as endogenous. By the early 1970s, the societal changes attendant on rural exodus and urban unemployment, the increasing absorption of traditional structures into the modern sector, and changes in the roles of women and young people had become obvious, and the need for new models that would reflect such changes was clear. Some macromodels designed to assist medium and long range planning were 1st elaborated in the mid 1970s; the Bachue development models were particularly promising because of their improved database. New models were developed which incorporated consumption problems based on basic needs. An increased focus on the interaction of macroeconomic variables with microsociological and demographic variables, the household and family, and employment and the labor market became necessary. The Bachue models, which had been the most successful of recent models in integrating economic and sociodemographic structures and variables, usually include 4 principal modules which cover demography and the educational system, the economy, employment, and income distribution; basic needs models include a 5th module. The Bachue models are based on a general equilibrium model subject to certain constraints, while the basic needs models are based on dynamic disequilibrium models. A major problem of the models is that technological progress is fundamentally exogenous; there is no intimate link between productivity and elevation of the educational level of the labor force. To avoid unmanageable complexity, households are reconstructed for each period on the basis of the demographic and economic data

  3. Modelling the interaction between flooding events and economic growth

    Science.gov (United States)

    Grames, Johanna; Fürnkranz-Prskawetz, Alexia; Grass, Dieter; Viglione, Alberto; Blöschl, Günter

    2016-04-01

    Recently socio-hydrology models have been proposed to analyze the interplay of community risk-coping culture, flooding damage and economic growth. These models descriptively explain the feedbacks between socio-economic development and natural disasters such as floods. Complementary to these descriptive models, we develop a dynamic optimization model, where the inter-temporal decision of an economic agent interacts with the hydrological system. This interdisciplinary approach matches with the goals of Panta Rhei i.e. to understand feedbacks between hydrology and society. It enables new perspectives but also shows limitations of each discipline. Young scientists need mentors from various scientific backgrounds to learn their different research approaches and how to best combine them such that interdisciplinary scientific work is also accepted by different science communities. In our socio-hydrology model we apply a macro-economic decision framework to a long-term flood-scenario. We assume a standard macro-economic growth model where agents derive utility from consumption and output depends on physical capital that can be accumulated through investment. To this framework we add the occurrence of flooding events which will destroy part of the capital. We identify two specific periodic long term solutions and denote them rich and poor economies. Whereas rich economies can afford to invest in flood defense and therefore avoid flood damage and develop high living standards, poor economies prefer consumption instead of investing in flood defense capital and end up facing flood damages every time the water level rises. Nevertheless, they manage to sustain at least a low level of physical capital. We identify optimal investment strategies and compare simulations with more frequent and more intense high water level events.

  4. Computational modeling of neurostimulation in brain diseases.

    Science.gov (United States)

    Wang, Yujiang; Hutchings, Frances; Kaiser, Marcus

    2015-01-01

    Neurostimulation as a therapeutic tool has been developed and used for a range of different diseases such as Parkinson's disease, epilepsy, and migraine. However, it is not known why the efficacy of the stimulation varies dramatically across patients or why some patients suffer from severe side effects. This is largely due to the lack of mechanistic understanding of neurostimulation. Hence, theoretical computational approaches to address this issue are in demand. This chapter provides a review of mechanistic computational modeling of brain stimulation. In particular, we will focus on brain diseases, where mechanistic models (e.g., neural population models or detailed neuronal models) have been used to bridge the gap between cellular-level processes of affected neural circuits and the symptomatic expression of disease dynamics. We show how such models have been, and can be, used to investigate the effects of neurostimulation in the diseased brain. We argue that these models are crucial for the mechanistic understanding of the effect of stimulation, allowing for a rational design of stimulation protocols. Based on mechanistic models, we argue that the development of closed-loop stimulation is essential in order to avoid inference with healthy ongoing brain activity. Furthermore, patient-specific data, such as neuroanatomic information and connectivity profiles obtainable from neuroimaging, can be readily incorporated to address the clinical issue of variability in efficacy between subjects. We conclude that mechanistic computational models can and should play a key role in the rational design of effective, fully integrated, patient-specific therapeutic brain stimulation. © 2015 Elsevier B.V. All rights reserved.

  5. Molecular Sieve Bench Testing and Computer Modeling

    Science.gov (United States)

    Mohamadinejad, Habib; DaLee, Robert C.; Blackmon, James B.

    1995-01-01

    The design of an efficient four-bed molecular sieve (4BMS) CO2 removal system for the International Space Station depends on many mission parameters, such as duration, crew size, cost of power, volume, fluid interface properties, etc. A need for space vehicle CO2 removal system models capable of accurately performing extrapolated hardware predictions is inevitable due to the change of the parameters which influences the CO2 removal system capacity. The purpose is to investigate the mathematical techniques required for a model capable of accurate extrapolated performance predictions and to obtain test data required to estimate mass transfer coefficients and verify the computer model. Models have been developed to demonstrate that the finite difference technique can be successfully applied to sorbents and conditions used in spacecraft CO2 removal systems. The nonisothermal, axially dispersed, plug flow model with linear driving force for 5X sorbent and pore diffusion for silica gel are then applied to test data. A more complex model, a non-darcian model (two dimensional), has also been developed for simulation of the test data. This model takes into account the channeling effect on column breakthrough. Four FORTRAN computer programs are presented: a two-dimensional model of flow adsorption/desorption in a packed bed; a one-dimensional model of flow adsorption/desorption in a packed bed; a model of thermal vacuum desorption; and a model of a tri-sectional packed bed with two different sorbent materials. The programs are capable of simulating up to four gas constituents for each process, which can be increased with a few minor changes.

  6. Technical, Legal, Economic and Social Aspects of Biometrics for Cloud Computing

    Directory of Open Access Journals (Sweden)

    Jernej Bule

    2014-12-01

    Full Text Available This article addresses technical, legal, economic and social aspects of biometrics for cloud computing, featuring application example, gains of such solution, current laws, directives and legislation for biometrics and cloud computing. It is primarily based on Slovenian example due to  common  general  EU  legislation  in  the  field  of  cloud  computing  and  biometrics. Authentication  on  the  Internet  is  still  mainly  done  using  passwords,  while  biometrics  is practically  not  used.  It  is  commonly  known  that  everything  is  moving  to  the  cloud  and biometrics is not an exception. Amount of biometric data is expected to grow significantly over the next few years and only cloud computing is possible to process such amounts of data. Due to these facts and increasing security needs, we propose and implement the use of biometry as a service in the cloud. A challenge regarding the use of biometric solutions in the cloud is the protection of the privacy of individuals and their personal data. In Slovenia privacy legislation is very strong, it permits usage of biometrics only for very specific reasons, but we predict that big players on the market will change this fact globally. One of the important reasons for that is also the fact that biometrics for cloud computing provides some strong benefits and economic incentives. Proper deployment can provide significant savings. Such solutions could improve people’s quality of life in terms of social development, especially in sense of more convenient, safer and reliable identification over multiple government and non-government services.

  7. An Economic Model of Workplace Mobbing in Academe

    Science.gov (United States)

    Faria, Joao Ricardo; Mixon, Franklin G., Jr.; Salter, Sean P.

    2012-01-01

    Workplace bullying or mobbing can be defined as the infliction of various forms of abuse (e.g., verbal, emotional, psychological) against a colleague or subordinate by one or more other members of a workplace. Even in the presence of academic tenure, workplace mobbing remains a prevalent issue in academe. This study develops an economic model that…

  8. Student Migration to Online Education: An Economic Model

    Science.gov (United States)

    Eisenhauer, Joseph G.

    2013-01-01

    The popularity of distance education has increasingly led universities to consider expanding their online offerings. Remarkably few financial models have been presented for online courses, however, and fewer still have investigated the economic consequences of the migration, or cross-over, of students from traditional classes within the…

  9. Nonlinear Economic Model Predictive Control Strategy for Active Smart Buildings

    DEFF Research Database (Denmark)

    Santos, Rui Mirra; Zong, Yi; Sousa, Joao M. C.

    2016-01-01

    Nowadays, the development of advanced and innovative intelligent control techniques for energy management in buildings is a key issue within the smart grid topic. A nonlinear economic model predictive control (EMPC) scheme, based on the branch-and-bound tree search used as optimization algorithm...

  10. R.M. Solow Adjusted Model of Economic Growth

    Directory of Open Access Journals (Sweden)

    Ion Gh. Rosca

    2007-05-01

    The analysis part of the model is based on the study of the equilibrium to the continuous case with some interpretations of the discreet one, by using the state diagram. The optimization problem at the economic level is also used; it is built up of a specified number of representative consumers and firms in order to reveal the interaction between these elements.

  11. Real-Time Optimization for Economic Model Predictive Control

    DEFF Research Database (Denmark)

    Sokoler, Leo Emil; Edlund, Kristian; Frison, Gianluca

    2012-01-01

    In this paper, we develop an efficient homogeneous and self-dual interior-point method for the linear programs arising in economic model predictive control. To exploit structure in the optimization problems, the algorithm employs a highly specialized Riccati iteration procedure. Simulations show...

  12. Aligning the economic modeling of software reuse with reuse practices

    NARCIS (Netherlands)

    Postmus, D.; Meijler, 27696

    In contrast to current practices where software reuse is applied recursively and reusable assets are tailored trough parameterization or specialization, existing reuse economic models assume that (i) the cost of reusing a software asset depends on its size and (ii) reusable assets are developed from

  13. Micro-economic panel data models for Dutch dairy farming

    NARCIS (Netherlands)

    Ooms, D.L.

    2007-01-01

    Keywords:micro-economic models, panel data, GMM, CAP reform, dairy farmingIn The Netherlands, thedairy sector is the largest agricultural sub-sector based on both gross production value and number of

  14. Using STELLA Simulation Models to Teach Natural Resource Economics

    Science.gov (United States)

    Dissanayake, Sahan T. M.

    2016-01-01

    In this article, the author discusses how graphical simulation models created using STELLA software can be used to present natural resource systems in an intuitive way in undergraduate natural resource economics classes based on his experiences at a leading research university, a state university, and a leading liberal arts college in the United…

  15. Bio-economic household modelling for agricultural intensification

    NARCIS (Netherlands)

    Kruseman, G.

    2000-01-01

    This study contributes to the quest for sustainable agricultural intensification through the development of a quantitative bio-economic modelling framework that allows assessment of new technology and policy measures in terms of household welfare and sustainability indicators. The main aim

  16. Modelling and estimation of dynamic instability in complex economic systems

    NARCIS (Netherlands)

    Wang, J.

    2015-01-01

    This thesis sheds some lights on the on-going discussions on modelling the economy as a complex evolving system. It introduces a complex systems approach and attempts to unfold the underlying mechanisms of dynamic instability in complex economic system. Moreover, it contributes to the ongoing

  17. The economic production lot size model with several production rates

    DEFF Research Database (Denmark)

    Larsen, Christian

    We study an extension of the economic production lot size model, where more than one production rate can be used during a cycle. The production rates and their corresponding runtimes are decision variables. We decompose the problem into two subproblems. First, we show that all production rates...

  18. Review of whole-farm economic modelling for irrigation farming ...

    African Journals Online (AJOL)

    The main objective of this paper is to review the progress that has been made in South Africa with respect to whole-farm economic modelling over the past 2 decades. Farming systems are complex and careful consideration to the stochastic dynamic nature of irrigation farming processes and their linkages with the larger ...

  19. Computational Modeling of Pollution Transmission in Rivers

    Science.gov (United States)

    Parsaie, Abbas; Haghiabi, Amir Hamzeh

    2017-06-01

    Modeling of river pollution contributes to better management of water quality and this will lead to the improvement of human health. The advection dispersion equation (ADE) is the government equation on pollutant transmission in the river. Modeling the pollution transmission includes numerical solution of the ADE and estimating the longitudinal dispersion coefficient (LDC). In this paper, a novel approach is proposed for numerical modeling of the pollution transmission in rivers. It is related to use both finite volume method as numerical method and artificial neural network (ANN) as soft computing technique together in simulation. In this approach, the result of the ANN for predicting the LDC was considered as input parameter for the numerical solution of the ADE. To validate the model performance in real engineering problems, the pollutant transmission in Severn River has been simulated. Comparison of the final model results with measured data of the Severn River showed that the model has good performance. Predicting the LDC by ANN model significantly improved the accuracy of computer simulation of the pollution transmission in river.

  20. A Generic Bio-Economic Farm Model for Environmental and Economic Assessment of Agricultural Systems

    Science.gov (United States)

    Louhichi, Kamel; Kanellopoulos, Argyris; Zander, Peter; Flichman, Guillermo; Hengsdijk, Huib; Meuter, Eelco; Andersen, Erling; Belhouchette, Hatem; Blanco, Maria; Borkowski, Nina; Heckelei, Thomas; Hecker, Martin; Li, Hongtao; Oude Lansink, Alfons; Stokstad, Grete; Thorne, Peter; van Keulen, Herman; van Ittersum, Martin K.

    2010-01-01

    Bio-economic farm models are tools to evaluate ex-post or to assess ex-ante the impact of policy and technology change on agriculture, economics and environment. Recently, various BEFMs have been developed, often for one purpose or location, but hardly any of these models are re-used later for other purposes or locations. The Farm System Simulator (FSSIM) provides a generic framework enabling the application of BEFMs under various situations and for different purposes (generating supply response functions and detailed regional or farm type assessments). FSSIM is set up as a component-based framework with components representing farmer objectives, risk, calibration, policies, current activities, alternative activities and different types of activities (e.g., annual and perennial cropping and livestock). The generic nature of FSSIM is evaluated using five criteria by examining its applications. FSSIM has been applied for different climate zones and soil types (criterion 1) and to a range of different farm types (criterion 2) with different specializations, intensities and sizes. In most applications FSSIM has been used to assess the effects of policy changes and in two applications to assess the impact of technological innovations (criterion 3). In the various applications, different data sources, level of detail (e.g., criterion 4) and model configurations have been used. FSSIM has been linked to an economic and several biophysical models (criterion 5). The model is available for applications to other conditions and research issues, and it is open to be further tested and to be extended with new components, indicators or linkages to other models. PMID:21113782

  1. Regional disaster impact analysis: comparing Input-Output and Computable General Equilibrium models

    NARCIS (Netherlands)

    Koks, E.E.; Carrera, L.; Jonkeren, O.; Aerts, J.C.J.H.; Husby, T.G.; Thissen, M.; Standardi, G.; Mysiak, J.

    2016-01-01

    A variety of models have been applied to assess the economic losses of disasters, of which the most common ones are input-output (IO) and computable general equilibrium (CGE) models. In addition, an increasing number of scholars have developed hybrid approaches: one that combines both or either of

  2. Economism

    Directory of Open Access Journals (Sweden)

    P. Simons

    2010-07-01

    Full Text Available Modern society is characterised not only by a fascination with scientific technology as a means of solving all problems, especially those that stand in the way of material progress (technicism, but also by an obsessive interest in everything that has to do with money (economism or mammonism. The article discusses the relationship between technicism and economism, on the basis of their relationship to utilitarian thinking: the quest for the greatest happiness for the greatest number of people. Recent major studies of neo-liberalism (seen as an intensification of utilitarianism by Laval and Dardot are used as reference to the development of utilitarianism. It is suggested that the western view of the world, as expressed in economism and technicism, with a utilitarian ethics, features three absolutisations: those of theoretical thinking, technology and economics. In a second part, the article draws on the framework of reformational philosophy to suggest an approach that, in principle, is not marred by such absolutisations.

  3. Models of rational decision making in contemporary economic theory

    Directory of Open Access Journals (Sweden)

    Krstić Bojan

    2015-01-01

    Full Text Available The aim of this paper is to show that the economists can not adequately explain the rational behavior if are focused only on the scientific observations from the model of full rationality and the model instrumental rationality, and the inclusion related model makes 'larger views', which like more reprezentative reflection of the rational behavior represents a solid basis for construction the model of decision-making in contemporary economic science. Taking into account the goal of the work and its specific character, we composed adequate structure of work. In the first part of the paper, we define the model of full rationality, its important characteristics. In the second part, we analyze the model of instrumental rationality. In the analysis of model, we start from the statement, which given in economic theory, that the rational actor uses the best means to achieve their objectives. In the third part, we consider of the basic of the models of value rationality. In the fourth part, we consider key characteristics of the model of bounded rationality. In the last part, we focuse on questioning the basic assumptions of the model of full rationality and the model of instrumental rationality. We especially analyze the personal and social goals preferences of high school students and university students.

  4. Computational hemodynamics theory, modelling and applications

    CERN Document Server

    Tu, Jiyuan; Wong, Kelvin Kian Loong

    2015-01-01

    This book discusses geometric and mathematical models that can be used to study fluid and structural mechanics in the cardiovascular system.  Where traditional research methodologies in the human cardiovascular system are challenging due to its invasive nature, several recent advances in medical imaging and computational fluid and solid mechanics modelling now provide new and exciting research opportunities. This emerging field of study is multi-disciplinary, involving numerical methods, computational science, fluid and structural mechanics, and biomedical engineering. Certainly any new student or researcher in this field may feel overwhelmed by the wide range of disciplines that need to be understood. This unique book is one of the first to bring together knowledge from multiple disciplines, providing a starting point to each of the individual disciplines involved, attempting to ease the steep learning curve. This book presents elementary knowledge on the physiology of the cardiovascular system; basic knowl...

  5. Method of generating a computer readable model

    DEFF Research Database (Denmark)

    2008-01-01

    A method of generating a computer readable model of a geometrical object constructed from a plurality of interconnectable construction elements, wherein each construction element has a number of connection elements for connecting the construction element with another construction element. The met......A method of generating a computer readable model of a geometrical object constructed from a plurality of interconnectable construction elements, wherein each construction element has a number of connection elements for connecting the construction element with another construction element....... The method comprises encoding a first and a second one of the construction elements as corresponding data structures, each representing the connection elements of the corresponding construction element, and each of the connection elements having associated with it a predetermined connection type. The method...

  6. A hybrid computational model for phagocyte transmigration

    OpenAIRE

    Xue, Jiaxing; Gao, Jean; Tang, Liping

    2008-01-01

    Phagocyte transmigration is the initiation of a series of phagocyte responses that are believed important in the formation of fibrotic capsules surrounding implanted medical devices. Understanding the molecular mechanisms governing phagocyte transmigration is highly desired in order to improve the stability and functionality of the implanted devices. A hybrid computational model that combines control theory and kinetics Monte Carlo (KMC) algorithm is proposed to simulate and predict phagocyte...

  7. An Impulse Model for Computer Viruses

    Directory of Open Access Journals (Sweden)

    Chunming Zhang

    2012-01-01

    Full Text Available Computer virus spread model concerning impulsive control strategy is proposed and analyzed. We prove that there exists a globally attractive infection-free periodic solution when the vaccination rate is larger than θ0. Moreover, we show that the system is uniformly persistent if the vaccination rate is less than θ1. Some numerical simulations are finally given to illustrate the main results.

  8. Patients with newly diagnosed cervical cancer should be screened for anal human papilloma virus and anal dysplasia: Results of a pilot study using a STELLA computer simulation and economic model.

    Science.gov (United States)

    Ehrenpreis, Eli D; Smith, Dylan G

    2017-12-13

    Women with cervical cancer often have anal human papillomavirus (HPV) infection and anal dysplasia. However, effectiveness of anal HPV screening is unknown. A dynamic model was constructed using STELLA. Populations are represented as "stocks" that change according to model rates. Initial anal cytology in new cervical cancer patients, dysplasia progression and regression, cost of treating high-grade squamous intraepithelial lesions (HSIL), and lifetime costs for anal cancer care were extrapolated from the literature. Local costs of anal HPV testing and cytology were obtained. Outcomes included anal cancer rates, anal cancer deaths, screening costs and cancer care. Benefits in the screened group included reduction in anal cancers after three years and anal cancer deaths after four years. After 10 years, predicted costs per anal cancer prevented and anal cancer deaths were $168,796 and $210,057 and were $98,631 and $210,057 at 20 years. Predicted costs per quality of life year saved at 10 and 20 years were $9785 and $1687. Sensitivity analysis demonstrated cost-effectiveness of screening for a variety of cure rates HSIL with electrocautery. Screening for anal HPV and treatment of anal HSIL in patients with cervical cancer is cost-effective, prevents anal cancer and reduces anal cancer deaths. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  9. Computational Modeling and Simulation of Developmental ...

    Science.gov (United States)

    Standard practice for assessing developmental toxicity is the observation of apical endpoints (intrauterine death, fetal growth retardation, structural malformations) in pregnant rats/rabbits following exposure during organogenesis. EPA’s computational toxicology research program (ToxCast) generated vast in vitro cellular and molecular effects data on >1858 chemicals in >600 high-throughput screening (HTS) assays. The diversity of assays has been increased for developmental toxicity with several HTS platforms, including the devTOX-quickPredict assay from Stemina Biomarker Discovery utilizing the human embryonic stem cell line (H9). Translating these HTS data into higher order-predictions of developmental toxicity is a significant challenge. Here, we address the application of computational systems models that recapitulate the kinematics of dynamical cell signaling networks (e.g., SHH, FGF, BMP, retinoids) in a CompuCell3D.org modeling environment. Examples include angiogenesis (angiodysplasia) and dysmorphogenesis. Being numerically responsive to perturbation, these models are amenable to data integration for systems Toxicology and Adverse Outcome Pathways (AOPs). The AOP simulation outputs predict potential phenotypes based on the in vitro HTS data ToxCast. A heuristic computational intelligence framework that recapitulates the kinematics of dynamical cell signaling networks in the embryo, together with the in vitro profiling data, produce quantitative predic

  10. Computer Based Modelling and Simulation-Modelling and ...

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 6; Issue 4. Computer Based Modelling and Simulation-Modelling and Simulation with Probability and Throwing Dice. N K Srinivasan. General Article Volume 6 Issue 4 April 2001 pp 69-77 ...

  11. Economics of extreme weather events: Terminology and regional impact models

    Directory of Open Access Journals (Sweden)

    Malte Jahn

    2015-12-01

    Full Text Available Impacts of extreme weather events are relevant for regional (in the sense of subnational economies and in particular cities in many aspects. Cities are the cores of economic activity and the amount of people and assets endangered by extreme weather events is large, even under the current climate. A changing climate with changing extreme weather patterns and the process of urbanization will make the whole issue even more relevant in the future. In this paper, definitions and terminology in the field of extreme weather events are discussed. Possible regional impacts of extreme weather events are collected, focusing on European cities. The human contributions to those impacts are emphasized. Furthermore, methodological aspects of economic impact assessment are discussed along a temporal and a sectoral dimension. Finally, common economic impact models are compared, analyzing their strengths and weaknesses.

  12. Analytical performance modeling for computer systems

    CERN Document Server

    Tay, Y C

    2013-01-01

    This book is an introduction to analytical performance modeling for computer systems, i.e., writing equations to describe their performance behavior. It is accessible to readers who have taken college-level courses in calculus and probability, networking and operating systems. This is not a training manual for becoming an expert performance analyst. Rather, the objective is to help the reader construct simple models for analyzing and understanding the systems that they are interested in.Describing a complicated system abstractly with mathematical equations requires a careful choice of assumpti

  13. Fuzzy predictive filtering in nonlinear economic model predictive control for demand response

    DEFF Research Database (Denmark)

    Santos, Rui Mirra; Zong, Yi; Sousa, Joao M. C.

    2016-01-01

    The performance of a model predictive controller (MPC) is highly correlated with the model's accuracy. This paper introduces an economic model predictive control (EMPC) scheme based on a nonlinear model, which uses a branch-and-bound tree search for solving the inherent non-convex optimization...... problem. Moreover, to reduce the computation time and improve the controller's performance, a fuzzy predictive filter is introduced. With the purpose of testing the developed EMPC, a simulation controlling the temperature levels of an intelligent office building (PowerFlexHouse), with and without fuzzy...... filtering, is performed. The results show that the controller achieves a good performance while keeping the temperature inside the predefined comfort limits. Fuzzy predictive filtering has shown to be an effective tool which is capable of reducing the computational burden and increasing the performance...

  14. Computational acoustic modeling of cetacean vocalizations

    Science.gov (United States)

    Gurevich, Michael Dixon

    A framework for computational acoustic modeling of hypothetical vocal production mechanisms in cetaceans is presented. As a specific example, a model of a proposed source in the larynx of odontocetes is developed. Whales and dolphins generate a broad range of vocal sounds, but the exact mechanisms they use are not conclusively understood. In the fifty years since it has become widely accepted that whales can and do make sound, how they do so has remained particularly confounding. Cetaceans' highly divergent respiratory anatomy, along with the difficulty of internal observation during vocalization have contributed to this uncertainty. A variety of acoustical, morphological, ethological and physiological evidence has led to conflicting and often disputed theories of the locations and mechanisms of cetaceans' sound sources. Computational acoustic modeling has been used to create real-time parametric models of musical instruments and the human voice. These techniques can be applied to cetacean vocalizations to help better understand the nature and function of these sounds. Extensive studies of odontocete laryngeal morphology have revealed vocal folds that are consistently similar to a known but poorly understood acoustic source, the ribbon reed. A parametric computational model of the ribbon reed is developed, based on simplified geometrical, mechanical and fluid models drawn from the human voice literature. The physical parameters of the ribbon reed model are then adapted to those of the odontocete larynx. With reasonable estimates of real physical parameters, both the ribbon reed and odontocete larynx models produce sounds that are perceptually similar to their real-world counterparts, and both respond realistically under varying control conditions. Comparisons of acoustic features of the real-world and synthetic systems show a number of consistencies. While this does not on its own prove that either model is conclusively an accurate description of the source, it

  15. Computational Design Modelling : Proceedings of the Design Modelling Symposium

    CERN Document Server

    Kilian, Axel; Palz, Norbert; Scheurer, Fabian

    2012-01-01

    This book publishes the peer-reviewed proceeding of the third Design Modeling Symposium Berlin . The conference constitutes a platform for dialogue on experimental practice and research within the field of computationally informed architectural design. More than 60 leading experts the computational processes within the field of computationally informed architectural design to develop a broader and less exotic building practice that bears more subtle but powerful traces of the complex tool set and approaches we have developed and studied over recent years. The outcome are new strategies for a reasonable and innovative implementation of digital potential in truly innovative and radical design guided by both responsibility towards processes and the consequences they initiate.

  16. Toward a computational model of hemostasis

    Science.gov (United States)

    Leiderman, Karin; Danes, Nicholas; Schoeman, Rogier; Neeves, Keith

    2017-11-01

    Hemostasis is the process by which a blood clot forms to prevent bleeding at a site of injury. The formation time, size and structure of a clot depends on the local hemodynamics and the nature of the injury. Our group has previously developed computational models to study intravascular clot formation, a process confined to the interior of a single vessel. Here we present the first stage of an experimentally-validated, computational model of extravascular clot formation (hemostasis) in which blood through a single vessel initially escapes through a hole in the vessel wall and out a separate injury channel. This stage of the model consists of a system of partial differential equations that describe platelet aggregation and hemodynamics, solved via the finite element method. We also present results from the analogous, in vitro, microfluidic model. In both models, formation of a blood clot occludes the injury channel and stops flow from escaping while blood in the main vessel retains its fluidity. We discuss the different biochemical and hemodynamic effects on clot formation using distinct geometries representing intra- and extravascular injuries.

  17. Stochastic effects in a discretized kinetic model of economic exchange

    Science.gov (United States)

    Bertotti, M. L.; Chattopadhyay, A. K.; Modanese, G.

    2017-04-01

    Linear stochastic models and discretized kinetic theory are two complementary analytical techniques used for the investigation of complex systems of economic interactions. The former employ Langevin equations, with an emphasis on stock trade; the latter is based on systems of ordinary differential equations and is better suited for the description of binary interactions, taxation and welfare redistribution. We propose a new framework which establishes a connection between the two approaches by introducing random fluctuations into the kinetic model based on Langevin and Fokker-Planck formalisms. Numerical simulations of the resulting model indicate positive correlations between the Gini index and the total wealth, that suggest a growing inequality with increasing income. Further analysis shows, in the presence of a conserved total wealth, a simultaneous decrease in inequality as social mobility increases, in conformity with economic data.

  18. ECONOMIC FORECASTS BASED ON ECONOMETRIC MODELS USING EViews 5

    Directory of Open Access Journals (Sweden)

    Cornelia TomescuDumitrescu,

    2009-05-01

    Full Text Available The forecast of evolution of economic phenomena represent on the most the final objective of econometrics. It withal represent a real attempt of validity elaborate model. Unlike the forecasts based on the study of temporal series which have an recognizable inertial character the forecasts generated by econometric model with simultaneous equations are after to contour the future of ones of important economic variables toward the direct and indirect influences bring the bear on their about exogenous variables. For the relief of the calculus who the realization of the forecasts based on the econometric models its suppose is indicate the use of the specialized informatics programs. One of this is the EViews which is applied because it reduces significant the time who is destined of the econometric analysis and it assure a high accuracy of calculus and of the interpretation of results.

  19. Computer Modeling of Human Delta Opioid Receptor

    Directory of Open Access Journals (Sweden)

    Tatyana Dzimbova

    2013-04-01

    Full Text Available The development of selective agonists of δ-opioid receptor as well as the model of interaction of ligands with this receptor is the subjects of increased interest. In the absence of crystal structures of opioid receptors, 3D homology models with different templates have been reported in the literature. The problem is that these models are not available for widespread use. The aims of our study are: (1 to choose within recently published crystallographic structures templates for homology modeling of the human δ-opioid receptor (DOR; (2 to evaluate the models with different computational tools; and (3 to precise the most reliable model basing on correlation between docking data and in vitro bioassay results. The enkephalin analogues, as ligands used in this study, were previously synthesized by our group and their biological activity was evaluated. Several models of DOR were generated using different templates. All these models were evaluated by PROCHECK and MolProbity and relationship between docking data and in vitro results was determined. The best correlations received for the tested models of DOR were found between efficacy (erel of the compounds, calculated from in vitro experiments and Fitness scoring function from docking studies. New model of DOR was generated and evaluated by different approaches. This model has good GA341 value (0.99 from MODELLER, good values from PROCHECK (92.6% of most favored regions and MolProbity (99.5% of favored regions. Scoring function correlates (Pearson r = -0.7368, p-value = 0.0097 with erel of a series of enkephalin analogues, calculated from in vitro experiments. So, this investigation allows suggesting a reliable model of DOR. Newly generated model of DOR receptor could be used further for in silico experiments and it will give possibility for faster and more correct design of selective and effective ligands for δ-opioid receptor.

  20. Stochastic computations in cortical microcircuit models.

    Directory of Open Access Journals (Sweden)

    Stefan Habenschuss

    Full Text Available Experimental data from neuroscience suggest that a substantial amount of knowledge is stored in the brain in the form of probability distributions over network states and trajectories of network states. We provide a theoretical foundation for this hypothesis by showing that even very detailed models for cortical microcircuits, with data-based diverse nonlinear neurons and synapses, have a stationary distribution of network states and trajectories of network states to which they converge exponentially fast from any initial state. We demonstrate that this convergence holds in spite of the non-reversibility of the stochastic dynamics of cortical microcircuits. We further show that, in the presence of background network oscillations, separate stationary distributions emerge for different phases of the oscillation, in accordance with experimentally reported phase-specific codes. We complement these theoretical results by computer simulations that investigate resulting computation times for typical probabilistic inference tasks on these internally stored distributions, such as marginalization or marginal maximum-a-posteriori estimation. Furthermore, we show that the inherent stochastic dynamics of generic cortical microcircuits enables them to quickly generate approximate solutions to difficult constraint satisfaction problems, where stored knowledge and current inputs jointly constrain possible solutions. This provides a powerful new computing paradigm for networks of spiking neurons, that also throws new light on how networks of neurons in the brain could carry out complex computational tasks such as prediction, imagination, memory recall and problem solving.

  1. Stochastic computations in cortical microcircuit models.

    Science.gov (United States)

    Habenschuss, Stefan; Jonke, Zeno; Maass, Wolfgang

    2013-01-01

    Experimental data from neuroscience suggest that a substantial amount of knowledge is stored in the brain in the form of probability distributions over network states and trajectories of network states. We provide a theoretical foundation for this hypothesis by showing that even very detailed models for cortical microcircuits, with data-based diverse nonlinear neurons and synapses, have a stationary distribution of network states and trajectories of network states to which they converge exponentially fast from any initial state. We demonstrate that this convergence holds in spite of the non-reversibility of the stochastic dynamics of cortical microcircuits. We further show that, in the presence of background network oscillations, separate stationary distributions emerge for different phases of the oscillation, in accordance with experimentally reported phase-specific codes. We complement these theoretical results by computer simulations that investigate resulting computation times for typical probabilistic inference tasks on these internally stored distributions, such as marginalization or marginal maximum-a-posteriori estimation. Furthermore, we show that the inherent stochastic dynamics of generic cortical microcircuits enables them to quickly generate approximate solutions to difficult constraint satisfaction problems, where stored knowledge and current inputs jointly constrain possible solutions. This provides a powerful new computing paradigm for networks of spiking neurons, that also throws new light on how networks of neurons in the brain could carry out complex computational tasks such as prediction, imagination, memory recall and problem solving.

  2. Computational model of a copper laser

    Energy Technology Data Exchange (ETDEWEB)

    Boley, C.D.; Molander, W.A.; Warner, B.E.

    1997-03-26

    This report describes a computational model of a copper laser amplifier. The model contains rate equations for copper and the buffer gas species (neon and hydrogen), along with equations for the electron temperature, the laser intensity, and the diffusing magnetic field of the discharge. Rates are given for all pertinent atomic reactions. The radial profile of the gas temperature is determined by the time-averaged power deposited in the gas. The presence of septum inserts, which aid gas cooling, is taken into account. Fields are calculated consistently throughout the plasma and the surrounding insulation. Employed in conjunction with a modulator model, the model is used to calculate comprehensive performance predictions for a high- power operational amplifier.

  3. Computational Modeling of Large Wildfires: A Roadmap

    KAUST Repository

    Coen, Janice L.

    2010-08-01

    Wildland fire behavior, particularly that of large, uncontrolled wildfires, has not been well understood or predicted. Our methodology to simulate this phenomenon uses high-resolution dynamic models made of numerical weather prediction (NWP) models coupled to fire behavior models to simulate fire behavior. NWP models are capable of modeling very high resolution (< 100 m) atmospheric flows. The wildland fire component is based upon semi-empirical formulas for fireline rate of spread, post-frontal heat release, and a canopy fire. The fire behavior is coupled to the atmospheric model such that low level winds drive the spread of the surface fire, which in turn releases sensible heat, latent heat, and smoke fluxes into the lower atmosphere, feeding back to affect the winds directing the fire. These coupled dynamic models capture the rapid spread downwind, flank runs up canyons, bifurcations of the fire into two heads, and rough agreement in area, shape, and direction of spread at periods for which fire location data is available. Yet, intriguing computational science questions arise in applying such models in a predictive manner, including physical processes that span a vast range of scales, processes such as spotting that cannot be modeled deterministically, estimating the consequences of uncertainty, the efforts to steer simulations with field data ("data assimilation"), lingering issues with short term forecasting of weather that may show skill only on the order of a few hours, and the difficulty of gathering pertinent data for verification and initialization in a dangerous environment. © 2010 IEEE.

  4. A Neural Computational Model of Incentive Salience

    Science.gov (United States)

    Zhang, Jun; Berridge, Kent C.; Tindell, Amy J.; Smith, Kyle S.; Aldridge, J. Wayne

    2009-01-01

    Incentive salience is a motivational property with ‘magnet-like’ qualities. When attributed to reward-predicting stimuli (cues), incentive salience triggers a pulse of ‘wanting’ and an individual is pulled toward the cues and reward. A key computational question is how incentive salience is generated during a cue re-encounter, which combines both learning and the state of limbic brain mechanisms. Learning processes, such as temporal-difference models, provide one way for stimuli to acquire cached predictive values of rewards. However, empirical data show that subsequent incentive values are also modulated on the fly by dynamic fluctuation in physiological states, altering cached values in ways requiring additional motivation mechanisms. Dynamic modulation of incentive salience for a Pavlovian conditioned stimulus (CS or cue) occurs during certain states, without necessarily requiring (re)learning about the cue. In some cases, dynamic modulation of cue value occurs during states that are quite novel, never having been experienced before, and even prior to experience of the associated unconditioned reward in the new state. Such cases can include novel drug-induced mesolimbic activation and addictive incentive-sensitization, as well as natural appetite states such as salt appetite. Dynamic enhancement specifically raises the incentive salience of an appropriate CS, without necessarily changing that of other CSs. Here we suggest a new computational model that modulates incentive salience by integrating changing physiological states with prior learning. We support the model with behavioral and neurobiological data from empirical tests that demonstrate dynamic elevations in cue-triggered motivation (involving natural salt appetite, and drug-induced intoxication and sensitization). Our data call for a dynamic model of incentive salience, such as presented here. Computational models can adequately capture fluctuations in cue-triggered ‘wanting’ only by

  5. Modelling the economic consequences of Marine Protected Areas using the BEMCOM model

    DEFF Research Database (Denmark)

    Hoff, A.; Andersen, J.L.; Christensen, Asbjørn

    2013-01-01

    This paper introduces and describes in detail the bioeconomic optimization model BEMCOM (BioEconomic Model to evaluate the COnsequences of Marine protected areas) that has been developed to assess the economic effects of introducing Marine Protected Areas (MPA) for fisheries. BEMCOM answers...

  6. Basin Economic Allocation Model (BEAM): An economic model of water use developed for the Aral Sea Basin

    Science.gov (United States)

    Riegels, Niels; Kromann, Mikkel; Karup Pedersen, Jesper; Lindgaard-Jørgensen, Palle; Sokolov, Vadim; Sorokin, Anatoly

    2013-04-01

    The water resources of the Aral Sea basin are under increasing pressure, particularly from the conflict over whether hydropower or irrigation water use should take priority. The purpose of the BEAM model is to explore the impact of changes to water allocation and investments in water management infrastructure on the overall welfare of the Aral Sea basin. The BEAM model estimates welfare changes associated with changes to how water is allocated between the five countries in the basin (Kazakhstan, Kyrgyz Republic, Tajikistan, Turkmenistan and Uzbekistan; water use in Afghanistan is assumed to be fixed). Water is allocated according to economic optimization criteria; in other words, the BEAM model allocates water across time and space so that the economic welfare associated with water use is maximized. The model is programmed in GAMS. The model addresses the Aral Sea Basin as a whole - that is, the rivers Syr Darya, Amu Darya, Kashkadarya, and Zarafshan, as well as the Aral Sea. The model representation includes water resources, including 14 river sections, 6 terminal lakes, 28 reservoirs and 19 catchment runoff nodes, as well as land resources (i.e., irrigated croplands). The model covers 5 sectors: agriculture (crops: wheat, cotton, alfalfa, rice, fruit, vegetables and others), hydropower, nature, households and industry. The focus of the model is on welfare impacts associated with changes to water use in the agriculture and hydropower sectors. The model aims at addressing the following issues of relevance for economic management of water resources: • Physical efficiency (estimating how investments in irrigation efficiency affect economic welfare). • Economic efficiency (estimating how changes in how water is allocated affect welfare). • Equity (who will gain from changes in allocation of water from one sector to another and who will lose?). Stakeholders in the region have been involved in the development of the model, and about 10 national experts, including

  7. Computer modeling for optimal placement of gloveboxes

    Energy Technology Data Exchange (ETDEWEB)

    Hench, K.W.; Olivas, J.D. [Los Alamos National Lab., NM (United States); Finch, P.R. [New Mexico State Univ., Las Cruces, NM (United States)

    1997-08-01

    Reduction of the nuclear weapons stockpile and the general downsizing of the nuclear weapons complex has presented challenges for Los Alamos. One is to design an optimized fabrication facility to manufacture nuclear weapon primary components (pits) in an environment of intense regulation and shrinking budgets. Historically, the location of gloveboxes in a processing area has been determined without benefit of industrial engineering studies to ascertain the optimal arrangement. The opportunity exists for substantial cost savings and increased process efficiency through careful study and optimization of the proposed layout by constructing a computer model of the fabrication process. This paper presents an integrative two- stage approach to modeling the casting operation for pit fabrication. The first stage uses a mathematical technique for the formulation of the facility layout problem; the solution procedure uses an evolutionary heuristic technique. The best solutions to the layout problem are used as input to the second stage - a computer simulation model that assesses the impact of competing layouts on operational performance. The focus of the simulation model is to determine the layout that minimizes personnel radiation exposures and nuclear material movement, and maximizes the utilization of capacity for finished units.

  8. Fuzzy logic approach to SWOT analysis for economics tasks and example of its computer realization

    Directory of Open Access Journals (Sweden)

    Vladimir CHERNOV

    2016-07-01

    Full Text Available The article discusses the widely used classic method of analysis, forecasting and decision-making in the various economic problems, called SWOT analysis. As known, it is a qualitative comparison of multicriteria degree of Strength, Weakness, Opportunity, Threat for different kinds of risks, forecasting the development in the markets, status and prospects of development of enterprises, regions and economic sectors, territorials etc. It can also be successfully applied to the evaluation and analysis of different project management tasks - investment, innovation, marketing, development, design and bring products to market and so on. However, in practical competitive market and economic conditions, there are various uncertainties, ambiguities, vagueness. Its making usage of SWOT analysis in the classical sense not enough reasonable and ineffective. In this case, the authors propose to use fuzzy logic approach and the theory of fuzzy sets for a more adequate representation and posttreatment assessments in the SWOT analysis. In particular, has been short showed the mathematical formulation of respective task and the main approaches to its solution. Also are given examples of suitable computer calculations in specialized software Fuzicalc for processing and operations with fuzzy input data. Finally, are presented considerations for interpretation of the results.

  9. Economic value added model upon conditions of banking company

    Directory of Open Access Journals (Sweden)

    Vlasta Kašparovská

    2008-01-01

    Full Text Available The content of this article is the application of the economic value added model (EVA upon the conditions of a banking company. Due to the character of banking business, which is in a different structure of financial sheet, it is not possible to use the standard model EVA for this banking company. The base of this article is the outlined of basic principles of the EVA mode in a non-banking company. Basic specified banking activity dissimilarities are analysed and a directed methodology adjustment of a model such as this, so that it is possible to use it for a banking company.

  10. Attribution of hydrologic trends using integrated hydrologic and economic models

    Science.gov (United States)

    Maneta, M. P.; Brugger, D. R.; Silverman, N. L.

    2014-12-01

    Hydrologic change has been detected in many regions of the world in the form of trends in annual streamflows, varying depths to the regional water table, or other alterations of the hydrologic balance. Most models used to investigate these changes implement sophisticated descriptions of the physical system but use simplified descriptions of the socioeconomic system. These simplifications come in the form of prescribed water diversions and land use change scenarios, which provide little insight into coupled natural-human systems and have limited predictive capabilities. We present an integrated model that adds realism to the description of the hydrologic system in agricultural regions by incorporating a component that updates the allocation of land and water to crops in response to hydroclimatic (water available) and economic conditions (prices of commodities and agricultural inputs). This component assumes that farmers allocate resources to maximize their net revenues, thus justifying the use of optimality conditions to constrain the parameters of an empirical production function that captures the economic behavior of farmers. Because the model internalizes the feedback between climate, agricultural markets, and farming activity into the hydrologic system, it can be used to understand to what extent human economic activity can exacerbate or buffer the regional hydrologic impacts of climate change in agricultural regions. It can also help in the attribution of causes of hydrologic change. These are important issues because local policy and management cannot solve climate change, but they can address land use and agricultural water use. We demonstrate the model in a case study.

  11. Economic inequality and mobility in kinetic models for social sciences

    Science.gov (United States)

    Letizia Bertotti, Maria; Modanese, Giovanni

    2016-10-01

    Statistical evaluations of the economic mobility of a society are more difficult than measurements of the income distribution, because they require to follow the evolution of the individuals' income for at least one or two generations. In micro-to-macro theoretical models of economic exchanges based on kinetic equations, the income distribution depends only on the asymptotic equilibrium solutions, while mobility estimates also involve the detailed structure of the transition probabilities of the model, and are thus an important tool for assessing its validity. Empirical data show a remarkably general negative correlation between economic inequality and mobility, whose explanation is still unclear. It is therefore particularly interesting to study this correlation in analytical models. In previous work we investigated the behavior of the Gini inequality index in kinetic models in dependence on several parameters which define the binary interactions and the taxation and redistribution processes: saving propensity, taxation rates gap, tax evasion rate, welfare means-testing etc. Here, we check the correlation of mobility with inequality by analyzing the mobility dependence from the same parameters. According to several numerical solutions, the correlation is confirmed to be negative.

  12. Designing a Hydro-Economic Collaborative Computer Decision Support System: Approaches, Best Practices, Lessons Learned, and Future Trends

    Science.gov (United States)

    Rosenberg, D. E.

    2008-12-01

    Designing and implementing a hydro-economic computer model to support or facilitate collaborative decision making among multiple stakeholders or users can be challenging and daunting. Collaborative modeling is distinguished and more difficult than non-collaborative efforts because of a large number of users with different backgrounds, disagreement or conflict among stakeholders regarding problem definitions, modeling roles, and analysis methods, plus evolving ideas of model scope and scale and needs for information and analysis as stakeholders interact, use the model, and learn about the underlying water system. This presentation reviews the lifecycle for collaborative model making and identifies some key design decisions that stakeholders and model developers must make to develop robust and trusted, verifiable and transparent, integrated and flexible, and ultimately useful models. It advances some best practices to implement and program these decisions. Among these best practices are 1) modular development of data- aware input, storage, manipulation, results recording and presentation components plus ways to couple and link to other models and tools, 2) explicitly structure both input data and the meta data that describes data sources, who acquired it, gaps, and modifications or translations made to put the data in a form usable by the model, 3) provide in-line documentation on model inputs, assumptions, calculations, and results plus ways for stakeholders to document their own model use and share results with others, and 4) flexibly program with graphical object-oriented properties and elements that allow users or the model maintainers to easily see and modify the spatial, temporal, or analysis scope as the collaborative process moves forward. We draw on examples of these best practices from the existing literature, the author's prior work, and some new applications just underway. The presentation concludes by identifying some future directions for collaborative

  13. Dynamic ecological-economic modeling approach for management of shellfish aquaculture

    CSIR Research Space (South Africa)

    Nobre, AM

    2008-02-01

    Full Text Available The objective of this report is to conceptualize ecological and economic interactions in mariculture; to implement a dynamic ecological-economic model in order to: simulate the socio-economics of aquaculture production, simulate its effects...

  14. Economic evaluations with agent-based modelling: an introduction.

    Science.gov (United States)

    Chhatwal, Jagpreet; He, Tianhua

    2015-05-01

    Agent-based modelling (ABM) is a relatively new technique, which overcomes some of the limitations of other methods commonly used for economic evaluations. These limitations include linearity, homogeneity and stationarity. Agents in ABMs are autonomous entities, who interact with each other and with the environment. ABMs provide an inductive or 'bottom-up' approach, i.e. individual-level behaviours define system-level components. ABMs have a unique property to capture emergence phenomena that otherwise cannot be predicted by the combination of individual-level interactions. In this tutorial, we discuss the basic concepts and important features of ABMs. We present a case study of an application of a simple ABM to evaluate the cost effectiveness of screening of an infectious disease. We also provide our model, which was developed using an open-source software program, NetLogo. We discuss software, resources, challenges and future research opportunities of ABMs for economic evaluations.

  15. Gravity model used for the analysis of economic developments

    Directory of Open Access Journals (Sweden)

    Ion PÂRŢACHI

    2010-04-01

    Full Text Available In this article we study the influence of ethnic fragmentation on foreign trade in the countries of Southeastern Europe in the conditions of recent conflicts. The analysis of trade is done using the gravity model. The explanatory variables of economic foreign relations, as inter-state social structure and inter-state political relations, play an important role in this region, namely in the context of the economic transition, and of EU enlargement to the East.The gravity approach followed in the article allows to consider the multilateral resistance, by the introduction of the fixed effects, and the model is estimated by pseudo-Poisson of Maximum Likelihood, that allows to include the null trade flows. The results show a positive influence of ethnic fractionalization on the import flows, as well as a positive influence of the 1990 conflicts on the volume of trade.

  16. Regressional modeling and forecasting of economic growth for arkhangelsk region

    Directory of Open Access Journals (Sweden)

    Robert Mikhailovich Nizhegorodtsev

    2012-12-01

    Full Text Available The regression models of GRP, considering the impact of three main factors: investment in fixed assets, wages amount, and, importantly, the innovation factor –the expenditures for research and development, are constructed in this paper on the empirical data for Arkhangelsk region. That approach permits to evaluate explicitly the contribution of innovation to economic growth. Regression analysis is the main research instrument, all calculations areperformedin the Microsoft Excel. There were made meaningful conclusions regarding the potential of the region's GRP growth by various factors, including impacts of positive and negative time lags. Adequate and relevant models are the base for estimation and forecasting values of the dependent variable (GRP and evaluating their confidence intervals. The invented method of research can be used in factor assessment and prediction of regional economic growth, including growth by expectations.

  17. Functional computational model for optimal color coding.

    Science.gov (United States)

    Romney, A Kimball; Chiao, Chuan-Chin

    2009-06-23

    This paper presents a computational model for color coding that provides a functional explanation of how humans perceive colors in a homogeneous color space. Beginning with known properties of human cone photoreceptors, the model estimates the locations of the reflectance spectra of Munsell color chips in perceptual color space as represented in the CIE L*a*b* color system. The fit between the two structures is within the limits of expected measurement error. Estimates of the structure of perceptual color space for color anomalous dichromats missing one of the normal cone photoreceptors correspond closely to results from the Farnsworth-Munsell color test. An unanticipated outcome of the model provides a functional explanation of why additive lights are always red, green, and blue and provide maximum gamut for color monitors and color television even though they do not correspond to human cone absorption spectra.

  18. Computer models in the design of FXR

    Energy Technology Data Exchange (ETDEWEB)

    Vogtlin, G.; Kuenning, R.

    1980-01-01

    Lawrence Livermore National Laboratory is developing a 15 to 20 MeV electron accelerator with a beam current goal of 4 kA. This accelerator will be used for flash radiography and has a requirement of high reliability. Components being developed include spark gaps, Marx generators, water Blumleins and oil insulation systems. A SCEPTRE model was developed that takes into consideration the non-linearity of the ferrite and the time dependency of the emission from a field emitter cathode. This model was used to predict an optimum charge time to obtain maximum magnetic flux change from the ferrite. This model and its application will be discussed. JASON was used extensively to determine optimum locations and shapes of supports and insulators. It was also used to determine stress within bubbles adjacent to walls in oil. Computer results will be shown and bubble breakdown will be related to bubble size.

  19. Computational fluid dynamic modelling of cavitation

    Science.gov (United States)

    Deshpande, Manish; Feng, Jinzhang; Merkle, Charles L.

    1993-01-01

    Models in sheet cavitation in cryogenic fluids are developed for use in Euler and Navier-Stokes codes. The models are based upon earlier potential-flow models but enable the cavity inception point, length, and shape to be determined as part of the computation. In the present paper, numerical solutions are compared with experimental measurements for both pressure distribution and cavity length. Comparisons between models are also presented. The CFD model provides a relatively simple modification to an existing code to enable cavitation performance predictions to be included. The analysis also has the added ability of incorporating thermodynamic effects of cryogenic fluids into the analysis. Extensions of the current two-dimensional steady state analysis to three-dimensions and/or time-dependent flows are, in principle, straightforward although geometrical issues become more complicated. Linearized models, however offer promise of providing effective cavitation modeling in three-dimensions. This analysis presents good potential for improved understanding of many phenomena associated with cavity flows.

  20. Computational models of neurophysiological correlates of tinnitus.

    Science.gov (United States)

    Schaette, Roland; Kempter, Richard

    2012-01-01

    The understanding of tinnitus has progressed considerably in the past decade, but the details of the mechanisms that give rise to this phantom perception of sound without a corresponding acoustic stimulus have not yet been pinpointed. It is now clear that tinnitus is generated in the brain, not in the ear, and that it is correlated with pathologically altered spontaneous activity of neurons in the central auditory system. Both increased spontaneous firing rates and increased neuronal synchrony have been identified as putative neuronal correlates of phantom sounds in animal models, and both phenomena can be triggered by damage to the cochlea. Various mechanisms could underlie the generation of such aberrant activity. At the cellular level, decreased synaptic inhibition and increased neuronal excitability, which may be related to homeostatic plasticity, could lead to an over-amplification of natural spontaneous activity. At the network level, lateral inhibition could amplify differences in spontaneous activity, and structural changes such as reorganization of tonotopic maps could lead to self-sustained activity in recurrently connected neurons. However, it is difficult to disentangle the contributions of different mechanisms in experiments, especially since not all changes observed in animal models of tinnitus are necessarily related to tinnitus. Computational modeling presents an opportunity of evaluating these mechanisms and their relation to tinnitus. Here we review the computational models for the generation of neurophysiological correlates of tinnitus that have been proposed so far, and evaluate predictions and compare them to available data. We also assess the limits of their explanatory power, thus demonstrating where an understanding is still lacking and where further research may be needed. Identifying appropriate models is important for finding therapies, and we therefore, also summarize the implications of the models for approaches to treat tinnitus.

  1. Computational models of neurophysiological correlates of tinnitus

    Directory of Open Access Journals (Sweden)

    Roland eSchaette

    2012-05-01

    Full Text Available The understanding of tinnitus has progressed considerably in the past decade, but the details of the mechanisms that give rise to this phantom perception of sound without a corresponding acoustic stimulus have not been pinpointed yet. It is now clear that tinnitus is generated in the brain, not in the ear, and that it is correlated with pathologically altered spontaneous activity of neurons in the central auditory system. Both increased spontaneous firing rates and increased neuronal synchrony have been identified as putative neuronal correlates of phantom sounds in animal models, and both phenomena can be triggered by damage to the cochlea. Various mechanisms could underlie the generation of such aberrant activity. At the cellular level, decreased synaptic inhibition and increased neuronal excitability, which may be related to homeostatic plasticity, could lead to an over-amplification of natural spontaneous activity. At the network level, lateral inhibition could amplify differences in spontaneous activity, and structural changes such as reorganization of tonotopic maps could lead to self-sustained activity in recurrently connected neurons. It is difficult to disentangle the contributions of different mechanisms in experiments, especially since not all changes observed in animal models of tinnitus are necessarily related to tinnitus. Computational modelling presents an opportunity of evaluating these mechanisms and their relation to tinnitus. Here we review the computational models for the generation of neurophysiological correlates of tinnitus that have been proposed so far, evaluate predictions and compare them to available data. We also evaluate the limits of their explanatory power, thus demonstrating where an understanding is still lacking and where further research may be needed. Identifying appropriate models is important for finding therapies and we therefore also summarize the implications of the models for approaches to treat

  2. Computer-aided modeling framework – a generic modeling template

    DEFF Research Database (Denmark)

    Fedorova, Marina; Sin, Gürkan; Gani, Rafiqul

    This work focuses on the development of a computer-aided modeling framework. The framework is a knowledge-based system that is built on a generic modeling language and structured on workflows for different modeling tasks. The overall objective is to support model developers and users to generate...... and test models systematically, efficiently and reliably. In this way, development of products and processes can be made faster, cheaper and more efficient. In this contribution, as part of the framework, a generic modeling template for the systematic derivation of problem specific models is presented....... The application of the modeling template is highlighted with a case study related to the modeling of a catalytic membrane reactor coupling dehydrogenation of ethylbenzene with hydrogenation of nitrobenzene...

  3. Modelling of data uncertainties on hybrid computers

    Energy Technology Data Exchange (ETDEWEB)

    Schneider, Anke (ed.)

    2016-06-15

    The codes d{sup 3}f and r{sup 3}t are well established for modelling density-driven flow and nuclide transport in the far field of repositories for hazardous material in deep geological formations. They are applicable in porous media as well as in fractured rock or mudstone, for modelling salt- and heat transport as well as a free groundwater surface. Development of the basic framework of d{sup 3}f and r{sup 3}t had begun more than 20 years ago. Since that time significant advancements took place in the requirements for safety assessment as well as for computer hardware development. The period of safety assessment for a repository of high-level radioactive waste was extended to 1 million years, and the complexity of the models is steadily growing. Concurrently, the demands on accuracy increase. Additionally, model and parameter uncertainties become more and more important for an increased understanding of prediction reliability. All this leads to a growing demand for computational power that requires a considerable software speed-up. An effective way to achieve this is the use of modern, hybrid computer architectures which requires basically the set-up of new data structures and a corresponding code revision but offers a potential speed-up by several orders of magnitude. The original codes d{sup 3}f and r{sup 3}t were applications of the software platform UG /BAS 94/ whose development had begun in the early nineteennineties. However, UG had recently been advanced to the C++ based, substantially revised version UG4 /VOG 13/. To benefit also in the future from state-of-the-art numerical algorithms and to use hybrid computer architectures, the codes d{sup 3}f and r{sup 3}t were transferred to this new code platform. Making use of the fact that coupling between different sets of equations is natively supported in UG4, d{sup 3}f and r{sup 3}t were combined to one conjoint code d{sup 3}f++. A direct estimation of uncertainties for complex groundwater flow models with the

  4. Modeling Reality - How Computers Mirror Life

    Science.gov (United States)

    Bialynicki-Birula, Iwo; Bialynicka-Birula, Iwona

    2005-01-01

    The bookModeling Reality covers a wide range of fascinating subjects, accessible to anyone who wants to learn about the use of computer modeling to solve a diverse range of problems, but who does not possess a specialized training in mathematics or computer science. The material presented is pitched at the level of high-school graduates, even though it covers some advanced topics (cellular automata, Shannon's measure of information, deterministic chaos, fractals, game theory, neural networks, genetic algorithms, and Turing machines). These advanced topics are explained in terms of well known simple concepts: Cellular automata - Game of Life, Shannon's formula - Game of twenty questions, Game theory - Television quiz, etc. The book is unique in explaining in a straightforward, yet complete, fashion many important ideas, related to various models of reality and their applications. Twenty-five programs, written especially for this book, are provided on an accompanying CD. They greatly enhance its pedagogical value and make learning of even the more complex topics an enjoyable pleasure.

  5. Economic Dispatch for Microgrid Containing Electric Vehicles via Probabilistic Modelling

    Energy Technology Data Exchange (ETDEWEB)

    Yao, Yin; Gao, Wenzhong; Momoh, James; Muljadi, Eduard

    2015-10-06

    In this paper, an economic dispatch model with probabilistic modeling is developed for microgrid. Electric power supply in microgrid consists of conventional power plants and renewable energy power plants, such as wind and solar power plants. Due to the fluctuation of solar and wind plants' output, an empirical probabilistic model is developed to predict their hourly output. According to different characteristics of wind and solar plants, the parameters for probabilistic distribution are further adjusted individually for both power plants. On the other hand, with the growing trend of Plug-in Electric Vehicle (PHEV), an integrated microgrid system must also consider the impact of PHEVs. Not only the charging loads from PHEVs, but also the discharging output via Vehicle to Grid (V2G) method can greatly affect the economic dispatch for all the micro energy sources in microgrid. This paper presents an optimization method for economic dispatch in microgrid considering conventional, renewable power plants, and PHEVs. The simulation results reveal that PHEVs with V2G capability can be an indispensable supplement in modern microgrid.

  6. Computer Model for Economic Study of Unbleached Kraft Paperboard Production.

    Science.gov (United States)

    1984-08-01

    material, energy, DFOM -Quantity of defoamer additives in stockpreparation and machine areas (pounds/dry tonand labor costs associated with the process. of... defoamer (S/ton).data, is provided as are user notes and guidelines for PMND -Minimum density of paper or paperboarddpatain oided are uder ncomteand...concentrators area i defoamer , and slimicide additives, by weightThssbotncaulesheqniisofIpsad (pud/r tofprdc) outputs In the digesterI~s), pulp washers, black

  7. Some queuing network models of computer systems

    Science.gov (United States)

    Herndon, E. S.

    1980-01-01

    Queuing network models of a computer system operating with a single workload type are presented. Program algorithms are adapted for use on the Texas Instruments SR-52 programmable calculator. By slightly altering the algorithm to process the G and H matrices row by row instead of column by column, six devices and an unlimited job/terminal population could be handled on the SR-52. Techniques are also introduced for handling a simple load dependent server and for studying interactive systems with fixed multiprogramming limits.

  8. Optimization and mathematical modeling in computer architecture

    CERN Document Server

    Sankaralingam, Karu; Nowatzki, Tony

    2013-01-01

    In this book we give an overview of modeling techniques used to describe computer systems to mathematical optimization tools. We give a brief introduction to various classes of mathematical optimization frameworks with special focus on mixed integer linear programming which provides a good balance between solver time and expressiveness. We present four detailed case studies -- instruction set customization, data center resource management, spatial architecture scheduling, and resource allocation in tiled architectures -- showing how MILP can be used and quantifying by how much it outperforms t

  9. Investigation of international energy economics. [Use of econometric model EXPLOR

    Energy Technology Data Exchange (ETDEWEB)

    Deonigi, D.E.; Clement, M.; Foley, T.J.; Rao, S.A.

    1977-03-01

    The Division of International Affairs of the Energy Research and Development Administration is assessing the long-range economic effects of energy research and development programs in the U.S. and other countries, particularly members of the International Energy Agency (IEA). In support of this effort, a program was designed to coordinate the capabilities of five research groups--Rand, Virginia Polytechnic Institute, Brookhaven National Laboratory, Lawrence Livermore Laboratory, and Pacific Northwest Laboratory. The program could evaluate the international economics of proposed or anticipated sources of energy. This program is designed to be general, flexible, and capable of evaluating a diverse collection of potential energy (nuclear and nonnuclear) related problems. For example, the newly developed methodology could evaluate the international and domestic economic impact of nuclear-related energy sources, but also existing nonnuclear and potential energy sources such as solar, geothermal, wind, etc. Major items to be included would be the cost of exploration, cost of production, prices, profit, market penetration, investment requirements and investment goods, economic growth, change in balance of payments, etc. In addition, the changes in cost of producing all goods and services would be identified for each new energy source. PNL developed (1) a means of estimating the demands for major forms of energy by country, and (2) a means of identifying results or impacts on each country. The results for each country were then to be compared to assess relative advantages. PNL relied on its existing general econometric model, EXPLOR, to forecast the demand for energy by country. (MCW)

  10. Dynamical Models for Computer Viruses Propagation

    Directory of Open Access Journals (Sweden)

    José R. C. Piqueira

    2008-01-01

    Full Text Available Nowadays, digital computer systems and networks are the main engineering tools, being used in planning, design, operation, and control of all sizes of building, transportation, machinery, business, and life maintaining devices. Consequently, computer viruses became one of the most important sources of uncertainty, contributing to decrease the reliability of vital activities. A lot of antivirus programs have been developed, but they are limited to detecting and removing infections, based on previous knowledge of the virus code. In spite of having good adaptation capability, these programs work just as vaccines against diseases and are not able to prevent new infections based on the network state. Here, a trial on modeling computer viruses propagation dynamics relates it to other notable events occurring in the network permitting to establish preventive policies in the network management. Data from three different viruses are collected in the Internet and two different identification techniques, autoregressive and Fourier analyses, are applied showing that it is possible to forecast the dynamics of a new virus propagation by using the data collected from other viruses that formerly infected the network.

  11. Computational modeling of corneal refractive surgery

    Science.gov (United States)

    Cabrera Fernandez, Delia; Niazy, Abdel-Salam M.; Kurtz, Ronald M.; Djotyan, Gagik P.; Juhasz, Tibor

    2004-07-01

    A finite element method was used to study the biomechanical behavior of the cornea and its response to refractive surgery when stiffness inhomogeneities varying with depth are considered. Side-by-side comparisons of different constitutive laws that have been commonly used to model refractive surgery were also performed. To facilitate the comparison, the material property constants were identified from the same experimental data, which were obtained from mechanical tests on corneal strips and membrane inflation experiments. We then validated the resulting model by comparing computed refractive power changes with clinical results. The model developed provides a much more predictable refractive outcome when the stiffness inhomogeneities of the cornea and nonlinearities of the deformations are included in the finite element simulations. Thus, it can be stated that the inhomogeneous model is a more accurate representation of the corneal material properties in order to model the biomechanical effects of refractive surgery. The simulations also revealed that the para-central and peripheral parts of the cornea deformed less in response to pressure loading compared to the central cornea and the limbus. Furthermore, the deformations in response to pressure loading predicted by the non-homogeneous and nonlinear model, showed that the para-central region is mechanically enhanced in the meridional direction. This result is in agreement with the experimentally documented regional differences reported in the literature by other investigators.

  12. Computational social dynamic modeling of group recruitment.

    Energy Technology Data Exchange (ETDEWEB)

    Berry, Nina M.; Lee, Marinna; Pickett, Marc; Turnley, Jessica Glicken (Sandia National Laboratories, Albuquerque, NM); Smrcka, Julianne D. (Sandia National Laboratories, Albuquerque, NM); Ko, Teresa H.; Moy, Timothy David (Sandia National Laboratories, Albuquerque, NM); Wu, Benjamin C.

    2004-01-01

    The Seldon software toolkit combines concepts from agent-based modeling and social science to create a computationally social dynamic model for group recruitment. The underlying recruitment model is based on a unique three-level hybrid agent-based architecture that contains simple agents (level one), abstract agents (level two), and cognitive agents (level three). This uniqueness of this architecture begins with abstract agents that permit the model to include social concepts (gang) or institutional concepts (school) into a typical software simulation environment. The future addition of cognitive agents to the recruitment model will provide a unique entity that does not exist in any agent-based modeling toolkits to date. We use social networks to provide an integrated mesh within and between the different levels. This Java based toolkit is used to analyze different social concepts based on initialization input from the user. The input alters a set of parameters used to influence the values associated with the simple agents, abstract agents, and the interactions (simple agent-simple agent or simple agent-abstract agent) between these entities. The results of phase-1 Seldon toolkit provide insight into how certain social concepts apply to different scenario development for inner city gang recruitment.

  13. Economic-Mathematical Models for Studying of Mesolevel of Economy

    Directory of Open Access Journals (Sweden)

    Igor L. Kirilyuk

    2017-09-01

    Full Text Available The review of a number of the mathematical models applied to the description and analysis of mesolevel of economy is presented in article. The criteria for assigning models to a class of meso-level models that distinguish them from purely microeconomic or macroeconomic models are proposed. Examples of the use of mathematical models in the literature on mesoeconomics are given. The some classical models like input–output model of Leontief, or, for example, game theory, and rather new models using a mathematical apparatus of systems of nonlinear mappings or the differential equations, diverse simulation models, can be considered as models of mesolevel. Just as development of nonlinear physics has led to a possibility of the description of multi-scale self-organizing structures, the mesolarge-scale level of economy understood as set of the subsystems evolving, interacting among themselves, competing and cooperating generating the emergent phenomena like the increasing return, the hyperbolic growth or self-organized criticality can be appropriate to describe by means of models of econophysics and use of the principles of synergetrics. Also discussed are the prospects for the development of meso-level models and the problem of the conventionality of separating the levels of the economy, due, for example, to signs of scale invariance in some socio-economic systems.

  14. Analysis of a Model for Computer Virus Transmission

    Directory of Open Access Journals (Sweden)

    Peng Qin

    2015-01-01

    Full Text Available Computer viruses remain a significant threat to computer networks. In this paper, the incorporation of new computers to the network and the removing of old computers from the network are considered. Meanwhile, the computers are equipped with antivirus software on the computer network. The computer virus model is established. Through the analysis of the model, disease-free and endemic equilibrium points are calculated. The stability conditions of the equilibria are derived. To illustrate our theoretical analysis, some numerical simulations are also included. The results provide a theoretical basis to control the spread of computer virus.

  15. Tax Compliance Models: From Economic to Behavioral Approaches

    Directory of Open Access Journals (Sweden)

    Larissa Margareta BĂTRÂNCEA

    2012-06-01

    Full Text Available The paper reviews the models of tax compliance with an emphasis on economic and behavioral perspectives. Although the standard tax evasion model of Allingham and Sandmo and other similar economic models capture some important aspects of tax compliance (i.e., taxpayers’ response to increases in tax rate, audit probability, penalty rate they do not suffice the need for an accurate prediction of taxpayers’ behavior. The reason is that they do not offer a comprehensive perspective on the sociological and psychological factors which shape compliance (i.e., attitudes, beliefs, norms, perceptions, motivations. Therefore, the researchers have considered examining taxpayers’ inner motivations, beliefs, perceptions, attitudes in order to accurately predict taxpayers’ behavior. As a response to their quest, behavioral models of tax compliance have emerged. Among the sociological and psychological factors which shape tax compliance, the ‘slippery slope’ framework singles out trust in authorities and the perception of the power of authorities. The aim of the paper is to contribute to the understanding of the reasons for which there is a need for a tax compliance model which incorporates both economic and behavioral features and why governments and tax authorities should consider these models when designing fiscal policies.

  16. Physics and financial economics (1776-2014): puzzles, Ising and agent-based models

    Science.gov (United States)

    Sornette, Didier

    2014-06-01

    This short review presents a selected history of the mutual fertilization between physics and economics—from Isaac Newton and Adam Smith to the present. The fundamentally different perspectives embraced in theories developed in financial economics compared with physics are dissected with the examples of the volatility smile and of the excess volatility puzzle. The role of the Ising model of phase transitions to model social and financial systems is reviewed, with the concepts of random utilities and the logit model as the analog of the Boltzmann factor in statistical physics. Recent extensions in terms of quantum decision theory are also covered. A wealth of models are discussed briefly that build on the Ising model and generalize it to account for the many stylized facts of financial markets. A summary of the relevance of the Ising model and its extensions is provided to account for financial bubbles and crashes. The review would be incomplete if it did not cover the dynamical field of agent-based models (ABMs), also known as computational economic models, of which the Ising-type models are just special ABM implementations. We formulate the ‘Emerging Intelligence Market Hypothesis’ to reconcile the pervasive presence of ‘noise traders’ with the near efficiency of financial markets. Finally, we note that evolutionary biology, more than physics, is now playing a growing role to inspire models of financial markets.

  17. Energy technologies and energy efficiency in economic modelling

    DEFF Research Database (Denmark)

    Klinge Jacobsen, Henrik

    1998-01-01

    technological development. This paper examines the effect on aggregate energy efficiency of using technological models to describe a number of specific technologies and of incorporating these models in an economic model. Different effects from the technology representation are illustrated. Vintage effects...... illustrates the dependence of average efficiencies and productivity on capacity utilisation rates. In the long run regulation induced by environmental policies are also very important for the improvement of aggregate energy efficiency in the energy supply sector. A Danish policy to increase the share...... of renewable energy and especially wind power will increase the rate of efficiency improvement. A technologically based model in this case indirectly makes the energy efficiency endogenous in the aggregate energy-economy model....

  18. THE VICTIM HANDLING MODEL OF HUMAN TRAFFICKING THROUGH ECONOMIC INDEPENDENCE

    Directory of Open Access Journals (Sweden)

    Henny Nuraeny

    2016-05-01

    Full Text Available Human Trafficking is a modern trading of human slavery. Human Trafficking is also one of the worst forms of violation of human dignity that results in trauma to the victims. To that end, there should be a comprehensive treatment for victims. The problems that can be studied is whether a model that can be applied in the treatment of victims of trafficking in Cianjur and disseminating technical how models Handling of Victims of Human Trafficking in Cianjur. This study used normative juridical approach and specification of descriptive analysis. The results of this study are alternative models to handle victims of trafficking in Cianjur is a service model based on inter-institutional and economic empowerment through planting camelina sativa with socialization techniques involving local government, private sector, community leaders and students through legal counseling and advocacy. Keywords: human trafficking, the victim handling model, socialization

  19. A stochastic model for simulation of the economic consequences of bovine virus diarrhoea virus infection in a dairy herd

    DEFF Research Database (Denmark)

    Sørensen, J.T.; Enevoldsen, Carsten; Houe, H.

    1995-01-01

    A dynamic, stochastic model simulating the technical and economic consequences of bovine virus diarrhoea virus (BVDV) infections for a dairy cattle herd for use on a personal computer was developed. The production and state changes of the herd were simulated by state changes of the individual cows...

  20. Computational models of intergroup competition and warfare.

    Energy Technology Data Exchange (ETDEWEB)

    Letendre, Kenneth (University of New Mexico); Abbott, Robert G.

    2011-11-01

    This document reports on the research of Kenneth Letendre, the recipient of a Sandia Graduate Research Fellowship at the University of New Mexico. Warfare is an extreme form of intergroup competition in which individuals make extreme sacrifices for the benefit of their nation or other group to which they belong. Among animals, limited, non-lethal competition is the norm. It is not fully understood what factors lead to warfare. We studied the global variation in the frequency of civil conflict among countries of the world, and its positive association with variation in the intensity of infectious disease. We demonstrated that the burden of human infectious disease importantly predicts the frequency of civil conflict and tested a causal model for this association based on the parasite-stress theory of sociality. We also investigated the organization of social foraging by colonies of harvester ants in the genus Pogonomyrmex, using both field studies and computer models.

  1. Multiscale computational modelling of the heart

    Science.gov (United States)

    Smith, N. P.; Nickerson, D. P.; Crampin, E. J.; Hunter, P. J.

    A computational framework is presented for integrating the electrical, mechanical and biochemical functions of the heart. Finite element techniques are used to solve the large-deformation soft tissue mechanics using orthotropic constitutive laws based in the measured fibre-sheet structure of myocardial (heart muscle) tissue. The reaction-diffusion equations governing electrical current flow in the heart are solved on a grid of deforming material points which access systems of ODEs representing the cellular processes underlying the cardiac action potential. Navier-Stokes equations are solved for coronary blood flow in a system of branching blood vessels embedded in the deforming myocardium and the delivery of oxygen and metabolites is coupled to the energy-dependent cellular processes. The framework presented here for modelling coupled physical conservation laws at the tissue and organ levels is also appropriate for other organ systems in the body and we briefly discuss applications to the lungs and the musculo-skeletal system. The computational framework is also designed to reach down to subcellular processes, including signal transduction cascades and metabolic pathways as well as ion channel electrophysiology, and we discuss the development of ontologies and markup language standards that will help link the tissue and organ level models to the vast array of gene and protein data that are now available in web-accessible databases.

  2. Optimizing Green Computing Awareness for Environmental Sustainability and Economic Security as a Stochastic Optimization Problem

    Directory of Open Access Journals (Sweden)

    Emmanuel Okewu

    2017-10-01

    Full Text Available The role of automation in sustainable development is not in doubt. Computerization in particular has permeated every facet of human endeavour, enhancing the provision of information for decision-making that reduces cost of operation, promotes productivity and socioeconomic prosperity and cohesion. Hence, a new field called information and communication technology for development (ICT4D has emerged. Nonetheless, the need to ensure environmentally friendly computing has led to this research study with particular focus on green computing in Africa. This is against the backdrop that the continent is feared to suffer most from the vulnerability of climate change and the impact of environmental risk. Using Nigeria as a test case, this paper gauges the green computing awareness level of Africans via sample survey. It also attempts to institutionalize green computing maturity model with a view to optimizing the level of citizens awareness amid inherent uncertainties like low bandwidth, poor network and erratic power in an emerging African market. Consequently, we classified the problem as a stochastic optimization problem and applied metaheuristic search algorithm to determine the best sensitization strategy. Although there are alternative ways of promoting green computing education, the metaheuristic search we conducted indicated that an online real-time solution that not only drives but preserves timely conversations on electronic waste (e-waste management and energy saving techniques among the citizenry is cutting edge. The authors therefore reviewed literature, gathered requirements, modelled the proposed solution using Universal Modelling Language (UML and developed a prototype. The proposed solution is a web-based multi-tier e-Green computing system that educates computer users on innovative techniques of managing computers and accessories in an environmentally friendly way. We found out that such a real-time web-based interactive forum does not

  3. Stochastic linear programming models, theory, and computation

    CERN Document Server

    Kall, Peter

    2011-01-01

    This new edition of Stochastic Linear Programming: Models, Theory and Computation has been brought completely up to date, either dealing with or at least referring to new material on models and methods, including DEA with stochastic outputs modeled via constraints on special risk functions (generalizing chance constraints, ICC’s and CVaR constraints), material on Sharpe-ratio, and Asset Liability Management models involving CVaR in a multi-stage setup. To facilitate use as a text, exercises are included throughout the book, and web access is provided to a student version of the authors’ SLP-IOR software. Additionally, the authors have updated the Guide to Available Software, and they have included newer algorithms and modeling systems for SLP. The book is thus suitable as a text for advanced courses in stochastic optimization, and as a reference to the field. From Reviews of the First Edition: "The book presents a comprehensive study of stochastic linear optimization problems and their applications. … T...

  4. Mathematical modeling in economics, ecology and the environment

    CERN Document Server

    Hritonenko, Natali

    2013-01-01

    Updated to textbook form by popular demand, this second edition discusses diverse mathematical models used in economics, ecology, and the environmental sciences with emphasis on control and optimization. It is intended for graduate and upper-undergraduate course use, however, applied mathematicians, industry practitioners, and a vast number of interdisciplinary academics will find the presentation highly useful. Core topics of this text are: ·         Economic growth and technological development ·         Population dynamics and human impact on the environment ·         Resource extraction and scarcity ·         Air and water contamination ·         Rational management of the economy and environment ·         Climate change and global dynamics The step-by-step approach taken is problem-based and easy to follow. The authors aptly demonstrate that the same models may be used to describe different economic and environmental processes and that similar invest...

  5. Financial Transaction Tax: Determination of Economic Impact Under DSGE Model

    Directory of Open Access Journals (Sweden)

    Veronika Solilová

    2015-01-01

    Full Text Available The discussion about the possible taxation of the financial sector has started in the European Union as a result of the financial crisis which has spread to the Europe from the United States in 2008 and consequently of the massive financial interventions by governments made in favour of the financial sector. On 14 February 2013, after rejection of the draft of the directive introducing a common system of financial transaction tax in 2011, the European Commission introduced the financial transaction tax through enhanced cooperation. The aim of the paper is to research economic impact of financial transaction tax on EU (EU27 or EU11 with respect to the DSGE model which was used for the determination of impacts. Based on our analysis the DSGE model can be considered as underestimated in case of the impact on economic growth and an overestimated in case of the revenue collection. Particularly, the overall impact of the financial transaction tax considering cascade effects of securities (tax rate 2.2% and derivatives (tax rate 0.2% is ranged between −4.752 and 1.472 percent points of GDP. And further, is assumed that the relocation effects of business/trade can be in average 40% causes a decline of expected tax revenues in the amount of 13bn EUR. Thus, at a time of fragile economic growth across the EU and the increased risk of recession in Europe, the introduction of the FTT should be undesirable.

  6. Computer models of vocal tract evolution: an overview and critique

    NARCIS (Netherlands)

    de Boer, B.; Fitch, W. T.

    2010-01-01

    Human speech has been investigated with computer models since the invention of digital computers, and models of the evolution of speech first appeared in the late 1960s and early 1970s. Speech science and computer models have a long shared history because speech is a physical signal and can be

  7. Electric vehicle charge planning using Economic Model Predictive Control

    DEFF Research Database (Denmark)

    Halvgaard, Rasmus; Poulsen, Niels K.; Madsen, Henrik

    2012-01-01

    Economic Model Predictive Control (MPC) is very well suited for controlling smart energy systems since electricity price and demand forecasts are easily integrated in the controller. Electric vehicles (EVs) are expected to play a large role in the future Smart Grid. They are expected to provide...... grid services, both for peak reduction and for ancillary services, by absorbing short term variations in the electricity production. In this paper the Economic MPC minimizes the cost of electricity consumption for a single EV. Simulations show savings of 50–60% of the electricity costs compared...... to uncontrolled charging from load shifting based on driving pattern predictions. The future energy system in Denmark will most likely be based on renewable energy sources e.g. wind and solar power. These green energy sources introduce stochastic fluctuations in the electricity production. Therefore, energy...

  8. CLUSTERS AS A MODEL OF ECONOMIC DEVELOPMENT OF SERBIA

    Directory of Open Access Journals (Sweden)

    Marko Laketa

    2013-12-01

    Full Text Available Insufficient competitiveness of small and medium enterprises in Serbia can be significantly improved by a system of business associations through clusters, business incubators and technology parks. This connection contributes to the growth and development of not only the cluster members, but has a regional and national dimension as well because without it there is no significant breakthrough on the international market. The process of association of small and medium enterprises in clusters and other forms of interconnection in Serbia is far from the required and potential level.The awareness on the importance of clusters in a local economic development through contributions to the advancement of small and medium sized enterprises is not yet sufficiently mature. Support to associating into clusters and usage of their benefits after the model of highly developed countries is the basis for leading a successful economic policy and in Serbia there are all necessary prerequisites for it.

  9. Statistics, Computation, and Modeling in Cosmology

    Science.gov (United States)

    Jewell, Jeff; Guiness, Joe; SAMSI 2016 Working Group in Cosmology

    2017-01-01

    Current and future ground and space based missions are designed to not only detect, but map out with increasing precision, details of the universe in its infancy to the present-day. As a result we are faced with the challenge of analyzing and interpreting observations from a wide variety of instruments to form a coherent view of the universe. Finding solutions to a broad range of challenging inference problems in cosmology is one of the goals of the “Statistics, Computation, and Modeling in Cosmology” workings groups, formed as part of the year long program on ‘Statistical, Mathematical, and Computational Methods for Astronomy’, hosted by the Statistical and Applied Mathematical Sciences Institute (SAMSI), a National Science Foundation funded institute. Two application areas have emerged for focused development in the cosmology working group involving advanced algorithmic implementations of exact Bayesian inference for the Cosmic Microwave Background, and statistical modeling of galaxy formation. The former includes study and development of advanced Markov Chain Monte Carlo algorithms designed to confront challenging inference problems including inference for spatial Gaussian random fields in the presence of sources of galactic emission (an example of a source separation problem). Extending these methods to future redshift survey data probing the nonlinear regime of large scale structure formation is also included in the working group activities. In addition, the working group is also focused on the study of ‘Galacticus’, a galaxy formation model applied to dark matter-only cosmological N-body simulations operating on time-dependent halo merger trees. The working group is interested in calibrating the Galacticus model to match statistics of galaxy survey observations; specifically stellar mass functions, luminosity functions, and color-color diagrams. The group will use subsampling approaches and fractional factorial designs to statistically and

  10. Performance of Air Pollution Models on Massively Parallel Computers

    DEFF Research Database (Denmark)

    Brown, John; Hansen, Per Christian; Wasniewski, Jerzy

    1996-01-01

    To compare the performance and use of three massively parallel SIMD computers, we implemented a large air pollution model on the computers. Using a realistic large-scale model, we gain detailed insight about the performance of the three computers when used to solve large-scale scientific problems...... that involve several types of numerical computations. The computers considered in our study are the Connection Machines CM-200 and CM-5, and the MasPar MP-2216...

  11. Dynamic Systems Modeling for Sustainable Economic Empowerment in Cilacap

    Directory of Open Access Journals (Sweden)

    Nurul Anwar

    2011-09-01

    Full Text Available This paper investigates the dynamic problem of living system in Kampung Laut, Cilacap, whichincludes social problems and ecological changes. The paper uses a dynamic system model to structurethe problems. The model simulates various feasible scenarios, from which the best becomesthe base to impose a policy to empower their sustainable economy. The model conceptualizes variablesrelated to the problem to build a figure of Causal Loop Diagram (CLD, which is then simulatedusing Powersim 2.5 software package. Using the scenario of intensification and populationcontrol, the paper finds that it can increase the people’s income, with positive trend until the end ofsimulation.Keywords: Dynamic modelling, sustainable economic empowerment, causal loop diagram

  12. Incorporating incorporating economic models into seasonal pool conservation planning

    Science.gov (United States)

    Freeman, Robert C.; Bell, Kathleen P.; Calhoun, Aram J.K.; Loftin, Cyndy

    2012-01-01

    Massachusetts, New Jersey, Connecticut, and Maine have adopted regulatory zones around seasonal (vernal) pools to conserve terrestrial habitat for pool-breeding amphibians. Most amphibians require access to distinct seasonal habitats in both terrestrial and aquatic ecosystems because of their complex life histories. These habitat requirements make them particularly vulnerable to land uses that destroy habitat or limit connectivity (or permeability) among habitats. Regulatory efforts focusing on breeding pools without consideration of terrestrial habitat needs will not ensure the persistence of pool-breeding amphibians. We used GIS to combine a discrete-choice, parcel-scale economic model of land conversion with a landscape permeability model based on known habitat requirements of wood frogs (Lithobates sylvaticus) in Maine (USA) to examine permeability among habitat elements for alternative future scenarios. The economic model predicts future landscapes under different subdivision open space and vernal pool regulatory requirements. Our model showed that even “no build” permit zones extending 76 m (250 ft) outward from the pool edge were insufficient to assure permeability among required habitat elements. Furthermore, effectiveness of permit zones may be inconsistent due to interactions with other growth management policies, highlighting the need for local and state planning for the long-term persistence of pool-breeding amphibians in developing landscapes.

  13. Methodical Approaches to Teaching of Computer Modeling in Computer Science Course

    Science.gov (United States)

    Rakhimzhanova, B. Lyazzat; Issabayeva, N. Darazha; Khakimova, Tiyshtik; Bolyskhanova, J. Madina

    2015-01-01

    The purpose of this study was to justify of the formation technique of representation of modeling methodology at computer science lessons. The necessity of studying computer modeling is that the current trends of strengthening of general education and worldview functions of computer science define the necessity of additional research of the…

  14. Computable general equilibrium model fiscal year 2014 capability development report

    Energy Technology Data Exchange (ETDEWEB)

    Edwards, Brian Keith [Los Alamos National Laboratory; Boero, Riccardo [Los Alamos National Laboratory

    2016-05-11

    This report provides an overview of the development of the NISAC CGE economic modeling capability since 2012. This capability enhances NISAC's economic modeling and analysis capabilities to answer a broader set of questions than possible with previous economic analysis capability. In particular, CGE modeling captures how the different sectors of the economy, for example, households, businesses, government, etc., interact to allocate resources in an economy and this approach captures these interactions when it is used to estimate the economic impacts of the kinds of events NISAC often analyzes.

  15. Unification and mechanistic detail as drivers of model construction: models of networks in economics and sociology.

    Science.gov (United States)

    Kuorikoski, Jaakko; Marchionni, Caterina

    2014-12-01

    We examine the diversity of strategies of modelling networks in (micro) economics and (analytical) sociology. Field-specific conceptions of what explaining (with) networks amounts to or systematic preference for certain kinds of explanatory factors are not sufficient to account for differences in modelling methodologies. We argue that network models in both sociology and economics are abstract models of network mechanisms and that differences in their modelling strategies derive to a large extent from field-specific conceptions of the way in which a good model should be a general one. Whereas the economics models aim at unification, the sociological models aim at a set of mechanism schemas that are extrapolatable to the extent that the underlying psychological mechanisms are general. These conceptions of generality induce specific biases in mechanistic explanation and are related to different views of when knowledge from different fields should be seen as relevant.

  16. Economic Analysis of HPAI Control in the Netherlands I: Epidemiological modelling to support economic analysis

    NARCIS (Netherlands)

    Longworth, N.J.; Mourits, M.C.M.; Saatkamp, H.W.

    2014-01-01

    Economic analysis of control strategies for contagious diseases is a necessity in the development of contingency plans. Economic impacts arising from epidemics such as highly pathogenic avian influenza (HPAI) consist of direct costs (DC), direct consequential costs (DCC), indirect consequential

  17. Preliminary Phase Field Computational Model Development

    Energy Technology Data Exchange (ETDEWEB)

    Li, Yulan [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Hu, Shenyang Y. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Xu, Ke [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Suter, Jonathan D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); McCloy, John S. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Johnson, Bradley R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Ramuhalli, Pradeep [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2014-12-15

    This interim report presents progress towards the development of meso-scale models of magnetic behavior that incorporate microstructural information. Modeling magnetic signatures in irradiated materials with complex microstructures (such as structural steels) is a significant challenge. The complexity is addressed incrementally, using the monocrystalline Fe (i.e., ferrite) film as model systems to develop and validate initial models, followed by polycrystalline Fe films, and by more complicated and representative alloys. In addition, the modeling incrementally addresses inclusion of other major phases (e.g., martensite, austenite), minor magnetic phases (e.g., carbides, FeCr precipitates), and minor nonmagnetic phases (e.g., Cu precipitates, voids). The focus of the magnetic modeling is on phase-field models. The models are based on the numerical solution to the Landau-Lifshitz-Gilbert equation. From the computational standpoint, phase-field modeling allows the simulation of large enough systems that relevant defect structures and their effects on functional properties like magnetism can be simulated. To date, two phase-field models have been generated in support of this work. First, a bulk iron model with periodic boundary conditions was generated as a proof-of-concept to investigate major loop effects of single versus polycrystalline bulk iron and effects of single non-magnetic defects. More recently, to support the experimental program herein using iron thin films, a new model was generated that uses finite boundary conditions representing surfaces and edges. This model has provided key insights into the domain structures observed in magnetic force microscopy (MFM) measurements. Simulation results for single crystal thin-film iron indicate the feasibility of the model for determining magnetic domain wall thickness and mobility in an externally applied field. Because the phase-field model dimensions are limited relative to the size of most specimens used in

  18. Abstract Modelling of the Impact of Activities of Economic Entities on the Social System

    Directory of Open Access Journals (Sweden)

    Dana Bernardová

    2017-01-01

    Full Text Available Economic entities as integral parts of the social system have an impact on it. The complexity of structures and uncertainty of behaviour which are also conditioned by incorporating the human factor are the typical characteristics of economic entities and the social system. The lack of precise measurement data as well as precise information is their typical feature. Methods of creating computer models of such systems must therefore be based on uncertain, incomplete or approximate data and hypothetical assumptions. The paper deals with the synthesis of the abstract model of the expert system for determining the level of corporate social responsibility of an enterprise (CSR with the use of methods of artificial intelligence. The linguistic rule model is built on the basis of the expert determination of the level of CSR based on the level of care for employees, level of supplier‑customer relations, level of its ecological behaviour, and compliance with legal obligations. The linguistic modelling method is based on the theoretical approach to fuzzy set mathematics and fuzzy logic. The aim of the paper is the presentation of the system for determining the level of CSR with the use of non‑conventional non‑numerical methods as well as simulative presentation of the efficiency of its functions. The above‑mentioned expert system is a relevant module of the built hierarchical structure aimed at the research of impacts of activities of economic entities on the social system.

  19. Modelling the lifetime economic consequences of glaucoma in France.

    Science.gov (United States)

    Philippe Nordmann, Jean; Lafuma, Antoine; Berdeaux, Gilles

    2009-03-01

    To estimate the lifetime economic consequences of glaucoma in France. A Markov model estimated the average discounted outcome and cost of glaucoma treatment over a patient's lifetime. Clinical states were defined as first- to fourth-line drugs, no treatment, laser therapy, surgery, blindness and death. After each failure (always after the fourth-line drug) patients could receive either laser treatment or surgery followed by no treatment, or a new treatment. A societal perspective was adopted. Sensitivity analyses were performed. Discounted medical costs were euro 7,322 for ocular hypertension treatment (OHT) and euro 8,488 for a glaucoma patient. Social costs of OHT and glaucoma patients exceeded medical costs. First-line use of the most effective drug would reduce medical and social costs. Societal willingness to pay for the vision benefit would equal the medical costs. Treatment initiated with the most effective drug is a cost saving strategy. Public health decisions in glaucoma treatment should take a broad economic view embracing the lifetime duration of the disease. There is still a place both within and outside the healthcare system for therapeutic innovations with important economic consequences that bring high added value to patients.

  20. Final technical report for DOE Computational Nanoscience Project: Integrated Multiscale Modeling of Molecular Computing Devices

    Energy Technology Data Exchange (ETDEWEB)

    Cummings, P. T.

    2010-02-08

    This document reports the outcomes of the Computational Nanoscience Project, "Integrated Multiscale Modeling of Molecular Computing Devices". It includes a list of participants and publications arising from the research supported.

  1. Modeling groundwater flow on massively parallel computers

    Energy Technology Data Exchange (ETDEWEB)

    Ashby, S.F.; Falgout, R.D.; Fogwell, T.W.; Tompson, A.F.B.

    1994-12-31

    The authors will explore the numerical simulation of groundwater flow in three-dimensional heterogeneous porous media. An interdisciplinary team of mathematicians, computer scientists, hydrologists, and environmental engineers is developing a sophisticated simulation code for use on workstation clusters and MPPs. To date, they have concentrated on modeling flow in the saturated zone (single phase), which requires the solution of a large linear system. they will discuss their implementation of preconditioned conjugate gradient solvers. The preconditioners under consideration include simple diagonal scaling, s-step Jacobi, adaptive Chebyshev polynomial preconditioning, and multigrid. They will present some preliminary numerical results, including simulations of groundwater flow at the LLNL site. They also will demonstrate the code`s scalability.

  2. Computational modeling of a forward lunge

    DEFF Research Database (Denmark)

    Alkjær, Tine; Wieland, Maja Rose; Andersen, Michael Skipper

    2012-01-01

    This study investigated the function of the cruciate ligaments during a forward lunge movement. The mechanical roles of the anterior and posterior cruciate ligament (ACL, PCL) during sagittal plane movements, such as forward lunging, are unclear. A forward lunge movement contains a knee joint...... during forward lunging. Thus, the purpose of the present study was to establish a musculoskeletal model of the forward lunge to computationally investigate the complete mechanical force equilibrium of the tibia during the movement to examine the loading pattern of the cruciate ligaments. A healthy female...... flexion and extension that is controlled by the quadriceps muscle. The contraction of the quadriceps can cause anterior tibial translation, which may strain the ACL at knee joint positions close to full extension. However, recent findings suggest that it is the PCL rather than the ACL which is strained...

  3. Methodology and basic algorithms of the Livermore Economic Modeling System

    Energy Technology Data Exchange (ETDEWEB)

    Bell, R.B.

    1981-03-17

    The methodology and the basic pricing algorithms used in the Livermore Economic Modeling System (EMS) are described. The report explains the derivations of the EMS equations in detail; however, it could also serve as a general introduction to the modeling system. A brief but comprehensive explanation of what EMS is and does, and how it does it is presented. The second part examines the basic pricing algorithms currently implemented in EMS. Each algorithm's function is analyzed and a detailed derivation of the actual mathematical expressions used to implement the algorithm is presented. EMS is an evolving modeling system; improvements in existing algorithms are constantly under development and new submodels are being introduced. A snapshot of the standard version of EMS is provided and areas currently under study and development are considered briefly.

  4. Direct regional energy/economic modeling (DREEM) design

    Energy Technology Data Exchange (ETDEWEB)

    Hall, P.D.; Pleatsikas, C.J.

    1979-10-01

    This report summarizes an investigation into the use of regional and multiregional economic models for estimating the indirect and induced impacts of Federally-mandated energy policies. It includes an examination of alternative types of energy policies that can impact regional economies and the available analytical frameworks for measuring the magnitudes and spatial extents of these impacts. One such analytical system, the National Regional Impact Evaluation System (NRIES), currently operational in the Bureau of Economic Analysis (BEA), is chosen for more detailed investigation. The report summarizes the models capabilities for addressing various energy policy issues and then demonstrates the applicability of the model in specified contexts by developing appropriate input data for three scenarios. These scenarios concern the multi-state impacts of alternative coal-mining-development decisions, multi-regional impacts of macroeconomic change, and the comprehensive effects of an alternative national energy supply trajectory. On the basis of this experience, the capabilities of NRIES for analyzing energy-policy issues are summarized in a concluding chapter.

  5. A Computational Model for Predicting Gas Breakdown

    Science.gov (United States)

    Gill, Zachary

    2017-10-01

    Pulsed-inductive discharges are a common method of producing a plasma. They provide a mechanism for quickly and efficiently generating a large volume of plasma for rapid use and are seen in applications including propulsion, fusion power, and high-power lasers. However, some common designs see a delayed response time due to the plasma forming when the magnitude of the magnetic field in the thruster is at a minimum. New designs are difficult to evaluate due to the amount of time needed to construct a new geometry and the high monetary cost of changing the power generation circuit. To more quickly evaluate new designs and better understand the shortcomings of existing designs, a computational model is developed. This model uses a modified single-electron model as the basis for a Mathematica code to determine how the energy distribution in a system changes with regards to time and location. By analyzing this energy distribution, the approximate time and location of initial plasma breakdown can be predicted. The results from this code are then compared to existing data to show its validity and shortcomings. Missouri S&T APLab.

  6. Computational model of heterogeneous heating in melanin

    Science.gov (United States)

    Kellicker, Jason; DiMarzio, Charles A.; Kowalski, Gregory J.

    2015-03-01

    Melanin particles often present as an aggregate of smaller melanin pigment granules and have a heterogeneous surface morphology. When irradiated with light within the absorption spectrum of melanin, these heterogeneities produce measurable concentrations of the electric field that result in temperature gradients from thermal effects that are not seen with spherical or ellipsoidal modeling of melanin. Modeling melanin without taking into consideration the heterogeneous surface morphology yields results that underestimate the strongest signals or over{estimate their spatial extent. We present a new technique to image phase changes induced by heating using a computational model of melanin that exhibits these surface heterogeneities. From this analysis, we demonstrate the heterogeneous energy absorption and resulting heating that occurs at the surface of the melanin granule that is consistent with three{photon absorption. Using the three{photon dluorescence as a beacon, we propose a method for detecting the extents of the melanin granule using photothermal microscopy to measure the phase changes resulting from the heating of the melanin.

  7. Computational Process Modeling for Additive Manufacturing (OSU)

    Science.gov (United States)

    Bagg, Stacey; Zhang, Wei

    2015-01-01

    Powder-Bed Additive Manufacturing (AM) through Direct Metal Laser Sintering (DMLS) or Selective Laser Melting (SLM) is being used by NASA and the Aerospace industry to "print" parts that traditionally are very complex, high cost, or long schedule lead items. The process spreads a thin layer of metal powder over a build platform, then melts the powder in a series of welds in a desired shape. The next layer of powder is applied, and the process is repeated until layer-by-layer, a very complex part can be built. This reduces cost and schedule by eliminating very complex tooling and processes traditionally used in aerospace component manufacturing. To use the process to print end-use items, NASA seeks to understand SLM material well enough to develop a method of qualifying parts for space flight operation. Traditionally, a new material process takes many years and high investment to generate statistical databases and experiential knowledge, but computational modeling can truncate the schedule and cost -many experiments can be run quickly in a model, which would take years and a high material cost to run empirically. This project seeks to optimize material build parameters with reduced time and cost through modeling.

  8. Partial differential equation models in the socio-economic sciences.

    Science.gov (United States)

    Burger, Martin; Caffarelli, Luis; Markowich, Peter A

    2014-11-13

    Mathematical models based on partial differential equations (PDEs) have become an integral part of quantitative analysis in most branches of science and engineering, recently expanding also towards biomedicine and socio-economic sciences. The application of PDEs in the latter is a promising field, but widely quite open and leading to a variety of novel mathematical challenges. In this introductory article of the Theme Issue, we will provide an overview of the field and its recent boosting topics. Moreover, we will put the contributions to the Theme Issue in an appropriate perspective. © 2014 The Author(s) Published by the Royal Society. All rights reserved.

  9. Germany's socio-economic model and the Euro crisis

    Directory of Open Access Journals (Sweden)

    Michael Dauderstädt

    2013-03-01

    Full Text Available Germany's socio-economic model, the "social market economy", was established in West Germany after World War II and extended to the unified Germany in 1990. During a prolonged recession after the adoption of the Euro in 1998, major reforms (Agenda 2010 were introduced which many consider as the key of Germany's recent success. The reforms had mixed results: employment increased but has consisted to a large extent of precarious low-wage jobs. Growth depended on export surpluses based on an internal real devaluation (low unit labour costs which make Germany vulnerable to global recessions as in 2009. Overall inequality increased substantially.

  10. Partial differential equation models in the socio-economic sciences

    KAUST Repository

    Burger, Martin

    2014-10-06

    Mathematical models based on partial differential equations (PDEs) have become an integral part of quantitative analysis in most branches of science and engineering, recently expanding also towards biomedicine and socio-economic sciences. The application of PDEs in the latter is a promising field, but widely quite open and leading to a variety of novel mathematical challenges. In this introductory article of the Theme Issue, we will provide an overview of the field and its recent boosting topics. Moreover, we will put the contributions to the Theme Issue in an appropriate perspective.

  11. ANALYSIS OF ECONOMIC MODELS OF POTATO PRODUCTION IN MONTENEGRO

    Directory of Open Access Journals (Sweden)

    Miomir JOVANOVIC

    2014-04-01

    Full Text Available The northern region of Montenegro represents a very important resource for agricultural production. However, the depopulation of the analysed area, pronounced in-kind character of production without significant participation of market producers, lack of market research, stronger vertical and horizontal connection between primary production and processing sectors have significant impacts causing the low level of competitiveness of agricultural production. Potato production in the analysed area has recorded positive trends in last ten years. This paper presents economic models of agriculture households on the analysed area from the potatoes production point of view.

  12. Boolean Variables in Economic Models Solved by Linear Programming

    Directory of Open Access Journals (Sweden)

    Lixandroiu D.

    2014-12-01

    Full Text Available The article analyses the use of logical variables in economic models solved by linear programming. Focus is given to the presentation of the way logical constraints are obtained and of the definition rules based on predicate logic. Emphasis is also put on the possibility to use logical variables in constructing a linear objective function on intervals. Such functions are encountered when costs or unitary receipts are different on disjunct intervals of production volumes achieved or sold. Other uses of Boolean variables are connected to constraint systems with conditions and the case of a variable which takes values from a finite set of integers.

  13. Mars Colony in situ resource utilization: An integrated architecture and economics model

    Science.gov (United States)

    Shishko, Robert; Fradet, René; Do, Sydney; Saydam, Serkan; Tapia-Cortez, Carlos; Dempster, Andrew G.; Coulton, Jeff

    2017-09-01

    This paper reports on our effort to develop an ensemble of specialized models to explore the commercial potential of mining water/ice on Mars in support of a Mars Colony. This ensemble starts with a formal systems architecting framework to describe a Mars Colony and capture its artifacts' parameters and technical attributes. The resulting database is then linked to a variety of ;downstream; analytic models. In particular, we integrated an extraction process (i.e., ;mining;) model, a simulation of the colony's environmental control and life support infrastructure known as HabNet, and a risk-based economics model. The mining model focuses on the technologies associated with in situ resource extraction, processing, storage and handling, and delivery. This model computes the production rate as a function of the systems' technical parameters and the local Mars environment. HabNet simulates the fundamental sustainability relationships associated with establishing and maintaining the colony's population. The economics model brings together market information, investment and operating costs, along with measures of market uncertainty and Monte Carlo techniques, with the objective of determining the profitability of commercial water/ice in situ mining operations. All told, over 50 market and technical parameters can be varied in order to address ;what-if; questions, including colony location.

  14. The Effect of Units Lost Due to Deterioration in Fuzzy Economic Order Quantity (FEOQ Model

    Directory of Open Access Journals (Sweden)

    M. Pattnaik

    2013-07-01

    Full Text Available For several decades, the Economic Order Quantity (EOQ model and its variations have received much attention from researchers. Recently, there has been an investigation into an EOQ model incorporating effect of units lost due to deterioration in infinite planning horizon with crisp decision environment. Accounting for holding and ordering cost, as has traditionally been the case of modeling inventory systems in fuzzy environment are investigated which are not precisely known and defined on a bounded interval of real numbers. The question is how reliable are the EOQ models when items stocked deteriorate one time. This paper introduces Fuzzy Economic Order Quantity (FEOQ model in which it assumes that units lost due to deterioration is included in the objective function to properly model the problem in finite planning horizon. The numerical analysis shows that an appropriate fuzzy policy can benefit the retailer and that is significant, especially for deteriorating items is shown to be superior to that of crisp decision making. A computational algorithm using LINGO 13.0 and MATLAB (R2009a software are developed to find the optimal solution. Sensitivity analysis of the optimal solution is also studied and managerial insights are drawn which shows the influence of key model parameters.

  15. Representing, Running, and Revising Mental Models: A Computational Model.

    Science.gov (United States)

    Friedman, Scott; Forbus, Kenneth; Sherin, Bruce

    2017-12-27

    People use commonsense science knowledge to flexibly explain, predict, and manipulate the world around them, yet we lack computational models of how this commonsense science knowledge is represented, acquired, utilized, and revised. This is an important challenge for cognitive science: Building higher order computational models in this area will help characterize one of the hallmarks of human reasoning, and it will allow us to build more robust reasoning systems. This paper presents a novel assembled coherence (AC) theory of human conceptual change, whereby people revise beliefs and mental models by constructing and evaluating explanations using fragmentary, globally inconsistent knowledge. We implement AC theory with Timber, a computational model of conceptual change that revises its beliefs and generates human-like explanations in commonsense science. Timber represents domain knowledge using predicate calculus and qualitative model fragments, and uses an abductive model formulation algorithm to construct competing explanations for phenomena. Timber then (a) scores competing explanations with respect to previously accepted beliefs, using a cost function based on simplicity and credibility, (b) identifies a low-cost, preferred explanation and accepts its constituent beliefs, and then (c) greedily alters previous explanation preferences to reduce global cost and thereby revise beliefs. Consistency is a soft constraint in Timber; it is biased to select explanations that share consistent beliefs, assumptions, and causal structure with its other, preferred explanations. In this paper, we use Timber to simulate the belief changes of students during clinical interviews about how the seasons change. We show that Timber produces and revises a sequence of explanations similar to those of the students, which supports the psychological plausibility of AC theory. Copyright © 2017 Cognitive Science Society, Inc.

  16. Artemisinin resistance--modelling the potential human and economic costs.

    Science.gov (United States)

    Lubell, Yoel; Dondorp, Arjen; Guérin, Philippe J; Drake, Tom; Meek, Sylvia; Ashley, Elizabeth; Day, Nicholas P J; White, Nicholas J; White, Lisa J

    2014-11-23

    Artemisinin combination therapy is recommended as first-line treatment for falciparum malaria across the endemic world and is increasingly relied upon for treating vivax malaria where chloroquine is failing. Artemisinin resistance was first detected in western Cambodia in 2007, and is now confirmed in the Greater Mekong region, raising the spectre of a malaria resurgence that could undo a decade of progress in control, and threaten the feasibility of elimination. The magnitude of this threat has not been quantified. This analysis compares the health and economic consequences of two future scenarios occurring once artemisinin-based treatments are available with high coverage. In the first scenario, artemisinin combination therapy (ACT) is largely effective in the management of uncomplicated malaria and severe malaria is treated with artesunate, while in the second scenario ACT are failing at a rate of 30%, and treatment of severe malaria reverts to quinine. The model is applied to all malaria-endemic countries using their specific estimates for malaria incidence, transmission intensity and GDP. The model describes the direct medical costs for repeated diagnosis and retreatment of clinical failures as well as admission costs for severe malaria. For productivity losses, the conservative friction costing method is used, which assumes a limited economic impact for individuals that are no longer economically active until they are replaced from the unemployment pool. Using conservative assumptions and parameter estimates, the model projects an excess of 116,000 deaths annually in the scenario of widespread artemisinin resistance. The predicted medical costs for retreatment of clinical failures and for management of severe malaria exceed US$32 million per year. Productivity losses resulting from excess morbidity and mortality were estimated at US$385 million for each year during which failing ACT remained in use as first-line treatment. These 'ballpark' figures for the

  17. Computational Modeling of Biological Systems From Molecules to Pathways

    CERN Document Server

    2012-01-01

    Computational modeling is emerging as a powerful new approach for studying and manipulating biological systems. Many diverse methods have been developed to model, visualize, and rationally alter these systems at various length scales, from atomic resolution to the level of cellular pathways. Processes taking place at larger time and length scales, such as molecular evolution, have also greatly benefited from new breeds of computational approaches. Computational Modeling of Biological Systems: From Molecules to Pathways provides an overview of established computational methods for the modeling of biologically and medically relevant systems. It is suitable for researchers and professionals working in the fields of biophysics, computational biology, systems biology, and molecular medicine.

  18. Economic man as model man: Ideal types, idealization and caricatures

    NARCIS (Netherlands)

    Morgan, M.S.

    2006-01-01

    Economics revolves around a central character: "economic man." As historians, we are all familiar with various episodes in the history of this character, and we appreciate his ever-changing aspect even while many of our colleagues in economics think the rational economic agent of neoclassical

  19. RECON: a computer program for analyzing repository economics. Documentation and user's manual

    Energy Technology Data Exchange (ETDEWEB)

    Clark, L.L.; Cole, B.M.; McNair, G.W.; Schutz, M.E.

    1983-05-01

    From 1981 through 1983 the Pacific Northwest Laboratory has been developing a computer model named RECON to calculate repository costs from parametric data input. The objective of the program has been to develop the capability to evalute the effect on costs of changes in repository design parameters and operating scenario assumptions. This report documents the development of the model through March of 1983. Included in the report are: (1) descriptions of model development and the underlying equations, assumptions and definitions; (2) descriptions of data input either using card images or an interactive data input program; and (3) detailed listings of the program and definitions of program variables. Cost estimates generated using the model have been verified against independent estimates and good agreement has been obtained.

  20. Process model economics of xanthan production from confectionery industry wastewaters.

    Science.gov (United States)

    Bajić, Bojana Ž; Vučurović, Damjan G; Dodić, Siniša N; Grahovac, Jovana A; Dodić, Jelena M

    2017-12-01

    In this research a process and cost model for a xanthan production facility was developed using process simulation software (SuperPro Designer ® ). This work represents a novelty in the field for two reasons. One is that xanthan gum has been produced from several wastes but never from wastewaters from confectionery industries. The other more important is that the aforementioned software, which in intended exclusively for bioprocesses, is used for generating a base case, i.e. starting point for transferring the technology to industrial scales. Previously acquired experimental knowledge about using confectionery wastewaters from five different factories as substitutes for commercially used cultivation medium have been incorporated into the process model in order to obtain an economic viability of implementing such substrates. A lower initial sugar content in the medium based on wastewater (28.41 g/L) compared to the synthetic medium (30.00 g/L) gave a lower xanthan content at the end of cultivation (23.98 and 26.27 g/L, respectively). Although this resulted in somewhat poorer economic parameters, they were still in the range of being an investment of interest. Also the possibility of utilizing a cheap resource (waste) and reducing pollution that would result from its disposal has a positive effect on the environment. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. Computational modeling of acute myocardial infarction.

    Science.gov (United States)

    Sáez, P; Kuhl, E

    2016-01-01

    Myocardial infarction, commonly known as heart attack, is caused by reduced blood supply and damages the heart muscle because of a lack of oxygen. Myocardial infarction initiates a cascade of biochemical and mechanical events. In the early stages, cardiomyocytes death, wall thinning, collagen degradation, and ventricular dilation are the immediate consequences of myocardial infarction. In the later stages, collagenous scar formation in the infarcted zone and hypertrophy of the non-infarcted zone are auto-regulatory mechanisms to partly correct for these events. Here we propose a computational model for the short-term adaptation after myocardial infarction using the continuum theory of multiplicative growth. Our model captures the effects of cell death initiating wall thinning, and collagen degradation initiating ventricular dilation. Our simulations agree well with clinical observations in early myocardial infarction. They represent a first step toward simulating the progression of myocardial infarction with the ultimate goal to predict the propensity toward heart failure as a function of infarct intensity, location, and size.

  2. Economic Pressure in African American Families: A Replication and Extension of the Family Stress Model.

    Science.gov (United States)

    Conger, Rand D.; Wallace, Lora Ebert; Sun, Yumei; Simons, Ronald L.; McLoyd, Vonnie C.; Brody, Gene H.

    2002-01-01

    Evaluated applicability of family stress model of economic hardship for understanding economic influences on child development among African American families with a 10- or 11-year-old child. Found that economic hardship positively related to economic pressure in families, and to emotional distress of caregivers, which in turn damaged the…

  3. A semantic-web approach for modeling computing infrastructures

    NARCIS (Netherlands)

    Ghijsen, M.; van der Ham, J.; Grosso, P.; Dumitru, C.; Zhu, H.; Zhao, Z.; de Laat, C.

    2013-01-01

    This paper describes our approach to modeling computing infrastructures. Our main contribution is the Infrastructure and Network Description Language (INDL) ontology. The aim of INDL is to provide technology independent descriptions of computing infrastructures, including the physical resources as

  4. Estimating the economic opportunity cost of water use with river basin simulators in a computationally efficient way

    Science.gov (United States)

    Rougé, Charles; Harou, Julien J.; Pulido-Velazquez, Manuel; Matrosov, Evgenii S.

    2017-04-01

    The marginal opportunity cost of water refers to benefits forgone by not allocating an additional unit of water to its most economically productive use at a specific location in a river basin at a specific moment in time. Estimating the opportunity cost of water is an important contribution to water management as it can be used for better water allocation or better system operation, and can suggest where future water infrastructure could be most beneficial. Opportunity costs can be estimated using 'shadow values' provided by hydro-economic optimization models. Yet, such models' use of optimization means the models had difficulty accurately representing the impact of operating rules and regulatory and institutional mechanisms on actual water allocation. In this work we use more widely available river basin simulation models to estimate opportunity costs. This has been done before by adding in the model a small quantity of water at the place and time where the opportunity cost should be computed, then running a simulation and comparing the difference in system benefits. The added system benefits per unit of water added to the system then provide an approximation of the opportunity cost. This approximation can then be used to design efficient pricing policies that provide incentives for users to reduce their water consumption. Yet, this method requires one simulation run per node and per time step, which is demanding computationally for large-scale systems and short time steps (e.g., a day or a week). Besides, opportunity cost estimates are supposed to reflect the most productive use of an additional unit of water, yet the simulation rules do not necessarily use water that way. In this work, we propose an alternative approach, which computes the opportunity cost through a double backward induction, first recursively from outlet to headwaters within the river network at each time step, then recursively backwards in time. Both backward inductions only require linear

  5. Fuzzy economic production quantity model with time dependent demand rate

    Directory of Open Access Journals (Sweden)

    Susanta Kumar Indrajitsingha

    2016-09-01

    Full Text Available Background: In this paper, an economic production quantity model is considered under a fuzzy environment. Both the demand cost and holding cost are considered using fuzzy pentagonal numbers. The Signed Distance Method is used to defuzzify the total cost function. Methods: The results obtained by these methods are compared with the help of a numerical example. Sensitivity analysis is also carried out to explore the effect of changes in the values of some of the system parameters. Results and conclusions: The fuzzy EPQ model with time dependent demand rate was presented together with the possible implementation. The behavior of changes in parameters was analyzed. The possible extension of the implementation of this method was presented.

  6. An economical semi-analytical orbit theory for micro-computer applications

    Science.gov (United States)

    Gordon, R. A.

    1988-01-01

    An economical algorithm is presented for predicting the position of a satellite perturbed by drag and zonal harmonics J sub 2 through J sub 4. Simplicity being of the essence, drag is modeled as a secular decay rate in the semi-axis (retarded motion); with the zonal perturbations modeled from a modified version of the Brouwers formulas. The algorithm is developed as: an alternative on-board orbit predictor; a back up propagator requiring low energy consumption; or a ground based propagator for microcomputer applications (e.g., at the foot of an antenna). An O(J sub 2) secular retarded state partial matrix (matrizant) is also given to employ with state estimation. The theory was implemented in BASIC on an inexpensive microcomputer, the program occupying under 8K bytes of memory. Simulated trajectory data and real tracking data are employed to illustrate the theory's ability to accurately accommodate oblateness and drag effects.

  7. Computational and Modeling Strategies for Cell Motility

    Science.gov (United States)

    Wang, Qi; Yang, Xiaofeng; Adalsteinsson, David; Elston, Timothy C.; Jacobson, Ken; Kapustina, Maryna; Forest, M. Gregory

    A predictive simulation of the dynamics of a living cell remains a fundamental modeling and computational challenge. The challenge does not even make sense unless one specifies the level of detail and the phenomena of interest, whether the focus is on near-equilibrium or strongly nonequilibrium behavior, and on localized, subcellular, or global cell behavior. Therefore, choices have to be made clear at the outset, ranging from distinguishing between prokaryotic and eukaryotic cells, specificity within each of these types, whether the cell is "normal," whether one wants to model mitosis, blebs, migration, division, deformation due to confined flow as with red blood cells, and the level of microscopic detail for any of these processes. The review article by Hoffman and Crocker [48] is both an excellent overview of cell mechanics and an inspiration for our approach. One might be interested, for example, in duplicating the intricate experimental details reported in [43]: "actin polymerization periodically builds a mechanical link, the lamellipodium, connecting myosin motors with the initiation of adhesion sites, suggesting that the major functions driving motility are coordinated by a biomechanical process," or to duplicate experimental evidence of traveling waves in cells recovering from actin depolymerization [42, 35]. Modeling studies of lamellipodial structure, protrusion, and retraction behavior range from early mechanistic models [84] to more recent deterministic [112, 97] and stochastic [51] approaches with significant biochemical and structural detail. Recent microscopic-macroscopic models and algorithms for cell blebbing have been developed by Young and Mitran [116], which update cytoskeletal microstructure via statistical sampling techniques together with fluid variables. Alternatively, whole cell compartment models (without spatial details) of oscillations in spreading cells have been proposed [35, 92, 109] which show positive and negative feedback

  8. Integration of multiple determinants in the neuronal computation of economic values.

    Science.gov (United States)

    Raghuraman, Anantha P; Padoa-Schioppa, Camillo

    2014-08-27

    Economic goods may vary on multiple dimensions (determinants). A central conjecture in decision neuroscience is that choices between goods are made by comparing subjective values computed through the integration of all relevant determinants. Previous work identified three groups of neurons in the orbitofrontal cortex (OFC) of monkeys engaged in economic choices: (1) offer value cells, which encode the value of individual offers; (2) chosen value cells, which encode the value of the chosen good; and (3) chosen juice cells, which encode the identity of the chosen good. In principle, these populations could be sufficient to generate a decision. Critically, previous work did not assess whether offer value cells (the putative input to the decision) indeed encode subjective values as opposed to physical properties of the goods, and/or whether offer value cells integrate multiple determinants. To address these issues, we recorded from the OFC while monkeys chose between risky outcomes. Confirming previous observations, three populations of neurons encoded the value of individual offers, the value of the chosen option, and the value-independent choice outcome. The activity of both offer value cells and chosen value cells encoded values defined by the integration of juice quantity and probability. Furthermore, both populations reflected the subjective risk attitude of the animals. We also found additional groups of neurons encoding the risk associated with a particular option, the risky nature of the chosen option, and whether the trial outcome was positive or negative. These results provide substantial support for the conjecture described above and for the involvement of OFC in good-based decisions. Copyright © 2014 the authors 0270-6474/14/3311583-21$15.00/0.

  9. [Mathematical models for economic evaluation: dynamic models based on differential equations].

    Science.gov (United States)

    Pradas Velasco, Roberto; Villar, Fernando Antoñanzas; Mar, Javier

    2009-01-01

    The joint utilization of both decision trees and epidemiological models based on differential equations is an appropriate method for the economic evaluation of preventative interventions applied to infectious diseases. These models can combine the dynamic pattern of the disease together with health resource consumption. To illustrate this type of model, we adjusted a dynamic system of differential equations to the epidemic behavior of influenza in Spain, with a view to projecting the epidemiologic impact of influenza vaccination. The results of the epidemic model are implemented in a diagram with the structure of a decision tree so that health resource consumption and the economic implications can be calculated.

  10. A computational model of consciousness for artificial emotional agents.

    OpenAIRE

    Kotov Artemy A.

    2017-01-01

    Background. The structure of consciousness has long been a cornerstone problem in the cognitive sciences. Recently it took on applied significance in the design of computer agents and mobile robots. This problem can thus be examined from perspectives of phi­losophy, neuropsychology, and computer modeling. Objective. In the present paper, we address the problem of the computational model of consciousness by designing computer agents aimed at simulating “speech understand­ing” and irony. Fu...

  11. Propagation of Computer Virus under Human Intervention: A Dynamical Model

    OpenAIRE

    Chenquan Gan; Xiaofan Yang; Wanping Liu; Qingyi Zhu; Xulong Zhang

    2012-01-01

    This paper examines the propagation behavior of computer virus under human intervention. A dynamical model describing the spread of computer virus, under which a susceptible computer can become recovered directly and an infected computer can become susceptible directly, is proposed. Through a qualitative analysis of this model, it is found that the virus-free equilibrium is globally asymptotically stable when the basic reproduction number R0≤1, whereas the viral equilibrium is globally asympt...

  12. AIR INGRESS ANALYSIS: COMPUTATIONAL FLUID DYNAMIC MODELS

    Energy Technology Data Exchange (ETDEWEB)

    Chang H. Oh; Eung S. Kim; Richard Schultz; Hans Gougar; David Petti; Hyung S. Kang

    2010-08-01

    The Idaho National Laboratory (INL), under the auspices of the U.S. Department of Energy, is performing research and development that focuses on key phenomena important during potential scenarios that may occur in very high temperature reactors (VHTRs). Phenomena Identification and Ranking Studies to date have ranked an air ingress event, following on the heels of a VHTR depressurization, as important with regard to core safety. Consequently, the development of advanced air ingress-related models and verification and validation data are a very high priority. Following a loss of coolant and system depressurization incident, air will enter the core of the High Temperature Gas Cooled Reactor through the break, possibly causing oxidation of the in-the core and reflector graphite structure. Simple core and plant models indicate that, under certain circumstances, the oxidation may proceed at an elevated rate with additional heat generated from the oxidation reaction itself. Under postulated conditions of fluid flow and temperature, excessive degradation of the lower plenum graphite can lead to a loss of structural support. Excessive oxidation of core graphite can also lead to the release of fission products into the confinement, which could be detrimental to a reactor safety. Computational fluid dynamic model developed in this study will improve our understanding of this phenomenon. This paper presents two-dimensional and three-dimensional CFD results for the quantitative assessment of the air ingress phenomena. A portion of results of the density-driven stratified flow in the inlet pipe will be compared with results of the experimental results.

  13. Actor Model of Computation for Scalable Robust Information Systems : One computer is no computer in IoT

    OpenAIRE

    Hewitt, Carl

    2015-01-01

    International audience; The Actor Model is a mathematical theory that treats “Actors” as the universal conceptual primitives of digital computation. Hypothesis: All physically possible computation can be directly implemented using Actors.The model has been used both as a framework for a theoretical understanding of concurrency, and as the theoretical basis for several practical implementations of concurrent systems. The advent of massive concurrency through client-cloud computing and many-cor...

  14. Biocellion: accelerating computer simulation of multicellular biological system models.

    Science.gov (United States)

    Kang, Seunghwa; Kahan, Simon; McDermott, Jason; Flann, Nicholas; Shmulevich, Ilya

    2014-11-01

    Biological system behaviors are often the outcome of complex interactions among a large number of cells and their biotic and abiotic environment. Computational biologists attempt to understand, predict and manipulate biological system behavior through mathematical modeling and computer simulation. Discrete agent-based modeling (in combination with high-resolution grids to model the extracellular environment) is a popular approach for building biological system models. However, the computational complexity of this approach forces computational biologists to resort to coarser resolution approaches to simulate large biological systems. High-performance parallel computers have the potential to address the computing challenge, but writing efficient software for parallel computers is difficult and time-consuming. We have developed Biocellion, a high-performance software framework, to solve this computing challenge using parallel computers. To support a wide range of multicellular biological system models, Biocellion asks users to provide their model specifics by filling the function body of pre-defined model routines. Using Biocellion, modelers without parallel computing expertise can efficiently exploit parallel computers with less effort than writing sequential programs from scratch. We simulate cell sorting, microbial patterning and a bacterial system in soil aggregate as case studies. Biocellion runs on x86 compatible systems with the 64 bit Linux operating system and is freely available for academic use. Visit http://biocellion.com for additional information. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  15. Biocellion: accelerating computer simulation of multicellular biological system models

    Science.gov (United States)

    Kang, Seunghwa; Kahan, Simon; McDermott, Jason; Flann, Nicholas; Shmulevich, Ilya

    2014-01-01

    Motivation: Biological system behaviors are often the outcome of complex interactions among a large number of cells and their biotic and abiotic environment. Computational biologists attempt to understand, predict and manipulate biological system behavior through mathematical modeling and computer simulation. Discrete agent-based modeling (in combination with high-resolution grids to model the extracellular environment) is a popular approach for building biological system models. However, the computational complexity of this approach forces computational biologists to resort to coarser resolution approaches to simulate large biological systems. High-performance parallel computers have the potential to address the computing challenge, but writing efficient software for parallel computers is difficult and time-consuming. Results: We have developed Biocellion, a high-performance software framework, to solve this computing challenge using parallel computers. To support a wide range of multicellular biological system models, Biocellion asks users to provide their model specifics by filling the function body of pre-defined model routines. Using Biocellion, modelers without parallel computing expertise can efficiently exploit parallel computers with less effort than writing sequential programs from scratch. We simulate cell sorting, microbial patterning and a bacterial system in soil aggregate as case studies. Availability and implementation: Biocellion runs on x86 compatible systems with the 64 bit Linux operating system and is freely available for academic use. Visit http://biocellion.com for additional information. Contact: seunghwa.kang@pnnl.gov PMID:25064572

  16. Applied economic model development algorithm for electronics company

    Directory of Open Access Journals (Sweden)

    Mikhailov I.

    2017-01-01

    Full Text Available The purpose of this paper is to report about received experience in the field of creating the actual methods and algorithms that help to simplify development of applied decision support systems. It reports about an algorithm, which is a result of two years research and have more than one-year practical verification. In a case of testing electronic components, the time of the contract conclusion is crucial point to make the greatest managerial mistake. At this stage, it is difficult to achieve a realistic assessment of time-limit and of wage-fund for future work. The creation of estimating model is possible way to solve this problem. In the article is represented an algorithm for creation of those models. The algorithm is based on example of the analytical model development that serves for amount of work estimation. The paper lists the algorithm’s stages and explains their meanings with participants’ goals. The implementation of the algorithm have made possible twofold acceleration of these models development and fulfilment of management’s requirements. The resulting models have made a significant economic effect. A new set of tasks was identified to be further theoretical study.

  17. Global economic consequences of selected surgical diseases: a modelling study.

    Science.gov (United States)

    Alkire, Blake C; Shrime, Mark G; Dare, Anna J; Vincent, Jeffrey R; Meara, John G

    2015-04-27

    The surgical burden of disease is substantial, but little is known about the associated economic consequences. We estimate the global macroeconomic impact of the surgical burden of disease due to injury, neoplasm, digestive diseases, and maternal and neonatal disorders from two distinct economic perspectives. We obtained mortality rate estimates for each disease for the years 2000 and 2010 from the Institute of Health Metrics and Evaluation Global Burden of Disease 2010 study, and estimates of the proportion of the burden of the selected diseases that is surgical from a paper by Shrime and colleagues. We first used the value of lost output (VLO) approach, based on the WHO's Projecting the Economic Cost of Ill-Health (EPIC) model, to project annual market economy losses due to these surgical diseases during 2015-30. EPIC attempts to model how disease affects a country's projected labour force and capital stock, which in turn are related to losses in economic output, or gross domestic product (GDP). We then used the value of lost welfare (VLW) approach, which is conceptually based on the value of a statistical life and is inclusive of non-market losses, to estimate the present value of long-run welfare losses resulting from mortality and short-run welfare losses resulting from morbidity incurred during 2010. Sensitivity analyses were performed for both approaches. During 2015-30, the VLO approach projected that surgical conditions would result in losses of 1·25% of potential GDP, or $20·7 trillion (2010 US$, purchasing power parity) in the 128 countries with data available. When expressed as a proportion of potential GDP, annual GDP losses were greatest in low-income and middle-income countries, with up to a 2·5% loss in output by 2030. When total welfare losses are assessed (VLW), the present value of economic losses is estimated to be equivalent to 17% of 2010 GDP, or $14·5 trillion in the 175 countries assessed with this approach. Neoplasm and injury account

  18. Elements of matrix modeling and computing with Matlab

    CERN Document Server

    White, Robert E

    2006-01-01

    As discrete models and computing have become more common, there is a need to study matrix computation and numerical linear algebra. Encompassing a diverse mathematical core, Elements of Matrix Modeling and Computing with MATLAB examines a variety of applications and their modeling processes, showing you how to develop matrix models and solve algebraic systems. Emphasizing practical skills, it creates a bridge from problems with two and three variables to more realistic problems that have additional variables. Elements of Matrix Modeling and Computing with MATLAB focuses on seven basic applicat

  19. Modeling economic costs of disasters and recovery involving positive effects of reconstruction: analysis using a dynamic CGE model

    Science.gov (United States)

    Xie, W.; Li, N.; Wu, J.-D.; Hao, X.-L.

    2013-11-01

    Disaster damages have negative effects on economy, whereas reconstruction investments have positive effects. The aim of this study is to model economic causes of disasters and recovery involving positive effects of reconstruction activities. Computable general equilibrium (CGE) model is a promising approach because it can incorporate these two kinds of shocks into a unified framework and further avoid double-counting problem. In order to factor both shocks in CGE model, direct loss is set as the amount of capital stock reduced on supply side of economy; A portion of investments restore the capital stock in existing period; An investment-driven dynamic model is formulated due to available reconstruction data, and the rest of a given country's saving is set as an endogenous variable. The 2008 Wenchuan Earthquake is selected as a case study to illustrate the model, and three scenarios are constructed: S0 (no disaster occurs), S1 (disaster occurs with reconstruction investment) and S2 (disaster occurs without reconstruction investment). S0 is taken as business as usual, and the differences between S1 and S0 and that between S2 and S0 can be interpreted as economic losses including reconstruction and excluding reconstruction respectively. The study showed that output from S1 is found to be closer to real data than that from S2. S2 overestimates economic loss by roughly two times that under S1. The gap in economic aggregate between S1 and S0 is reduced to 3% in 2011, a level that should take another four years to achieve under S2.

  20. Economic evaluation of newborn hearing screening: modelling costs and outcomes

    Directory of Open Access Journals (Sweden)

    von Voß, Hubertus

    2003-12-01

    Full Text Available Objectives: The prevalence of newborn hearing disorders is 1-3 per 1,000. Crucial for later outcome are correct diagnosis and effective treatment as soon as possible. With BERA and TEOAE low-risk techniques for early detection are available. Universal screening is recommended but not realised in most European health care systems.Aim of the study was to examine the scientific evidence of newborn hearing screening and a comparison of medical outcome and costs of different programmes, differentiated by type of strategy (risk screening, universal screening, no systematical screening. Methods: In an interdisciplinary health technology assessment project all studies on newborn hearing screening detected in a standardized comprehensive literature search were identified and data on medical outcome, costs, and cost-effectiveness extracted. A Markov model was designed to calculate cost-effectiveness ratios. Results: Economic data were extracted from 20 relevant publications out of 39 publications found. In the model total costs for screening of 100,000 newborns with a time horizon of ten years were calculated: 2.0 Mio.€ for universal screening (U, 1.0 Mio.€ for risk screening (R, and 0.6 Mio.€ for no screening (N. The costs per child detected: 13,395€ (U respectively 6,715€ (R, and 4,125€ (N. At 6 months of life the following percentages of cases are detected: U 72%, R 43%, N 13%. Conclusions: A remarkable small number of economic publications mainly of low methodological quality was found. In our own model we found reasonable cost-effectiveness ratios also for universal screening. Considering the outcome advantages of higher numbers of detected cases a universal newborn hearing screening is recommended.

  1. Economic evaluation of newborn hearing screening: modelling costs and outcomes

    Science.gov (United States)

    Hessel, Franz; Grill, Eva; Schnell-Inderst, Petra; Siebert, Uwe; Kunze, Silke; Nickisch, Andreas; von Voß, Hubertus; Wasem, Jürgen

    2003-01-01

    Objectives: The prevalence of newborn hearing disorders is 1-3 per 1,000. Crucial for later outcome are correct diagnosis and effective treatment as soon as possible. With BERA and TEOAE low-risk techniques for early detection are available. Universal screening is recommended but not realised in most European health care systems. Aim of the study was to examine the scientific evidence of newborn hearing screening and a comparison of medical outcome and costs of different programmes, differentiated by type of strategy (risk screening, universal screening, no systematical screening). Methods: In an interdisciplinary health technology assessment project all studies on newborn hearing screening detected in a standardized comprehensive literature search were identified and data on medical outcome, costs, and cost-effectiveness extracted. A Markov model was designed to calculate cost-effectiveness ratios. Results: Economic data were extracted from 20 relevant publications out of 39 publications found. In the model total costs for screening of 100,000 newborns with a time horizon of ten years were calculated: 2.0 Mio.€ for universal screening (U), 1.0 Mio.€ for risk screening (R), and 0.6 Mio.€ for no screening (N). The costs per child detected: 13,395€ (U) respectively 6,715€ (R), and 4,125€ (N). At 6 months of life the following percentages of cases are detected: U 72%, R 43%, N 13%. Conclusions: A remarkable small number of economic publications mainly of low methodological quality was found. In our own model we found reasonable cost-effectiveness ratios also for universal screening. Considering the outcome advantages of higher numbers of detected cases a universal newborn hearing screening is recommended. PMID:19675707

  2. Optimization Model for Economic Evaluation of Wind Farms - How to Optimize a Wind Energy Project Economically and Technically

    Directory of Open Access Journals (Sweden)

    Wagner Sousa de Oliveira

    2012-01-01

    Full Text Available This paper makes a review and systematize methods and techniques of economic evaluation applied to renewable energy projects, specific to wind energy projects. Both project and cost methodologies of economic evaluation are reviewed for a model optimization construction for a proposed optimization model with its objective function most appropriated. It is necessary to engage in different approaches, but complementary, microeconomic project evaluation methods and optimization methods applied to engineering solutions in wind energy converter systems. Optimization model for economic evaluation of wind farms can be as an efficient planning and resource management, which is the key to the success of an energy project. Wind energy is one of the most potent alternative energy resources; however the economics of wind energy is not yet universally favorable to place wind at a competitive platform with coal and natural gas (fossil fuels. Economic evaluation models of wind projects developed would allow investors to better plan their projects, as well as provide valuable insight into the areas that require further development to improve the overall economics of wind energy projects.

  3. A cost modelling system for cloud computing

    OpenAIRE

    Ajeh, Daniel; Ellman, Jeremy; Keogh, Shelagh

    2014-01-01

    An advance in technology unlocks new opportunities for organizations to increase their productivity, efficiency and process automation while reducing the cost of doing business as well. The emergence of cloud computing addresses these prospects through the provision of agile systems that are scalable, flexible and reliable as well as cost effective. Cloud computing has made hosting and deployment of computing resources cheaper and easier with no up-front charges but pay per-use flexible payme...

  4. INTEGRATION OF ECONOMIC AND COMPUTER SKILLS AT IMPLEMENTATION OF STUDENTS PROJECT «BUSINESS PLAN PRODUCING IN MICROSOFT WORD»

    Directory of Open Access Journals (Sweden)

    Y.B. Samchinska

    2012-07-01

    Full Text Available In the article expedience at implementation of economic specialities by complex students project on Informatics and Computer Sciences is grounded on creation of business plan by modern information technologies, and also methodical recommendations are presented on implementation of this project.

  5. User's manual for atmospheric fluidized bed combustor system economic performance algorithm computer program. [AFBCIBM

    Energy Technology Data Exchange (ETDEWEB)

    1979-12-01

    The computer program calculates several economic and energy terms, given various performance and cost parameters, for a system composed of coal, a coal beneficiation (cleaning) plant, a combustor plant and an associated flue gas desulfurization (FGD) plant. The combustor can be either an atmospheric fluidized bed combustor (AFBC) or a conventional pulverized (CP) combustor. The FGD system is a lime-slurry system.

  6. Modelling, abstraction, and computation in systems biology: A view from computer science.

    Science.gov (United States)

    Melham, Tom

    2013-04-01

    Systems biology is centrally engaged with computational modelling across multiple scales and at many levels of abstraction. Formal modelling, precise and formalised abstraction relationships, and computation also lie at the heart of computer science--and over the past decade a growing number of computer scientists have been bringing their discipline's core intellectual and computational tools to bear on biology in fascinating new ways. This paper explores some of the apparent points of contact between the two fields, in the context of a multi-disciplinary discussion on conceptual foundations of systems biology. Copyright © 2012 Elsevier Ltd. All rights reserved.

  7. An economic model of friendship and enmity for measuring social balance in networks

    Science.gov (United States)

    Lee, Kyu-Min; Shin, Euncheol; You, Seungil

    2017-12-01

    We propose a dynamic economic model of networks where agents can be friends or enemies with one another. This is a decentralized relationship model in that agents decide whether to change their relationships so as to minimize their imbalanced triads. In this model, there is a single parameter, which we call social temperature, that captures the degree to which agents care about social balance in their relationships. We show that the global structure of relationship configuration converges to a unique stationary distribution. Using this stationary distribution, we characterize the maximum likelihood estimator of the social temperature parameter. Since the estimator is computationally challenging to calculate from real social network datasets, we provide a simple simulation algorithm and verify its performance with real social network datasets.

  8. Models for regionalizing economic data and their applications within the scope of forensic disaster analyses

    Science.gov (United States)

    Schmidt, Hanns-Maximilian; Wiens, rer. pol. Marcus, , Dr.; Schultmann, rer. pol. Frank, Prof. _., Dr.

    2015-04-01

    The impact of natural hazards on the economic system can be observed in many different regions all over the world. Once the local economic structure is hit by an event direct costs instantly occur. However, the disturbance on a local level (e.g. parts of city or industries along a river bank) might also cause monetary damages in other, indirectly affected sectors. If the impact of an event is strong, these damages are likely to cascade and spread even on an international scale (e.g. the eruption of Eyjafjallajökull and its impact on the automotive sector in Europe). In order to determine these special impacts, one has to gain insights into the directly hit economic structure before being able to calculate these side effects. Especially, regarding the development of a model used for near real-time forensic disaster analyses any simulation needs to be based on data that is rapidly available or easily to be computed. Therefore, we investigated commonly used or recently discussed methodologies for regionalizing economic data. Surprisingly, even for German federal states there is no official input-output data available that can be used, although it might provide detailed figures concerning economic interrelations between different industry sectors. In the case of highly developed countries, such as Germany, we focus on models for regionalizing nationwide input-output table which is usually available at the national statistical offices. However, when it comes to developing countries (e.g. South-East Asia) the data quality and availability is usually much poorer. In this case, other sources need to be found for the proper assessment of regional economic performance. We developed an indicator-based model that can fill this gap because of its flexibility regarding the level of aggregation and the composability of different input parameters. Our poster presentation brings up a literature review and a summary on potential models that seem to be useful for this specific task

  9. Anticipating the uncertain: economic modeling and climate change policy

    Energy Technology Data Exchange (ETDEWEB)

    Jensen, Svenn

    2012-11-01

    With this thesis I wish to contribute to the understanding of how uncertainty and the anticipation of future events by economic actors affect climate policies. The thesis consists of four papers. Two papers are analytical models which explicitly consider that emissions are caused by extracting scarce fossil fuels which in the future must be replaced by clean technologies. The other two are so called numerical integrated assessment models. Such models represent the world economy, the climate system and the interactions between those two quantitatively, complementing more abstract theoretical work. Should policy makers discriminate between subsidizing renewable energy sources such as wind or solar power, and technologies such as carbon capture and storage (CCS)? Focusing only on the dynamic supply of fossil fuels and hence Co{sub 2}, we find here that cheaper future renewables cause extraction to speed up, lower costs of CCS may delay it. CCS hence may dampen the dynamic inefficiency caused by the absence of comprehensive climate policies today. Does it matter whether uncertainty about future damage assessment is due to scientific complexities or stems from the political process? In paper two, I find that political and scientific uncertainties have opposing effects on the incentives to investment in renewables and the extraction of fossil fuels: The prospect of scientific learning about the climate system increases investment incentives and, ceteris paribus, slows extraction down; uncertainty about future political constellations does the opposite. The optimal carbon tax under scientific uncertainty equals expected marginal damages, whereas political uncertainty demands a tax below marginal damages that decreases over time. Does uncertainty about economic growth impact optimal climate policy today? Here we are the first to consistently analyze how uncertainty about future economic growth affects optimal emission reductions and the optimal social cost of carbon. We

  10. Predictive Capability Maturity Model for computational modeling and simulation.

    Energy Technology Data Exchange (ETDEWEB)

    Oberkampf, William Louis; Trucano, Timothy Guy; Pilch, Martin M.

    2007-10-01

    The Predictive Capability Maturity Model (PCMM) is a new model that can be used to assess the level of maturity of computational modeling and simulation (M&S) efforts. The development of the model is based on both the authors experience and their analysis of similar investigations in the past. The perspective taken in this report is one of judging the usefulness of a predictive capability that relies on the numerical solution to partial differential equations to better inform and improve decision making. The review of past investigations, such as the Software Engineering Institute's Capability Maturity Model Integration and the National Aeronautics and Space Administration and Department of Defense Technology Readiness Levels, indicates that a more restricted, more interpretable method is needed to assess the maturity of an M&S effort. The PCMM addresses six contributing elements to M&S: (1) representation and geometric fidelity, (2) physics and material model fidelity, (3) code verification, (4) solution verification, (5) model validation, and (6) uncertainty quantification and sensitivity analysis. For each of these elements, attributes are identified that characterize four increasing levels of maturity. Importantly, the PCMM is a structured method for assessing the maturity of an M&S effort that is directed toward an engineering application of interest. The PCMM does not assess whether the M&S effort, the accuracy of the predictions, or the performance of the engineering system satisfies or does not satisfy specified application requirements.

  11. Evolution of Money Distribution in a Simple Economic Model

    Science.gov (United States)

    Liang, X. San; Carter, Thomas J.

    An analytical approach is utilized to study the money evolution in a simple agent-based economic model, where every agent randomly selects someone else and gives the target one dollar unless he runs out of money. (No one is allowed to go into debt.) If originally no agent is in poverty, for most of time the economy is found to be dominated by a Gaussian money distribution, with a fixed mean and an increasing variance proportional to time. This structure begins to be drifted toward the left when the tail of the Gaussian hits the left boundary, and the drift becomes faster and faster, until a steady state is reached. The steady state generally follows the Boltzmann-Gibbs distribution, except for the points around the origin. Our result shows that, the pdf for the utterly destitute is only half of that predicted by the Boltzmann solution. An implication of this is that the economic structure may be improved through manipulating transaction rules.

  12. Computational modeling of ion transport through nanopores

    Science.gov (United States)

    Modi, Niraj; Winterhalter, Mathias; Kleinekathöfer, Ulrich

    2012-09-01

    Nanoscale pores are ubiquitous in biological systems while artificial nanopores are being fabricated for an increasing number of applications. Biological pores are responsible for the transport of various ions and substrates between the different compartments of biological systems separated by membranes while artificial pores are aimed at emulating such transport properties. As an experimental method, electrophysiology has proven to be an important nano-analytical tool for the study of substrate transport through nanopores utilizing ion current measurements as a probe for the detection. Independent of the pore type, i.e., biological or synthetic, and objective of the study, i.e., to model cellular processes of ion transport or electrophysiological experiments, it has become increasingly important to understand the dynamics of ions in nanoscale confinements. To this end, numerical simulations have established themselves as an indispensable tool to decipher ion transport processes through biological as well as artificial nanopores. This article provides an overview of different theoretical and computational methods to study ion transport in general and to calculate ion conductance in particular. Potential new improvements in the existing methods and their applications are highlighted wherever applicable. Moreover, representative examples are given describing the ion transport through biological and synthetic nanopores as well as the high selectivity of ion channels. Special emphasis is placed on the usage of molecular dynamics simulations which already have demonstrated their potential to unravel ion transport properties at an atomic level.

  13. Agricultural climate impacts assessment for economic modeling and decision support

    Science.gov (United States)

    Thomson, A. M.; Izaurralde, R. C.; Beach, R.; Zhang, X.; Zhao, K.; Monier, E.

    2013-12-01

    A range of approaches can be used in the application of climate change projections to agricultural impacts assessment. Climate projections can be used directly to drive crop models, which in turn can be used to provide inputs for agricultural economic or integrated assessment models. These model applications, and the transfer of information between models, must be guided by the state of the science. But the methodology must also account for the specific needs of stakeholders and the intended use of model results beyond pure scientific inquiry, including meeting the requirements of agencies responsible for designing and assessing policies, programs, and regulations. Here we present methodology and results of two climate impacts studies that applied climate model projections from CMIP3 and from the EPA Climate Impacts and Risk Analysis (CIRA) project in a crop model (EPIC - Environmental Policy Indicator Climate) in order to generate estimates of changes in crop productivity for use in an agricultural economic model for the United States (FASOM - Forest and Agricultural Sector Optimization Model). The FASOM model is a forward-looking dynamic model of the US forest and agricultural sector used to assess market responses to changing productivity of alternative land uses. The first study, focused on climate change impacts on the UDSA crop insurance program, was designed to use available daily climate projections from the CMIP3 archive. The decision to focus on daily data for this application limited the climate model and time period selection significantly; however for the intended purpose of assessing impacts on crop insurance payments, consideration of extreme event frequency was critical for assessing periodic crop failures. In a second, coordinated impacts study designed to assess the relative difference in climate impacts under a no-mitigation policy and different future climate mitigation scenarios, the stakeholder specifically requested an assessment of a

  14. Analytical-diagnostic and computing technologies for the attribution, authentication and economic evaluation of art works

    Directory of Open Access Journals (Sweden)

    Salvatore Lorusso

    2014-12-01

    Full Text Available This study, addresses the specific subject relating to the formulation of the economic value of art works in connection with their attribution and authentication. Through this process, which also includes regulatory issues, it follows that the current problem in the field of forgery and, in general inauthentic works, is precisely that of ascertaining, by means of scientific methods, whether a work of art is authentic or not.With this aim in mind, the following research project has been proposed and aims to develop the right methodological path to use to achieve this aim. It includes a cognitive phase involving aesthetic, stylistic, iconographic, and historical analysis and concludes with a technical-experimental phase in which diagnostic-analytical and computer technologies are employed.On completion of the research, combined the diagnostic and analytical results, as well as the development of software allowing the style of a particular artist to be identified, will provide a valuable contribution to the various and complex issues relating to the attribution and authentication of art works.

  15. Bio-economic modeling of water quality improvements using a dynamic applied general equilibrium approach

    NARCIS (Netherlands)

    Dellink, R.; Brouwer, R.; Linderhof, V.G.M.; Stone, K.

    2011-01-01

    An integrated bio-economic model is developed to assess the impacts of pollution reduction policies on water quality and the economy. Emission levels of economic activities to water are determined based on existing environmental accounts. These emission levels are built into a dynamic economic model

  16. Ecological theories and indicators in economic models of biodiversity loss and conservation: a critical review.

    NARCIS (Netherlands)

    Eppink, F.V.; van den Bergh, J.C.J.M.

    2007-01-01

    We evaluate how well environmental-economic models describe biodiversity loss and conservation issues. Four types of economic models turn out to dominate economic research into biodiversity conservation. For each of these, we assess the extent to which they integrate relevant ecological theories and

  17. Ecological Theories and Indicators in Economic Models of Biodiversity Loss and Conservation: A Critical Review

    NARCIS (Netherlands)

    Eppink, F.V.; van den Bergh, J.C.J.M.

    2007-01-01

    We evaluate how well environmental-economic models describe biodiversity loss and conservation issues. Four types of economic models turn out to dominate economic research into biodiversity conservation. For each of these, we assess the extent to which they integrate relevant ecological theories and

  18. A spatial-dynamic value transfer model of economic losses from a biological invasion

    Science.gov (United States)

    Thomas P. Holmes; Andrew M. Liebhold; Kent F. Kovacs; Betsy. Von Holle

    2010-01-01

    Rigorous assessments of the economic impacts of introduced species at broad spatial scales are required to provide credible information to policy makers. We propose that economic models of aggregate damages induced by biological invasions need to link microeconomic analyses of site-specific economic damages with spatial-dynamic models of value change associated with...

  19. Pollution and economic growth in a model of overlapping generations

    Energy Technology Data Exchange (ETDEWEB)

    Fisher, Eric O`N. [Department of Economics, The Ohio State University, Columbus, OH (United States); Van Marrewijk, Charles [Department of Economics, Erasmus University, Rotterdam (Netherlands)

    1994-01-22

    We analyze a model of overlapping generations in which clean air, a pure public consumption good, is used as a private input into production. Although production exhibits constant returns to scale, endogenous growth can occur because the economy has tWO sectors. In a laissez-faire equilibrium, there is no market for pollution rights, and firms appropriate clean air in an arbitrary manner. Growth occurs only if the marginal propensity to save is high enough and the asymptotic share of pollution in the investment sector is zero. Firms generate quasi-rents that are the value of pollution rights. These quasi-rents crowd out investment and slow economic growth. A laissez- faire equilibrium may not support Pareto optimal allocations, but a Pigouvian tax with lump-sum distribution of the resulting revenues does. Hence, a pollution lax yields a double dividend because it can increase both the static efficiency of the economy and its growth rate. 1 fig., 20 refs.

  20. Computational neurorehabilitation: modeling plasticity and learning to predict recovery

    National Research Council Canada - National Science Library

    Reinkensmeyer, David J; Burdet, Etienne; Casadio, Maura; Krakauer, John W; Kwakkel, Gert; Lang, Catherine E; Swinnen, Stephan P; Ward, Nick S; Schweighofer, Nicolas

    2016-01-01

    .... Here, we therefore discuss Computational Neurorehabilitation, a newly emerging field aimed at modeling plasticity and motor learning to understand and improve movement recovery of individuals with neurologic impairment...

  1. The complete guide to blender graphics computer modeling and animation

    CERN Document Server

    Blain, John M

    2014-01-01

    Smoothly Leads Users into the Subject of Computer Graphics through the Blender GUIBlender, the free and open source 3D computer modeling and animation program, allows users to create and animate models and figures in scenes, compile feature movies, and interact with the models and create video games. Reflecting the latest version of Blender, The Complete Guide to Blender Graphics: Computer Modeling & Animation, 2nd Edition helps beginners learn the basics of computer animation using this versatile graphics program. This edition incorporates many new features of Blender, including developments

  2. COMPUTER MODEL FOR ORGANIC FERTILIZER EVALUATION

    Directory of Open Access Journals (Sweden)

    Zdenko Lončarić

    2009-12-01

    seedlings with highest mass and leaf area are produced using growing media with pH close to 6 and with EC lower than 2 dSm-1. It could be concluded that conductivity approx. 3 dSm-1 has inhibitory effect on lettuce if pH is about 7 or higher. The computer model shows that raising pH and EC resulted in decreasing growth which could be expressed as increasing stress index. The lettuce height as a function of pH and EC is incorporated into the model as stress function showing increase of lettuce height by lowering EC from 4 to 1 dSm-1or pH from 7.4 to 6. The highest growing media index (8.1 was determined for mixture of composted pig manure and peat (1:1, and lowest (2.3 for composted horse manure and peat (1:2.

  3. Recent advances in estimating nonlinear models with applications in economics and finance

    CERN Document Server

    Ma, Jun

    2013-01-01

    Featuring current research in economics, finance and management, this book surveys nonlinear estimation techniques and offers new methods and insights into nonlinear time series analysis. Covers Markov Switching Models for analyzing economics series and more.

  4. An Emotional Agent Model Based on Granular Computing

    Directory of Open Access Journals (Sweden)

    Jun Hu

    2012-01-01

    Full Text Available Affective computing has a very important significance for fulfilling intelligent information processing and harmonious communication between human being and computers. A new model for emotional agent is proposed in this paper to make agent have the ability of handling emotions, based on the granular computing theory and the traditional BDI agent model. Firstly, a new emotion knowledge base based on granular computing for emotion expression is presented in the model. Secondly, a new emotional reasoning algorithm based on granular computing is proposed. Thirdly, a new emotional agent model based on granular computing is presented. Finally, based on the model, an emotional agent for patient assistant in hospital is realized, experiment results show that it is efficient to handle simple emotions.

  5. A microbial model of economic trading and comparative advantage.

    Science.gov (United States)

    Enyeart, Peter J; Simpson, Zachary B; Ellington, Andrew D

    2015-01-07

    The economic theory of comparative advantage postulates that beneficial trading relationships can be arrived at by two self-interested entities producing the same goods as long as they have opposing relative efficiencies in producing those goods. The theory predicts that upon entering trade, in order to maximize consumption both entities will specialize in producing the good they can produce at higher efficiency, that the weaker entity will specialize more completely than the stronger entity, and that both will be able to consume more goods as a result of trade than either would be able to alone. We extend this theory to the realm of unicellular organisms by developing mathematical models of genetic circuits that allow trading of a common good (specifically, signaling molecules) required for growth in bacteria in order to demonstrate comparative advantage interactions. In Conception 1, the experimenter controls production rates via exogenous inducers, allowing exploration of the parameter space of specialization. In Conception 2, the circuits self-regulate via feedback mechanisms. Our models indicate that these genetic circuits can demonstrate comparative advantage, and that cooperation in such a manner is particularly favored under stringent external conditions and when the cost of production is not overly high. Further work could involve implementing the models in living bacteria and searching for naturally occurring cooperative relationships between bacteria that conform to the principles of comparative advantage. Copyright © 2014 The Authors. Published by Elsevier Ltd.. All rights reserved.

  6. Incorporating transportation network modeling tools within transportation economic impact studies of disasters

    Directory of Open Access Journals (Sweden)

    Yi Wen

    2014-08-01

    Full Text Available Transportation system disruption due to a disaster results in "ripple effects" throughout the entire transportation system of a metropolitan region. Many researchers have focused on the economic costs of transportation system disruptions in transportation-related industries, specifïcally within commerce and logistics, in the assessment of the regional economic costs. However, the foundation of an assessment of the regional economic costs of a disaster needs to include the evaluation of consumer surplus in addition to the direct cost for reconstruction of the regional transportation system. The objective of this study is to propose a method to estimate the regional consumer surplus based on indirect economic costs of a disaster on intermodal transportation systems in the context of diverting vehicles and trains. The computational methods used to assess the regional indirect economic costs sustained by the highway and railroad system can utilize readily available state departments of transportation (DOTs and metropolitan planning organizations (MPOs traffic models allowing prioritization of regional recovery plans after a disaster and strengthening of infrastructure before a disaster. Hurricane Katrina is one of the most devastating hurricanes in the history of the United States. Due to the significance of Hurricane Katrina, a case study is presented to evaluate consumer surplus in the Gulf Coast Region of Mississippi. Results from the case study indicate the costs of rerouting and congestion delays in the regional highway system and the rent costs of right-of-way in the regional railroad system are major factors of the indirect costs in the consumer surplus.

  7. Soft Computing Models in Industrial and Environmental Applications

    CERN Document Server

    Abraham, Ajith; Corchado, Emilio; 7th International Conference, SOCO’12

    2013-01-01

    This volume of Advances in Intelligent and Soft Computing contains accepted papers presented at SOCO 2012, held in the beautiful and historic city of Ostrava (Czech Republic), in September 2012.   Soft computing represents a collection or set of computational techniques in machine learning, computer science and some engineering disciplines, which investigate, simulate, and analyze very complex issues and phenomena.   After a through peer-review process, the SOCO 2012 International Program Committee selected 75 papers which are published in these conference proceedings, and represents an acceptance rate of 38%. In this relevant edition a special emphasis was put on the organization of special sessions. Three special sessions were organized related to relevant topics as: Soft computing models for Control Theory & Applications in Electrical Engineering, Soft computing models for biomedical signals and data processing and Advanced Soft Computing Methods in Computer Vision and Data Processing.   The selecti...

  8. Ablative Rocket Deflector Testing and Computational Modeling

    Science.gov (United States)

    Allgood, Daniel C.; Lott, Jeffrey W.; Raines, Nickey

    2010-01-01

    A deflector risk mitigation program was recently conducted at the NASA Stennis Space Center. The primary objective was to develop a database that characterizes the behavior of industry-grade refractory materials subjected to rocket plume impingement conditions commonly experienced on static test stands. The program consisted of short and long duration engine tests where the supersonic exhaust flow from the engine impinged on an ablative panel. Quasi time-dependent erosion depths and patterns generated by the plume impingement were recorded for a variety of different ablative materials. The erosion behavior was found to be highly dependent on the material s composition and corresponding thermal properties. For example, in the case of the HP CAST 93Z ablative material, the erosion rate actually decreased under continued thermal heating conditions due to the formation of a low thermal conductivity "crystallization" layer. The "crystallization" layer produced near the surface of the material provided an effective insulation from the hot rocket exhaust plume. To gain further insight into the complex interaction of the plume with the ablative deflector, computational fluid dynamic modeling was performed in parallel to the ablative panel testing. The results from the current study demonstrated that locally high heating occurred due to shock reflections. These localized regions of shock-induced heat flux resulted in non-uniform erosion of the ablative panels. In turn, it was observed that the non-uniform erosion exacerbated the localized shock heating causing eventual plume separation and reversed flow for long duration tests under certain conditions. Overall, the flow simulations compared very well with the available experimental data obtained during this project.

  9. Scaling predictive modeling in drug development with cloud computing.

    Science.gov (United States)

    Moghadam, Behrooz Torabi; Alvarsson, Jonathan; Holm, Marcus; Eklund, Martin; Carlsson, Lars; Spjuth, Ola

    2015-01-26

    Growing data sets with increased time for analysis is hampering predictive modeling in drug discovery. Model building can be carried out on high-performance computer clusters, but these can be expensive to purchase and maintain. We have evaluated ligand-based modeling on cloud computing resources where computations are parallelized and run on the Amazon Elastic Cloud. We trained models on open data sets of varying sizes for the end points logP and Ames mutagenicity and compare with model building parallelized on a traditional high-performance computing cluster. We show that while high-performance computing results in faster model building, the use of cloud computing resources is feasible for large data sets and scales well within cloud instances. An additional advantage of cloud computing is that the costs of predictive models can be easily quantified, and a choice can be made between speed and economy. The easy access to computational resources with no up-front investments makes cloud computing an attractive alternative for scientists, especially for those without access to a supercomputer, and our study shows that it enables cost-efficient modeling of large data sets on demand within reasonable time.

  10. Attacker Modelling in Ubiquitous Computing Systems

    DEFF Research Database (Denmark)

    Papini, Davide

    in with our everyday life. This future is visible to everyone nowadays: terms like smartphone, cloud, sensor, network etc. are widely known and used in our everyday life. But what about the security of such systems. Ubiquitous computing devices can be limited in terms of energy, computing power and memory......, localisation services and many others. These technologies can be classified under the name of ubiquitous systems. The term Ubiquitous System dates back to 1991 when Mark Weiser at Xerox PARC Lab first referred to it in writing. He envisioned a future where computing technologies would have been melted...

  11. The emerging role of cloud computing in molecular modelling.

    Science.gov (United States)

    Ebejer, Jean-Paul; Fulle, Simone; Morris, Garrett M; Finn, Paul W

    2013-07-01

    There is a growing recognition of the importance of cloud computing for large-scale and data-intensive applications. The distinguishing features of cloud computing and their relationship to other distributed computing paradigms are described, as are the strengths and weaknesses of the approach. We review the use made to date of cloud computing for molecular modelling projects and the availability of front ends for molecular modelling applications. Although the use of cloud computing technologies for molecular modelling is still in its infancy, we demonstrate its potential by presenting several case studies. Rapid growth can be expected as more applications become available and costs continue to fall; cloud computing can make a major contribution not just in terms of the availability of on-demand computing power, but could also spur innovation in the development of novel approaches that utilize that capacity in more effective ways. Copyright © 2013 Elsevier Inc. All rights reserved.

  12. Reduced order methods for modeling and computational reduction

    CERN Document Server

    Rozza, Gianluigi

    2014-01-01

    This monograph addresses the state of the art of reduced order methods for modeling and computational reduction of complex parametrized systems, governed by ordinary and/or partial differential equations, with a special emphasis on real time computing techniques and applications in computational mechanics, bioengineering and computer graphics.  Several topics are covered, including: design, optimization, and control theory in real-time with applications in engineering; data assimilation, geometry registration, and parameter estimation with special attention to real-time computing in biomedical engineering and computational physics; real-time visualization of physics-based simulations in computer science; the treatment of high-dimensional problems in state space, physical space, or parameter space; the interactions between different model reduction and dimensionality reduction approaches; the development of general error estimation frameworks which take into account both model and discretization effects. This...

  13. BIOREACTOR ECONOMICS, SIZE AND TIME OF OPERATION (BEST) COMPUTER SIMULATOR FOR DESIGNING SULFATE-REDUCING BACTERIA FIELD BIOREACTORS

    Science.gov (United States)

    BEST (bioreactor economics, size and time of operation) is an Excel™ spreadsheet-based model that is used in conjunction with the public domain geochemical modeling software, PHREEQCI. The BEST model is used in the design process of sulfate-reducing bacteria (SRB) field bioreacto...

  14. Economics of End-of-Life Materials Recovery: A Study of Small Appliances and Computer Devices in Portugal.

    Science.gov (United States)

    Ford, Patrick; Santos, Eduardo; Ferrão, Paulo; Margarido, Fernanda; Van Vliet, Krystyn J; Olivetti, Elsa

    2016-05-03

    The challenges brought on by the increasing complexity of electronic products, and the criticality of the materials these devices contain, present an opportunity for maximizing the economic and societal benefits derived from recovery and recycling. Small appliances and computer devices (SACD), including mobile phones, contain significant amounts of precious metals including gold and platinum, the present value of which should serve as a key economic driver for many recycling decisions. However, a detailed analysis is required to estimate the economic value that is unrealized by incomplete recovery of these and other materials, and to ascertain how such value could be reinvested to improve recovery processes. We present a dynamic product flow analysis for SACD throughout Portugal, a European Union member, including annual data detailing product sales and industrial-scale preprocessing data for recovery of specific materials from devices. We employ preprocessing facility and metals pricing data to identify losses, and develop an economic framework around the value of recycling including uncertainty. We show that significant economic losses occur during preprocessing (over $70 M USD unrecovered in computers and mobile phones, 2006-2014) due to operations that fail to target high value materials, and characterize preprocessing operations according to material recovery and total costs.

  15. Integrated Multiscale Modeling of Molecular Computing Devices

    Energy Technology Data Exchange (ETDEWEB)

    Jerzy Bernholc

    2011-02-03

    photolithography will some day reach a miniaturization limit, forcing designers of Si-based electronics to pursue increased performance by other means. Any other alternative approach would have the unenviable task of matching the ability of Si technology to pack more than a billion interconnected and addressable devices on a chip the size of a thumbnail. Nevertheless, the prospects of developing alternative approaches to fabricate electronic devices have spurred an ever-increasing pace of fundamental research. One of the promising possibilities is molecular electronics (ME), self-assembled molecular-based electronic systems composed of single-molecule devices in ultra dense, ultra fast molecular-sized components. This project focused on developing accurate, reliable theoretical modeling capabilities for describing molecular electronics devices. The participants in the project are given in Table 1. The primary outcomes of this fundamental computational science grant are publications in the open scientific literature. As listed below, 62 papers have been published from this project. In addition, the research has also been the subject of more than 100 invited talks at conferences, including several plenary or keynote lectures. Many of the goals of the original proposal were completed. Specifically, the multi-disciplinary group developed a unique set of capabilities and tools for investigating electron transport in fabricated and self-assembled nanostructures at multiple length and time scales.

  16. Public sector administration of ecological economics systems using mediated modeling.

    Science.gov (United States)

    van den Belt, Marjan; Kenyan, Jennifer R; Krueger, Elizabeth; Maynard, Alison; Roy, Matthew Galen; Raphael, Ian

    2010-01-01

    In today's climate of government outsourcing and multiple stakeholder involvement in public sector management and service delivery, it is more important than ever to rethink and redesign the structure of how policy decisions are made, implemented, monitored, and adapted to new realities. The traditional command-and-control approach is now less effective because an increasing amount of responsibility to deliver public goods and services falls on networks of nongovernment agencies. Even though public administrators are seeking new decision-making models in an increasingly more complex environment, the public sector currently only sparsely utilizes Mediated Modeling (MM). There is growing evidence, however, that by employing MM and similar tools, public interest networks can be better equipped to deal with their long-term viability while maintaining the short-term needs of their clients. However, it may require a shift in organizational culture within and between organizations to achieve the desired results. This paper explores the successes and barriers to implementing MM and similar tools in the public sector and offers insights into utilizing them through a review of case studies and interdisciplinary literature. We aim to raise a broader interest in MM and similar tools among public sector administrators at various administrative levels. We focus primarily, but not exclusively, on those cases operating at the interface of ecology and socio-economic systems.

  17. On the capabilities and computational costs of neuron models.

    Science.gov (United States)

    Skocik, Michael J; Long, Lyle N

    2014-08-01

    We review the Hodgkin-Huxley, Izhikevich, and leaky integrate-and-fire neuron models in regular spiking modes solved with the forward Euler, fourth-order Runge-Kutta, and exponential Euler methods and determine the necessary time steps and corresponding computational costs required to make the solutions accurate. We conclude that the leaky integrate-and-fire needs the least number of computations, and that the Hodgkin-Huxley and Izhikevich models are comparable in computational cost.

  18. Global Stability of an Epidemic Model of Computer Virus

    OpenAIRE

    Yang, Xiaofan; Liu, Bei; Gan, Chenquan

    2014-01-01

    With the rapid popularization of the Internet, computers can enter or leave the Internet increasingly frequently. In fact, no antivirus software can detect and remove all sorts of computer viruses. This implies that viruses would persist on the Internet. To better understand the spread of computer viruses in these situations, a new propagation model is established and analyzed. The unique equilibrium of the model is globally asymptotically stable, in accordance with the reality. A parameter a...

  19. A Hydro-Economic Approach to Representing Water Resources Impacts in Integrated Assessment Models

    Energy Technology Data Exchange (ETDEWEB)

    Kirshen, Paul H.; Strzepek, Kenneth, M.

    2004-01-14

    Grant Number DE-FG02-98ER62665 Office of Energy Research of the U.S. Department of Energy Abstract Many Integrated Assessment Models (IAM) divide the world into a small number of highly aggregated regions. Non-OECD countries are aggregated geographically into continental and multiple-continental regions or economically by development level. Current research suggests that these large scale aggregations cannot accurately represent potential water resources-related climate change impacts. In addition, IAMs do not explicitly model the flow regulation impacts of reservoir and ground water systems, the economics of water supply, or the demand for water in economic activities. Using the International Model for Policy Analysis of Agricultural Commodities and Trade (IMPACT) model of the International Food Policy Research Institute (IFPRI) as a case study, this research implemented a set of methodologies to provide accurate representation of water resource climate change impacts in Integrated Assessment Models. There were also detailed examinations of key issues related to aggregated modeling including: modeling water consumption versus water withdrawals; ground and surface water interactions; development of reservoir cost curves; modeling of surface areas of aggregated reservoirs for estimating evaporation losses; and evaluating the importance of spatial scale in river basin modeling. The major findings include: - Continental or national or even large scale river basin aggregation of water supplies and demands do not accurately capture the impacts of climate change in the water and agricultural sector in IAMs. - Fortunately, there now exist gridden approaches (0.5 X 0.5 degrees) to model streamflows in a global analysis. The gridded approach to hydrologic modeling allows flexibility in aligning basin boundaries with national boundaries. This combined with GIS tools, high speed computers, and the growing availability of socio-economic gridded data bases allows assignment of

  20. Ocean Modeling and Visualization on Massively Parallel Computer

    Science.gov (United States)

    Chao, Yi; Li, P. Peggy; Wang, Ping; Katz, Daniel S.; Cheng, Benny N.

    1997-01-01

    Climate modeling is one of the grand challenges of computational science, and ocean modeling plays an important role in both understanding the current climatic conditions and predicting future climate change.

  1. A Computational Model of Crowds for Collective Intelligence

    OpenAIRE

    Prpic, John; Jackson, Piper; Nguyen, Thai

    2014-01-01

    In this work, we present a high-level computational model of IT-mediated crowds for collective intelligence. We introduce the Crowd Capital perspective as an organizational-level model of collective intelligence generation from IT-mediated crowds, and specify a computational system including agents, forms of IT, and organizational knowledge.

  2. Airfoil Computations using the γ - Reθ Model

    DEFF Research Database (Denmark)

    Sørensen, Niels N.

    computations. Based on this, an estimate of the error in the computations is determined to be approximately one percent in the attached region. Following the verification of the implemented model, the model is applied to four airfoils, NACA64- 018, NACA64-218, NACA64-418 and NACA64-618 and the results...

  3. Python for Scientific Computing Education: Modeling of Queueing Systems

    Directory of Open Access Journals (Sweden)

    Vladimiras Dolgopolovas

    2014-01-01

    Full Text Available In this paper, we present the methodology for the introduction to scientific computing based on model-centered learning. We propose multiphase queueing systems as a basis for learning objects. We use Python and parallel programming for implementing the models and present the computer code and results of stochastic simulations.

  4. Using Computational Simulations to Confront Students' Mental Models

    Science.gov (United States)

    Rodrigues, R.; Carvalho, P. Simeão

    2014-01-01

    In this paper we show an example of how to use a computational simulation to obtain visual feedback for students' mental models, and compare their predictions with the simulated system's behaviour. Additionally, we use the computational simulation to incrementally modify the students' mental models in order to accommodate new data,…

  5. Generation of river discharge using water balance computer model ...

    African Journals Online (AJOL)

    The paper presents a study on river discharge generation using a water balance computer model. The results of the data generated shows that the computer program designed gave a good· prediction of the recorded discharge within 95% confidence interval. The model is therefore recommended for other catchments with ...

  6. Overview of ASC Capability Computing System Governance Model

    Energy Technology Data Exchange (ETDEWEB)

    Doebling, Scott W. [Los Alamos National Laboratory

    2012-07-11

    This document contains a description of the Advanced Simulation and Computing Program's Capability Computing System Governance Model. Objectives of the Governance Model are to ensure that the capability system resources are allocated on a priority-driven basis according to the Program requirements; and to utilize ASC Capability Systems for the large capability jobs for which they were designed and procured.

  7. Three-dimensional computer modeling of slag cement hydration

    NARCIS (Netherlands)

    Chen, Wei; Brouwers, Jos; Shui, Z.H.

    2007-01-01

    A newly developed version of a three-dimensional computer model for simulating the hydration and microstructure development of slag cement pastes is presented in this study. It is based on a 3-D computer model for Portland cement hydration (CEMHYD3D) which was originally developed at NIST, taken

  8. Modeling the Economic Impacts of Large Deployments on Local Communities

    Science.gov (United States)

    2008-12-01

    Anselin, Luc and Rosina Moreno. “Properties of tests for spatial error components,” Regional Science and Urban Economics , 33:595-618 (January...Autocorrelation,” Regional Science and Urban Economics , 37:491-496 (2007). GlobalSecurity.org. Army Forts and Camps. 15 November 2007a. http...California Economy, Berkley: Institute of urban and Regional Development, Center for Real Estate and Urban Economics , University of California at Berkeley

  9. The Economics of Presenteeism: A discrete choice & count model framework

    OpenAIRE

    Pedersen, Kjeld Møller; Skagen, Kristian

    2014-01-01

    There are three levels in this paper: A search for economic theories about presenteeism, a search for appropriate econometric approaches, and finally empirical results based on a unique Danish cross sectional data set. There are two economic approaches to presenteeism: 1. Productivity losses and 2. labor supply. The first is part of the indirect cost component in cost-of-illness studies and economic evaluation. There are two core questions in the productivity loss literature: Measurement of p...

  10. An integrated economic model of multiple types and uses of water

    Science.gov (United States)

    Luckmann, Jonas; Grethe, Harald; McDonald, Scott; Orlov, Anton; Siddig, Khalid

    2014-05-01

    Water scarcity is an increasing problem in many parts of the world and the management of water has become an important issue on the political economy agenda in many countries. As water is used in most economic activities and the allocation of water is often a complex problem involving different economic agents and sectors, Computable General Equilibrium (CGE) models have been proven useful to analyze water allocation problems, although their adaptation to include water is still relatively undeveloped. This paper provides a description of an integrated water-focused CGE model (STAGE_W) that includes multiple types and uses of water, and for the first time, the reclamation of wastewater as well as the provision of brackish groundwater as separate, independent activities with specific cost structures. The insights provided by the model are illustrated with an application to the Israeli water sector assuming that freshwater resources available to the economy are cut by 50%. We analyze how the Israeli economy copes with this shock if it reduces potable water supply compared with further investments in the desalination sector. The results demonstrate that the effects on the economy are slightly negative under both scenarios. Counter intuitively, the provision of additional potable water to the economy through desalination does not substantively reduce the negative outcomes. This is mainly due to the high costs of desalination, which are currently subsidized, with the distribution of the negative welfare effect over household groups dependent on how these subsidies are financed.

  11. Computer System for the Prognosis of Physical and Economic Indicators of Production and Services

    Directory of Open Access Journals (Sweden)

    Elio David Zaldívar-Linares

    2015-12-01

    Full Text Available The early prognostic indicators in a enterprise is very important. The application of response functions through the Multiple Linear Regression Analysis, contributes to its best determination and the refinement of planning. This constitutes a considerable benefit to the country. This work involves the preparation of a computer system that overcomes the disadvantages of professional programs allowing greater automation of the calculation of the estimates in the business environment. The SICEP1 lets you build, solve, analyze and apply regression models and has been validated with Stathistical Package for Social Sciences (SPSS professional system. The system has been used in various production entities selected Santiago de Cuba province, from the 2010-2011 harvest impacting  favorably in the precision of planning that determined the reduction of the production costs of value 11 %. 

  12. Computational intelligence applications in modeling and control

    CERN Document Server

    Vaidyanathan, Sundarapandian

    2015-01-01

    The development of computational intelligence (CI) systems was inspired by observable and imitable aspects of intelligent activity of human being and nature. The essence of the systems based on computational intelligence is to process and interpret data of various nature so that that CI is strictly connected with the increase of available data as well as capabilities of their processing, mutually supportive factors. Developed theories of computational intelligence were quickly applied in many fields of engineering, data analysis, forecasting, biomedicine and others. They are used in images and sounds processing and identifying, signals processing, multidimensional data visualization, steering of objects, analysis of lexicographic data, requesting systems in banking, diagnostic systems, expert systems and many other practical implementations. This book consists of 16 contributed chapters by subject experts who are specialized in the various topics addressed in this book. The special chapters have been brought ...

  13. Implementing and assessing computational modeling in introductory mechanics

    CERN Document Server

    Caballero, Marcos D; Schatz, Michael F

    2011-01-01

    Students taking introductory physics are rarely exposed to computational modeling. In a one-semester large lecture introductory calculus-based mechanics course at Georgia Tech, students learned to solve physics problems using the VPython programming environment. During the term 1357 students in this course solved a suite of fourteen computational modeling homework questions delivered using an online commercial course management system. Their proficiency with computational modeling was evaluated in a proctored environment using a novel central force problem. The majority of students (60.4%) successfully completed the evaluation. Analysis of erroneous student-submitted programs indicated that a small set of student errors explained why most programs failed. We discuss the design and implementation of the computational modeling homework and evaluation, the results from the evaluation and the implications for instruction in computational modeling in introductory STEM courses.

  14. A computational model of the human hand 93-ERI-053

    Energy Technology Data Exchange (ETDEWEB)

    Hollerbach, K.; Axelrod, T.

    1996-03-01

    The objectives of the Computational Hand Modeling project were to prove the feasibility of the Laboratory`s NIKE3D finite element code to orthopaedic problems. Because of the great complexity of anatomical structures and the nonlinearity of their behavior, we have focused on a subset of joints of the hand and lower extremity and have developed algorithms to model their behavior. The algorithms developed here solve fundamental problems in computational biomechanics and can be expanded to describe any other joints of the human body. This kind of computational modeling has never successfully been attempted before, due in part to a lack of biomaterials data and a lack of computational resources. With the computational resources available at the National Laboratories and the collaborative relationships we have established with experimental and other modeling laboratories, we have been in a position to pursue our innovative approach to biomechanical and orthopedic modeling.

  15. Mathematical modeling and computational intelligence in engineering applications

    CERN Document Server

    Silva Neto, Antônio José da; Silva, Geraldo Nunes

    2016-01-01

    This book brings together a rich selection of studies in mathematical modeling and computational intelligence, with application in several fields of engineering, like automation, biomedical, chemical, civil, electrical, electronic, geophysical and mechanical engineering, on a multidisciplinary approach. Authors from five countries and 16 different research centers contribute with their expertise in both the fundamentals and real problems applications based upon their strong background on modeling and computational intelligence. The reader will find a wide variety of applications, mathematical and computational tools and original results, all presented with rigorous mathematical procedures. This work is intended for use in graduate courses of engineering, applied mathematics and applied computation where tools as mathematical and computational modeling, numerical methods and computational intelligence are applied to the solution of real problems.

  16. Quantitative Model for Economic Analyses of Information Security Investment in an Enterprise Information System

    Directory of Open Access Journals (Sweden)

    Bojanc Rok

    2012-11-01

    Full Text Available The paper presents a mathematical model for the optimal security-technology investment evaluation and decision-making processes based on the quantitative analysis of security risks and digital asset assessments in an enterprise. The model makes use of the quantitative analysis of different security measures that counteract individual risks by identifying the information system processes in an enterprise and the potential threats. The model comprises the target security levels for all identified business processes and the probability of a security accident together with the possible loss the enterprise may suffer. The selection of security technology is based on the efficiency of selected security measures. Economic metrics are applied for the efficiency assessment and comparative analysis of different protection technologies. Unlike the existing models for evaluation of the security investment, the proposed model allows direct comparison and quantitative assessment of different security measures. The model allows deep analyses and computations providing quantitative assessments of different options for investments, which translate into recommendations facilitating the selection of the best solution and the decision-making thereof. The model was tested using empirical examples with data from real business environment.

  17. A Simple Model of Entrepreneurship for Principles of Economics Courses

    Science.gov (United States)

    Gunter, Frank R.

    2012-01-01

    The critical roles of entrepreneurs in creating, operating, and destroying markets, as well as their importance in driving long-term economic growth are still generally either absent from principles of economics texts or relegated to later chapters. The primary difficulties in explaining entrepreneurship at the principles level are the lack of a…

  18. A Panel Threshold Model of Tourism Specialization and Economic Development

    NARCIS (Netherlands)

    C-L. Chang (Chia-Lin); T. Khamkaew (Tanchanok); M.J. McAleer (Michael)

    2009-01-01

    textabstractThe significant impact of international tourism in stimulating economic growth is especially important from a policy perspective. For this reason, the relationship between international tourism and economic growth would seem to be an interesting empirical issue. In particular, if there

  19. Modeling the Development of Vocational Competence: A Psychometric Model for Economic Domains

    Science.gov (United States)

    Klotz, Viola Katharina; Winther, Esther; Festner, Dagmar

    2015-01-01

    This article discusses the development of vocational competence through economic vocational educational training (VET) from a theoretical and psychometric perspective. Most assessment and competence models tend to adopt a state perspective toward assessments of competence and carve out different structures of competence for diverse vocational…

  20. Computational compliance criteria in water hammer modelling

    Science.gov (United States)

    Urbanowicz, Kamil

    2017-10-01

    Among many numerical methods (finite: difference, element, volume etc.) used to solve the system of partial differential equations describing unsteady pipe flow, the method of characteristics (MOC) is most appreciated. With its help, it is possible to examine the effect of numerical discretisation carried over the pipe length. It was noticed, based on the tests performed in this study, that convergence of the calculation results occurred on a rectangular grid with the division of each pipe of the analysed system into at least 10 elements. Therefore, it is advisable to introduce computational compliance criteria (CCC), which will be responsible for optimal discretisation of the examined system. The results of this study, based on the assumption of various values of the Courant-Friedrichs-Levy (CFL) number, indicate also that the CFL number should be equal to one for optimum computational results. Application of the CCC criterion to own written and commercial computer programmes based on the method of characteristics will guarantee fast simulations and the necessary computational coherence.